Jan 13 20:40:28.747434 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241116 p3) 14.2.1 20241116, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Jan 13 18:58:40 -00 2025 Jan 13 20:40:28.747451 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 20:40:28.747457 kernel: Disabled fast string operations Jan 13 20:40:28.747462 kernel: BIOS-provided physical RAM map: Jan 13 20:40:28.747466 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Jan 13 20:40:28.747470 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Jan 13 20:40:28.747476 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Jan 13 20:40:28.747480 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Jan 13 20:40:28.747485 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Jan 13 20:40:28.747489 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Jan 13 20:40:28.747493 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Jan 13 20:40:28.747497 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Jan 13 20:40:28.747502 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Jan 13 20:40:28.747506 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Jan 13 20:40:28.747512 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Jan 13 20:40:28.747517 kernel: NX (Execute Disable) protection: active Jan 13 20:40:28.747522 kernel: APIC: Static calls initialized Jan 13 20:40:28.747527 kernel: SMBIOS 2.7 present. Jan 13 20:40:28.747531 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Jan 13 20:40:28.747536 kernel: vmware: hypercall mode: 0x00 Jan 13 20:40:28.747541 kernel: Hypervisor detected: VMware Jan 13 20:40:28.747546 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Jan 13 20:40:28.747552 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Jan 13 20:40:28.747557 kernel: vmware: using clock offset of 6416934363 ns Jan 13 20:40:28.747562 kernel: tsc: Detected 3408.000 MHz processor Jan 13 20:40:28.747567 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 13 20:40:28.747572 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 13 20:40:28.747577 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Jan 13 20:40:28.747582 kernel: total RAM covered: 3072M Jan 13 20:40:28.747587 kernel: Found optimal setting for mtrr clean up Jan 13 20:40:28.747592 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Jan 13 20:40:28.747597 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Jan 13 20:40:28.747603 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 13 20:40:28.747608 kernel: Using GB pages for direct mapping Jan 13 20:40:28.747613 kernel: ACPI: Early table checksum verification disabled Jan 13 20:40:28.747618 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Jan 13 20:40:28.747623 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Jan 13 20:40:28.747628 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Jan 13 20:40:28.747633 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Jan 13 20:40:28.747638 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:40:28.747646 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:40:28.747651 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Jan 13 20:40:28.747656 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Jan 13 20:40:28.747662 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Jan 13 20:40:28.747667 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Jan 13 20:40:28.747672 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Jan 13 20:40:28.747678 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Jan 13 20:40:28.747684 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Jan 13 20:40:28.747689 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Jan 13 20:40:28.747694 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:40:28.747699 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:40:28.747704 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Jan 13 20:40:28.747709 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Jan 13 20:40:28.747714 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Jan 13 20:40:28.747719 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Jan 13 20:40:28.747726 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Jan 13 20:40:28.747731 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Jan 13 20:40:28.747736 kernel: system APIC only can use physical flat Jan 13 20:40:28.747741 kernel: APIC: Switched APIC routing to: physical flat Jan 13 20:40:28.747747 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 13 20:40:28.747752 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 13 20:40:28.747757 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 13 20:40:28.747762 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 13 20:40:28.747767 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 13 20:40:28.747772 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 13 20:40:28.747778 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 13 20:40:28.747783 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 13 20:40:28.747788 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Jan 13 20:40:28.747793 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Jan 13 20:40:28.747798 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Jan 13 20:40:28.747803 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Jan 13 20:40:28.747808 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Jan 13 20:40:28.747813 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Jan 13 20:40:28.747818 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Jan 13 20:40:28.747823 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Jan 13 20:40:28.747829 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Jan 13 20:40:28.747834 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Jan 13 20:40:28.747840 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Jan 13 20:40:28.747844 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Jan 13 20:40:28.747849 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Jan 13 20:40:28.747855 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Jan 13 20:40:28.747860 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Jan 13 20:40:28.747865 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Jan 13 20:40:28.747870 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Jan 13 20:40:28.747875 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Jan 13 20:40:28.747881 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Jan 13 20:40:28.747886 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Jan 13 20:40:28.747891 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Jan 13 20:40:28.747896 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Jan 13 20:40:28.747901 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Jan 13 20:40:28.747906 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Jan 13 20:40:28.747911 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Jan 13 20:40:28.747916 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Jan 13 20:40:28.747921 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Jan 13 20:40:28.747926 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Jan 13 20:40:28.747932 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Jan 13 20:40:28.747937 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Jan 13 20:40:28.747942 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Jan 13 20:40:28.747947 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Jan 13 20:40:28.747952 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Jan 13 20:40:28.747957 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Jan 13 20:40:28.747962 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Jan 13 20:40:28.747967 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Jan 13 20:40:28.747972 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Jan 13 20:40:28.747977 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Jan 13 20:40:28.747983 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Jan 13 20:40:28.747989 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Jan 13 20:40:28.747993 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Jan 13 20:40:28.747999 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Jan 13 20:40:28.748003 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Jan 13 20:40:28.748008 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Jan 13 20:40:28.748014 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Jan 13 20:40:28.748018 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Jan 13 20:40:28.748024 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Jan 13 20:40:28.748028 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Jan 13 20:40:28.748035 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Jan 13 20:40:28.748040 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Jan 13 20:40:28.748045 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Jan 13 20:40:28.748055 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Jan 13 20:40:28.748060 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Jan 13 20:40:28.748066 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Jan 13 20:40:28.748071 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Jan 13 20:40:28.748076 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Jan 13 20:40:28.748082 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Jan 13 20:40:28.748088 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Jan 13 20:40:28.748094 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Jan 13 20:40:28.748099 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Jan 13 20:40:28.748105 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Jan 13 20:40:28.748111 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Jan 13 20:40:28.748116 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Jan 13 20:40:28.748121 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Jan 13 20:40:28.748127 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Jan 13 20:40:28.748132 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Jan 13 20:40:28.748138 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Jan 13 20:40:28.748144 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Jan 13 20:40:28.748149 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Jan 13 20:40:28.748155 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Jan 13 20:40:28.748166 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Jan 13 20:40:28.748172 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Jan 13 20:40:28.748177 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Jan 13 20:40:28.748183 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Jan 13 20:40:28.748188 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Jan 13 20:40:28.748193 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Jan 13 20:40:28.748199 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Jan 13 20:40:28.748205 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Jan 13 20:40:28.748211 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Jan 13 20:40:28.748216 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Jan 13 20:40:28.748221 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Jan 13 20:40:28.748227 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Jan 13 20:40:28.748232 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Jan 13 20:40:28.748237 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Jan 13 20:40:28.748243 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Jan 13 20:40:28.748248 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Jan 13 20:40:28.748253 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Jan 13 20:40:28.748260 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Jan 13 20:40:28.748265 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Jan 13 20:40:28.748271 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Jan 13 20:40:28.748276 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Jan 13 20:40:28.748281 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Jan 13 20:40:28.748286 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Jan 13 20:40:28.748291 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Jan 13 20:40:28.748297 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Jan 13 20:40:28.748302 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Jan 13 20:40:28.748308 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Jan 13 20:40:28.748313 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Jan 13 20:40:28.748319 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Jan 13 20:40:28.748325 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Jan 13 20:40:28.748330 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Jan 13 20:40:28.748336 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Jan 13 20:40:28.748341 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Jan 13 20:40:28.748346 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Jan 13 20:40:28.748351 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Jan 13 20:40:28.748357 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Jan 13 20:40:28.748362 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Jan 13 20:40:28.748368 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Jan 13 20:40:28.748374 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Jan 13 20:40:28.748380 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Jan 13 20:40:28.748385 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Jan 13 20:40:28.748390 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Jan 13 20:40:28.748396 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Jan 13 20:40:28.748401 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Jan 13 20:40:28.748406 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Jan 13 20:40:28.748411 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Jan 13 20:40:28.748417 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Jan 13 20:40:28.748422 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Jan 13 20:40:28.748429 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Jan 13 20:40:28.748434 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Jan 13 20:40:28.748440 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 13 20:40:28.748445 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 13 20:40:28.748451 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Jan 13 20:40:28.748456 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Jan 13 20:40:28.748462 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Jan 13 20:40:28.748467 kernel: Zone ranges: Jan 13 20:40:28.748473 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 13 20:40:28.748479 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Jan 13 20:40:28.748485 kernel: Normal empty Jan 13 20:40:28.748490 kernel: Movable zone start for each node Jan 13 20:40:28.748496 kernel: Early memory node ranges Jan 13 20:40:28.748501 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Jan 13 20:40:28.748507 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Jan 13 20:40:28.748512 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Jan 13 20:40:28.748518 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Jan 13 20:40:28.748523 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 13 20:40:28.748529 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Jan 13 20:40:28.748535 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Jan 13 20:40:28.748541 kernel: ACPI: PM-Timer IO Port: 0x1008 Jan 13 20:40:28.748546 kernel: system APIC only can use physical flat Jan 13 20:40:28.748552 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Jan 13 20:40:28.748557 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Jan 13 20:40:28.748562 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Jan 13 20:40:28.748568 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Jan 13 20:40:28.748573 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Jan 13 20:40:28.748579 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Jan 13 20:40:28.748584 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Jan 13 20:40:28.748591 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Jan 13 20:40:28.748596 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Jan 13 20:40:28.748601 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Jan 13 20:40:28.748607 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Jan 13 20:40:28.748612 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Jan 13 20:40:28.748618 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Jan 13 20:40:28.748623 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Jan 13 20:40:28.748629 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Jan 13 20:40:28.748634 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Jan 13 20:40:28.748640 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Jan 13 20:40:28.748646 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Jan 13 20:40:28.748651 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Jan 13 20:40:28.748657 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Jan 13 20:40:28.748662 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Jan 13 20:40:28.748667 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Jan 13 20:40:28.748673 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Jan 13 20:40:28.748678 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Jan 13 20:40:28.748683 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Jan 13 20:40:28.748689 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Jan 13 20:40:28.748696 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Jan 13 20:40:28.748701 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Jan 13 20:40:28.748707 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Jan 13 20:40:28.748712 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Jan 13 20:40:28.748717 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Jan 13 20:40:28.748723 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Jan 13 20:40:28.748728 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Jan 13 20:40:28.748733 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Jan 13 20:40:28.748739 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Jan 13 20:40:28.748745 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Jan 13 20:40:28.748751 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Jan 13 20:40:28.748756 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Jan 13 20:40:28.748762 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Jan 13 20:40:28.748767 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Jan 13 20:40:28.748772 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Jan 13 20:40:28.748778 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Jan 13 20:40:28.748784 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Jan 13 20:40:28.748789 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Jan 13 20:40:28.748795 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Jan 13 20:40:28.748801 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Jan 13 20:40:28.748807 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Jan 13 20:40:28.748812 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Jan 13 20:40:28.748817 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Jan 13 20:40:28.748823 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Jan 13 20:40:28.748828 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Jan 13 20:40:28.748834 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Jan 13 20:40:28.748839 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Jan 13 20:40:28.748845 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Jan 13 20:40:28.748851 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Jan 13 20:40:28.748857 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Jan 13 20:40:28.748862 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Jan 13 20:40:28.748867 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Jan 13 20:40:28.748873 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Jan 13 20:40:28.748878 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Jan 13 20:40:28.748884 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Jan 13 20:40:28.748889 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Jan 13 20:40:28.748894 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Jan 13 20:40:28.748901 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Jan 13 20:40:28.748906 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Jan 13 20:40:28.748912 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Jan 13 20:40:28.748917 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Jan 13 20:40:28.748923 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Jan 13 20:40:28.748928 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Jan 13 20:40:28.748933 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Jan 13 20:40:28.748939 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Jan 13 20:40:28.748945 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Jan 13 20:40:28.748950 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Jan 13 20:40:28.748957 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Jan 13 20:40:28.748962 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Jan 13 20:40:28.748968 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Jan 13 20:40:28.748973 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Jan 13 20:40:28.748978 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Jan 13 20:40:28.748984 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Jan 13 20:40:28.748989 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Jan 13 20:40:28.748995 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Jan 13 20:40:28.749000 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Jan 13 20:40:28.749007 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Jan 13 20:40:28.749012 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Jan 13 20:40:28.749017 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Jan 13 20:40:28.749023 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Jan 13 20:40:28.749028 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Jan 13 20:40:28.749034 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Jan 13 20:40:28.749039 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Jan 13 20:40:28.749045 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Jan 13 20:40:28.749050 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Jan 13 20:40:28.749055 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Jan 13 20:40:28.749062 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Jan 13 20:40:28.749068 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Jan 13 20:40:28.749073 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Jan 13 20:40:28.749078 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Jan 13 20:40:28.749084 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Jan 13 20:40:28.749090 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Jan 13 20:40:28.749095 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Jan 13 20:40:28.749100 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Jan 13 20:40:28.749106 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Jan 13 20:40:28.749112 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Jan 13 20:40:28.749118 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Jan 13 20:40:28.749123 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Jan 13 20:40:28.749128 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Jan 13 20:40:28.749134 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Jan 13 20:40:28.749139 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Jan 13 20:40:28.749145 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Jan 13 20:40:28.749151 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Jan 13 20:40:28.749156 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Jan 13 20:40:28.749170 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Jan 13 20:40:28.749177 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Jan 13 20:40:28.749183 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Jan 13 20:40:28.749188 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Jan 13 20:40:28.749194 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Jan 13 20:40:28.749199 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Jan 13 20:40:28.749204 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Jan 13 20:40:28.749210 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Jan 13 20:40:28.749215 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Jan 13 20:40:28.749221 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Jan 13 20:40:28.749227 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Jan 13 20:40:28.749233 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Jan 13 20:40:28.749238 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Jan 13 20:40:28.749244 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Jan 13 20:40:28.749249 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Jan 13 20:40:28.749255 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Jan 13 20:40:28.749260 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Jan 13 20:40:28.749265 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Jan 13 20:40:28.749271 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Jan 13 20:40:28.749276 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Jan 13 20:40:28.749283 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 13 20:40:28.749289 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Jan 13 20:40:28.749295 kernel: TSC deadline timer available Jan 13 20:40:28.749300 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Jan 13 20:40:28.749306 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Jan 13 20:40:28.749312 kernel: Booting paravirtualized kernel on VMware hypervisor Jan 13 20:40:28.749317 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 13 20:40:28.749323 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Jan 13 20:40:28.749328 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 13 20:40:28.749335 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 13 20:40:28.749341 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Jan 13 20:40:28.749346 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Jan 13 20:40:28.749351 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Jan 13 20:40:28.749357 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Jan 13 20:40:28.749362 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Jan 13 20:40:28.749375 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Jan 13 20:40:28.749382 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Jan 13 20:40:28.749388 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Jan 13 20:40:28.749395 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Jan 13 20:40:28.749400 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Jan 13 20:40:28.749406 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Jan 13 20:40:28.749412 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Jan 13 20:40:28.749417 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Jan 13 20:40:28.749423 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Jan 13 20:40:28.749429 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Jan 13 20:40:28.749434 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Jan 13 20:40:28.749442 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 20:40:28.749448 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 13 20:40:28.749454 kernel: random: crng init done Jan 13 20:40:28.749459 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Jan 13 20:40:28.749465 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Jan 13 20:40:28.749471 kernel: printk: log_buf_len min size: 262144 bytes Jan 13 20:40:28.749477 kernel: printk: log_buf_len: 1048576 bytes Jan 13 20:40:28.749483 kernel: printk: early log buf free: 239648(91%) Jan 13 20:40:28.749490 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 13 20:40:28.749496 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 13 20:40:28.749501 kernel: Fallback order for Node 0: 0 Jan 13 20:40:28.749507 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Jan 13 20:40:28.749513 kernel: Policy zone: DMA32 Jan 13 20:40:28.749519 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 13 20:40:28.749525 kernel: Memory: 1934320K/2096628K available (14336K kernel code, 2299K rwdata, 22800K rodata, 43320K init, 1756K bss, 162048K reserved, 0K cma-reserved) Jan 13 20:40:28.749532 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Jan 13 20:40:28.749538 kernel: ftrace: allocating 37890 entries in 149 pages Jan 13 20:40:28.749543 kernel: ftrace: allocated 149 pages with 4 groups Jan 13 20:40:28.749549 kernel: Dynamic Preempt: voluntary Jan 13 20:40:28.749555 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 13 20:40:28.749561 kernel: rcu: RCU event tracing is enabled. Jan 13 20:40:28.749567 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Jan 13 20:40:28.749573 kernel: Trampoline variant of Tasks RCU enabled. Jan 13 20:40:28.749580 kernel: Rude variant of Tasks RCU enabled. Jan 13 20:40:28.749586 kernel: Tracing variant of Tasks RCU enabled. Jan 13 20:40:28.749592 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 13 20:40:28.749598 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Jan 13 20:40:28.749603 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Jan 13 20:40:28.749609 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Jan 13 20:40:28.749615 kernel: Console: colour VGA+ 80x25 Jan 13 20:40:28.749621 kernel: printk: console [tty0] enabled Jan 13 20:40:28.749627 kernel: printk: console [ttyS0] enabled Jan 13 20:40:28.749632 kernel: ACPI: Core revision 20230628 Jan 13 20:40:28.749640 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Jan 13 20:40:28.749646 kernel: APIC: Switch to symmetric I/O mode setup Jan 13 20:40:28.749652 kernel: x2apic enabled Jan 13 20:40:28.749657 kernel: APIC: Switched APIC routing to: physical x2apic Jan 13 20:40:28.749663 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 13 20:40:28.749669 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:40:28.749675 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Jan 13 20:40:28.749681 kernel: Disabled fast string operations Jan 13 20:40:28.749687 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Jan 13 20:40:28.749694 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Jan 13 20:40:28.749700 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 13 20:40:28.749706 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Jan 13 20:40:28.749712 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Jan 13 20:40:28.749719 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 13 20:40:28.749725 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 13 20:40:28.749731 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 13 20:40:28.749736 kernel: RETBleed: Mitigation: Enhanced IBRS Jan 13 20:40:28.749742 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 13 20:40:28.749750 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 13 20:40:28.749756 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jan 13 20:40:28.749762 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 13 20:40:28.749768 kernel: GDS: Unknown: Dependent on hypervisor status Jan 13 20:40:28.749774 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 13 20:40:28.749779 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 13 20:40:28.749785 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 13 20:40:28.749791 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 13 20:40:28.749798 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jan 13 20:40:28.749804 kernel: Freeing SMP alternatives memory: 32K Jan 13 20:40:28.749810 kernel: pid_max: default: 131072 minimum: 1024 Jan 13 20:40:28.749816 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 13 20:40:28.749822 kernel: landlock: Up and running. Jan 13 20:40:28.749828 kernel: SELinux: Initializing. Jan 13 20:40:28.749834 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:40:28.749840 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:40:28.749845 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Jan 13 20:40:28.749853 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:40:28.749859 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:40:28.749864 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:40:28.749870 kernel: Performance Events: Skylake events, core PMU driver. Jan 13 20:40:28.749876 kernel: core: CPUID marked event: 'cpu cycles' unavailable Jan 13 20:40:28.749882 kernel: core: CPUID marked event: 'instructions' unavailable Jan 13 20:40:28.749888 kernel: core: CPUID marked event: 'bus cycles' unavailable Jan 13 20:40:28.749893 kernel: core: CPUID marked event: 'cache references' unavailable Jan 13 20:40:28.749899 kernel: core: CPUID marked event: 'cache misses' unavailable Jan 13 20:40:28.749906 kernel: core: CPUID marked event: 'branch instructions' unavailable Jan 13 20:40:28.749912 kernel: core: CPUID marked event: 'branch misses' unavailable Jan 13 20:40:28.749917 kernel: ... version: 1 Jan 13 20:40:28.749923 kernel: ... bit width: 48 Jan 13 20:40:28.749929 kernel: ... generic registers: 4 Jan 13 20:40:28.749935 kernel: ... value mask: 0000ffffffffffff Jan 13 20:40:28.749940 kernel: ... max period: 000000007fffffff Jan 13 20:40:28.749946 kernel: ... fixed-purpose events: 0 Jan 13 20:40:28.749952 kernel: ... event mask: 000000000000000f Jan 13 20:40:28.749959 kernel: signal: max sigframe size: 1776 Jan 13 20:40:28.749965 kernel: rcu: Hierarchical SRCU implementation. Jan 13 20:40:28.749971 kernel: rcu: Max phase no-delay instances is 400. Jan 13 20:40:28.749976 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 13 20:40:28.749982 kernel: smp: Bringing up secondary CPUs ... Jan 13 20:40:28.749988 kernel: smpboot: x86: Booting SMP configuration: Jan 13 20:40:28.749994 kernel: .... node #0, CPUs: #1 Jan 13 20:40:28.749999 kernel: Disabled fast string operations Jan 13 20:40:28.750005 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Jan 13 20:40:28.750012 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 13 20:40:28.750018 kernel: smp: Brought up 1 node, 2 CPUs Jan 13 20:40:28.750024 kernel: smpboot: Max logical packages: 128 Jan 13 20:40:28.750029 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Jan 13 20:40:28.750035 kernel: devtmpfs: initialized Jan 13 20:40:28.750041 kernel: x86/mm: Memory block size: 128MB Jan 13 20:40:28.750047 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Jan 13 20:40:28.750053 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 13 20:40:28.750059 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Jan 13 20:40:28.750064 kernel: pinctrl core: initialized pinctrl subsystem Jan 13 20:40:28.750071 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 13 20:40:28.750077 kernel: audit: initializing netlink subsys (disabled) Jan 13 20:40:28.750083 kernel: audit: type=2000 audit(1736800827.073:1): state=initialized audit_enabled=0 res=1 Jan 13 20:40:28.750089 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 13 20:40:28.750095 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 13 20:40:28.750100 kernel: cpuidle: using governor menu Jan 13 20:40:28.750106 kernel: Simple Boot Flag at 0x36 set to 0x80 Jan 13 20:40:28.750112 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 13 20:40:28.750118 kernel: dca service started, version 1.12.1 Jan 13 20:40:28.750125 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Jan 13 20:40:28.750131 kernel: PCI: Using configuration type 1 for base access Jan 13 20:40:28.750137 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 13 20:40:28.750143 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 13 20:40:28.750148 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 13 20:40:28.751011 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 13 20:40:28.751022 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 13 20:40:28.751028 kernel: ACPI: Added _OSI(Module Device) Jan 13 20:40:28.751036 kernel: ACPI: Added _OSI(Processor Device) Jan 13 20:40:28.751042 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 13 20:40:28.751048 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 13 20:40:28.751054 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 13 20:40:28.751059 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Jan 13 20:40:28.751065 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 13 20:40:28.751071 kernel: ACPI: Interpreter enabled Jan 13 20:40:28.751077 kernel: ACPI: PM: (supports S0 S1 S5) Jan 13 20:40:28.751083 kernel: ACPI: Using IOAPIC for interrupt routing Jan 13 20:40:28.751089 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 13 20:40:28.751096 kernel: PCI: Using E820 reservations for host bridge windows Jan 13 20:40:28.751102 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Jan 13 20:40:28.751108 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Jan 13 20:40:28.751220 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 13 20:40:28.751303 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Jan 13 20:40:28.751355 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Jan 13 20:40:28.751364 kernel: PCI host bridge to bus 0000:00 Jan 13 20:40:28.751417 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 13 20:40:28.751463 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Jan 13 20:40:28.751507 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 13 20:40:28.751551 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 13 20:40:28.751595 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Jan 13 20:40:28.751639 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Jan 13 20:40:28.751698 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Jan 13 20:40:28.751757 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Jan 13 20:40:28.751811 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Jan 13 20:40:28.751868 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Jan 13 20:40:28.751919 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Jan 13 20:40:28.752031 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Jan 13 20:40:28.752107 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Jan 13 20:40:28.752220 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Jan 13 20:40:28.752275 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Jan 13 20:40:28.752329 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Jan 13 20:40:28.752379 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Jan 13 20:40:28.752429 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Jan 13 20:40:28.752482 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Jan 13 20:40:28.752535 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Jan 13 20:40:28.752585 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Jan 13 20:40:28.752638 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Jan 13 20:40:28.752688 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Jan 13 20:40:28.752737 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Jan 13 20:40:28.752785 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Jan 13 20:40:28.752859 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Jan 13 20:40:28.752918 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 13 20:40:28.752972 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Jan 13 20:40:28.753029 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.753083 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.753138 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.753201 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.753263 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.753318 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.753385 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.753494 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.753836 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.753891 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.753950 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754001 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754056 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754107 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754197 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754255 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754310 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754365 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754419 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754469 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754523 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754576 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754649 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754699 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754752 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754802 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754854 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754904 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754961 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755010 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755062 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755112 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755217 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755274 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755346 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755395 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755448 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755516 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755585 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755635 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755690 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755741 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755795 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755846 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755899 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755949 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756001 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756054 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756107 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756175 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756263 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756315 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756370 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756424 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756479 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756531 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756586 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756637 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756690 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756744 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756799 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756849 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756903 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756954 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.757006 kernel: pci_bus 0000:01: extended config space not accessible Jan 13 20:40:28.757061 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:40:28.757113 kernel: pci_bus 0000:02: extended config space not accessible Jan 13 20:40:28.757123 kernel: acpiphp: Slot [32] registered Jan 13 20:40:28.757129 kernel: acpiphp: Slot [33] registered Jan 13 20:40:28.757135 kernel: acpiphp: Slot [34] registered Jan 13 20:40:28.757141 kernel: acpiphp: Slot [35] registered Jan 13 20:40:28.757147 kernel: acpiphp: Slot [36] registered Jan 13 20:40:28.757202 kernel: acpiphp: Slot [37] registered Jan 13 20:40:28.757208 kernel: acpiphp: Slot [38] registered Jan 13 20:40:28.757216 kernel: acpiphp: Slot [39] registered Jan 13 20:40:28.757222 kernel: acpiphp: Slot [40] registered Jan 13 20:40:28.757228 kernel: acpiphp: Slot [41] registered Jan 13 20:40:28.757234 kernel: acpiphp: Slot [42] registered Jan 13 20:40:28.757239 kernel: acpiphp: Slot [43] registered Jan 13 20:40:28.757245 kernel: acpiphp: Slot [44] registered Jan 13 20:40:28.757251 kernel: acpiphp: Slot [45] registered Jan 13 20:40:28.757257 kernel: acpiphp: Slot [46] registered Jan 13 20:40:28.757263 kernel: acpiphp: Slot [47] registered Jan 13 20:40:28.757270 kernel: acpiphp: Slot [48] registered Jan 13 20:40:28.757275 kernel: acpiphp: Slot [49] registered Jan 13 20:40:28.757281 kernel: acpiphp: Slot [50] registered Jan 13 20:40:28.757287 kernel: acpiphp: Slot [51] registered Jan 13 20:40:28.757292 kernel: acpiphp: Slot [52] registered Jan 13 20:40:28.757298 kernel: acpiphp: Slot [53] registered Jan 13 20:40:28.757304 kernel: acpiphp: Slot [54] registered Jan 13 20:40:28.757310 kernel: acpiphp: Slot [55] registered Jan 13 20:40:28.757315 kernel: acpiphp: Slot [56] registered Jan 13 20:40:28.757321 kernel: acpiphp: Slot [57] registered Jan 13 20:40:28.757328 kernel: acpiphp: Slot [58] registered Jan 13 20:40:28.757334 kernel: acpiphp: Slot [59] registered Jan 13 20:40:28.757340 kernel: acpiphp: Slot [60] registered Jan 13 20:40:28.757346 kernel: acpiphp: Slot [61] registered Jan 13 20:40:28.757352 kernel: acpiphp: Slot [62] registered Jan 13 20:40:28.757358 kernel: acpiphp: Slot [63] registered Jan 13 20:40:28.757410 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jan 13 20:40:28.757461 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:40:28.757509 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:40:28.757562 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:40:28.757611 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Jan 13 20:40:28.757660 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Jan 13 20:40:28.757709 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Jan 13 20:40:28.757757 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Jan 13 20:40:28.757806 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Jan 13 20:40:28.757862 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Jan 13 20:40:28.757917 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Jan 13 20:40:28.757968 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Jan 13 20:40:28.758019 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:40:28.758070 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.758121 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:40:28.758184 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:40:28.758238 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:40:28.758292 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:40:28.758343 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:40:28.758393 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:40:28.758443 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:40:28.758493 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:40:28.758544 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:40:28.758593 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:40:28.758643 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:40:28.758695 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:40:28.758746 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:40:28.758795 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:40:28.758844 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:40:28.758896 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:40:28.758946 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:40:28.758996 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:40:28.759050 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:40:28.759100 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:40:28.759150 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:40:28.759226 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:40:28.759278 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:40:28.759331 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:40:28.759381 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:40:28.759431 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:40:28.759482 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:40:28.759538 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Jan 13 20:40:28.759594 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Jan 13 20:40:28.759645 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Jan 13 20:40:28.759696 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Jan 13 20:40:28.759751 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Jan 13 20:40:28.759802 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:40:28.759855 kernel: pci 0000:0b:00.0: supports D1 D2 Jan 13 20:40:28.759906 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Jan 13 20:40:28.759957 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:40:28.760009 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:40:28.760060 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:40:28.760113 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:40:28.760617 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:40:28.760678 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:40:28.760732 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:40:28.760783 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:40:28.760837 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:40:28.760887 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:40:28.760937 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:40:28.760991 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:40:28.761044 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:40:28.761094 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:40:28.761143 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:40:28.762294 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:40:28.762355 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:40:28.762408 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:40:28.762462 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:40:28.762518 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:40:28.762568 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:40:28.762620 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:40:28.762671 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:40:28.762721 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:40:28.762772 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:40:28.762822 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:40:28.762871 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:40:28.762927 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:40:28.762977 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:40:28.763028 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:40:28.763077 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:40:28.763129 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:40:28.764210 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:40:28.764270 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:40:28.764323 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:40:28.764380 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:40:28.764432 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:40:28.764484 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:40:28.764534 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:40:28.764587 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:40:28.764638 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:40:28.764688 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:40:28.764743 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:40:28.764794 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:40:28.764844 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:40:28.764897 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:40:28.764947 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:40:28.764998 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:40:28.765051 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:40:28.765101 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:40:28.765153 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:40:28.766283 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:40:28.766341 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:40:28.766393 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:40:28.766447 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:40:28.766497 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:40:28.766546 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:40:28.766595 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:40:28.766651 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:40:28.766701 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:40:28.766750 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:40:28.766800 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:40:28.766851 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:40:28.766901 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:40:28.766951 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:40:28.767002 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:40:28.767054 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:40:28.767103 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:40:28.769738 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:40:28.769808 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:40:28.769863 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:40:28.769917 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:40:28.769968 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:40:28.770018 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:40:28.770075 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:40:28.770126 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:40:28.770598 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:40:28.770656 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:40:28.770707 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:40:28.770757 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:40:28.770766 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Jan 13 20:40:28.770772 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Jan 13 20:40:28.770781 kernel: ACPI: PCI: Interrupt link LNKB disabled Jan 13 20:40:28.770787 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 13 20:40:28.770793 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Jan 13 20:40:28.770798 kernel: iommu: Default domain type: Translated Jan 13 20:40:28.770805 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 13 20:40:28.770811 kernel: PCI: Using ACPI for IRQ routing Jan 13 20:40:28.770817 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 13 20:40:28.770823 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Jan 13 20:40:28.770829 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Jan 13 20:40:28.770879 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Jan 13 20:40:28.770930 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Jan 13 20:40:28.770979 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 13 20:40:28.770988 kernel: vgaarb: loaded Jan 13 20:40:28.770994 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Jan 13 20:40:28.771001 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Jan 13 20:40:28.771006 kernel: clocksource: Switched to clocksource tsc-early Jan 13 20:40:28.771012 kernel: VFS: Disk quotas dquot_6.6.0 Jan 13 20:40:28.771018 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 13 20:40:28.771026 kernel: pnp: PnP ACPI init Jan 13 20:40:28.773212 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Jan 13 20:40:28.773263 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Jan 13 20:40:28.773309 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Jan 13 20:40:28.773372 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Jan 13 20:40:28.773418 kernel: pnp 00:06: [dma 2] Jan 13 20:40:28.773466 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Jan 13 20:40:28.773514 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Jan 13 20:40:28.773557 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Jan 13 20:40:28.773565 kernel: pnp: PnP ACPI: found 8 devices Jan 13 20:40:28.773572 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 13 20:40:28.773578 kernel: NET: Registered PF_INET protocol family Jan 13 20:40:28.773583 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 13 20:40:28.773589 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 13 20:40:28.773597 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 13 20:40:28.773603 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 13 20:40:28.773609 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 13 20:40:28.773615 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 13 20:40:28.773620 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:40:28.773626 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:40:28.773632 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 13 20:40:28.773638 kernel: NET: Registered PF_XDP protocol family Jan 13 20:40:28.773691 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Jan 13 20:40:28.773745 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 13 20:40:28.773797 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 13 20:40:28.773848 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 13 20:40:28.773898 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 13 20:40:28.773948 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 13 20:40:28.773999 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 13 20:40:28.774052 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 13 20:40:28.774102 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 13 20:40:28.774151 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 13 20:40:28.774236 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 13 20:40:28.774312 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 13 20:40:28.774425 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 13 20:40:28.774483 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 13 20:40:28.774536 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 13 20:40:28.774589 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 13 20:40:28.774641 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 13 20:40:28.774693 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 13 20:40:28.774748 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 13 20:40:28.774801 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 13 20:40:28.774853 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 13 20:40:28.774905 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 13 20:40:28.774957 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Jan 13 20:40:28.775008 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:40:28.775063 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:40:28.775115 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775175 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775228 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775280 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775332 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775383 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775435 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775490 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775543 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775594 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775646 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775697 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775750 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775801 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775853 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775907 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775958 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776009 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776059 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776111 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776190 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776261 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776313 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776367 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776417 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776467 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776516 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776566 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776616 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776666 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776716 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776768 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776817 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776867 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776917 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776966 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777016 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777066 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777115 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777254 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777310 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777359 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777409 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777458 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777507 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777556 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777604 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777653 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777702 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777754 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777803 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777851 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777900 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777950 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777999 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778047 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778097 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778146 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778246 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778296 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778345 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778394 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778443 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778491 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778540 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778588 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778637 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778690 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778740 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778789 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778837 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778887 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778936 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778985 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.779034 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.779084 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.779133 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.779207 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.779259 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.779308 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.779358 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.779408 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.779458 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.779507 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.779558 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:40:28.779610 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Jan 13 20:40:28.779663 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:40:28.779713 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:40:28.779763 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:40:28.779818 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Jan 13 20:40:28.779869 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:40:28.779919 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:40:28.779970 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:40:28.780019 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:40:28.780073 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:40:28.780124 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:40:28.780232 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:40:28.780294 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:40:28.780346 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:40:28.780395 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:40:28.780444 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:40:28.780494 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:40:28.780543 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:40:28.780592 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:40:28.780645 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:40:28.780694 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:40:28.780743 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:40:28.780793 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:40:28.780845 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:40:28.780894 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:40:28.780946 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:40:28.780996 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:40:28.781045 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:40:28.781094 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:40:28.781144 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:40:28.781245 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:40:28.781295 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:40:28.781349 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Jan 13 20:40:28.781399 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:40:28.781451 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:40:28.781501 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:40:28.781551 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:40:28.781624 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:40:28.781677 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:40:28.781726 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:40:28.781776 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:40:28.781826 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:40:28.781877 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:40:28.781930 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:40:28.781980 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:40:28.782029 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:40:28.782080 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:40:28.782130 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:40:28.782230 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:40:28.782280 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:40:28.782329 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:40:28.782378 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:40:28.782428 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:40:28.782481 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:40:28.782530 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:40:28.782580 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:40:28.782630 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:40:28.782680 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:40:28.782729 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:40:28.782779 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:40:28.782831 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:40:28.782881 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:40:28.782934 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:40:28.782983 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:40:28.783034 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:40:28.783083 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:40:28.783133 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:40:28.783193 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:40:28.783245 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:40:28.783299 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:40:28.783349 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:40:28.783399 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:40:28.783452 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:40:28.783502 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:40:28.783551 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:40:28.783601 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:40:28.783651 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:40:28.783700 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:40:28.783764 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:40:28.783812 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:40:28.783861 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:40:28.783912 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:40:28.783961 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:40:28.784009 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:40:28.784058 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:40:28.784107 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:40:28.784155 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:40:28.784213 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:40:28.784262 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:40:28.784319 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:40:28.784370 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:40:28.784423 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:40:28.784471 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:40:28.784520 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:40:28.784570 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:40:28.784619 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:40:28.784667 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:40:28.784716 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:40:28.784765 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:40:28.784813 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:40:28.784865 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:40:28.784914 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:40:28.784962 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:40:28.785011 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:40:28.785096 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:40:28.785145 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:40:28.785218 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:40:28.785268 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:40:28.785317 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:40:28.785366 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:40:28.785418 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:40:28.785466 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:40:28.785515 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:40:28.785564 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:40:28.785608 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:40:28.785651 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:40:28.785694 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:40:28.785737 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:40:28.785789 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Jan 13 20:40:28.785835 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Jan 13 20:40:28.785881 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:40:28.785925 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:40:28.785970 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:40:28.786015 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:40:28.786077 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:40:28.786139 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:40:28.786215 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Jan 13 20:40:28.786264 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Jan 13 20:40:28.786319 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:40:28.786369 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Jan 13 20:40:28.786415 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Jan 13 20:40:28.786459 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:40:28.786510 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Jan 13 20:40:28.786556 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Jan 13 20:40:28.786601 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:40:28.786650 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Jan 13 20:40:28.786696 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:40:28.786745 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Jan 13 20:40:28.786791 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:40:28.786842 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Jan 13 20:40:28.786887 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:40:28.786936 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Jan 13 20:40:28.786982 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:40:28.787034 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Jan 13 20:40:28.787089 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:40:28.787142 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Jan 13 20:40:28.787253 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Jan 13 20:40:28.787300 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:40:28.787352 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Jan 13 20:40:28.787398 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Jan 13 20:40:28.787444 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:40:28.787496 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Jan 13 20:40:28.787543 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Jan 13 20:40:28.787591 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:40:28.787641 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Jan 13 20:40:28.787697 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:40:28.787749 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Jan 13 20:40:28.787799 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:40:28.787849 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Jan 13 20:40:28.787896 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:40:28.787945 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Jan 13 20:40:28.787992 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:40:28.788042 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Jan 13 20:40:28.788089 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:40:28.788143 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Jan 13 20:40:28.788251 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Jan 13 20:40:28.788298 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:40:28.788350 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Jan 13 20:40:28.788399 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Jan 13 20:40:28.788445 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:40:28.788498 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Jan 13 20:40:28.788544 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Jan 13 20:40:28.788589 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:40:28.788639 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Jan 13 20:40:28.788686 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:40:28.788735 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Jan 13 20:40:28.788781 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:40:28.788833 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Jan 13 20:40:28.788879 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:40:28.788929 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Jan 13 20:40:28.788976 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:40:28.789026 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Jan 13 20:40:28.789072 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:40:28.789126 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 13 20:40:28.789198 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Jan 13 20:40:28.789247 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:40:28.789300 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Jan 13 20:40:28.789347 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Jan 13 20:40:28.789393 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:40:28.789446 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Jan 13 20:40:28.789493 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:40:28.789556 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Jan 13 20:40:28.789602 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:40:28.789651 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Jan 13 20:40:28.789697 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:40:28.789748 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Jan 13 20:40:28.789794 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:40:28.789842 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Jan 13 20:40:28.789888 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:40:28.789936 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Jan 13 20:40:28.789982 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:40:28.790037 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jan 13 20:40:28.790047 kernel: PCI: CLS 32 bytes, default 64 Jan 13 20:40:28.790054 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 13 20:40:28.790060 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:40:28.790067 kernel: clocksource: Switched to clocksource tsc Jan 13 20:40:28.790073 kernel: Initialise system trusted keyrings Jan 13 20:40:28.790080 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 13 20:40:28.790086 kernel: Key type asymmetric registered Jan 13 20:40:28.790092 kernel: Asymmetric key parser 'x509' registered Jan 13 20:40:28.790100 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 13 20:40:28.790106 kernel: io scheduler mq-deadline registered Jan 13 20:40:28.790112 kernel: io scheduler kyber registered Jan 13 20:40:28.790118 kernel: io scheduler bfq registered Jan 13 20:40:28.790198 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Jan 13 20:40:28.790252 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.790303 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Jan 13 20:40:28.790353 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.790424 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Jan 13 20:40:28.790475 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.790540 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Jan 13 20:40:28.790591 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.790641 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Jan 13 20:40:28.790690 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.790743 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Jan 13 20:40:28.790793 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.790843 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Jan 13 20:40:28.790893 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.790942 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Jan 13 20:40:28.790994 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791044 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Jan 13 20:40:28.791094 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791144 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Jan 13 20:40:28.791225 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791276 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Jan 13 20:40:28.791326 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791379 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Jan 13 20:40:28.791428 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791496 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Jan 13 20:40:28.791546 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791595 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Jan 13 20:40:28.791646 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791699 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Jan 13 20:40:28.791749 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791800 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Jan 13 20:40:28.791851 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791902 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Jan 13 20:40:28.791955 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792006 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Jan 13 20:40:28.792056 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792106 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Jan 13 20:40:28.792156 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792241 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Jan 13 20:40:28.792292 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792346 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Jan 13 20:40:28.792396 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792449 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Jan 13 20:40:28.792500 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792550 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Jan 13 20:40:28.792604 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792654 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Jan 13 20:40:28.792705 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792756 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Jan 13 20:40:28.792806 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792857 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Jan 13 20:40:28.792940 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.793006 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Jan 13 20:40:28.793056 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.793106 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Jan 13 20:40:28.793188 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.793247 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Jan 13 20:40:28.793299 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.793348 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Jan 13 20:40:28.793399 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.793448 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Jan 13 20:40:28.793497 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.793548 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Jan 13 20:40:28.793597 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.793606 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 13 20:40:28.793613 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 13 20:40:28.793619 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 13 20:40:28.793626 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Jan 13 20:40:28.793632 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 13 20:40:28.793640 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 13 20:40:28.793691 kernel: rtc_cmos 00:01: registered as rtc0 Jan 13 20:40:28.793737 kernel: rtc_cmos 00:01: setting system clock to 2025-01-13T20:40:28 UTC (1736800828) Jan 13 20:40:28.793783 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Jan 13 20:40:28.793792 kernel: intel_pstate: CPU model not supported Jan 13 20:40:28.793798 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 13 20:40:28.793804 kernel: NET: Registered PF_INET6 protocol family Jan 13 20:40:28.793811 kernel: Segment Routing with IPv6 Jan 13 20:40:28.793817 kernel: In-situ OAM (IOAM) with IPv6 Jan 13 20:40:28.793825 kernel: NET: Registered PF_PACKET protocol family Jan 13 20:40:28.793831 kernel: Key type dns_resolver registered Jan 13 20:40:28.793837 kernel: IPI shorthand broadcast: enabled Jan 13 20:40:28.793844 kernel: sched_clock: Marking stable (923277504, 247162490)->(1236792580, -66352586) Jan 13 20:40:28.793850 kernel: registered taskstats version 1 Jan 13 20:40:28.793856 kernel: Loading compiled-in X.509 certificates Jan 13 20:40:28.793862 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: ede78b3e719729f95eaaf7cb6a5289b567f6ee3e' Jan 13 20:40:28.793868 kernel: Key type .fscrypt registered Jan 13 20:40:28.793874 kernel: Key type fscrypt-provisioning registered Jan 13 20:40:28.793881 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 13 20:40:28.793888 kernel: ima: Allocated hash algorithm: sha1 Jan 13 20:40:28.793894 kernel: ima: No architecture policies found Jan 13 20:40:28.793900 kernel: clk: Disabling unused clocks Jan 13 20:40:28.793906 kernel: Freeing unused kernel image (initmem) memory: 43320K Jan 13 20:40:28.793912 kernel: Write protecting the kernel read-only data: 38912k Jan 13 20:40:28.793918 kernel: Freeing unused kernel image (rodata/data gap) memory: 1776K Jan 13 20:40:28.793924 kernel: Run /init as init process Jan 13 20:40:28.793931 kernel: with arguments: Jan 13 20:40:28.793938 kernel: /init Jan 13 20:40:28.793944 kernel: with environment: Jan 13 20:40:28.793949 kernel: HOME=/ Jan 13 20:40:28.793955 kernel: TERM=linux Jan 13 20:40:28.793961 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 13 20:40:28.793969 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 20:40:28.793977 systemd[1]: Detected virtualization vmware. Jan 13 20:40:28.793986 systemd[1]: Detected architecture x86-64. Jan 13 20:40:28.793992 systemd[1]: Running in initrd. Jan 13 20:40:28.793998 systemd[1]: No hostname configured, using default hostname. Jan 13 20:40:28.794004 systemd[1]: Hostname set to . Jan 13 20:40:28.794010 systemd[1]: Initializing machine ID from random generator. Jan 13 20:40:28.794016 systemd[1]: Queued start job for default target initrd.target. Jan 13 20:40:28.794023 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:40:28.794029 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:40:28.794037 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 13 20:40:28.794044 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 20:40:28.794050 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 13 20:40:28.794057 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 13 20:40:28.794064 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 13 20:40:28.794071 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 13 20:40:28.794077 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:40:28.794085 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:40:28.794091 systemd[1]: Reached target paths.target - Path Units. Jan 13 20:40:28.794097 systemd[1]: Reached target slices.target - Slice Units. Jan 13 20:40:28.794103 systemd[1]: Reached target swap.target - Swaps. Jan 13 20:40:28.794110 systemd[1]: Reached target timers.target - Timer Units. Jan 13 20:40:28.794116 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:40:28.794122 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:40:28.794129 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 13 20:40:28.794135 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 13 20:40:28.794143 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:40:28.794149 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 20:40:28.794156 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:40:28.794184 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 20:40:28.794191 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 13 20:40:28.794197 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 20:40:28.794204 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 13 20:40:28.794210 systemd[1]: Starting systemd-fsck-usr.service... Jan 13 20:40:28.794218 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 20:40:28.794224 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 20:40:28.794230 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:40:28.794248 systemd-journald[216]: Collecting audit messages is disabled. Jan 13 20:40:28.794265 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 13 20:40:28.794272 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:40:28.794278 systemd[1]: Finished systemd-fsck-usr.service. Jan 13 20:40:28.794285 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 13 20:40:28.794292 kernel: Bridge firewalling registered Jan 13 20:40:28.794299 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 20:40:28.794306 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 20:40:28.794312 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:40:28.794319 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:40:28.794325 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:40:28.794332 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 20:40:28.794339 systemd-journald[216]: Journal started Jan 13 20:40:28.794355 systemd-journald[216]: Runtime Journal (/run/log/journal/b838563f16134b0b8965585a0146e2f7) is 4.8M, max 38.6M, 33.8M free. Jan 13 20:40:28.752129 systemd-modules-load[217]: Inserted module 'overlay' Jan 13 20:40:28.776229 systemd-modules-load[217]: Inserted module 'br_netfilter' Jan 13 20:40:28.798511 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 20:40:28.798530 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 20:40:28.801520 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 20:40:28.809282 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:40:28.809555 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:40:28.809787 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:40:28.810732 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 13 20:40:28.815601 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:40:28.818880 dracut-cmdline[248]: dracut-dracut-053 Jan 13 20:40:28.821327 dracut-cmdline[248]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 20:40:28.822044 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 20:40:28.837223 systemd-resolved[254]: Positive Trust Anchors: Jan 13 20:40:28.837231 systemd-resolved[254]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 20:40:28.837255 systemd-resolved[254]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 20:40:28.838841 systemd-resolved[254]: Defaulting to hostname 'linux'. Jan 13 20:40:28.839450 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 20:40:28.839597 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:40:28.865185 kernel: SCSI subsystem initialized Jan 13 20:40:28.870191 kernel: Loading iSCSI transport class v2.0-870. Jan 13 20:40:28.877190 kernel: iscsi: registered transport (tcp) Jan 13 20:40:28.889499 kernel: iscsi: registered transport (qla4xxx) Jan 13 20:40:28.889530 kernel: QLogic iSCSI HBA Driver Jan 13 20:40:28.909031 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 13 20:40:28.911269 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 13 20:40:28.926513 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 13 20:40:28.926544 kernel: device-mapper: uevent: version 1.0.3 Jan 13 20:40:28.927630 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 13 20:40:28.960174 kernel: raid6: avx2x4 gen() 47531 MB/s Jan 13 20:40:28.975172 kernel: raid6: avx2x2 gen() 52973 MB/s Jan 13 20:40:28.992373 kernel: raid6: avx2x1 gen() 43643 MB/s Jan 13 20:40:28.992410 kernel: raid6: using algorithm avx2x2 gen() 52973 MB/s Jan 13 20:40:29.010426 kernel: raid6: .... xor() 31417 MB/s, rmw enabled Jan 13 20:40:29.010526 kernel: raid6: using avx2x2 recovery algorithm Jan 13 20:40:29.024176 kernel: xor: automatically using best checksumming function avx Jan 13 20:40:29.115183 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 13 20:40:29.120682 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:40:29.124252 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:40:29.132199 systemd-udevd[434]: Using default interface naming scheme 'v255'. Jan 13 20:40:29.134749 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:40:29.146314 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 13 20:40:29.153056 dracut-pre-trigger[439]: rd.md=0: removing MD RAID activation Jan 13 20:40:29.169049 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:40:29.172249 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 20:40:29.245312 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:40:29.252314 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 13 20:40:29.262982 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 13 20:40:29.263625 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:40:29.264379 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:40:29.264721 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 20:40:29.268287 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 13 20:40:29.277381 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:40:29.312190 kernel: VMware PVSCSI driver - version 1.0.7.0-k Jan 13 20:40:29.322262 kernel: vmw_pvscsi: using 64bit dma Jan 13 20:40:29.326172 kernel: vmw_pvscsi: max_id: 16 Jan 13 20:40:29.326194 kernel: vmw_pvscsi: setting ring_pages to 8 Jan 13 20:40:29.334766 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Jan 13 20:40:29.334794 kernel: vmw_pvscsi: enabling reqCallThreshold Jan 13 20:40:29.334803 kernel: vmw_pvscsi: driver-based request coalescing enabled Jan 13 20:40:29.334811 kernel: vmw_pvscsi: using MSI-X Jan 13 20:40:29.343111 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Jan 13 20:40:29.343263 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Jan 13 20:40:29.344576 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Jan 13 20:40:29.354586 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Jan 13 20:40:29.354685 kernel: cryptd: max_cpu_qlen set to 1000 Jan 13 20:40:29.354695 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Jan 13 20:40:29.354770 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Jan 13 20:40:29.349061 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:40:29.349098 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:40:29.349296 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:40:29.349388 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:40:29.349414 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:40:29.349516 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:40:29.355275 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:40:29.362190 kernel: AVX2 version of gcm_enc/dec engaged. Jan 13 20:40:29.367216 kernel: AES CTR mode by8 optimization enabled Jan 13 20:40:29.367256 kernel: libata version 3.00 loaded. Jan 13 20:40:29.375217 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Jan 13 20:40:29.425182 kernel: sd 0:0:0:0: [sda] Write Protect is off Jan 13 20:40:29.425268 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Jan 13 20:40:29.425332 kernel: sd 0:0:0:0: [sda] Cache data unavailable Jan 13 20:40:29.425400 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Jan 13 20:40:29.425463 kernel: ata_piix 0000:00:07.1: version 2.13 Jan 13 20:40:29.425538 kernel: scsi host1: ata_piix Jan 13 20:40:29.425603 kernel: scsi host2: ata_piix Jan 13 20:40:29.425663 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Jan 13 20:40:29.425671 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Jan 13 20:40:29.425679 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:40:29.425688 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jan 13 20:40:29.383886 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:40:29.388251 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:40:29.394749 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:40:29.553212 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Jan 13 20:40:29.558174 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Jan 13 20:40:29.585223 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Jan 13 20:40:29.601375 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 13 20:40:29.601388 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jan 13 20:40:29.851194 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (486) Jan 13 20:40:29.854755 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Jan 13 20:40:29.858526 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Jan 13 20:40:29.861666 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 20:40:29.863455 kernel: BTRFS: device fsid 7f507843-6957-466b-8fb7-5bee228b170a devid 1 transid 44 /dev/sda3 scanned by (udev-worker) (481) Jan 13 20:40:29.867655 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Jan 13 20:40:29.867772 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Jan 13 20:40:29.878253 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 13 20:40:29.899185 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:40:29.905178 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:40:30.904182 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:40:30.904219 disk-uuid[595]: The operation has completed successfully. Jan 13 20:40:30.975642 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 13 20:40:30.975710 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 13 20:40:30.980263 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 13 20:40:30.985943 sh[609]: Success Jan 13 20:40:31.000178 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Jan 13 20:40:31.395350 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 13 20:40:31.406120 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 13 20:40:31.406651 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 13 20:40:31.439438 kernel: BTRFS info (device dm-0): first mount of filesystem 7f507843-6957-466b-8fb7-5bee228b170a Jan 13 20:40:31.439473 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:40:31.439488 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 13 20:40:31.440519 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 13 20:40:31.441313 kernel: BTRFS info (device dm-0): using free space tree Jan 13 20:40:31.526177 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 13 20:40:31.528787 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 13 20:40:31.537277 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Jan 13 20:40:31.538564 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 13 20:40:31.614187 kernel: BTRFS info (device sda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:40:31.614236 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:40:31.616173 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:40:31.692178 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:40:31.707222 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 13 20:40:31.709192 kernel: BTRFS info (device sda6): last unmount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:40:31.724913 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 13 20:40:31.728247 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 13 20:40:31.850122 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:40:31.857289 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 13 20:40:31.901382 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:40:31.906324 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 20:40:31.918467 systemd-networkd[799]: lo: Link UP Jan 13 20:40:31.918472 systemd-networkd[799]: lo: Gained carrier Jan 13 20:40:31.919393 systemd-networkd[799]: Enumeration completed Jan 13 20:40:31.919733 systemd-networkd[799]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Jan 13 20:40:31.920000 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 20:40:31.920227 systemd[1]: Reached target network.target - Network. Jan 13 20:40:31.922207 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 20:40:31.922349 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 20:40:31.923366 systemd-networkd[799]: ens192: Link UP Jan 13 20:40:31.923372 systemd-networkd[799]: ens192: Gained carrier Jan 13 20:40:32.020716 ignition[671]: Ignition 2.20.0 Jan 13 20:40:32.020723 ignition[671]: Stage: fetch-offline Jan 13 20:40:32.020746 ignition[671]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:40:32.020752 ignition[671]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:40:32.020807 ignition[671]: parsed url from cmdline: "" Jan 13 20:40:32.020809 ignition[671]: no config URL provided Jan 13 20:40:32.020812 ignition[671]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 20:40:32.020816 ignition[671]: no config at "/usr/lib/ignition/user.ign" Jan 13 20:40:32.021249 ignition[671]: config successfully fetched Jan 13 20:40:32.021266 ignition[671]: parsing config with SHA512: 8f4577f0b94c1b1519d9a838ddd204133562f7b3d3034002ef3056cdc68b89395eb0a086d911ba8c930dec4216fd8106d97f38ddd3ea1b860e9d91974775ca2e Jan 13 20:40:32.024108 unknown[671]: fetched base config from "system" Jan 13 20:40:32.024117 unknown[671]: fetched user config from "vmware" Jan 13 20:40:32.024392 ignition[671]: fetch-offline: fetch-offline passed Jan 13 20:40:32.024438 ignition[671]: Ignition finished successfully Jan 13 20:40:32.025254 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:40:32.025464 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jan 13 20:40:32.030269 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 13 20:40:32.039415 ignition[808]: Ignition 2.20.0 Jan 13 20:40:32.039425 ignition[808]: Stage: kargs Jan 13 20:40:32.039568 ignition[808]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:40:32.039577 ignition[808]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:40:32.040520 ignition[808]: kargs: kargs passed Jan 13 20:40:32.040561 ignition[808]: Ignition finished successfully Jan 13 20:40:32.041493 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 13 20:40:32.046357 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 13 20:40:32.055054 ignition[815]: Ignition 2.20.0 Jan 13 20:40:32.055065 ignition[815]: Stage: disks Jan 13 20:40:32.055228 ignition[815]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:40:32.055238 ignition[815]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:40:32.056060 ignition[815]: disks: disks passed Jan 13 20:40:32.056105 ignition[815]: Ignition finished successfully Jan 13 20:40:32.056801 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 13 20:40:32.057273 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 13 20:40:32.057380 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 13 20:40:32.057480 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 20:40:32.057564 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 20:40:32.057648 systemd[1]: Reached target basic.target - Basic System. Jan 13 20:40:32.061295 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 13 20:40:32.223877 systemd-fsck[824]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 13 20:40:32.231736 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 13 20:40:32.239353 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 13 20:40:32.430174 kernel: EXT4-fs (sda9): mounted filesystem 59ba8ffc-e6b0-4bb4-a36e-13a47bd6ad99 r/w with ordered data mode. Quota mode: none. Jan 13 20:40:32.430798 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 13 20:40:32.431313 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 13 20:40:32.439291 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:40:32.441240 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 13 20:40:32.442403 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 13 20:40:32.442441 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 13 20:40:32.442459 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:40:32.450567 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 13 20:40:32.451709 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 13 20:40:32.454179 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (832) Jan 13 20:40:32.457911 kernel: BTRFS info (device sda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:40:32.457956 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:40:32.457965 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:40:32.464390 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:40:32.465525 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:40:32.495512 initrd-setup-root[856]: cut: /sysroot/etc/passwd: No such file or directory Jan 13 20:40:32.506650 initrd-setup-root[863]: cut: /sysroot/etc/group: No such file or directory Jan 13 20:40:32.513792 initrd-setup-root[870]: cut: /sysroot/etc/shadow: No such file or directory Jan 13 20:40:32.527220 initrd-setup-root[877]: cut: /sysroot/etc/gshadow: No such file or directory Jan 13 20:40:33.010222 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 13 20:40:33.014243 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 13 20:40:33.016695 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 13 20:40:33.020759 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 13 20:40:33.022184 kernel: BTRFS info (device sda6): last unmount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:40:33.033644 ignition[944]: INFO : Ignition 2.20.0 Jan 13 20:40:33.033644 ignition[944]: INFO : Stage: mount Jan 13 20:40:33.034022 ignition[944]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:40:33.034022 ignition[944]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:40:33.034315 ignition[944]: INFO : mount: mount passed Jan 13 20:40:33.034315 ignition[944]: INFO : Ignition finished successfully Jan 13 20:40:33.034740 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 13 20:40:33.040265 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 13 20:40:33.045359 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:40:33.050871 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 13 20:40:33.059267 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (954) Jan 13 20:40:33.061700 kernel: BTRFS info (device sda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:40:33.061733 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:40:33.061742 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:40:33.068175 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:40:33.069366 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:40:33.089060 ignition[974]: INFO : Ignition 2.20.0 Jan 13 20:40:33.089060 ignition[974]: INFO : Stage: files Jan 13 20:40:33.089797 ignition[974]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:40:33.089797 ignition[974]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:40:33.089797 ignition[974]: DEBUG : files: compiled without relabeling support, skipping Jan 13 20:40:33.093010 ignition[974]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 13 20:40:33.093010 ignition[974]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 13 20:40:33.097284 ignition[974]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 13 20:40:33.097628 ignition[974]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 13 20:40:33.097942 ignition[974]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 13 20:40:33.097882 unknown[974]: wrote ssh authorized keys file for user: core Jan 13 20:40:33.101137 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:40:33.101137 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 13 20:40:33.147991 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 13 20:40:33.222477 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:40:33.223044 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 13 20:40:33.223044 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 13 20:40:33.223044 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Jan 13 20:40:33.538272 systemd-networkd[799]: ens192: Gained IPv6LL Jan 13 20:40:33.742278 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 13 20:40:34.177321 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 13 20:40:34.177747 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:40:34.177747 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Jan 13 20:40:34.397328 ignition[974]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:40:34.399850 ignition[974]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:40:34.399850 ignition[974]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Jan 13 20:40:34.399850 ignition[974]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Jan 13 20:40:34.399850 ignition[974]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Jan 13 20:40:34.401079 ignition[974]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:40:34.401079 ignition[974]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:40:34.401079 ignition[974]: INFO : files: files passed Jan 13 20:40:34.401079 ignition[974]: INFO : Ignition finished successfully Jan 13 20:40:34.400983 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 13 20:40:34.406272 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 13 20:40:34.407396 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 13 20:40:34.408459 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 13 20:40:34.408649 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 13 20:40:34.415057 initrd-setup-root-after-ignition[1005]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:40:34.415057 initrd-setup-root-after-ignition[1005]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:40:34.416080 initrd-setup-root-after-ignition[1009]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:40:34.416871 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:40:34.417303 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 13 20:40:34.421269 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 13 20:40:34.440562 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 13 20:40:34.440824 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 13 20:40:34.441458 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 13 20:40:34.441709 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 13 20:40:34.441942 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 13 20:40:34.442620 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 13 20:40:34.452191 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:40:34.456263 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 13 20:40:34.462269 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:40:34.462466 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:40:34.462683 systemd[1]: Stopped target timers.target - Timer Units. Jan 13 20:40:34.462871 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 13 20:40:34.462949 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:40:34.463326 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 13 20:40:34.463488 systemd[1]: Stopped target basic.target - Basic System. Jan 13 20:40:34.463682 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 13 20:40:34.463887 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:40:34.464219 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 13 20:40:34.464614 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 13 20:40:34.464856 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:40:34.465099 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 13 20:40:34.465301 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 13 20:40:34.465518 systemd[1]: Stopped target swap.target - Swaps. Jan 13 20:40:34.465662 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 13 20:40:34.465745 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:40:34.466120 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:40:34.466350 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:40:34.466543 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 13 20:40:34.466591 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:40:34.466766 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 13 20:40:34.466835 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 13 20:40:34.467134 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 13 20:40:34.467220 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:40:34.467494 systemd[1]: Stopped target paths.target - Path Units. Jan 13 20:40:34.467631 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 13 20:40:34.471213 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:40:34.471461 systemd[1]: Stopped target slices.target - Slice Units. Jan 13 20:40:34.471655 systemd[1]: Stopped target sockets.target - Socket Units. Jan 13 20:40:34.471827 systemd[1]: iscsid.socket: Deactivated successfully. Jan 13 20:40:34.471894 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:40:34.472092 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 13 20:40:34.472149 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:40:34.472352 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 13 20:40:34.472429 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:40:34.472705 systemd[1]: ignition-files.service: Deactivated successfully. Jan 13 20:40:34.472792 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 13 20:40:34.481601 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 13 20:40:34.481903 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 13 20:40:34.482266 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:40:34.484363 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 13 20:40:34.484613 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 13 20:40:34.484886 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:40:34.485299 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 13 20:40:34.485586 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:40:34.491210 ignition[1030]: INFO : Ignition 2.20.0 Jan 13 20:40:34.491210 ignition[1030]: INFO : Stage: umount Jan 13 20:40:34.497735 ignition[1030]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:40:34.497735 ignition[1030]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:40:34.497735 ignition[1030]: INFO : umount: umount passed Jan 13 20:40:34.497735 ignition[1030]: INFO : Ignition finished successfully Jan 13 20:40:34.496351 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 13 20:40:34.496416 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 13 20:40:34.496804 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 13 20:40:34.496851 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 13 20:40:34.498342 systemd[1]: Stopped target network.target - Network. Jan 13 20:40:34.499354 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 13 20:40:34.499414 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 13 20:40:34.499568 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 13 20:40:34.499594 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 13 20:40:34.499711 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 13 20:40:34.499742 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 13 20:40:34.499852 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 13 20:40:34.499873 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 13 20:40:34.500065 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 13 20:40:34.500231 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 13 20:40:34.506510 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 13 20:40:34.507963 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 13 20:40:34.508045 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 13 20:40:34.508936 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 13 20:40:34.508985 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:40:34.509975 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 13 20:40:34.510247 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 13 20:40:34.511013 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 13 20:40:34.511410 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:40:34.515308 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 13 20:40:34.515656 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 13 20:40:34.516030 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:40:34.516359 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Jan 13 20:40:34.516392 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:40:34.516527 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 13 20:40:34.516557 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:40:34.516675 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 13 20:40:34.516704 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 13 20:40:34.516883 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:40:34.525448 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 13 20:40:34.525702 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 13 20:40:34.530771 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 13 20:40:34.530884 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:40:34.531348 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 13 20:40:34.531385 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 13 20:40:34.531617 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 13 20:40:34.531640 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:40:34.531795 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 13 20:40:34.531827 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:40:34.532112 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 13 20:40:34.532141 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 13 20:40:34.532434 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:40:34.532463 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:40:34.538310 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 13 20:40:34.538426 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 13 20:40:34.538461 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:40:34.538597 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:40:34.538619 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:40:34.542033 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 13 20:40:34.542105 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 13 20:40:34.622631 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 13 20:40:34.622698 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 13 20:40:34.623139 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 13 20:40:34.623280 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 13 20:40:34.623310 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 13 20:40:34.625298 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 13 20:40:34.633363 systemd[1]: Switching root. Jan 13 20:40:34.668250 systemd-journald[216]: Journal stopped Jan 13 20:40:28.747434 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241116 p3) 14.2.1 20241116, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Jan 13 18:58:40 -00 2025 Jan 13 20:40:28.747451 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 20:40:28.747457 kernel: Disabled fast string operations Jan 13 20:40:28.747462 kernel: BIOS-provided physical RAM map: Jan 13 20:40:28.747466 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Jan 13 20:40:28.747470 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Jan 13 20:40:28.747476 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Jan 13 20:40:28.747480 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Jan 13 20:40:28.747485 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Jan 13 20:40:28.747489 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Jan 13 20:40:28.747493 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Jan 13 20:40:28.747497 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Jan 13 20:40:28.747502 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Jan 13 20:40:28.747506 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Jan 13 20:40:28.747512 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Jan 13 20:40:28.747517 kernel: NX (Execute Disable) protection: active Jan 13 20:40:28.747522 kernel: APIC: Static calls initialized Jan 13 20:40:28.747527 kernel: SMBIOS 2.7 present. Jan 13 20:40:28.747531 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Jan 13 20:40:28.747536 kernel: vmware: hypercall mode: 0x00 Jan 13 20:40:28.747541 kernel: Hypervisor detected: VMware Jan 13 20:40:28.747546 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Jan 13 20:40:28.747552 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Jan 13 20:40:28.747557 kernel: vmware: using clock offset of 6416934363 ns Jan 13 20:40:28.747562 kernel: tsc: Detected 3408.000 MHz processor Jan 13 20:40:28.747567 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 13 20:40:28.747572 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 13 20:40:28.747577 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Jan 13 20:40:28.747582 kernel: total RAM covered: 3072M Jan 13 20:40:28.747587 kernel: Found optimal setting for mtrr clean up Jan 13 20:40:28.747592 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Jan 13 20:40:28.747597 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Jan 13 20:40:28.747603 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 13 20:40:28.747608 kernel: Using GB pages for direct mapping Jan 13 20:40:28.747613 kernel: ACPI: Early table checksum verification disabled Jan 13 20:40:28.747618 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Jan 13 20:40:28.747623 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Jan 13 20:40:28.747628 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Jan 13 20:40:28.747633 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Jan 13 20:40:28.747638 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:40:28.747646 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:40:28.747651 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Jan 13 20:40:28.747656 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Jan 13 20:40:28.747662 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Jan 13 20:40:28.747667 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Jan 13 20:40:28.747672 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Jan 13 20:40:28.747678 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Jan 13 20:40:28.747684 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Jan 13 20:40:28.747689 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Jan 13 20:40:28.747694 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:40:28.747699 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:40:28.747704 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Jan 13 20:40:28.747709 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Jan 13 20:40:28.747714 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Jan 13 20:40:28.747719 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Jan 13 20:40:28.747726 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Jan 13 20:40:28.747731 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Jan 13 20:40:28.747736 kernel: system APIC only can use physical flat Jan 13 20:40:28.747741 kernel: APIC: Switched APIC routing to: physical flat Jan 13 20:40:28.747747 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 13 20:40:28.747752 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 13 20:40:28.747757 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 13 20:40:28.747762 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 13 20:40:28.747767 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 13 20:40:28.747772 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 13 20:40:28.747778 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 13 20:40:28.747783 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 13 20:40:28.747788 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Jan 13 20:40:28.747793 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Jan 13 20:40:28.747798 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Jan 13 20:40:28.747803 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Jan 13 20:40:28.747808 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Jan 13 20:40:28.747813 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Jan 13 20:40:28.747818 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Jan 13 20:40:28.747823 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Jan 13 20:40:28.747829 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Jan 13 20:40:28.747834 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Jan 13 20:40:28.747840 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Jan 13 20:40:28.747844 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Jan 13 20:40:28.747849 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Jan 13 20:40:28.747855 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Jan 13 20:40:28.747860 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Jan 13 20:40:28.747865 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Jan 13 20:40:28.747870 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Jan 13 20:40:28.747875 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Jan 13 20:40:28.747881 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Jan 13 20:40:28.747886 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Jan 13 20:40:28.747891 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Jan 13 20:40:28.747896 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Jan 13 20:40:28.747901 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Jan 13 20:40:28.747906 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Jan 13 20:40:28.747911 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Jan 13 20:40:28.747916 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Jan 13 20:40:28.747921 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Jan 13 20:40:28.747926 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Jan 13 20:40:28.747932 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Jan 13 20:40:28.747937 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Jan 13 20:40:28.747942 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Jan 13 20:40:28.747947 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Jan 13 20:40:28.747952 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Jan 13 20:40:28.747957 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Jan 13 20:40:28.747962 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Jan 13 20:40:28.747967 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Jan 13 20:40:28.747972 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Jan 13 20:40:28.747977 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Jan 13 20:40:28.747983 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Jan 13 20:40:28.747989 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Jan 13 20:40:28.747993 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Jan 13 20:40:28.747999 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Jan 13 20:40:28.748003 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Jan 13 20:40:28.748008 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Jan 13 20:40:28.748014 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Jan 13 20:40:28.748018 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Jan 13 20:40:28.748024 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Jan 13 20:40:28.748028 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Jan 13 20:40:28.748035 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Jan 13 20:40:28.748040 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Jan 13 20:40:28.748045 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Jan 13 20:40:28.748055 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Jan 13 20:40:28.748060 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Jan 13 20:40:28.748066 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Jan 13 20:40:28.748071 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Jan 13 20:40:28.748076 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Jan 13 20:40:28.748082 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Jan 13 20:40:28.748088 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Jan 13 20:40:28.748094 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Jan 13 20:40:28.748099 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Jan 13 20:40:28.748105 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Jan 13 20:40:28.748111 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Jan 13 20:40:28.748116 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Jan 13 20:40:28.748121 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Jan 13 20:40:28.748127 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Jan 13 20:40:28.748132 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Jan 13 20:40:28.748138 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Jan 13 20:40:28.748144 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Jan 13 20:40:28.748149 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Jan 13 20:40:28.748155 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Jan 13 20:40:28.748166 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Jan 13 20:40:28.748172 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Jan 13 20:40:28.748177 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Jan 13 20:40:28.748183 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Jan 13 20:40:28.748188 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Jan 13 20:40:28.748193 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Jan 13 20:40:28.748199 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Jan 13 20:40:28.748205 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Jan 13 20:40:28.748211 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Jan 13 20:40:28.748216 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Jan 13 20:40:28.748221 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Jan 13 20:40:28.748227 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Jan 13 20:40:28.748232 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Jan 13 20:40:28.748237 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Jan 13 20:40:28.748243 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Jan 13 20:40:28.748248 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Jan 13 20:40:28.748253 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Jan 13 20:40:28.748260 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Jan 13 20:40:28.748265 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Jan 13 20:40:28.748271 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Jan 13 20:40:28.748276 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Jan 13 20:40:28.748281 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Jan 13 20:40:28.748286 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Jan 13 20:40:28.748291 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Jan 13 20:40:28.748297 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Jan 13 20:40:28.748302 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Jan 13 20:40:28.748308 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Jan 13 20:40:28.748313 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Jan 13 20:40:28.748319 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Jan 13 20:40:28.748325 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Jan 13 20:40:28.748330 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Jan 13 20:40:28.748336 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Jan 13 20:40:28.748341 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Jan 13 20:40:28.748346 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Jan 13 20:40:28.748351 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Jan 13 20:40:28.748357 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Jan 13 20:40:28.748362 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Jan 13 20:40:28.748368 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Jan 13 20:40:28.748374 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Jan 13 20:40:28.748380 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Jan 13 20:40:28.748385 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Jan 13 20:40:28.748390 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Jan 13 20:40:28.748396 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Jan 13 20:40:28.748401 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Jan 13 20:40:28.748406 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Jan 13 20:40:28.748411 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Jan 13 20:40:28.748417 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Jan 13 20:40:28.748422 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Jan 13 20:40:28.748429 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Jan 13 20:40:28.748434 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Jan 13 20:40:28.748440 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 13 20:40:28.748445 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 13 20:40:28.748451 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Jan 13 20:40:28.748456 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Jan 13 20:40:28.748462 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Jan 13 20:40:28.748467 kernel: Zone ranges: Jan 13 20:40:28.748473 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 13 20:40:28.748479 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Jan 13 20:40:28.748485 kernel: Normal empty Jan 13 20:40:28.748490 kernel: Movable zone start for each node Jan 13 20:40:28.748496 kernel: Early memory node ranges Jan 13 20:40:28.748501 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Jan 13 20:40:28.748507 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Jan 13 20:40:28.748512 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Jan 13 20:40:28.748518 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Jan 13 20:40:28.748523 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 13 20:40:28.748529 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Jan 13 20:40:28.748535 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Jan 13 20:40:28.748541 kernel: ACPI: PM-Timer IO Port: 0x1008 Jan 13 20:40:28.748546 kernel: system APIC only can use physical flat Jan 13 20:40:28.748552 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Jan 13 20:40:28.748557 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Jan 13 20:40:28.748562 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Jan 13 20:40:28.748568 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Jan 13 20:40:28.748573 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Jan 13 20:40:28.748579 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Jan 13 20:40:28.748584 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Jan 13 20:40:28.748591 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Jan 13 20:40:28.748596 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Jan 13 20:40:28.748601 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Jan 13 20:40:28.748607 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Jan 13 20:40:28.748612 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Jan 13 20:40:28.748618 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Jan 13 20:40:28.748623 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Jan 13 20:40:28.748629 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Jan 13 20:40:28.748634 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Jan 13 20:40:28.748640 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Jan 13 20:40:28.748646 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Jan 13 20:40:28.748651 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Jan 13 20:40:28.748657 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Jan 13 20:40:28.748662 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Jan 13 20:40:28.748667 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Jan 13 20:40:28.748673 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Jan 13 20:40:28.748678 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Jan 13 20:40:28.748683 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Jan 13 20:40:28.748689 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Jan 13 20:40:28.748696 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Jan 13 20:40:28.748701 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Jan 13 20:40:28.748707 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Jan 13 20:40:28.748712 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Jan 13 20:40:28.748717 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Jan 13 20:40:28.748723 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Jan 13 20:40:28.748728 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Jan 13 20:40:28.748733 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Jan 13 20:40:28.748739 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Jan 13 20:40:28.748745 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Jan 13 20:40:28.748751 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Jan 13 20:40:28.748756 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Jan 13 20:40:28.748762 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Jan 13 20:40:28.748767 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Jan 13 20:40:28.748772 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Jan 13 20:40:28.748778 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Jan 13 20:40:28.748784 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Jan 13 20:40:28.748789 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Jan 13 20:40:28.748795 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Jan 13 20:40:28.748801 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Jan 13 20:40:28.748807 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Jan 13 20:40:28.748812 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Jan 13 20:40:28.748817 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Jan 13 20:40:28.748823 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Jan 13 20:40:28.748828 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Jan 13 20:40:28.748834 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Jan 13 20:40:28.748839 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Jan 13 20:40:28.748845 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Jan 13 20:40:28.748851 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Jan 13 20:40:28.748857 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Jan 13 20:40:28.748862 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Jan 13 20:40:28.748867 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Jan 13 20:40:28.748873 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Jan 13 20:40:28.748878 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Jan 13 20:40:28.748884 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Jan 13 20:40:28.748889 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Jan 13 20:40:28.748894 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Jan 13 20:40:28.748901 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Jan 13 20:40:28.748906 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Jan 13 20:40:28.748912 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Jan 13 20:40:28.748917 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Jan 13 20:40:28.748923 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Jan 13 20:40:28.748928 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Jan 13 20:40:28.748933 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Jan 13 20:40:28.748939 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Jan 13 20:40:28.748945 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Jan 13 20:40:28.748950 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Jan 13 20:40:28.748957 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Jan 13 20:40:28.748962 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Jan 13 20:40:28.748968 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Jan 13 20:40:28.748973 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Jan 13 20:40:28.748978 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Jan 13 20:40:28.748984 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Jan 13 20:40:28.748989 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Jan 13 20:40:28.748995 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Jan 13 20:40:28.749000 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Jan 13 20:40:28.749007 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Jan 13 20:40:28.749012 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Jan 13 20:40:28.749017 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Jan 13 20:40:28.749023 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Jan 13 20:40:28.749028 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Jan 13 20:40:28.749034 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Jan 13 20:40:28.749039 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Jan 13 20:40:28.749045 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Jan 13 20:40:28.749050 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Jan 13 20:40:28.749055 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Jan 13 20:40:28.749062 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Jan 13 20:40:28.749068 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Jan 13 20:40:28.749073 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Jan 13 20:40:28.749078 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Jan 13 20:40:28.749084 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Jan 13 20:40:28.749090 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Jan 13 20:40:28.749095 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Jan 13 20:40:28.749100 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Jan 13 20:40:28.749106 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Jan 13 20:40:28.749112 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Jan 13 20:40:28.749118 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Jan 13 20:40:28.749123 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Jan 13 20:40:28.749128 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Jan 13 20:40:28.749134 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Jan 13 20:40:28.749139 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Jan 13 20:40:28.749145 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Jan 13 20:40:28.749151 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Jan 13 20:40:28.749156 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Jan 13 20:40:28.749170 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Jan 13 20:40:28.749177 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Jan 13 20:40:28.749183 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Jan 13 20:40:28.749188 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Jan 13 20:40:28.749194 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Jan 13 20:40:28.749199 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Jan 13 20:40:28.749204 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Jan 13 20:40:28.749210 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Jan 13 20:40:28.749215 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Jan 13 20:40:28.749221 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Jan 13 20:40:28.749227 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Jan 13 20:40:28.749233 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Jan 13 20:40:28.749238 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Jan 13 20:40:28.749244 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Jan 13 20:40:28.749249 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Jan 13 20:40:28.749255 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Jan 13 20:40:28.749260 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Jan 13 20:40:28.749265 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Jan 13 20:40:28.749271 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Jan 13 20:40:28.749276 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Jan 13 20:40:28.749283 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 13 20:40:28.749289 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Jan 13 20:40:28.749295 kernel: TSC deadline timer available Jan 13 20:40:28.749300 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Jan 13 20:40:28.749306 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Jan 13 20:40:28.749312 kernel: Booting paravirtualized kernel on VMware hypervisor Jan 13 20:40:28.749317 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 13 20:40:28.749323 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Jan 13 20:40:28.749328 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 13 20:40:28.749335 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 13 20:40:28.749341 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Jan 13 20:40:28.749346 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Jan 13 20:40:28.749351 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Jan 13 20:40:28.749357 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Jan 13 20:40:28.749362 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Jan 13 20:40:28.749375 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Jan 13 20:40:28.749382 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Jan 13 20:40:28.749388 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Jan 13 20:40:28.749395 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Jan 13 20:40:28.749400 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Jan 13 20:40:28.749406 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Jan 13 20:40:28.749412 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Jan 13 20:40:28.749417 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Jan 13 20:40:28.749423 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Jan 13 20:40:28.749429 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Jan 13 20:40:28.749434 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Jan 13 20:40:28.749442 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 20:40:28.749448 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 13 20:40:28.749454 kernel: random: crng init done Jan 13 20:40:28.749459 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Jan 13 20:40:28.749465 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Jan 13 20:40:28.749471 kernel: printk: log_buf_len min size: 262144 bytes Jan 13 20:40:28.749477 kernel: printk: log_buf_len: 1048576 bytes Jan 13 20:40:28.749483 kernel: printk: early log buf free: 239648(91%) Jan 13 20:40:28.749490 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 13 20:40:28.749496 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 13 20:40:28.749501 kernel: Fallback order for Node 0: 0 Jan 13 20:40:28.749507 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Jan 13 20:40:28.749513 kernel: Policy zone: DMA32 Jan 13 20:40:28.749519 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 13 20:40:28.749525 kernel: Memory: 1934320K/2096628K available (14336K kernel code, 2299K rwdata, 22800K rodata, 43320K init, 1756K bss, 162048K reserved, 0K cma-reserved) Jan 13 20:40:28.749532 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Jan 13 20:40:28.749538 kernel: ftrace: allocating 37890 entries in 149 pages Jan 13 20:40:28.749543 kernel: ftrace: allocated 149 pages with 4 groups Jan 13 20:40:28.749549 kernel: Dynamic Preempt: voluntary Jan 13 20:40:28.749555 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 13 20:40:28.749561 kernel: rcu: RCU event tracing is enabled. Jan 13 20:40:28.749567 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Jan 13 20:40:28.749573 kernel: Trampoline variant of Tasks RCU enabled. Jan 13 20:40:28.749580 kernel: Rude variant of Tasks RCU enabled. Jan 13 20:40:28.749586 kernel: Tracing variant of Tasks RCU enabled. Jan 13 20:40:28.749592 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 13 20:40:28.749598 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Jan 13 20:40:28.749603 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Jan 13 20:40:28.749609 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Jan 13 20:40:28.749615 kernel: Console: colour VGA+ 80x25 Jan 13 20:40:28.749621 kernel: printk: console [tty0] enabled Jan 13 20:40:28.749627 kernel: printk: console [ttyS0] enabled Jan 13 20:40:28.749632 kernel: ACPI: Core revision 20230628 Jan 13 20:40:28.749640 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Jan 13 20:40:28.749646 kernel: APIC: Switch to symmetric I/O mode setup Jan 13 20:40:28.749652 kernel: x2apic enabled Jan 13 20:40:28.749657 kernel: APIC: Switched APIC routing to: physical x2apic Jan 13 20:40:28.749663 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 13 20:40:28.749669 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:40:28.749675 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Jan 13 20:40:28.749681 kernel: Disabled fast string operations Jan 13 20:40:28.749687 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Jan 13 20:40:28.749694 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Jan 13 20:40:28.749700 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 13 20:40:28.749706 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Jan 13 20:40:28.749712 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Jan 13 20:40:28.749719 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 13 20:40:28.749725 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 13 20:40:28.749731 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 13 20:40:28.749736 kernel: RETBleed: Mitigation: Enhanced IBRS Jan 13 20:40:28.749742 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 13 20:40:28.749750 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 13 20:40:28.749756 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jan 13 20:40:28.749762 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 13 20:40:28.749768 kernel: GDS: Unknown: Dependent on hypervisor status Jan 13 20:40:28.749774 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 13 20:40:28.749779 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 13 20:40:28.749785 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 13 20:40:28.749791 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 13 20:40:28.749798 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jan 13 20:40:28.749804 kernel: Freeing SMP alternatives memory: 32K Jan 13 20:40:28.749810 kernel: pid_max: default: 131072 minimum: 1024 Jan 13 20:40:28.749816 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 13 20:40:28.749822 kernel: landlock: Up and running. Jan 13 20:40:28.749828 kernel: SELinux: Initializing. Jan 13 20:40:28.749834 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:40:28.749840 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:40:28.749845 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Jan 13 20:40:28.749853 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:40:28.749859 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:40:28.749864 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:40:28.749870 kernel: Performance Events: Skylake events, core PMU driver. Jan 13 20:40:28.749876 kernel: core: CPUID marked event: 'cpu cycles' unavailable Jan 13 20:40:28.749882 kernel: core: CPUID marked event: 'instructions' unavailable Jan 13 20:40:28.749888 kernel: core: CPUID marked event: 'bus cycles' unavailable Jan 13 20:40:28.749893 kernel: core: CPUID marked event: 'cache references' unavailable Jan 13 20:40:28.749899 kernel: core: CPUID marked event: 'cache misses' unavailable Jan 13 20:40:28.749906 kernel: core: CPUID marked event: 'branch instructions' unavailable Jan 13 20:40:28.749912 kernel: core: CPUID marked event: 'branch misses' unavailable Jan 13 20:40:28.749917 kernel: ... version: 1 Jan 13 20:40:28.749923 kernel: ... bit width: 48 Jan 13 20:40:28.749929 kernel: ... generic registers: 4 Jan 13 20:40:28.749935 kernel: ... value mask: 0000ffffffffffff Jan 13 20:40:28.749940 kernel: ... max period: 000000007fffffff Jan 13 20:40:28.749946 kernel: ... fixed-purpose events: 0 Jan 13 20:40:28.749952 kernel: ... event mask: 000000000000000f Jan 13 20:40:28.749959 kernel: signal: max sigframe size: 1776 Jan 13 20:40:28.749965 kernel: rcu: Hierarchical SRCU implementation. Jan 13 20:40:28.749971 kernel: rcu: Max phase no-delay instances is 400. Jan 13 20:40:28.749976 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 13 20:40:28.749982 kernel: smp: Bringing up secondary CPUs ... Jan 13 20:40:28.749988 kernel: smpboot: x86: Booting SMP configuration: Jan 13 20:40:28.749994 kernel: .... node #0, CPUs: #1 Jan 13 20:40:28.749999 kernel: Disabled fast string operations Jan 13 20:40:28.750005 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Jan 13 20:40:28.750012 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 13 20:40:28.750018 kernel: smp: Brought up 1 node, 2 CPUs Jan 13 20:40:28.750024 kernel: smpboot: Max logical packages: 128 Jan 13 20:40:28.750029 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Jan 13 20:40:28.750035 kernel: devtmpfs: initialized Jan 13 20:40:28.750041 kernel: x86/mm: Memory block size: 128MB Jan 13 20:40:28.750047 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Jan 13 20:40:28.750053 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 13 20:40:28.750059 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Jan 13 20:40:28.750064 kernel: pinctrl core: initialized pinctrl subsystem Jan 13 20:40:28.750071 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 13 20:40:28.750077 kernel: audit: initializing netlink subsys (disabled) Jan 13 20:40:28.750083 kernel: audit: type=2000 audit(1736800827.073:1): state=initialized audit_enabled=0 res=1 Jan 13 20:40:28.750089 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 13 20:40:28.750095 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 13 20:40:28.750100 kernel: cpuidle: using governor menu Jan 13 20:40:28.750106 kernel: Simple Boot Flag at 0x36 set to 0x80 Jan 13 20:40:28.750112 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 13 20:40:28.750118 kernel: dca service started, version 1.12.1 Jan 13 20:40:28.750125 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Jan 13 20:40:28.750131 kernel: PCI: Using configuration type 1 for base access Jan 13 20:40:28.750137 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 13 20:40:28.750143 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 13 20:40:28.750148 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 13 20:40:28.751011 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 13 20:40:28.751022 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 13 20:40:28.751028 kernel: ACPI: Added _OSI(Module Device) Jan 13 20:40:28.751036 kernel: ACPI: Added _OSI(Processor Device) Jan 13 20:40:28.751042 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 13 20:40:28.751048 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 13 20:40:28.751054 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 13 20:40:28.751059 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Jan 13 20:40:28.751065 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 13 20:40:28.751071 kernel: ACPI: Interpreter enabled Jan 13 20:40:28.751077 kernel: ACPI: PM: (supports S0 S1 S5) Jan 13 20:40:28.751083 kernel: ACPI: Using IOAPIC for interrupt routing Jan 13 20:40:28.751089 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 13 20:40:28.751096 kernel: PCI: Using E820 reservations for host bridge windows Jan 13 20:40:28.751102 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Jan 13 20:40:28.751108 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Jan 13 20:40:28.751220 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 13 20:40:28.751303 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Jan 13 20:40:28.751355 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Jan 13 20:40:28.751364 kernel: PCI host bridge to bus 0000:00 Jan 13 20:40:28.751417 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 13 20:40:28.751463 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Jan 13 20:40:28.751507 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 13 20:40:28.751551 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 13 20:40:28.751595 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Jan 13 20:40:28.751639 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Jan 13 20:40:28.751698 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Jan 13 20:40:28.751757 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Jan 13 20:40:28.751811 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Jan 13 20:40:28.751868 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Jan 13 20:40:28.751919 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Jan 13 20:40:28.752031 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Jan 13 20:40:28.752107 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Jan 13 20:40:28.752220 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Jan 13 20:40:28.752275 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Jan 13 20:40:28.752329 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Jan 13 20:40:28.752379 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Jan 13 20:40:28.752429 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Jan 13 20:40:28.752482 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Jan 13 20:40:28.752535 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Jan 13 20:40:28.752585 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Jan 13 20:40:28.752638 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Jan 13 20:40:28.752688 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Jan 13 20:40:28.752737 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Jan 13 20:40:28.752785 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Jan 13 20:40:28.752859 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Jan 13 20:40:28.752918 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 13 20:40:28.752972 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Jan 13 20:40:28.753029 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.753083 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.753138 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.753201 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.753263 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.753318 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.753385 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.753494 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.753836 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.753891 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.753950 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754001 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754056 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754107 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754197 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754255 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754310 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754365 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754419 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754469 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754523 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754576 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754649 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754699 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754752 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754802 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754854 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.754904 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.754961 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755010 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755062 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755112 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755217 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755274 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755346 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755395 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755448 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755516 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755585 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755635 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755690 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755741 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755795 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755846 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.755899 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.755949 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756001 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756054 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756107 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756175 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756263 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756315 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756370 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756424 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756479 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756531 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756586 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756637 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756690 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756744 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756799 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756849 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.756903 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:40:28.756954 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.757006 kernel: pci_bus 0000:01: extended config space not accessible Jan 13 20:40:28.757061 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:40:28.757113 kernel: pci_bus 0000:02: extended config space not accessible Jan 13 20:40:28.757123 kernel: acpiphp: Slot [32] registered Jan 13 20:40:28.757129 kernel: acpiphp: Slot [33] registered Jan 13 20:40:28.757135 kernel: acpiphp: Slot [34] registered Jan 13 20:40:28.757141 kernel: acpiphp: Slot [35] registered Jan 13 20:40:28.757147 kernel: acpiphp: Slot [36] registered Jan 13 20:40:28.757202 kernel: acpiphp: Slot [37] registered Jan 13 20:40:28.757208 kernel: acpiphp: Slot [38] registered Jan 13 20:40:28.757216 kernel: acpiphp: Slot [39] registered Jan 13 20:40:28.757222 kernel: acpiphp: Slot [40] registered Jan 13 20:40:28.757228 kernel: acpiphp: Slot [41] registered Jan 13 20:40:28.757234 kernel: acpiphp: Slot [42] registered Jan 13 20:40:28.757239 kernel: acpiphp: Slot [43] registered Jan 13 20:40:28.757245 kernel: acpiphp: Slot [44] registered Jan 13 20:40:28.757251 kernel: acpiphp: Slot [45] registered Jan 13 20:40:28.757257 kernel: acpiphp: Slot [46] registered Jan 13 20:40:28.757263 kernel: acpiphp: Slot [47] registered Jan 13 20:40:28.757270 kernel: acpiphp: Slot [48] registered Jan 13 20:40:28.757275 kernel: acpiphp: Slot [49] registered Jan 13 20:40:28.757281 kernel: acpiphp: Slot [50] registered Jan 13 20:40:28.757287 kernel: acpiphp: Slot [51] registered Jan 13 20:40:28.757292 kernel: acpiphp: Slot [52] registered Jan 13 20:40:28.757298 kernel: acpiphp: Slot [53] registered Jan 13 20:40:28.757304 kernel: acpiphp: Slot [54] registered Jan 13 20:40:28.757310 kernel: acpiphp: Slot [55] registered Jan 13 20:40:28.757315 kernel: acpiphp: Slot [56] registered Jan 13 20:40:28.757321 kernel: acpiphp: Slot [57] registered Jan 13 20:40:28.757328 kernel: acpiphp: Slot [58] registered Jan 13 20:40:28.757334 kernel: acpiphp: Slot [59] registered Jan 13 20:40:28.757340 kernel: acpiphp: Slot [60] registered Jan 13 20:40:28.757346 kernel: acpiphp: Slot [61] registered Jan 13 20:40:28.757352 kernel: acpiphp: Slot [62] registered Jan 13 20:40:28.757358 kernel: acpiphp: Slot [63] registered Jan 13 20:40:28.757410 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jan 13 20:40:28.757461 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:40:28.757509 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:40:28.757562 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:40:28.757611 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Jan 13 20:40:28.757660 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Jan 13 20:40:28.757709 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Jan 13 20:40:28.757757 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Jan 13 20:40:28.757806 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Jan 13 20:40:28.757862 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Jan 13 20:40:28.757917 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Jan 13 20:40:28.757968 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Jan 13 20:40:28.758019 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:40:28.758070 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Jan 13 20:40:28.758121 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:40:28.758184 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:40:28.758238 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:40:28.758292 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:40:28.758343 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:40:28.758393 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:40:28.758443 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:40:28.758493 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:40:28.758544 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:40:28.758593 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:40:28.758643 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:40:28.758695 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:40:28.758746 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:40:28.758795 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:40:28.758844 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:40:28.758896 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:40:28.758946 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:40:28.758996 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:40:28.759050 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:40:28.759100 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:40:28.759150 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:40:28.759226 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:40:28.759278 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:40:28.759331 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:40:28.759381 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:40:28.759431 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:40:28.759482 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:40:28.759538 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Jan 13 20:40:28.759594 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Jan 13 20:40:28.759645 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Jan 13 20:40:28.759696 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Jan 13 20:40:28.759751 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Jan 13 20:40:28.759802 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:40:28.759855 kernel: pci 0000:0b:00.0: supports D1 D2 Jan 13 20:40:28.759906 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Jan 13 20:40:28.759957 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:40:28.760009 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:40:28.760060 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:40:28.760113 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:40:28.760617 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:40:28.760678 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:40:28.760732 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:40:28.760783 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:40:28.760837 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:40:28.760887 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:40:28.760937 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:40:28.760991 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:40:28.761044 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:40:28.761094 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:40:28.761143 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:40:28.762294 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:40:28.762355 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:40:28.762408 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:40:28.762462 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:40:28.762518 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:40:28.762568 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:40:28.762620 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:40:28.762671 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:40:28.762721 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:40:28.762772 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:40:28.762822 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:40:28.762871 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:40:28.762927 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:40:28.762977 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:40:28.763028 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:40:28.763077 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:40:28.763129 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:40:28.764210 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:40:28.764270 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:40:28.764323 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:40:28.764380 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:40:28.764432 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:40:28.764484 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:40:28.764534 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:40:28.764587 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:40:28.764638 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:40:28.764688 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:40:28.764743 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:40:28.764794 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:40:28.764844 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:40:28.764897 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:40:28.764947 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:40:28.764998 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:40:28.765051 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:40:28.765101 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:40:28.765153 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:40:28.766283 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:40:28.766341 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:40:28.766393 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:40:28.766447 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:40:28.766497 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:40:28.766546 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:40:28.766595 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:40:28.766651 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:40:28.766701 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:40:28.766750 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:40:28.766800 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:40:28.766851 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:40:28.766901 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:40:28.766951 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:40:28.767002 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:40:28.767054 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:40:28.767103 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:40:28.769738 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:40:28.769808 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:40:28.769863 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:40:28.769917 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:40:28.769968 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:40:28.770018 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:40:28.770075 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:40:28.770126 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:40:28.770598 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:40:28.770656 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:40:28.770707 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:40:28.770757 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:40:28.770766 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Jan 13 20:40:28.770772 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Jan 13 20:40:28.770781 kernel: ACPI: PCI: Interrupt link LNKB disabled Jan 13 20:40:28.770787 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 13 20:40:28.770793 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Jan 13 20:40:28.770798 kernel: iommu: Default domain type: Translated Jan 13 20:40:28.770805 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 13 20:40:28.770811 kernel: PCI: Using ACPI for IRQ routing Jan 13 20:40:28.770817 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 13 20:40:28.770823 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Jan 13 20:40:28.770829 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Jan 13 20:40:28.770879 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Jan 13 20:40:28.770930 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Jan 13 20:40:28.770979 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 13 20:40:28.770988 kernel: vgaarb: loaded Jan 13 20:40:28.770994 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Jan 13 20:40:28.771001 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Jan 13 20:40:28.771006 kernel: clocksource: Switched to clocksource tsc-early Jan 13 20:40:28.771012 kernel: VFS: Disk quotas dquot_6.6.0 Jan 13 20:40:28.771018 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 13 20:40:28.771026 kernel: pnp: PnP ACPI init Jan 13 20:40:28.773212 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Jan 13 20:40:28.773263 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Jan 13 20:40:28.773309 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Jan 13 20:40:28.773372 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Jan 13 20:40:28.773418 kernel: pnp 00:06: [dma 2] Jan 13 20:40:28.773466 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Jan 13 20:40:28.773514 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Jan 13 20:40:28.773557 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Jan 13 20:40:28.773565 kernel: pnp: PnP ACPI: found 8 devices Jan 13 20:40:28.773572 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 13 20:40:28.773578 kernel: NET: Registered PF_INET protocol family Jan 13 20:40:28.773583 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 13 20:40:28.773589 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 13 20:40:28.773597 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 13 20:40:28.773603 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 13 20:40:28.773609 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 13 20:40:28.773615 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 13 20:40:28.773620 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:40:28.773626 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:40:28.773632 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 13 20:40:28.773638 kernel: NET: Registered PF_XDP protocol family Jan 13 20:40:28.773691 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Jan 13 20:40:28.773745 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 13 20:40:28.773797 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 13 20:40:28.773848 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 13 20:40:28.773898 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 13 20:40:28.773948 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 13 20:40:28.773999 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 13 20:40:28.774052 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 13 20:40:28.774102 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 13 20:40:28.774151 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 13 20:40:28.774236 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 13 20:40:28.774312 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 13 20:40:28.774425 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 13 20:40:28.774483 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 13 20:40:28.774536 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 13 20:40:28.774589 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 13 20:40:28.774641 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 13 20:40:28.774693 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 13 20:40:28.774748 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 13 20:40:28.774801 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 13 20:40:28.774853 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 13 20:40:28.774905 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 13 20:40:28.774957 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Jan 13 20:40:28.775008 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:40:28.775063 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:40:28.775115 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775175 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775228 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775280 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775332 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775383 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775435 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775490 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775543 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775594 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775646 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775697 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775750 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775801 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775853 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.775907 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.775958 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776009 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776059 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776111 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776190 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776261 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776313 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776367 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776417 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776467 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776516 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776566 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776616 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776666 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776716 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776768 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776817 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776867 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.776917 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.776966 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777016 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777066 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777115 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777254 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777310 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777359 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777409 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777458 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777507 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777556 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777604 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777653 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777702 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777754 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777803 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777851 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777900 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.777950 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.777999 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778047 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778097 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778146 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778246 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778296 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778345 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778394 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778443 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778491 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778540 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778588 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778637 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778690 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778740 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778789 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778837 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778887 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.778936 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.778985 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.779034 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.779084 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.779133 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.779207 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.779259 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.779308 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.779358 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.779408 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.779458 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:40:28.779507 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:40:28.779558 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:40:28.779610 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Jan 13 20:40:28.779663 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:40:28.779713 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:40:28.779763 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:40:28.779818 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Jan 13 20:40:28.779869 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:40:28.779919 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:40:28.779970 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:40:28.780019 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:40:28.780073 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:40:28.780124 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:40:28.780232 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:40:28.780294 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:40:28.780346 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:40:28.780395 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:40:28.780444 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:40:28.780494 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:40:28.780543 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:40:28.780592 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:40:28.780645 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:40:28.780694 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:40:28.780743 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:40:28.780793 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:40:28.780845 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:40:28.780894 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:40:28.780946 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:40:28.780996 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:40:28.781045 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:40:28.781094 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:40:28.781144 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:40:28.781245 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:40:28.781295 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:40:28.781349 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Jan 13 20:40:28.781399 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:40:28.781451 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:40:28.781501 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:40:28.781551 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:40:28.781624 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:40:28.781677 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:40:28.781726 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:40:28.781776 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:40:28.781826 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:40:28.781877 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:40:28.781930 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:40:28.781980 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:40:28.782029 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:40:28.782080 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:40:28.782130 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:40:28.782230 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:40:28.782280 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:40:28.782329 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:40:28.782378 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:40:28.782428 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:40:28.782481 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:40:28.782530 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:40:28.782580 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:40:28.782630 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:40:28.782680 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:40:28.782729 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:40:28.782779 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:40:28.782831 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:40:28.782881 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:40:28.782934 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:40:28.782983 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:40:28.783034 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:40:28.783083 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:40:28.783133 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:40:28.783193 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:40:28.783245 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:40:28.783299 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:40:28.783349 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:40:28.783399 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:40:28.783452 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:40:28.783502 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:40:28.783551 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:40:28.783601 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:40:28.783651 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:40:28.783700 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:40:28.783764 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:40:28.783812 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:40:28.783861 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:40:28.783912 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:40:28.783961 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:40:28.784009 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:40:28.784058 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:40:28.784107 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:40:28.784155 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:40:28.784213 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:40:28.784262 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:40:28.784319 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:40:28.784370 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:40:28.784423 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:40:28.784471 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:40:28.784520 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:40:28.784570 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:40:28.784619 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:40:28.784667 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:40:28.784716 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:40:28.784765 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:40:28.784813 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:40:28.784865 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:40:28.784914 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:40:28.784962 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:40:28.785011 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:40:28.785096 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:40:28.785145 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:40:28.785218 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:40:28.785268 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:40:28.785317 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:40:28.785366 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:40:28.785418 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:40:28.785466 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:40:28.785515 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:40:28.785564 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:40:28.785608 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:40:28.785651 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:40:28.785694 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:40:28.785737 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:40:28.785789 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Jan 13 20:40:28.785835 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Jan 13 20:40:28.785881 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:40:28.785925 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:40:28.785970 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:40:28.786015 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:40:28.786077 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:40:28.786139 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:40:28.786215 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Jan 13 20:40:28.786264 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Jan 13 20:40:28.786319 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:40:28.786369 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Jan 13 20:40:28.786415 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Jan 13 20:40:28.786459 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:40:28.786510 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Jan 13 20:40:28.786556 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Jan 13 20:40:28.786601 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:40:28.786650 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Jan 13 20:40:28.786696 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:40:28.786745 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Jan 13 20:40:28.786791 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:40:28.786842 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Jan 13 20:40:28.786887 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:40:28.786936 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Jan 13 20:40:28.786982 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:40:28.787034 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Jan 13 20:40:28.787089 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:40:28.787142 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Jan 13 20:40:28.787253 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Jan 13 20:40:28.787300 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:40:28.787352 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Jan 13 20:40:28.787398 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Jan 13 20:40:28.787444 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:40:28.787496 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Jan 13 20:40:28.787543 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Jan 13 20:40:28.787591 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:40:28.787641 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Jan 13 20:40:28.787697 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:40:28.787749 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Jan 13 20:40:28.787799 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:40:28.787849 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Jan 13 20:40:28.787896 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:40:28.787945 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Jan 13 20:40:28.787992 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:40:28.788042 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Jan 13 20:40:28.788089 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:40:28.788143 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Jan 13 20:40:28.788251 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Jan 13 20:40:28.788298 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:40:28.788350 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Jan 13 20:40:28.788399 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Jan 13 20:40:28.788445 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:40:28.788498 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Jan 13 20:40:28.788544 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Jan 13 20:40:28.788589 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:40:28.788639 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Jan 13 20:40:28.788686 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:40:28.788735 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Jan 13 20:40:28.788781 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:40:28.788833 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Jan 13 20:40:28.788879 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:40:28.788929 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Jan 13 20:40:28.788976 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:40:28.789026 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Jan 13 20:40:28.789072 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:40:28.789126 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 13 20:40:28.789198 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Jan 13 20:40:28.789247 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:40:28.789300 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Jan 13 20:40:28.789347 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Jan 13 20:40:28.789393 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:40:28.789446 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Jan 13 20:40:28.789493 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:40:28.789556 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Jan 13 20:40:28.789602 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:40:28.789651 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Jan 13 20:40:28.789697 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:40:28.789748 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Jan 13 20:40:28.789794 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:40:28.789842 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Jan 13 20:40:28.789888 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:40:28.789936 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Jan 13 20:40:28.789982 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:40:28.790037 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jan 13 20:40:28.790047 kernel: PCI: CLS 32 bytes, default 64 Jan 13 20:40:28.790054 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 13 20:40:28.790060 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:40:28.790067 kernel: clocksource: Switched to clocksource tsc Jan 13 20:40:28.790073 kernel: Initialise system trusted keyrings Jan 13 20:40:28.790080 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 13 20:40:28.790086 kernel: Key type asymmetric registered Jan 13 20:40:28.790092 kernel: Asymmetric key parser 'x509' registered Jan 13 20:40:28.790100 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 13 20:40:28.790106 kernel: io scheduler mq-deadline registered Jan 13 20:40:28.790112 kernel: io scheduler kyber registered Jan 13 20:40:28.790118 kernel: io scheduler bfq registered Jan 13 20:40:28.790198 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Jan 13 20:40:28.790252 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.790303 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Jan 13 20:40:28.790353 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.790424 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Jan 13 20:40:28.790475 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.790540 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Jan 13 20:40:28.790591 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.790641 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Jan 13 20:40:28.790690 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.790743 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Jan 13 20:40:28.790793 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.790843 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Jan 13 20:40:28.790893 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.790942 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Jan 13 20:40:28.790994 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791044 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Jan 13 20:40:28.791094 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791144 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Jan 13 20:40:28.791225 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791276 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Jan 13 20:40:28.791326 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791379 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Jan 13 20:40:28.791428 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791496 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Jan 13 20:40:28.791546 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791595 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Jan 13 20:40:28.791646 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791699 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Jan 13 20:40:28.791749 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791800 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Jan 13 20:40:28.791851 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.791902 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Jan 13 20:40:28.791955 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792006 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Jan 13 20:40:28.792056 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792106 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Jan 13 20:40:28.792156 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792241 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Jan 13 20:40:28.792292 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792346 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Jan 13 20:40:28.792396 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792449 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Jan 13 20:40:28.792500 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792550 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Jan 13 20:40:28.792604 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792654 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Jan 13 20:40:28.792705 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792756 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Jan 13 20:40:28.792806 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.792857 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Jan 13 20:40:28.792940 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.793006 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Jan 13 20:40:28.793056 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.793106 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Jan 13 20:40:28.793188 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.793247 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Jan 13 20:40:28.793299 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.793348 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Jan 13 20:40:28.793399 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.793448 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Jan 13 20:40:28.793497 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.793548 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Jan 13 20:40:28.793597 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:40:28.793606 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 13 20:40:28.793613 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 13 20:40:28.793619 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 13 20:40:28.793626 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Jan 13 20:40:28.793632 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 13 20:40:28.793640 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 13 20:40:28.793691 kernel: rtc_cmos 00:01: registered as rtc0 Jan 13 20:40:28.793737 kernel: rtc_cmos 00:01: setting system clock to 2025-01-13T20:40:28 UTC (1736800828) Jan 13 20:40:28.793783 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Jan 13 20:40:28.793792 kernel: intel_pstate: CPU model not supported Jan 13 20:40:28.793798 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 13 20:40:28.793804 kernel: NET: Registered PF_INET6 protocol family Jan 13 20:40:28.793811 kernel: Segment Routing with IPv6 Jan 13 20:40:28.793817 kernel: In-situ OAM (IOAM) with IPv6 Jan 13 20:40:28.793825 kernel: NET: Registered PF_PACKET protocol family Jan 13 20:40:28.793831 kernel: Key type dns_resolver registered Jan 13 20:40:28.793837 kernel: IPI shorthand broadcast: enabled Jan 13 20:40:28.793844 kernel: sched_clock: Marking stable (923277504, 247162490)->(1236792580, -66352586) Jan 13 20:40:28.793850 kernel: registered taskstats version 1 Jan 13 20:40:28.793856 kernel: Loading compiled-in X.509 certificates Jan 13 20:40:28.793862 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: ede78b3e719729f95eaaf7cb6a5289b567f6ee3e' Jan 13 20:40:28.793868 kernel: Key type .fscrypt registered Jan 13 20:40:28.793874 kernel: Key type fscrypt-provisioning registered Jan 13 20:40:28.793881 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 13 20:40:28.793888 kernel: ima: Allocated hash algorithm: sha1 Jan 13 20:40:28.793894 kernel: ima: No architecture policies found Jan 13 20:40:28.793900 kernel: clk: Disabling unused clocks Jan 13 20:40:28.793906 kernel: Freeing unused kernel image (initmem) memory: 43320K Jan 13 20:40:28.793912 kernel: Write protecting the kernel read-only data: 38912k Jan 13 20:40:28.793918 kernel: Freeing unused kernel image (rodata/data gap) memory: 1776K Jan 13 20:40:28.793924 kernel: Run /init as init process Jan 13 20:40:28.793931 kernel: with arguments: Jan 13 20:40:28.793938 kernel: /init Jan 13 20:40:28.793944 kernel: with environment: Jan 13 20:40:28.793949 kernel: HOME=/ Jan 13 20:40:28.793955 kernel: TERM=linux Jan 13 20:40:28.793961 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 13 20:40:28.793969 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 20:40:28.793977 systemd[1]: Detected virtualization vmware. Jan 13 20:40:28.793986 systemd[1]: Detected architecture x86-64. Jan 13 20:40:28.793992 systemd[1]: Running in initrd. Jan 13 20:40:28.793998 systemd[1]: No hostname configured, using default hostname. Jan 13 20:40:28.794004 systemd[1]: Hostname set to . Jan 13 20:40:28.794010 systemd[1]: Initializing machine ID from random generator. Jan 13 20:40:28.794016 systemd[1]: Queued start job for default target initrd.target. Jan 13 20:40:28.794023 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:40:28.794029 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:40:28.794037 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 13 20:40:28.794044 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 20:40:28.794050 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 13 20:40:28.794057 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 13 20:40:28.794064 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 13 20:40:28.794071 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 13 20:40:28.794077 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:40:28.794085 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:40:28.794091 systemd[1]: Reached target paths.target - Path Units. Jan 13 20:40:28.794097 systemd[1]: Reached target slices.target - Slice Units. Jan 13 20:40:28.794103 systemd[1]: Reached target swap.target - Swaps. Jan 13 20:40:28.794110 systemd[1]: Reached target timers.target - Timer Units. Jan 13 20:40:28.794116 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:40:28.794122 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:40:28.794129 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 13 20:40:28.794135 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 13 20:40:28.794143 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:40:28.794149 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 20:40:28.794156 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:40:28.794184 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 20:40:28.794191 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 13 20:40:28.794197 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 20:40:28.794204 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 13 20:40:28.794210 systemd[1]: Starting systemd-fsck-usr.service... Jan 13 20:40:28.794218 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 20:40:28.794224 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 20:40:28.794230 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:40:28.794248 systemd-journald[216]: Collecting audit messages is disabled. Jan 13 20:40:28.794265 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 13 20:40:28.794272 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:40:28.794278 systemd[1]: Finished systemd-fsck-usr.service. Jan 13 20:40:28.794285 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 13 20:40:28.794292 kernel: Bridge firewalling registered Jan 13 20:40:28.794299 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 20:40:28.794306 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 20:40:28.794312 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:40:28.794319 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:40:28.794325 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:40:28.794332 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 20:40:28.794339 systemd-journald[216]: Journal started Jan 13 20:40:28.794355 systemd-journald[216]: Runtime Journal (/run/log/journal/b838563f16134b0b8965585a0146e2f7) is 4.8M, max 38.6M, 33.8M free. Jan 13 20:40:28.752129 systemd-modules-load[217]: Inserted module 'overlay' Jan 13 20:40:28.776229 systemd-modules-load[217]: Inserted module 'br_netfilter' Jan 13 20:40:28.798511 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 20:40:28.798530 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 20:40:28.801520 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 20:40:28.809282 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:40:28.809555 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:40:28.809787 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:40:28.810732 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 13 20:40:28.815601 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:40:28.818880 dracut-cmdline[248]: dracut-dracut-053 Jan 13 20:40:28.821327 dracut-cmdline[248]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 20:40:28.822044 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 20:40:28.837223 systemd-resolved[254]: Positive Trust Anchors: Jan 13 20:40:28.837231 systemd-resolved[254]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 20:40:28.837255 systemd-resolved[254]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 20:40:28.838841 systemd-resolved[254]: Defaulting to hostname 'linux'. Jan 13 20:40:28.839450 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 20:40:28.839597 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:40:28.865185 kernel: SCSI subsystem initialized Jan 13 20:40:28.870191 kernel: Loading iSCSI transport class v2.0-870. Jan 13 20:40:28.877190 kernel: iscsi: registered transport (tcp) Jan 13 20:40:28.889499 kernel: iscsi: registered transport (qla4xxx) Jan 13 20:40:28.889530 kernel: QLogic iSCSI HBA Driver Jan 13 20:40:28.909031 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 13 20:40:28.911269 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 13 20:40:28.926513 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 13 20:40:28.926544 kernel: device-mapper: uevent: version 1.0.3 Jan 13 20:40:28.927630 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 13 20:40:28.960174 kernel: raid6: avx2x4 gen() 47531 MB/s Jan 13 20:40:28.975172 kernel: raid6: avx2x2 gen() 52973 MB/s Jan 13 20:40:28.992373 kernel: raid6: avx2x1 gen() 43643 MB/s Jan 13 20:40:28.992410 kernel: raid6: using algorithm avx2x2 gen() 52973 MB/s Jan 13 20:40:29.010426 kernel: raid6: .... xor() 31417 MB/s, rmw enabled Jan 13 20:40:29.010526 kernel: raid6: using avx2x2 recovery algorithm Jan 13 20:40:29.024176 kernel: xor: automatically using best checksumming function avx Jan 13 20:40:29.115183 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 13 20:40:29.120682 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:40:29.124252 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:40:29.132199 systemd-udevd[434]: Using default interface naming scheme 'v255'. Jan 13 20:40:29.134749 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:40:29.146314 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 13 20:40:29.153056 dracut-pre-trigger[439]: rd.md=0: removing MD RAID activation Jan 13 20:40:29.169049 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:40:29.172249 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 20:40:29.245312 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:40:29.252314 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 13 20:40:29.262982 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 13 20:40:29.263625 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:40:29.264379 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:40:29.264721 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 20:40:29.268287 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 13 20:40:29.277381 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:40:29.312190 kernel: VMware PVSCSI driver - version 1.0.7.0-k Jan 13 20:40:29.322262 kernel: vmw_pvscsi: using 64bit dma Jan 13 20:40:29.326172 kernel: vmw_pvscsi: max_id: 16 Jan 13 20:40:29.326194 kernel: vmw_pvscsi: setting ring_pages to 8 Jan 13 20:40:29.334766 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Jan 13 20:40:29.334794 kernel: vmw_pvscsi: enabling reqCallThreshold Jan 13 20:40:29.334803 kernel: vmw_pvscsi: driver-based request coalescing enabled Jan 13 20:40:29.334811 kernel: vmw_pvscsi: using MSI-X Jan 13 20:40:29.343111 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Jan 13 20:40:29.343263 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Jan 13 20:40:29.344576 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Jan 13 20:40:29.354586 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Jan 13 20:40:29.354685 kernel: cryptd: max_cpu_qlen set to 1000 Jan 13 20:40:29.354695 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Jan 13 20:40:29.354770 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Jan 13 20:40:29.349061 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:40:29.349098 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:40:29.349296 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:40:29.349388 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:40:29.349414 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:40:29.349516 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:40:29.355275 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:40:29.362190 kernel: AVX2 version of gcm_enc/dec engaged. Jan 13 20:40:29.367216 kernel: AES CTR mode by8 optimization enabled Jan 13 20:40:29.367256 kernel: libata version 3.00 loaded. Jan 13 20:40:29.375217 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Jan 13 20:40:29.425182 kernel: sd 0:0:0:0: [sda] Write Protect is off Jan 13 20:40:29.425268 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Jan 13 20:40:29.425332 kernel: sd 0:0:0:0: [sda] Cache data unavailable Jan 13 20:40:29.425400 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Jan 13 20:40:29.425463 kernel: ata_piix 0000:00:07.1: version 2.13 Jan 13 20:40:29.425538 kernel: scsi host1: ata_piix Jan 13 20:40:29.425603 kernel: scsi host2: ata_piix Jan 13 20:40:29.425663 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Jan 13 20:40:29.425671 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Jan 13 20:40:29.425679 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:40:29.425688 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jan 13 20:40:29.383886 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:40:29.388251 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:40:29.394749 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:40:29.553212 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Jan 13 20:40:29.558174 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Jan 13 20:40:29.585223 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Jan 13 20:40:29.601375 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 13 20:40:29.601388 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jan 13 20:40:29.851194 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (486) Jan 13 20:40:29.854755 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Jan 13 20:40:29.858526 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Jan 13 20:40:29.861666 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 20:40:29.863455 kernel: BTRFS: device fsid 7f507843-6957-466b-8fb7-5bee228b170a devid 1 transid 44 /dev/sda3 scanned by (udev-worker) (481) Jan 13 20:40:29.867655 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Jan 13 20:40:29.867772 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Jan 13 20:40:29.878253 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 13 20:40:29.899185 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:40:29.905178 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:40:30.904182 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:40:30.904219 disk-uuid[595]: The operation has completed successfully. Jan 13 20:40:30.975642 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 13 20:40:30.975710 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 13 20:40:30.980263 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 13 20:40:30.985943 sh[609]: Success Jan 13 20:40:31.000178 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Jan 13 20:40:31.395350 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 13 20:40:31.406120 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 13 20:40:31.406651 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 13 20:40:31.439438 kernel: BTRFS info (device dm-0): first mount of filesystem 7f507843-6957-466b-8fb7-5bee228b170a Jan 13 20:40:31.439473 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:40:31.439488 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 13 20:40:31.440519 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 13 20:40:31.441313 kernel: BTRFS info (device dm-0): using free space tree Jan 13 20:40:31.526177 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 13 20:40:31.528787 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 13 20:40:31.537277 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Jan 13 20:40:31.538564 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 13 20:40:31.614187 kernel: BTRFS info (device sda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:40:31.614236 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:40:31.616173 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:40:31.692178 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:40:31.707222 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 13 20:40:31.709192 kernel: BTRFS info (device sda6): last unmount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:40:31.724913 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 13 20:40:31.728247 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 13 20:40:31.850122 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:40:31.857289 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 13 20:40:31.901382 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:40:31.906324 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 20:40:31.918467 systemd-networkd[799]: lo: Link UP Jan 13 20:40:31.918472 systemd-networkd[799]: lo: Gained carrier Jan 13 20:40:31.919393 systemd-networkd[799]: Enumeration completed Jan 13 20:40:31.919733 systemd-networkd[799]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Jan 13 20:40:31.920000 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 20:40:31.920227 systemd[1]: Reached target network.target - Network. Jan 13 20:40:31.922207 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 20:40:31.922349 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 20:40:31.923366 systemd-networkd[799]: ens192: Link UP Jan 13 20:40:31.923372 systemd-networkd[799]: ens192: Gained carrier Jan 13 20:40:32.020716 ignition[671]: Ignition 2.20.0 Jan 13 20:40:32.020723 ignition[671]: Stage: fetch-offline Jan 13 20:40:32.020746 ignition[671]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:40:32.020752 ignition[671]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:40:32.020807 ignition[671]: parsed url from cmdline: "" Jan 13 20:40:32.020809 ignition[671]: no config URL provided Jan 13 20:40:32.020812 ignition[671]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 20:40:32.020816 ignition[671]: no config at "/usr/lib/ignition/user.ign" Jan 13 20:40:32.021249 ignition[671]: config successfully fetched Jan 13 20:40:32.021266 ignition[671]: parsing config with SHA512: 8f4577f0b94c1b1519d9a838ddd204133562f7b3d3034002ef3056cdc68b89395eb0a086d911ba8c930dec4216fd8106d97f38ddd3ea1b860e9d91974775ca2e Jan 13 20:40:32.024108 unknown[671]: fetched base config from "system" Jan 13 20:40:32.024117 unknown[671]: fetched user config from "vmware" Jan 13 20:40:32.024392 ignition[671]: fetch-offline: fetch-offline passed Jan 13 20:40:32.024438 ignition[671]: Ignition finished successfully Jan 13 20:40:32.025254 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:40:32.025464 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jan 13 20:40:32.030269 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 13 20:40:32.039415 ignition[808]: Ignition 2.20.0 Jan 13 20:40:32.039425 ignition[808]: Stage: kargs Jan 13 20:40:32.039568 ignition[808]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:40:32.039577 ignition[808]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:40:32.040520 ignition[808]: kargs: kargs passed Jan 13 20:40:32.040561 ignition[808]: Ignition finished successfully Jan 13 20:40:32.041493 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 13 20:40:32.046357 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 13 20:40:32.055054 ignition[815]: Ignition 2.20.0 Jan 13 20:40:32.055065 ignition[815]: Stage: disks Jan 13 20:40:32.055228 ignition[815]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:40:32.055238 ignition[815]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:40:32.056060 ignition[815]: disks: disks passed Jan 13 20:40:32.056105 ignition[815]: Ignition finished successfully Jan 13 20:40:32.056801 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 13 20:40:32.057273 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 13 20:40:32.057380 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 13 20:40:32.057480 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 20:40:32.057564 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 20:40:32.057648 systemd[1]: Reached target basic.target - Basic System. Jan 13 20:40:32.061295 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 13 20:40:32.223877 systemd-fsck[824]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 13 20:40:32.231736 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 13 20:40:32.239353 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 13 20:40:32.430174 kernel: EXT4-fs (sda9): mounted filesystem 59ba8ffc-e6b0-4bb4-a36e-13a47bd6ad99 r/w with ordered data mode. Quota mode: none. Jan 13 20:40:32.430798 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 13 20:40:32.431313 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 13 20:40:32.439291 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:40:32.441240 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 13 20:40:32.442403 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 13 20:40:32.442441 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 13 20:40:32.442459 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:40:32.450567 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 13 20:40:32.451709 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 13 20:40:32.454179 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (832) Jan 13 20:40:32.457911 kernel: BTRFS info (device sda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:40:32.457956 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:40:32.457965 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:40:32.464390 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:40:32.465525 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:40:32.495512 initrd-setup-root[856]: cut: /sysroot/etc/passwd: No such file or directory Jan 13 20:40:32.506650 initrd-setup-root[863]: cut: /sysroot/etc/group: No such file or directory Jan 13 20:40:32.513792 initrd-setup-root[870]: cut: /sysroot/etc/shadow: No such file or directory Jan 13 20:40:32.527220 initrd-setup-root[877]: cut: /sysroot/etc/gshadow: No such file or directory Jan 13 20:40:33.010222 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 13 20:40:33.014243 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 13 20:40:33.016695 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 13 20:40:33.020759 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 13 20:40:33.022184 kernel: BTRFS info (device sda6): last unmount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:40:33.033644 ignition[944]: INFO : Ignition 2.20.0 Jan 13 20:40:33.033644 ignition[944]: INFO : Stage: mount Jan 13 20:40:33.034022 ignition[944]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:40:33.034022 ignition[944]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:40:33.034315 ignition[944]: INFO : mount: mount passed Jan 13 20:40:33.034315 ignition[944]: INFO : Ignition finished successfully Jan 13 20:40:33.034740 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 13 20:40:33.040265 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 13 20:40:33.045359 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:40:33.050871 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 13 20:40:33.059267 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (954) Jan 13 20:40:33.061700 kernel: BTRFS info (device sda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:40:33.061733 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:40:33.061742 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:40:33.068175 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:40:33.069366 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:40:33.089060 ignition[974]: INFO : Ignition 2.20.0 Jan 13 20:40:33.089060 ignition[974]: INFO : Stage: files Jan 13 20:40:33.089797 ignition[974]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:40:33.089797 ignition[974]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:40:33.089797 ignition[974]: DEBUG : files: compiled without relabeling support, skipping Jan 13 20:40:33.093010 ignition[974]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 13 20:40:33.093010 ignition[974]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 13 20:40:33.097284 ignition[974]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 13 20:40:33.097628 ignition[974]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 13 20:40:33.097942 ignition[974]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 13 20:40:33.097882 unknown[974]: wrote ssh authorized keys file for user: core Jan 13 20:40:33.101137 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:40:33.101137 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 13 20:40:33.147991 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 13 20:40:33.222477 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:40:33.223044 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 13 20:40:33.223044 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 13 20:40:33.223044 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 13 20:40:33.224223 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Jan 13 20:40:33.538272 systemd-networkd[799]: ens192: Gained IPv6LL Jan 13 20:40:33.742278 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 13 20:40:34.177321 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 13 20:40:34.177747 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:40:34.177747 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Jan 13 20:40:34.177747 ignition[974]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Jan 13 20:40:34.397328 ignition[974]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:40:34.399850 ignition[974]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:40:34.399850 ignition[974]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Jan 13 20:40:34.399850 ignition[974]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Jan 13 20:40:34.399850 ignition[974]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Jan 13 20:40:34.401079 ignition[974]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:40:34.401079 ignition[974]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:40:34.401079 ignition[974]: INFO : files: files passed Jan 13 20:40:34.401079 ignition[974]: INFO : Ignition finished successfully Jan 13 20:40:34.400983 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 13 20:40:34.406272 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 13 20:40:34.407396 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 13 20:40:34.408459 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 13 20:40:34.408649 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 13 20:40:34.415057 initrd-setup-root-after-ignition[1005]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:40:34.415057 initrd-setup-root-after-ignition[1005]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:40:34.416080 initrd-setup-root-after-ignition[1009]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:40:34.416871 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:40:34.417303 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 13 20:40:34.421269 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 13 20:40:34.440562 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 13 20:40:34.440824 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 13 20:40:34.441458 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 13 20:40:34.441709 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 13 20:40:34.441942 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 13 20:40:34.442620 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 13 20:40:34.452191 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:40:34.456263 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 13 20:40:34.462269 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:40:34.462466 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:40:34.462683 systemd[1]: Stopped target timers.target - Timer Units. Jan 13 20:40:34.462871 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 13 20:40:34.462949 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:40:34.463326 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 13 20:40:34.463488 systemd[1]: Stopped target basic.target - Basic System. Jan 13 20:40:34.463682 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 13 20:40:34.463887 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:40:34.464219 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 13 20:40:34.464614 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 13 20:40:34.464856 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:40:34.465099 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 13 20:40:34.465301 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 13 20:40:34.465518 systemd[1]: Stopped target swap.target - Swaps. Jan 13 20:40:34.465662 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 13 20:40:34.465745 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:40:34.466120 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:40:34.466350 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:40:34.466543 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 13 20:40:34.466591 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:40:34.466766 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 13 20:40:34.466835 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 13 20:40:34.467134 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 13 20:40:34.467220 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:40:34.467494 systemd[1]: Stopped target paths.target - Path Units. Jan 13 20:40:34.467631 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 13 20:40:34.471213 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:40:34.471461 systemd[1]: Stopped target slices.target - Slice Units. Jan 13 20:40:34.471655 systemd[1]: Stopped target sockets.target - Socket Units. Jan 13 20:40:34.471827 systemd[1]: iscsid.socket: Deactivated successfully. Jan 13 20:40:34.471894 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:40:34.472092 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 13 20:40:34.472149 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:40:34.472352 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 13 20:40:34.472429 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:40:34.472705 systemd[1]: ignition-files.service: Deactivated successfully. Jan 13 20:40:34.472792 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 13 20:40:34.481601 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 13 20:40:34.481903 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 13 20:40:34.482266 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:40:34.484363 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 13 20:40:34.484613 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 13 20:40:34.484886 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:40:34.485299 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 13 20:40:34.485586 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:40:34.491210 ignition[1030]: INFO : Ignition 2.20.0 Jan 13 20:40:34.491210 ignition[1030]: INFO : Stage: umount Jan 13 20:40:34.497735 ignition[1030]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:40:34.497735 ignition[1030]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:40:34.497735 ignition[1030]: INFO : umount: umount passed Jan 13 20:40:34.497735 ignition[1030]: INFO : Ignition finished successfully Jan 13 20:40:34.496351 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 13 20:40:34.496416 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 13 20:40:34.496804 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 13 20:40:34.496851 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 13 20:40:34.498342 systemd[1]: Stopped target network.target - Network. Jan 13 20:40:34.499354 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 13 20:40:34.499414 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 13 20:40:34.499568 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 13 20:40:34.499594 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 13 20:40:34.499711 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 13 20:40:34.499742 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 13 20:40:34.499852 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 13 20:40:34.499873 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 13 20:40:34.500065 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 13 20:40:34.500231 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 13 20:40:34.506510 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 13 20:40:34.507963 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 13 20:40:34.508045 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 13 20:40:34.508936 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 13 20:40:34.508985 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:40:34.509975 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 13 20:40:34.510247 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 13 20:40:34.511013 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 13 20:40:34.511410 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:40:34.515308 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 13 20:40:34.515656 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 13 20:40:34.516030 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:40:34.516359 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Jan 13 20:40:34.516392 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:40:34.516527 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 13 20:40:34.516557 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:40:34.516675 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 13 20:40:34.516704 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 13 20:40:34.516883 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:40:34.525448 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 13 20:40:34.525702 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 13 20:40:34.530771 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 13 20:40:34.530884 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:40:34.531348 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 13 20:40:34.531385 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 13 20:40:34.531617 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 13 20:40:34.531640 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:40:34.531795 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 13 20:40:34.531827 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:40:34.532112 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 13 20:40:34.532141 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 13 20:40:34.532434 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:40:34.532463 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:40:34.538310 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 13 20:40:34.538426 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 13 20:40:34.538461 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:40:34.538597 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:40:34.538619 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:40:34.542033 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 13 20:40:34.542105 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 13 20:40:34.622631 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 13 20:40:34.622698 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 13 20:40:34.623139 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 13 20:40:34.623280 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 13 20:40:34.623310 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 13 20:40:34.625298 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 13 20:40:34.633363 systemd[1]: Switching root. Jan 13 20:40:34.668250 systemd-journald[216]: Journal stopped Jan 13 20:40:35.700705 systemd-journald[216]: Received SIGTERM from PID 1 (systemd). Jan 13 20:40:35.700728 kernel: SELinux: policy capability network_peer_controls=1 Jan 13 20:40:35.700737 kernel: SELinux: policy capability open_perms=1 Jan 13 20:40:35.700743 kernel: SELinux: policy capability extended_socket_class=1 Jan 13 20:40:35.700748 kernel: SELinux: policy capability always_check_network=0 Jan 13 20:40:35.700754 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 13 20:40:35.700762 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 13 20:40:35.700768 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 13 20:40:35.700774 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 13 20:40:35.700779 kernel: audit: type=1403 audit(1736800835.204:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 13 20:40:35.700786 systemd[1]: Successfully loaded SELinux policy in 33.760ms. Jan 13 20:40:35.700793 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 6.936ms. Jan 13 20:40:35.700801 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 20:40:35.700812 systemd[1]: Detected virtualization vmware. Jan 13 20:40:35.700823 systemd[1]: Detected architecture x86-64. Jan 13 20:40:35.700834 systemd[1]: Detected first boot. Jan 13 20:40:35.700847 systemd[1]: Initializing machine ID from random generator. Jan 13 20:40:35.700857 zram_generator::config[1072]: No configuration found. Jan 13 20:40:35.700865 systemd[1]: Populated /etc with preset unit settings. Jan 13 20:40:35.700872 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:40:35.700879 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Jan 13 20:40:35.700886 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 13 20:40:35.700893 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 13 20:40:35.700900 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 13 20:40:35.700906 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 13 20:40:35.700915 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 13 20:40:35.700925 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 13 20:40:35.700937 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 13 20:40:35.700948 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 13 20:40:35.700959 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 13 20:40:35.700971 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 13 20:40:35.700981 systemd[1]: Created slice user.slice - User and Session Slice. Jan 13 20:40:35.700991 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:40:35.700998 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:40:35.701005 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 13 20:40:35.701012 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 13 20:40:35.701019 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 13 20:40:35.701026 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 20:40:35.701033 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 13 20:40:35.701039 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:40:35.701052 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 13 20:40:35.701065 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 13 20:40:35.701076 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 13 20:40:35.701086 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 13 20:40:35.701096 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:40:35.701107 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 20:40:35.701114 systemd[1]: Reached target slices.target - Slice Units. Jan 13 20:40:35.701123 systemd[1]: Reached target swap.target - Swaps. Jan 13 20:40:35.701130 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 13 20:40:35.701137 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 13 20:40:35.701144 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:40:35.701151 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 20:40:35.705962 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:40:35.705984 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 13 20:40:35.705998 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 13 20:40:35.706012 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 13 20:40:35.706024 systemd[1]: Mounting media.mount - External Media Directory... Jan 13 20:40:35.706037 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:40:35.706049 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 13 20:40:35.706062 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 13 20:40:35.706077 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 13 20:40:35.706087 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 13 20:40:35.706097 systemd[1]: Reached target machines.target - Containers. Jan 13 20:40:35.706107 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 13 20:40:35.706118 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Jan 13 20:40:35.706130 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 20:40:35.706142 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 13 20:40:35.706154 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:40:35.706238 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 20:40:35.706254 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 20:40:35.706266 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 13 20:40:35.706278 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:40:35.706291 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 13 20:40:35.706304 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 13 20:40:35.706317 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 13 20:40:35.706329 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 13 20:40:35.706340 systemd[1]: Stopped systemd-fsck-usr.service. Jan 13 20:40:35.706355 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 20:40:35.706367 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 20:40:35.706380 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 13 20:40:35.706393 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 13 20:40:35.706405 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 20:40:35.706417 systemd[1]: verity-setup.service: Deactivated successfully. Jan 13 20:40:35.706428 systemd[1]: Stopped verity-setup.service. Jan 13 20:40:35.706440 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:40:35.706454 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 13 20:40:35.706465 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 13 20:40:35.706476 systemd[1]: Mounted media.mount - External Media Directory. Jan 13 20:40:35.706489 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 13 20:40:35.706501 kernel: fuse: init (API version 7.39) Jan 13 20:40:35.706512 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 13 20:40:35.706525 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 13 20:40:35.706561 systemd-journald[1170]: Collecting audit messages is disabled. Jan 13 20:40:35.706592 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 13 20:40:35.706606 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:40:35.706618 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 13 20:40:35.706630 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 13 20:40:35.706641 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:40:35.706656 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:40:35.706669 systemd-journald[1170]: Journal started Jan 13 20:40:35.706692 systemd-journald[1170]: Runtime Journal (/run/log/journal/5a2e2e195e7343d09f2f9696c952b15c) is 4.8M, max 38.6M, 33.8M free. Jan 13 20:40:35.522974 systemd[1]: Queued start job for default target multi-user.target. Jan 13 20:40:35.541973 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 13 20:40:35.542200 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 13 20:40:35.707444 jq[1139]: true Jan 13 20:40:35.712224 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 20:40:35.708589 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 20:40:35.708678 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 20:40:35.708915 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 13 20:40:35.708993 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 13 20:40:35.709271 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 20:40:35.709502 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 13 20:40:35.709734 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 13 20:40:35.720716 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 13 20:40:35.722170 kernel: loop: module loaded Jan 13 20:40:35.727186 jq[1181]: true Jan 13 20:40:35.727233 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 13 20:40:35.729302 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 13 20:40:35.729448 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 13 20:40:35.729470 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 20:40:35.730217 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Jan 13 20:40:35.739014 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 13 20:40:35.741512 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 13 20:40:35.741704 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:40:35.753617 kernel: ACPI: bus type drm_connector registered Jan 13 20:40:35.763278 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 13 20:40:35.771428 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 13 20:40:35.771597 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 20:40:35.772525 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 13 20:40:35.773966 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 20:40:35.777703 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 13 20:40:35.779332 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 13 20:40:35.780925 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 20:40:35.781723 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 20:40:35.781978 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:40:35.782150 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:40:35.782863 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 13 20:40:35.783024 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 13 20:40:35.783592 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 13 20:40:35.786050 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 20:40:35.792488 systemd-journald[1170]: Time spent on flushing to /var/log/journal/5a2e2e195e7343d09f2f9696c952b15c is 60.280ms for 1833 entries. Jan 13 20:40:35.792488 systemd-journald[1170]: System Journal (/var/log/journal/5a2e2e195e7343d09f2f9696c952b15c) is 8.0M, max 584.8M, 576.8M free. Jan 13 20:40:35.870520 systemd-journald[1170]: Received client request to flush runtime journal. Jan 13 20:40:35.870545 kernel: loop0: detected capacity change from 0 to 138184 Jan 13 20:40:35.819237 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 13 20:40:35.819444 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 13 20:40:35.828379 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Jan 13 20:40:35.860120 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:40:35.879977 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 13 20:40:35.893369 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 13 20:40:35.899490 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 13 20:40:35.899960 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Jan 13 20:40:35.919287 kernel: loop1: detected capacity change from 0 to 141000 Jan 13 20:40:35.921818 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 13 20:40:35.926319 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 20:40:35.927214 ignition[1208]: Ignition 2.20.0 Jan 13 20:40:35.927899 ignition[1208]: deleting config from guestinfo properties Jan 13 20:40:35.931407 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:40:35.933427 ignition[1208]: Successfully deleted config Jan 13 20:40:35.938577 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Jan 13 20:40:35.939194 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Jan 13 20:40:35.954798 udevadm[1233]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Jan 13 20:40:35.959276 systemd-tmpfiles[1231]: ACLs are not supported, ignoring. Jan 13 20:40:35.959288 systemd-tmpfiles[1231]: ACLs are not supported, ignoring. Jan 13 20:40:35.965006 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:40:35.969179 kernel: loop2: detected capacity change from 0 to 2960 Jan 13 20:40:36.019184 kernel: loop3: detected capacity change from 0 to 210664 Jan 13 20:40:36.066245 kernel: loop4: detected capacity change from 0 to 138184 Jan 13 20:40:36.103236 kernel: loop5: detected capacity change from 0 to 141000 Jan 13 20:40:36.129184 kernel: loop6: detected capacity change from 0 to 2960 Jan 13 20:40:36.147724 kernel: loop7: detected capacity change from 0 to 210664 Jan 13 20:40:36.183798 (sd-merge)[1240]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Jan 13 20:40:36.184092 (sd-merge)[1240]: Merged extensions into '/usr'. Jan 13 20:40:36.188238 systemd[1]: Reloading requested from client PID 1206 ('systemd-sysext') (unit systemd-sysext.service)... Jan 13 20:40:36.188326 systemd[1]: Reloading... Jan 13 20:40:36.252180 zram_generator::config[1268]: No configuration found. Jan 13 20:40:36.334230 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:40:36.351742 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:40:36.381695 systemd[1]: Reloading finished in 193 ms. Jan 13 20:40:36.413606 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 13 20:40:36.420312 systemd[1]: Starting ensure-sysext.service... Jan 13 20:40:36.421850 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 20:40:36.434097 systemd-tmpfiles[1323]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 13 20:40:36.434283 systemd-tmpfiles[1323]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 13 20:40:36.434768 systemd-tmpfiles[1323]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 13 20:40:36.434937 systemd-tmpfiles[1323]: ACLs are not supported, ignoring. Jan 13 20:40:36.434975 systemd-tmpfiles[1323]: ACLs are not supported, ignoring. Jan 13 20:40:36.474076 systemd-tmpfiles[1323]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 20:40:36.474082 systemd-tmpfiles[1323]: Skipping /boot Jan 13 20:40:36.479460 systemd[1]: Reloading requested from client PID 1322 ('systemctl') (unit ensure-sysext.service)... Jan 13 20:40:36.479539 systemd[1]: Reloading... Jan 13 20:40:36.480015 systemd-tmpfiles[1323]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 20:40:36.480018 systemd-tmpfiles[1323]: Skipping /boot Jan 13 20:40:36.506170 ldconfig[1201]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 13 20:40:36.529254 zram_generator::config[1351]: No configuration found. Jan 13 20:40:36.589240 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:40:36.605752 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:40:36.635095 systemd[1]: Reloading finished in 155 ms. Jan 13 20:40:36.647355 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 13 20:40:36.647782 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 13 20:40:36.653782 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:40:36.659897 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 20:40:36.663361 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 13 20:40:36.665356 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 13 20:40:36.674481 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 20:40:36.677609 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:40:36.680581 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 13 20:40:36.683218 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:40:36.689776 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:40:36.692964 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 20:40:36.695090 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:40:36.695309 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:40:36.703360 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 13 20:40:36.703463 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:40:36.703992 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:40:36.705205 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:40:36.708746 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:40:36.708847 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:40:36.712986 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:40:36.720421 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:40:36.722147 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:40:36.722334 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:40:36.722412 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:40:36.723308 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 13 20:40:36.724193 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 13 20:40:36.724561 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 20:40:36.724738 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 20:40:36.726074 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:40:36.726308 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:40:36.727924 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:40:36.728012 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:40:36.733080 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:40:36.740429 systemd-udevd[1420]: Using default interface naming scheme 'v255'. Jan 13 20:40:36.740590 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:40:36.743983 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 20:40:36.747657 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 20:40:36.750209 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:40:36.750411 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:40:36.752948 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 13 20:40:36.753802 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:40:36.754448 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 13 20:40:36.756671 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:40:36.756784 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:40:36.758008 systemd[1]: Finished ensure-sysext.service. Jan 13 20:40:36.760569 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 20:40:36.761067 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 20:40:36.765363 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 20:40:36.765486 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 20:40:36.766683 augenrules[1448]: No rules Jan 13 20:40:36.766782 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 20:40:36.775355 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 13 20:40:36.775716 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 20:40:36.775856 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 20:40:36.777102 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 13 20:40:36.779058 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:40:36.779281 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:40:36.779925 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 20:40:36.804167 systemd-resolved[1413]: Positive Trust Anchors: Jan 13 20:40:36.804178 systemd-resolved[1413]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 20:40:36.804202 systemd-resolved[1413]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 20:40:36.807489 systemd-resolved[1413]: Defaulting to hostname 'linux'. Jan 13 20:40:36.809322 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 20:40:36.809580 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:40:36.815289 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:40:36.823708 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 20:40:36.823915 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 13 20:40:36.824690 systemd[1]: Reached target time-set.target - System Time Set. Jan 13 20:40:36.856154 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 13 20:40:36.857426 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 13 20:40:36.867296 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 13 20:40:36.874475 systemd-networkd[1475]: lo: Link UP Jan 13 20:40:36.874665 systemd-networkd[1475]: lo: Gained carrier Jan 13 20:40:36.876363 systemd-networkd[1475]: Enumeration completed Jan 13 20:40:36.876694 systemd-networkd[1475]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Jan 13 20:40:36.877223 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 20:40:36.877388 systemd[1]: Reached target network.target - Network. Jan 13 20:40:36.883647 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 20:40:36.883815 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 20:40:36.883350 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 13 20:40:36.887028 systemd-networkd[1475]: ens192: Link UP Jan 13 20:40:36.887419 systemd-networkd[1475]: ens192: Gained carrier Jan 13 20:40:36.893353 systemd-timesyncd[1466]: Network configuration changed, trying to establish connection. Jan 13 20:40:36.928172 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Jan 13 20:40:36.937183 kernel: ACPI: button: Power Button [PWRF] Jan 13 20:40:36.953222 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 44 scanned by (udev-worker) (1480) Jan 13 20:40:36.977193 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Jan 13 20:40:36.995637 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 20:40:37.001913 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input4 Jan 13 20:40:37.002554 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 13 20:40:37.004179 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Jan 13 20:40:37.004664 kernel: Guest personality initialized and is active Jan 13 20:40:37.006240 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Jan 13 20:40:37.006275 kernel: Initialized host personality Jan 13 20:40:37.013763 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 13 20:40:37.032178 kernel: mousedev: PS/2 mouse device common for all mice Jan 13 20:40:37.035446 (udev-worker)[1473]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Jan 13 20:40:37.047273 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:40:37.048118 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Jan 13 20:40:37.050001 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Jan 13 20:40:37.062307 lvm[1512]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 13 20:40:37.086612 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Jan 13 20:40:37.087231 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:40:37.092876 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Jan 13 20:40:37.095937 lvm[1517]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 13 20:40:37.107120 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:40:37.107446 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 20:40:37.107632 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 13 20:40:37.107774 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 13 20:40:37.107995 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 13 20:40:37.108229 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 13 20:40:37.108357 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 13 20:40:37.108478 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 13 20:40:37.108501 systemd[1]: Reached target paths.target - Path Units. Jan 13 20:40:37.108595 systemd[1]: Reached target timers.target - Timer Units. Jan 13 20:40:37.109482 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 13 20:40:37.110832 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 13 20:40:37.114478 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 13 20:40:37.115151 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Jan 13 20:40:37.115370 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 13 20:40:37.115803 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 20:40:37.115925 systemd[1]: Reached target basic.target - Basic System. Jan 13 20:40:37.116064 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 13 20:40:37.116080 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 13 20:40:37.117021 systemd[1]: Starting containerd.service - containerd container runtime... Jan 13 20:40:37.120449 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 13 20:40:37.122879 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 13 20:40:37.126341 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 13 20:40:37.127311 jq[1525]: false Jan 13 20:40:37.127246 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 13 20:40:37.129284 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 13 20:40:37.131221 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 13 20:40:37.136274 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 13 20:40:37.138456 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 13 20:40:37.146907 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 13 20:40:37.147330 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 13 20:40:37.147833 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 13 20:40:37.148461 systemd[1]: Starting update-engine.service - Update Engine... Jan 13 20:40:37.151267 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 13 20:40:37.153943 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Jan 13 20:40:37.161429 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 13 20:40:37.161552 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 13 20:40:37.163362 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 13 20:40:37.163466 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 13 20:40:37.172747 extend-filesystems[1526]: Found loop4 Jan 13 20:40:37.172747 extend-filesystems[1526]: Found loop5 Jan 13 20:40:37.172747 extend-filesystems[1526]: Found loop6 Jan 13 20:40:37.172747 extend-filesystems[1526]: Found loop7 Jan 13 20:40:37.172747 extend-filesystems[1526]: Found sda Jan 13 20:40:37.172747 extend-filesystems[1526]: Found sda1 Jan 13 20:40:37.172747 extend-filesystems[1526]: Found sda2 Jan 13 20:40:37.172747 extend-filesystems[1526]: Found sda3 Jan 13 20:40:37.172747 extend-filesystems[1526]: Found usr Jan 13 20:40:37.172747 extend-filesystems[1526]: Found sda4 Jan 13 20:40:37.172747 extend-filesystems[1526]: Found sda6 Jan 13 20:40:37.177104 extend-filesystems[1526]: Found sda7 Jan 13 20:40:37.177104 extend-filesystems[1526]: Found sda9 Jan 13 20:40:37.177104 extend-filesystems[1526]: Checking size of /dev/sda9 Jan 13 20:40:37.179251 systemd[1]: motdgen.service: Deactivated successfully. Jan 13 20:40:37.180766 jq[1536]: true Jan 13 20:40:37.179563 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 13 20:40:37.190674 (ntainerd)[1551]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 13 20:40:37.199902 update_engine[1535]: I20250113 20:40:37.199153 1535 main.cc:92] Flatcar Update Engine starting Jan 13 20:40:37.204924 extend-filesystems[1526]: Old size kept for /dev/sda9 Jan 13 20:40:37.204924 extend-filesystems[1526]: Found sr0 Jan 13 20:40:37.205403 tar[1545]: linux-amd64/helm Jan 13 20:40:37.202305 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Jan 13 20:40:37.202619 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 13 20:40:37.202740 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 13 20:40:37.206649 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Jan 13 20:40:37.218970 jq[1554]: true Jan 13 20:40:37.219469 dbus-daemon[1524]: [system] SELinux support is enabled Jan 13 20:40:37.219579 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 13 20:40:37.232971 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 13 20:40:37.234346 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 13 20:40:37.234549 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 13 20:40:37.234561 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 13 20:40:37.242086 systemd[1]: Started update-engine.service - Update Engine. Jan 13 20:40:37.247383 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 13 20:40:37.247718 update_engine[1535]: I20250113 20:40:37.242692 1535 update_check_scheduler.cc:74] Next update check in 9m35s Jan 13 20:40:37.261969 systemd-logind[1532]: Watching system buttons on /dev/input/event1 (Power Button) Jan 13 20:40:37.264040 systemd-logind[1532]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 13 20:40:37.266019 systemd-logind[1532]: New seat seat0. Jan 13 20:40:37.272209 unknown[1560]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Jan 13 20:40:37.277857 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Jan 13 20:40:37.278071 systemd[1]: Started systemd-logind.service - User Login Management. Jan 13 20:40:37.283462 unknown[1560]: Core dump limit set to -1 Jan 13 20:40:37.289669 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 44 scanned by (udev-worker) (1484) Jan 13 20:40:37.296198 kernel: NET: Registered PF_VSOCK protocol family Jan 13 20:40:37.364116 locksmithd[1569]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 13 20:40:37.416335 bash[1587]: Updated "/home/core/.ssh/authorized_keys" Jan 13 20:40:37.418626 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 13 20:40:37.419197 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Jan 13 20:40:37.461849 sshd_keygen[1555]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 13 20:40:37.476277 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 13 20:40:37.481354 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 13 20:40:37.485660 systemd[1]: issuegen.service: Deactivated successfully. Jan 13 20:40:37.485786 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 13 20:40:37.490409 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 13 20:40:37.500641 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 13 20:40:37.505467 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 13 20:40:37.508406 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 13 20:40:37.509682 systemd[1]: Reached target getty.target - Login Prompts. Jan 13 20:40:37.549284 containerd[1551]: time="2025-01-13T20:40:37.547818789Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Jan 13 20:40:37.567624 containerd[1551]: time="2025-01-13T20:40:37.567593220Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:40:37.568688 containerd[1551]: time="2025-01-13T20:40:37.568665218Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.71-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:40:37.568688 containerd[1551]: time="2025-01-13T20:40:37.568683354Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Jan 13 20:40:37.568744 containerd[1551]: time="2025-01-13T20:40:37.568693412Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Jan 13 20:40:37.568791 containerd[1551]: time="2025-01-13T20:40:37.568780475Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Jan 13 20:40:37.568808 containerd[1551]: time="2025-01-13T20:40:37.568792093Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Jan 13 20:40:37.568841 containerd[1551]: time="2025-01-13T20:40:37.568830127Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:40:37.568857 containerd[1551]: time="2025-01-13T20:40:37.568840350Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:40:37.568947 containerd[1551]: time="2025-01-13T20:40:37.568936065Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:40:37.568963 containerd[1551]: time="2025-01-13T20:40:37.568946376Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Jan 13 20:40:37.568963 containerd[1551]: time="2025-01-13T20:40:37.568953984Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:40:37.568963 containerd[1551]: time="2025-01-13T20:40:37.568959207Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Jan 13 20:40:37.569014 containerd[1551]: time="2025-01-13T20:40:37.569002512Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:40:37.569129 containerd[1551]: time="2025-01-13T20:40:37.569116183Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:40:37.569190 containerd[1551]: time="2025-01-13T20:40:37.569178960Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:40:37.569190 containerd[1551]: time="2025-01-13T20:40:37.569188710Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Jan 13 20:40:37.569240 containerd[1551]: time="2025-01-13T20:40:37.569229792Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Jan 13 20:40:37.569269 containerd[1551]: time="2025-01-13T20:40:37.569258411Z" level=info msg="metadata content store policy set" policy=shared Jan 13 20:40:37.575685 containerd[1551]: time="2025-01-13T20:40:37.575660928Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Jan 13 20:40:37.575746 containerd[1551]: time="2025-01-13T20:40:37.575697458Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Jan 13 20:40:37.575746 containerd[1551]: time="2025-01-13T20:40:37.575710039Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Jan 13 20:40:37.575746 containerd[1551]: time="2025-01-13T20:40:37.575720101Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Jan 13 20:40:37.575746 containerd[1551]: time="2025-01-13T20:40:37.575728250Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Jan 13 20:40:37.575826 containerd[1551]: time="2025-01-13T20:40:37.575815351Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Jan 13 20:40:37.575961 containerd[1551]: time="2025-01-13T20:40:37.575950862Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Jan 13 20:40:37.576019 containerd[1551]: time="2025-01-13T20:40:37.576007703Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Jan 13 20:40:37.576038 containerd[1551]: time="2025-01-13T20:40:37.576018513Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Jan 13 20:40:37.576038 containerd[1551]: time="2025-01-13T20:40:37.576027420Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Jan 13 20:40:37.576038 containerd[1551]: time="2025-01-13T20:40:37.576035005Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Jan 13 20:40:37.576075 containerd[1551]: time="2025-01-13T20:40:37.576042126Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Jan 13 20:40:37.576075 containerd[1551]: time="2025-01-13T20:40:37.576049130Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Jan 13 20:40:37.576075 containerd[1551]: time="2025-01-13T20:40:37.576056741Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Jan 13 20:40:37.576075 containerd[1551]: time="2025-01-13T20:40:37.576068551Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Jan 13 20:40:37.576127 containerd[1551]: time="2025-01-13T20:40:37.576076846Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Jan 13 20:40:37.576127 containerd[1551]: time="2025-01-13T20:40:37.576084745Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Jan 13 20:40:37.576127 containerd[1551]: time="2025-01-13T20:40:37.576090552Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Jan 13 20:40:37.576127 containerd[1551]: time="2025-01-13T20:40:37.576101546Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576127 containerd[1551]: time="2025-01-13T20:40:37.576109227Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576127 containerd[1551]: time="2025-01-13T20:40:37.576116122Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576127 containerd[1551]: time="2025-01-13T20:40:37.576123545Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576249 containerd[1551]: time="2025-01-13T20:40:37.576133048Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576249 containerd[1551]: time="2025-01-13T20:40:37.576140721Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576249 containerd[1551]: time="2025-01-13T20:40:37.576146809Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576249 containerd[1551]: time="2025-01-13T20:40:37.576168646Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576249 containerd[1551]: time="2025-01-13T20:40:37.576180735Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576249 containerd[1551]: time="2025-01-13T20:40:37.576189655Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576249 containerd[1551]: time="2025-01-13T20:40:37.576196152Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576249 containerd[1551]: time="2025-01-13T20:40:37.576205105Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576249 containerd[1551]: time="2025-01-13T20:40:37.576212600Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576249 containerd[1551]: time="2025-01-13T20:40:37.576220998Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Jan 13 20:40:37.576249 containerd[1551]: time="2025-01-13T20:40:37.576232383Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576249 containerd[1551]: time="2025-01-13T20:40:37.576242788Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576249 containerd[1551]: time="2025-01-13T20:40:37.576249063Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Jan 13 20:40:37.576416 containerd[1551]: time="2025-01-13T20:40:37.576272632Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Jan 13 20:40:37.576416 containerd[1551]: time="2025-01-13T20:40:37.576282873Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Jan 13 20:40:37.576416 containerd[1551]: time="2025-01-13T20:40:37.576288853Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Jan 13 20:40:37.576416 containerd[1551]: time="2025-01-13T20:40:37.576295045Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Jan 13 20:40:37.576416 containerd[1551]: time="2025-01-13T20:40:37.576300194Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576416 containerd[1551]: time="2025-01-13T20:40:37.576306892Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Jan 13 20:40:37.576416 containerd[1551]: time="2025-01-13T20:40:37.576312291Z" level=info msg="NRI interface is disabled by configuration." Jan 13 20:40:37.576416 containerd[1551]: time="2025-01-13T20:40:37.576318036Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Jan 13 20:40:37.576523 containerd[1551]: time="2025-01-13T20:40:37.576493622Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Jan 13 20:40:37.576523 containerd[1551]: time="2025-01-13T20:40:37.576523522Z" level=info msg="Connect containerd service" Jan 13 20:40:37.576618 containerd[1551]: time="2025-01-13T20:40:37.576544982Z" level=info msg="using legacy CRI server" Jan 13 20:40:37.576618 containerd[1551]: time="2025-01-13T20:40:37.576549773Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 13 20:40:37.576618 containerd[1551]: time="2025-01-13T20:40:37.576610002Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Jan 13 20:40:37.578035 containerd[1551]: time="2025-01-13T20:40:37.576954236Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 13 20:40:37.578035 containerd[1551]: time="2025-01-13T20:40:37.577091295Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 13 20:40:37.578035 containerd[1551]: time="2025-01-13T20:40:37.577115805Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 13 20:40:37.578035 containerd[1551]: time="2025-01-13T20:40:37.577254139Z" level=info msg="Start subscribing containerd event" Jan 13 20:40:37.578035 containerd[1551]: time="2025-01-13T20:40:37.577397484Z" level=info msg="Start recovering state" Jan 13 20:40:37.578035 containerd[1551]: time="2025-01-13T20:40:37.577442528Z" level=info msg="Start event monitor" Jan 13 20:40:37.578035 containerd[1551]: time="2025-01-13T20:40:37.577450213Z" level=info msg="Start snapshots syncer" Jan 13 20:40:37.578035 containerd[1551]: time="2025-01-13T20:40:37.577455941Z" level=info msg="Start cni network conf syncer for default" Jan 13 20:40:37.578035 containerd[1551]: time="2025-01-13T20:40:37.577465021Z" level=info msg="Start streaming server" Jan 13 20:40:37.578035 containerd[1551]: time="2025-01-13T20:40:37.577502683Z" level=info msg="containerd successfully booted in 0.030171s" Jan 13 20:40:37.577552 systemd[1]: Started containerd.service - containerd container runtime. Jan 13 20:40:37.681115 tar[1545]: linux-amd64/LICENSE Jan 13 20:40:37.681115 tar[1545]: linux-amd64/README.md Jan 13 20:40:37.698657 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 13 20:40:38.274284 systemd-networkd[1475]: ens192: Gained IPv6LL Jan 13 20:40:38.274663 systemd-timesyncd[1466]: Network configuration changed, trying to establish connection. Jan 13 20:40:38.275855 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 13 20:40:38.276451 systemd[1]: Reached target network-online.target - Network is Online. Jan 13 20:40:38.280411 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Jan 13 20:40:38.284104 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:40:38.287483 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 13 20:40:38.367525 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 13 20:40:38.369060 systemd[1]: coreos-metadata.service: Deactivated successfully. Jan 13 20:40:38.369196 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Jan 13 20:40:38.369728 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 13 20:40:38.827830 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 13 20:40:38.828897 systemd[1]: Started sshd@0-139.178.70.106:22-194.0.234.37:26840.service - OpenSSH per-connection server daemon (194.0.234.37:26840). Jan 13 20:40:39.779123 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:40:39.779531 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 13 20:40:39.780176 systemd[1]: Startup finished in 1.006s (kernel) + 6.574s (initrd) + 4.608s (userspace) = 12.188s. Jan 13 20:40:39.784772 (kubelet)[1705]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:40:39.788430 agetty[1617]: failed to open credentials directory Jan 13 20:40:39.789024 agetty[1616]: failed to open credentials directory Jan 13 20:40:39.811913 login[1616]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 13 20:40:39.812368 login[1617]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 13 20:40:39.818013 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 13 20:40:39.823411 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 13 20:40:39.825102 systemd-logind[1532]: New session 2 of user core. Jan 13 20:40:39.828058 systemd-logind[1532]: New session 1 of user core. Jan 13 20:40:39.831583 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 13 20:40:39.838398 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 13 20:40:39.846183 (systemd)[1712]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 13 20:40:40.006973 systemd[1712]: Queued start job for default target default.target. Jan 13 20:40:40.017422 systemd[1712]: Created slice app.slice - User Application Slice. Jan 13 20:40:40.017447 systemd[1712]: Reached target paths.target - Paths. Jan 13 20:40:40.017457 systemd[1712]: Reached target timers.target - Timers. Jan 13 20:40:40.018216 systemd[1712]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 13 20:40:40.025332 systemd[1712]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 13 20:40:40.025370 systemd[1712]: Reached target sockets.target - Sockets. Jan 13 20:40:40.025380 systemd[1712]: Reached target basic.target - Basic System. Jan 13 20:40:40.025408 systemd[1712]: Reached target default.target - Main User Target. Jan 13 20:40:40.025426 systemd[1712]: Startup finished in 175ms. Jan 13 20:40:40.025438 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 13 20:40:40.026496 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 13 20:40:40.027090 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 13 20:40:40.963014 kubelet[1705]: E0113 20:40:40.962962 1705 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:40:40.964704 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:40:40.964832 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:40:42.264653 sshd[1698]: Invalid user ubnt from 194.0.234.37 port 26840 Jan 13 20:40:42.498805 sshd[1698]: Connection closed by invalid user ubnt 194.0.234.37 port 26840 [preauth] Jan 13 20:40:42.499984 systemd[1]: sshd@0-139.178.70.106:22-194.0.234.37:26840.service: Deactivated successfully. Jan 13 20:40:51.065568 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 13 20:40:51.076408 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:40:51.137001 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:40:51.139781 (kubelet)[1758]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:40:51.233550 kubelet[1758]: E0113 20:40:51.233512 1758 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:40:51.235767 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:40:51.235846 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:41:01.315678 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 13 20:41:01.324397 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:41:01.561147 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:41:01.564010 (kubelet)[1774]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:41:01.591565 kubelet[1774]: E0113 20:41:01.591497 1774 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:41:01.593102 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:41:01.593250 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:42:21.293428 systemd-resolved[1413]: Clock change detected. Flushing caches. Jan 13 20:42:21.293501 systemd-timesyncd[1466]: Contacted time server 74.6.168.72:123 (2.flatcar.pool.ntp.org). Jan 13 20:42:21.293544 systemd-timesyncd[1466]: Initial clock synchronization to Mon 2025-01-13 20:42:21.293383 UTC. Jan 13 20:42:24.622100 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 13 20:42:24.632909 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:42:24.865626 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:42:24.869063 (kubelet)[1790]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:42:24.916475 kubelet[1790]: E0113 20:42:24.916400 1790 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:42:24.917944 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:42:24.918029 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:42:30.284564 systemd[1]: Started sshd@1-139.178.70.106:22-147.75.109.163:51688.service - OpenSSH per-connection server daemon (147.75.109.163:51688). Jan 13 20:42:30.313941 sshd[1800]: Accepted publickey for core from 147.75.109.163 port 51688 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:42:30.314682 sshd-session[1800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:42:30.317213 systemd-logind[1532]: New session 3 of user core. Jan 13 20:42:30.326848 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 13 20:42:30.379895 systemd[1]: Started sshd@2-139.178.70.106:22-147.75.109.163:51704.service - OpenSSH per-connection server daemon (147.75.109.163:51704). Jan 13 20:42:30.411292 sshd[1805]: Accepted publickey for core from 147.75.109.163 port 51704 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:42:30.412482 sshd-session[1805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:42:30.417009 systemd-logind[1532]: New session 4 of user core. Jan 13 20:42:30.419828 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 13 20:42:30.468750 sshd[1807]: Connection closed by 147.75.109.163 port 51704 Jan 13 20:42:30.469097 sshd-session[1805]: pam_unix(sshd:session): session closed for user core Jan 13 20:42:30.477188 systemd[1]: sshd@2-139.178.70.106:22-147.75.109.163:51704.service: Deactivated successfully. Jan 13 20:42:30.477936 systemd[1]: session-4.scope: Deactivated successfully. Jan 13 20:42:30.478632 systemd-logind[1532]: Session 4 logged out. Waiting for processes to exit. Jan 13 20:42:30.479360 systemd[1]: Started sshd@3-139.178.70.106:22-147.75.109.163:51712.service - OpenSSH per-connection server daemon (147.75.109.163:51712). Jan 13 20:42:30.481965 systemd-logind[1532]: Removed session 4. Jan 13 20:42:30.509379 sshd[1812]: Accepted publickey for core from 147.75.109.163 port 51712 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:42:30.510309 sshd-session[1812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:42:30.512975 systemd-logind[1532]: New session 5 of user core. Jan 13 20:42:30.524896 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 13 20:42:30.571301 sshd[1814]: Connection closed by 147.75.109.163 port 51712 Jan 13 20:42:30.571788 sshd-session[1812]: pam_unix(sshd:session): session closed for user core Jan 13 20:42:30.580190 systemd[1]: sshd@3-139.178.70.106:22-147.75.109.163:51712.service: Deactivated successfully. Jan 13 20:42:30.580985 systemd[1]: session-5.scope: Deactivated successfully. Jan 13 20:42:30.581702 systemd-logind[1532]: Session 5 logged out. Waiting for processes to exit. Jan 13 20:42:30.582395 systemd[1]: Started sshd@4-139.178.70.106:22-147.75.109.163:51720.service - OpenSSH per-connection server daemon (147.75.109.163:51720). Jan 13 20:42:30.584011 systemd-logind[1532]: Removed session 5. Jan 13 20:42:30.612327 sshd[1819]: Accepted publickey for core from 147.75.109.163 port 51720 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:42:30.613042 sshd-session[1819]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:42:30.615564 systemd-logind[1532]: New session 6 of user core. Jan 13 20:42:30.625819 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 13 20:42:30.674478 sshd[1821]: Connection closed by 147.75.109.163 port 51720 Jan 13 20:42:30.674820 sshd-session[1819]: pam_unix(sshd:session): session closed for user core Jan 13 20:42:30.686100 systemd[1]: sshd@4-139.178.70.106:22-147.75.109.163:51720.service: Deactivated successfully. Jan 13 20:42:30.686829 systemd[1]: session-6.scope: Deactivated successfully. Jan 13 20:42:30.687507 systemd-logind[1532]: Session 6 logged out. Waiting for processes to exit. Jan 13 20:42:30.688204 systemd[1]: Started sshd@5-139.178.70.106:22-147.75.109.163:51736.service - OpenSSH per-connection server daemon (147.75.109.163:51736). Jan 13 20:42:30.689288 systemd-logind[1532]: Removed session 6. Jan 13 20:42:30.724325 sshd[1826]: Accepted publickey for core from 147.75.109.163 port 51736 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:42:30.725042 sshd-session[1826]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:42:30.727525 systemd-logind[1532]: New session 7 of user core. Jan 13 20:42:30.735835 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 13 20:42:30.841980 sudo[1829]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 13 20:42:30.842197 sudo[1829]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:42:30.858689 sudo[1829]: pam_unix(sudo:session): session closed for user root Jan 13 20:42:30.859580 sshd[1828]: Connection closed by 147.75.109.163 port 51736 Jan 13 20:42:30.860606 sshd-session[1826]: pam_unix(sshd:session): session closed for user core Jan 13 20:42:30.865497 systemd[1]: sshd@5-139.178.70.106:22-147.75.109.163:51736.service: Deactivated successfully. Jan 13 20:42:30.866540 systemd[1]: session-7.scope: Deactivated successfully. Jan 13 20:42:30.867571 systemd-logind[1532]: Session 7 logged out. Waiting for processes to exit. Jan 13 20:42:30.868365 systemd[1]: Started sshd@6-139.178.70.106:22-147.75.109.163:51748.service - OpenSSH per-connection server daemon (147.75.109.163:51748). Jan 13 20:42:30.870117 systemd-logind[1532]: Removed session 7. Jan 13 20:42:30.911405 sshd[1834]: Accepted publickey for core from 147.75.109.163 port 51748 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:42:30.912345 sshd-session[1834]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:42:30.915433 systemd-logind[1532]: New session 8 of user core. Jan 13 20:42:30.924859 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 13 20:42:30.974828 sudo[1838]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 13 20:42:30.975038 sudo[1838]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:42:30.977545 sudo[1838]: pam_unix(sudo:session): session closed for user root Jan 13 20:42:30.981564 sudo[1837]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 13 20:42:30.981960 sudo[1837]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:42:30.993002 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 20:42:31.011356 augenrules[1860]: No rules Jan 13 20:42:31.012039 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 20:42:31.012165 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 20:42:31.013704 sudo[1837]: pam_unix(sudo:session): session closed for user root Jan 13 20:42:31.015185 sshd[1836]: Connection closed by 147.75.109.163 port 51748 Jan 13 20:42:31.015042 sshd-session[1834]: pam_unix(sshd:session): session closed for user core Jan 13 20:42:31.019021 systemd[1]: sshd@6-139.178.70.106:22-147.75.109.163:51748.service: Deactivated successfully. Jan 13 20:42:31.019804 systemd[1]: session-8.scope: Deactivated successfully. Jan 13 20:42:31.020168 systemd-logind[1532]: Session 8 logged out. Waiting for processes to exit. Jan 13 20:42:31.021102 systemd[1]: Started sshd@7-139.178.70.106:22-147.75.109.163:51764.service - OpenSSH per-connection server daemon (147.75.109.163:51764). Jan 13 20:42:31.022912 systemd-logind[1532]: Removed session 8. Jan 13 20:42:31.059920 sshd[1868]: Accepted publickey for core from 147.75.109.163 port 51764 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:42:31.060905 sshd-session[1868]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:42:31.065070 systemd-logind[1532]: New session 9 of user core. Jan 13 20:42:31.066865 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 13 20:42:31.117965 sudo[1871]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 13 20:42:31.118192 sudo[1871]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:42:31.444881 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 13 20:42:31.444961 (dockerd)[1889]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 13 20:42:31.770820 dockerd[1889]: time="2025-01-13T20:42:31.770743576Z" level=info msg="Starting up" Jan 13 20:42:31.834087 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2858471500-merged.mount: Deactivated successfully. Jan 13 20:42:31.841427 systemd[1]: var-lib-docker-metacopy\x2dcheck2311369895-merged.mount: Deactivated successfully. Jan 13 20:42:31.852827 dockerd[1889]: time="2025-01-13T20:42:31.852801222Z" level=info msg="Loading containers: start." Jan 13 20:42:31.944776 kernel: Initializing XFRM netlink socket Jan 13 20:42:31.989841 systemd-networkd[1475]: docker0: Link UP Jan 13 20:42:32.009596 dockerd[1889]: time="2025-01-13T20:42:32.009559846Z" level=info msg="Loading containers: done." Jan 13 20:42:32.019333 dockerd[1889]: time="2025-01-13T20:42:32.019304885Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 13 20:42:32.019416 dockerd[1889]: time="2025-01-13T20:42:32.019374848Z" level=info msg="Docker daemon" commit=41ca978a0a5400cc24b274137efa9f25517fcc0b containerd-snapshotter=false storage-driver=overlay2 version=27.3.1 Jan 13 20:42:32.019444 dockerd[1889]: time="2025-01-13T20:42:32.019433644Z" level=info msg="Daemon has completed initialization" Jan 13 20:42:32.033225 dockerd[1889]: time="2025-01-13T20:42:32.033085932Z" level=info msg="API listen on /run/docker.sock" Jan 13 20:42:32.033231 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 13 20:42:33.158589 containerd[1551]: time="2025-01-13T20:42:33.158335686Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.8\"" Jan 13 20:42:33.744188 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2983152479.mount: Deactivated successfully. Jan 13 20:42:34.754684 containerd[1551]: time="2025-01-13T20:42:34.754145221Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:34.755186 containerd[1551]: time="2025-01-13T20:42:34.755169631Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.8: active requests=0, bytes read=32675642" Jan 13 20:42:34.755484 containerd[1551]: time="2025-01-13T20:42:34.755472258Z" level=info msg="ImageCreate event name:\"sha256:772392d372035bf92e430e758ad0446146d82b7192358c8651252e4fb49c43dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:34.757396 containerd[1551]: time="2025-01-13T20:42:34.757383965Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:f0e1b3de0c2e98e6c6abd73edf9d3b8e4d44460656cde0ebb92e2d9206961fcb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:34.757800 containerd[1551]: time="2025-01-13T20:42:34.757789087Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.8\" with image id \"sha256:772392d372035bf92e430e758ad0446146d82b7192358c8651252e4fb49c43dd\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:f0e1b3de0c2e98e6c6abd73edf9d3b8e4d44460656cde0ebb92e2d9206961fcb\", size \"32672442\" in 1.59942758s" Jan 13 20:42:34.757849 containerd[1551]: time="2025-01-13T20:42:34.757841344Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.8\" returns image reference \"sha256:772392d372035bf92e430e758ad0446146d82b7192358c8651252e4fb49c43dd\"" Jan 13 20:42:34.770145 containerd[1551]: time="2025-01-13T20:42:34.770111903Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.8\"" Jan 13 20:42:35.122189 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 13 20:42:35.128869 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:42:35.182592 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:42:35.185549 (kubelet)[2149]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:42:35.235964 kubelet[2149]: E0113 20:42:35.235889 2149 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:42:35.237261 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:42:35.237352 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:42:35.402194 update_engine[1535]: I20250113 20:42:35.401993 1535 update_attempter.cc:509] Updating boot flags... Jan 13 20:42:35.552505 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 44 scanned by (udev-worker) (2166) Jan 13 20:42:35.689811 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 44 scanned by (udev-worker) (2166) Jan 13 20:42:36.580941 containerd[1551]: time="2025-01-13T20:42:36.580897649Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:36.588259 containerd[1551]: time="2025-01-13T20:42:36.588211190Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.8: active requests=0, bytes read=29606409" Jan 13 20:42:36.593771 containerd[1551]: time="2025-01-13T20:42:36.593716740Z" level=info msg="ImageCreate event name:\"sha256:85333d41dd3ce32d8344280c6d533d4c8f66252e4c28e332a2322ba3837f7bd6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:36.599778 containerd[1551]: time="2025-01-13T20:42:36.599645604Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:124f66b7e877eb5a80a40503057299bb60e6a5f2130905f4e3293dabf194c397\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:36.600488 containerd[1551]: time="2025-01-13T20:42:36.600280371Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.8\" with image id \"sha256:85333d41dd3ce32d8344280c6d533d4c8f66252e4c28e332a2322ba3837f7bd6\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:124f66b7e877eb5a80a40503057299bb60e6a5f2130905f4e3293dabf194c397\", size \"31051521\" in 1.829994793s" Jan 13 20:42:36.600488 containerd[1551]: time="2025-01-13T20:42:36.600304641Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.8\" returns image reference \"sha256:85333d41dd3ce32d8344280c6d533d4c8f66252e4c28e332a2322ba3837f7bd6\"" Jan 13 20:42:36.619186 containerd[1551]: time="2025-01-13T20:42:36.619165178Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.8\"" Jan 13 20:42:37.581289 containerd[1551]: time="2025-01-13T20:42:37.581255851Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:37.581933 containerd[1551]: time="2025-01-13T20:42:37.581904078Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.8: active requests=0, bytes read=17783035" Jan 13 20:42:37.582362 containerd[1551]: time="2025-01-13T20:42:37.582218658Z" level=info msg="ImageCreate event name:\"sha256:eb53b988d5e03f329b5fdba21cbbbae48e1619b199689e7448095b31843b2c43\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:37.583842 containerd[1551]: time="2025-01-13T20:42:37.583805389Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c8bdeac2590c99c1a77e33995423ddb6633ff90a82a2aa455442e0a8079ef8c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:37.584862 containerd[1551]: time="2025-01-13T20:42:37.584430057Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.8\" with image id \"sha256:eb53b988d5e03f329b5fdba21cbbbae48e1619b199689e7448095b31843b2c43\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c8bdeac2590c99c1a77e33995423ddb6633ff90a82a2aa455442e0a8079ef8c7\", size \"19228165\" in 965.242152ms" Jan 13 20:42:37.584862 containerd[1551]: time="2025-01-13T20:42:37.584449125Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.8\" returns image reference \"sha256:eb53b988d5e03f329b5fdba21cbbbae48e1619b199689e7448095b31843b2c43\"" Jan 13 20:42:37.599797 containerd[1551]: time="2025-01-13T20:42:37.599771318Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.8\"" Jan 13 20:42:39.395052 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4230640689.mount: Deactivated successfully. Jan 13 20:42:39.678721 containerd[1551]: time="2025-01-13T20:42:39.678643776Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:39.684239 containerd[1551]: time="2025-01-13T20:42:39.684206316Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.8: active requests=0, bytes read=29057470" Jan 13 20:42:39.691774 containerd[1551]: time="2025-01-13T20:42:39.691472398Z" level=info msg="ImageCreate event name:\"sha256:ce61fda67eb41cf09d2b984e7979e289b5042e3983ddfc67be678425632cc0d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:39.693951 containerd[1551]: time="2025-01-13T20:42:39.693924042Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:f6d6be9417e22af78905000ac4fd134896bacd2188ea63c7cac8edd7a5d7e9b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:39.694433 containerd[1551]: time="2025-01-13T20:42:39.694240537Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.8\" with image id \"sha256:ce61fda67eb41cf09d2b984e7979e289b5042e3983ddfc67be678425632cc0d2\", repo tag \"registry.k8s.io/kube-proxy:v1.30.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:f6d6be9417e22af78905000ac4fd134896bacd2188ea63c7cac8edd7a5d7e9b5\", size \"29056489\" in 2.094445269s" Jan 13 20:42:39.694433 containerd[1551]: time="2025-01-13T20:42:39.694258313Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.8\" returns image reference \"sha256:ce61fda67eb41cf09d2b984e7979e289b5042e3983ddfc67be678425632cc0d2\"" Jan 13 20:42:39.708959 containerd[1551]: time="2025-01-13T20:42:39.708890468Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Jan 13 20:42:40.248020 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount819057741.mount: Deactivated successfully. Jan 13 20:42:41.008869 containerd[1551]: time="2025-01-13T20:42:41.008540182Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:41.008869 containerd[1551]: time="2025-01-13T20:42:41.008829996Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" Jan 13 20:42:41.011927 containerd[1551]: time="2025-01-13T20:42:41.011907836Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:41.013778 containerd[1551]: time="2025-01-13T20:42:41.013759644Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:41.014374 containerd[1551]: time="2025-01-13T20:42:41.014352667Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.305420823s" Jan 13 20:42:41.014414 containerd[1551]: time="2025-01-13T20:42:41.014376991Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Jan 13 20:42:41.029677 containerd[1551]: time="2025-01-13T20:42:41.029653241Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Jan 13 20:42:41.591258 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4057817830.mount: Deactivated successfully. Jan 13 20:42:41.593042 containerd[1551]: time="2025-01-13T20:42:41.593011699Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:41.594404 containerd[1551]: time="2025-01-13T20:42:41.594382319Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:41.595311 containerd[1551]: time="2025-01-13T20:42:41.594070833Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322290" Jan 13 20:42:41.596073 containerd[1551]: time="2025-01-13T20:42:41.596049174Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:41.597232 containerd[1551]: time="2025-01-13T20:42:41.597210514Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 567.534694ms" Jan 13 20:42:41.597281 containerd[1551]: time="2025-01-13T20:42:41.597232216Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Jan 13 20:42:41.612436 containerd[1551]: time="2025-01-13T20:42:41.612410743Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Jan 13 20:42:42.045722 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2715102363.mount: Deactivated successfully. Jan 13 20:42:45.372280 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jan 13 20:42:45.379005 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:42:45.705424 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:42:45.708433 (kubelet)[2312]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:42:45.865863 kubelet[2312]: E0113 20:42:45.865825 2312 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:42:45.866971 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:42:45.867053 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:42:46.585492 containerd[1551]: time="2025-01-13T20:42:46.585460536Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:46.586187 containerd[1551]: time="2025-01-13T20:42:46.586118536Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238571" Jan 13 20:42:46.588646 containerd[1551]: time="2025-01-13T20:42:46.588605171Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:46.593466 containerd[1551]: time="2025-01-13T20:42:46.593427581Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:42:46.594239 containerd[1551]: time="2025-01-13T20:42:46.594135512Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 4.981702802s" Jan 13 20:42:46.594239 containerd[1551]: time="2025-01-13T20:42:46.594159714Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Jan 13 20:42:48.517041 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:42:48.533135 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:42:48.549182 systemd[1]: Reloading requested from client PID 2388 ('systemctl') (unit session-9.scope)... Jan 13 20:42:48.549193 systemd[1]: Reloading... Jan 13 20:42:48.610749 zram_generator::config[2425]: No configuration found. Jan 13 20:42:48.674518 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:42:48.690863 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:42:48.737218 systemd[1]: Reloading finished in 187 ms. Jan 13 20:42:48.774985 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 13 20:42:48.775049 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 13 20:42:48.775202 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:42:48.778952 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:42:49.001326 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:42:49.005827 (kubelet)[2492]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 20:42:49.045091 kubelet[2492]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:42:49.045091 kubelet[2492]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 13 20:42:49.045091 kubelet[2492]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:42:49.058667 kubelet[2492]: I0113 20:42:49.058627 2492 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 20:42:49.206901 kubelet[2492]: I0113 20:42:49.206878 2492 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Jan 13 20:42:49.206901 kubelet[2492]: I0113 20:42:49.206900 2492 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 20:42:49.207064 kubelet[2492]: I0113 20:42:49.207053 2492 server.go:927] "Client rotation is on, will bootstrap in background" Jan 13 20:42:49.222727 kubelet[2492]: I0113 20:42:49.222701 2492 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 20:42:49.227264 kubelet[2492]: E0113 20:42:49.227226 2492 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://139.178.70.106:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:49.243421 kubelet[2492]: I0113 20:42:49.243394 2492 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 20:42:49.245318 kubelet[2492]: I0113 20:42:49.245290 2492 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 20:42:49.246701 kubelet[2492]: I0113 20:42:49.245320 2492 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 13 20:42:49.246795 kubelet[2492]: I0113 20:42:49.246711 2492 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 20:42:49.246795 kubelet[2492]: I0113 20:42:49.246719 2492 container_manager_linux.go:301] "Creating device plugin manager" Jan 13 20:42:49.246868 kubelet[2492]: I0113 20:42:49.246822 2492 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:42:49.247570 kubelet[2492]: I0113 20:42:49.247558 2492 kubelet.go:400] "Attempting to sync node with API server" Jan 13 20:42:49.247570 kubelet[2492]: I0113 20:42:49.247571 2492 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 20:42:49.247629 kubelet[2492]: I0113 20:42:49.247585 2492 kubelet.go:312] "Adding apiserver pod source" Jan 13 20:42:49.247629 kubelet[2492]: I0113 20:42:49.247598 2492 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 20:42:49.251470 kubelet[2492]: W0113 20:42:49.251114 2492 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.106:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:49.251470 kubelet[2492]: E0113 20:42:49.251154 2492 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.106:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:49.251470 kubelet[2492]: W0113 20:42:49.251291 2492 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:49.251470 kubelet[2492]: E0113 20:42:49.251311 2492 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:49.251624 kubelet[2492]: I0113 20:42:49.251615 2492 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 13 20:42:49.252987 kubelet[2492]: I0113 20:42:49.252977 2492 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 20:42:49.253064 kubelet[2492]: W0113 20:42:49.253057 2492 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 13 20:42:49.253598 kubelet[2492]: I0113 20:42:49.253589 2492 server.go:1264] "Started kubelet" Jan 13 20:42:49.261301 kubelet[2492]: I0113 20:42:49.261286 2492 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 20:42:49.264835 kubelet[2492]: E0113 20:42:49.264645 2492 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.106:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.106:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.181a5b4791e9ca24 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-01-13 20:42:49.253571108 +0000 UTC m=+0.245197027,LastTimestamp:2025-01-13 20:42:49.253571108 +0000 UTC m=+0.245197027,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jan 13 20:42:49.264835 kubelet[2492]: I0113 20:42:49.261510 2492 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 20:42:49.265040 kubelet[2492]: I0113 20:42:49.265027 2492 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 20:42:49.265585 kubelet[2492]: I0113 20:42:49.265256 2492 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 13 20:42:49.265585 kubelet[2492]: I0113 20:42:49.261474 2492 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 20:42:49.266678 kubelet[2492]: I0113 20:42:49.266563 2492 server.go:455] "Adding debug handlers to kubelet server" Jan 13 20:42:49.267353 kubelet[2492]: I0113 20:42:49.267084 2492 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Jan 13 20:42:49.271441 kubelet[2492]: I0113 20:42:49.271420 2492 reconciler.go:26] "Reconciler: start to sync state" Jan 13 20:42:49.276768 kubelet[2492]: W0113 20:42:49.276260 2492 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:49.276768 kubelet[2492]: E0113 20:42:49.276293 2492 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:49.276768 kubelet[2492]: E0113 20:42:49.276477 2492 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="200ms" Jan 13 20:42:49.277051 kubelet[2492]: I0113 20:42:49.277042 2492 factory.go:221] Registration of the systemd container factory successfully Jan 13 20:42:49.277136 kubelet[2492]: I0113 20:42:49.277126 2492 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 20:42:49.278114 kubelet[2492]: E0113 20:42:49.278105 2492 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 13 20:42:49.278243 kubelet[2492]: I0113 20:42:49.278235 2492 factory.go:221] Registration of the containerd container factory successfully Jan 13 20:42:49.283906 kubelet[2492]: I0113 20:42:49.283878 2492 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 20:42:49.284476 kubelet[2492]: I0113 20:42:49.284461 2492 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 20:42:49.284509 kubelet[2492]: I0113 20:42:49.284479 2492 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 13 20:42:49.284509 kubelet[2492]: I0113 20:42:49.284490 2492 kubelet.go:2337] "Starting kubelet main sync loop" Jan 13 20:42:49.284547 kubelet[2492]: E0113 20:42:49.284516 2492 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 13 20:42:49.288442 kubelet[2492]: W0113 20:42:49.288356 2492 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:49.288442 kubelet[2492]: E0113 20:42:49.288389 2492 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:49.301039 kubelet[2492]: I0113 20:42:49.300902 2492 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 13 20:42:49.301039 kubelet[2492]: I0113 20:42:49.300912 2492 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 13 20:42:49.301039 kubelet[2492]: I0113 20:42:49.300921 2492 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:42:49.302728 kubelet[2492]: I0113 20:42:49.302680 2492 policy_none.go:49] "None policy: Start" Jan 13 20:42:49.302989 kubelet[2492]: I0113 20:42:49.302981 2492 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 13 20:42:49.303045 kubelet[2492]: I0113 20:42:49.303023 2492 state_mem.go:35] "Initializing new in-memory state store" Jan 13 20:42:49.306346 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 13 20:42:49.318202 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 13 20:42:49.321028 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 13 20:42:49.327375 kubelet[2492]: I0113 20:42:49.327354 2492 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 20:42:49.327500 kubelet[2492]: I0113 20:42:49.327476 2492 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 13 20:42:49.327560 kubelet[2492]: I0113 20:42:49.327551 2492 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 20:42:49.328747 kubelet[2492]: E0113 20:42:49.328698 2492 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Jan 13 20:42:49.370619 kubelet[2492]: I0113 20:42:49.370586 2492 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:42:49.370821 kubelet[2492]: E0113 20:42:49.370803 2492 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Jan 13 20:42:49.385240 kubelet[2492]: I0113 20:42:49.385165 2492 topology_manager.go:215] "Topology Admit Handler" podUID="a50886d4f9275a12962d368e55f2cbf4" podNamespace="kube-system" podName="kube-apiserver-localhost" Jan 13 20:42:49.386641 kubelet[2492]: I0113 20:42:49.385946 2492 topology_manager.go:215] "Topology Admit Handler" podUID="8a50003978138b3ab9890682eff4eae8" podNamespace="kube-system" podName="kube-controller-manager-localhost" Jan 13 20:42:49.387146 kubelet[2492]: I0113 20:42:49.386835 2492 topology_manager.go:215] "Topology Admit Handler" podUID="b107a98bcf27297d642d248711a3fc70" podNamespace="kube-system" podName="kube-scheduler-localhost" Jan 13 20:42:49.391441 systemd[1]: Created slice kubepods-burstable-poda50886d4f9275a12962d368e55f2cbf4.slice - libcontainer container kubepods-burstable-poda50886d4f9275a12962d368e55f2cbf4.slice. Jan 13 20:42:49.397135 systemd[1]: Created slice kubepods-burstable-pod8a50003978138b3ab9890682eff4eae8.slice - libcontainer container kubepods-burstable-pod8a50003978138b3ab9890682eff4eae8.slice. Jan 13 20:42:49.401072 systemd[1]: Created slice kubepods-burstable-podb107a98bcf27297d642d248711a3fc70.slice - libcontainer container kubepods-burstable-podb107a98bcf27297d642d248711a3fc70.slice. Jan 13 20:42:49.472796 kubelet[2492]: I0113 20:42:49.472772 2492 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a50886d4f9275a12962d368e55f2cbf4-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"a50886d4f9275a12962d368e55f2cbf4\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:42:49.473038 kubelet[2492]: I0113 20:42:49.472918 2492 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8a50003978138b3ab9890682eff4eae8-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8a50003978138b3ab9890682eff4eae8\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:42:49.473038 kubelet[2492]: I0113 20:42:49.472934 2492 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8a50003978138b3ab9890682eff4eae8-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"8a50003978138b3ab9890682eff4eae8\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:42:49.473038 kubelet[2492]: I0113 20:42:49.472945 2492 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8a50003978138b3ab9890682eff4eae8-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8a50003978138b3ab9890682eff4eae8\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:42:49.473038 kubelet[2492]: I0113 20:42:49.472954 2492 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a50886d4f9275a12962d368e55f2cbf4-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"a50886d4f9275a12962d368e55f2cbf4\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:42:49.473038 kubelet[2492]: I0113 20:42:49.472966 2492 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8a50003978138b3ab9890682eff4eae8-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"8a50003978138b3ab9890682eff4eae8\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:42:49.473143 kubelet[2492]: I0113 20:42:49.473001 2492 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8a50003978138b3ab9890682eff4eae8-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"8a50003978138b3ab9890682eff4eae8\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:42:49.473143 kubelet[2492]: I0113 20:42:49.473011 2492 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b107a98bcf27297d642d248711a3fc70-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"b107a98bcf27297d642d248711a3fc70\") " pod="kube-system/kube-scheduler-localhost" Jan 13 20:42:49.473143 kubelet[2492]: I0113 20:42:49.473019 2492 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a50886d4f9275a12962d368e55f2cbf4-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"a50886d4f9275a12962d368e55f2cbf4\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:42:49.477252 kubelet[2492]: E0113 20:42:49.477220 2492 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="400ms" Jan 13 20:42:49.572419 kubelet[2492]: I0113 20:42:49.572348 2492 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:42:49.573032 kubelet[2492]: E0113 20:42:49.572557 2492 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Jan 13 20:42:49.696031 containerd[1551]: time="2025-01-13T20:42:49.695999624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:a50886d4f9275a12962d368e55f2cbf4,Namespace:kube-system,Attempt:0,}" Jan 13 20:42:49.700675 containerd[1551]: time="2025-01-13T20:42:49.700557988Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:8a50003978138b3ab9890682eff4eae8,Namespace:kube-system,Attempt:0,}" Jan 13 20:42:49.703098 containerd[1551]: time="2025-01-13T20:42:49.703037901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:b107a98bcf27297d642d248711a3fc70,Namespace:kube-system,Attempt:0,}" Jan 13 20:42:49.878508 kubelet[2492]: E0113 20:42:49.878480 2492 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="800ms" Jan 13 20:42:49.973627 kubelet[2492]: I0113 20:42:49.973602 2492 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:42:49.974277 kubelet[2492]: E0113 20:42:49.974098 2492 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Jan 13 20:42:50.217363 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3384434051.mount: Deactivated successfully. Jan 13 20:42:50.219634 containerd[1551]: time="2025-01-13T20:42:50.219188938Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:42:50.220000 containerd[1551]: time="2025-01-13T20:42:50.219966275Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Jan 13 20:42:50.220524 containerd[1551]: time="2025-01-13T20:42:50.220512564Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:42:50.221868 containerd[1551]: time="2025-01-13T20:42:50.221852193Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 13 20:42:50.222771 containerd[1551]: time="2025-01-13T20:42:50.222170710Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:42:50.223756 containerd[1551]: time="2025-01-13T20:42:50.223722363Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:42:50.224154 containerd[1551]: time="2025-01-13T20:42:50.224138068Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 13 20:42:50.225244 containerd[1551]: time="2025-01-13T20:42:50.225222517Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 529.159273ms" Jan 13 20:42:50.225597 containerd[1551]: time="2025-01-13T20:42:50.225579946Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:42:50.227701 containerd[1551]: time="2025-01-13T20:42:50.226278344Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 525.477261ms" Jan 13 20:42:50.231292 containerd[1551]: time="2025-01-13T20:42:50.227213714Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 524.087199ms" Jan 13 20:42:50.350080 kubelet[2492]: W0113 20:42:50.349957 2492 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:50.357783 kubelet[2492]: E0113 20:42:50.357688 2492 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:50.361572 containerd[1551]: time="2025-01-13T20:42:50.361320226Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:42:50.361572 containerd[1551]: time="2025-01-13T20:42:50.361358717Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:42:50.361572 containerd[1551]: time="2025-01-13T20:42:50.361371322Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:42:50.361572 containerd[1551]: time="2025-01-13T20:42:50.361424867Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:42:50.365092 containerd[1551]: time="2025-01-13T20:42:50.363564835Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:42:50.365092 containerd[1551]: time="2025-01-13T20:42:50.363790851Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:42:50.365092 containerd[1551]: time="2025-01-13T20:42:50.363805067Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:42:50.365092 containerd[1551]: time="2025-01-13T20:42:50.363859249Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:42:50.367841 containerd[1551]: time="2025-01-13T20:42:50.367721093Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:42:50.367841 containerd[1551]: time="2025-01-13T20:42:50.367802560Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:42:50.367841 containerd[1551]: time="2025-01-13T20:42:50.367826901Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:42:50.368127 containerd[1551]: time="2025-01-13T20:42:50.367984136Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:42:50.393849 systemd[1]: Started cri-containerd-08cf80b2458f75217a8a023eb7536395450b29202ac6848aa69a7effd3445ea0.scope - libcontainer container 08cf80b2458f75217a8a023eb7536395450b29202ac6848aa69a7effd3445ea0. Jan 13 20:42:50.394849 systemd[1]: Started cri-containerd-b4c00423b0a6ec97521babd251d77b0fad18efb01ac6e2a60e787e3f34943a19.scope - libcontainer container b4c00423b0a6ec97521babd251d77b0fad18efb01ac6e2a60e787e3f34943a19. Jan 13 20:42:50.396016 systemd[1]: Started cri-containerd-ffa5d14475c3f115583de419401a7be54a31ebc5dc82447f75176deeea847eeb.scope - libcontainer container ffa5d14475c3f115583de419401a7be54a31ebc5dc82447f75176deeea847eeb. Jan 13 20:42:50.408651 kubelet[2492]: W0113 20:42:50.408559 2492 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:50.408651 kubelet[2492]: E0113 20:42:50.408602 2492 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:50.434028 containerd[1551]: time="2025-01-13T20:42:50.434006768Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:b107a98bcf27297d642d248711a3fc70,Namespace:kube-system,Attempt:0,} returns sandbox id \"ffa5d14475c3f115583de419401a7be54a31ebc5dc82447f75176deeea847eeb\"" Jan 13 20:42:50.437999 containerd[1551]: time="2025-01-13T20:42:50.437879140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:a50886d4f9275a12962d368e55f2cbf4,Namespace:kube-system,Attempt:0,} returns sandbox id \"b4c00423b0a6ec97521babd251d77b0fad18efb01ac6e2a60e787e3f34943a19\"" Jan 13 20:42:50.438702 containerd[1551]: time="2025-01-13T20:42:50.438538565Z" level=info msg="CreateContainer within sandbox \"ffa5d14475c3f115583de419401a7be54a31ebc5dc82447f75176deeea847eeb\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 13 20:42:50.451925 containerd[1551]: time="2025-01-13T20:42:50.451867169Z" level=info msg="CreateContainer within sandbox \"b4c00423b0a6ec97521babd251d77b0fad18efb01ac6e2a60e787e3f34943a19\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 13 20:42:50.472779 containerd[1551]: time="2025-01-13T20:42:50.472189202Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:8a50003978138b3ab9890682eff4eae8,Namespace:kube-system,Attempt:0,} returns sandbox id \"08cf80b2458f75217a8a023eb7536395450b29202ac6848aa69a7effd3445ea0\"" Jan 13 20:42:50.474213 containerd[1551]: time="2025-01-13T20:42:50.474194103Z" level=info msg="CreateContainer within sandbox \"08cf80b2458f75217a8a023eb7536395450b29202ac6848aa69a7effd3445ea0\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 13 20:42:50.526521 containerd[1551]: time="2025-01-13T20:42:50.526425368Z" level=info msg="CreateContainer within sandbox \"ffa5d14475c3f115583de419401a7be54a31ebc5dc82447f75176deeea847eeb\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"8e149927235b0de65d02476c24538e1fbfc89e7330da2907632aadee722a04f2\"" Jan 13 20:42:50.527430 containerd[1551]: time="2025-01-13T20:42:50.527411842Z" level=info msg="StartContainer for \"8e149927235b0de65d02476c24538e1fbfc89e7330da2907632aadee722a04f2\"" Jan 13 20:42:50.527997 containerd[1551]: time="2025-01-13T20:42:50.527838185Z" level=info msg="CreateContainer within sandbox \"08cf80b2458f75217a8a023eb7536395450b29202ac6848aa69a7effd3445ea0\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"8a5ef9406507c6f8ff223d5bde80534e72aa601e5867823b0cdc6dffc6994daf\"" Jan 13 20:42:50.528614 containerd[1551]: time="2025-01-13T20:42:50.528090104Z" level=info msg="StartContainer for \"8a5ef9406507c6f8ff223d5bde80534e72aa601e5867823b0cdc6dffc6994daf\"" Jan 13 20:42:50.530963 containerd[1551]: time="2025-01-13T20:42:50.530944432Z" level=info msg="CreateContainer within sandbox \"b4c00423b0a6ec97521babd251d77b0fad18efb01ac6e2a60e787e3f34943a19\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"96aef3461a9b5f56ce0ab3509a267c42549f43b9eb41ddb869930bc937bdd4a7\"" Jan 13 20:42:50.531355 containerd[1551]: time="2025-01-13T20:42:50.531343644Z" level=info msg="StartContainer for \"96aef3461a9b5f56ce0ab3509a267c42549f43b9eb41ddb869930bc937bdd4a7\"" Jan 13 20:42:50.549415 systemd[1]: Started cri-containerd-8e149927235b0de65d02476c24538e1fbfc89e7330da2907632aadee722a04f2.scope - libcontainer container 8e149927235b0de65d02476c24538e1fbfc89e7330da2907632aadee722a04f2. Jan 13 20:42:50.564926 systemd[1]: Started cri-containerd-96aef3461a9b5f56ce0ab3509a267c42549f43b9eb41ddb869930bc937bdd4a7.scope - libcontainer container 96aef3461a9b5f56ce0ab3509a267c42549f43b9eb41ddb869930bc937bdd4a7. Jan 13 20:42:50.567326 systemd[1]: Started cri-containerd-8a5ef9406507c6f8ff223d5bde80534e72aa601e5867823b0cdc6dffc6994daf.scope - libcontainer container 8a5ef9406507c6f8ff223d5bde80534e72aa601e5867823b0cdc6dffc6994daf. Jan 13 20:42:50.604126 containerd[1551]: time="2025-01-13T20:42:50.603838505Z" level=info msg="StartContainer for \"96aef3461a9b5f56ce0ab3509a267c42549f43b9eb41ddb869930bc937bdd4a7\" returns successfully" Jan 13 20:42:50.617370 containerd[1551]: time="2025-01-13T20:42:50.617292346Z" level=info msg="StartContainer for \"8e149927235b0de65d02476c24538e1fbfc89e7330da2907632aadee722a04f2\" returns successfully" Jan 13 20:42:50.620165 containerd[1551]: time="2025-01-13T20:42:50.620098785Z" level=info msg="StartContainer for \"8a5ef9406507c6f8ff223d5bde80534e72aa601e5867823b0cdc6dffc6994daf\" returns successfully" Jan 13 20:42:50.679068 kubelet[2492]: E0113 20:42:50.679030 2492 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="1.6s" Jan 13 20:42:50.732575 kubelet[2492]: W0113 20:42:50.732477 2492 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.106:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:50.732575 kubelet[2492]: E0113 20:42:50.732517 2492 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.106:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:50.777866 kubelet[2492]: I0113 20:42:50.775687 2492 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:42:50.778015 kubelet[2492]: E0113 20:42:50.777938 2492 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Jan 13 20:42:50.802501 kubelet[2492]: W0113 20:42:50.802434 2492 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:50.802501 kubelet[2492]: E0113 20:42:50.802487 2492 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:51.417726 kubelet[2492]: E0113 20:42:51.417701 2492 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://139.178.70.106:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:52.246184 kubelet[2492]: W0113 20:42:52.246158 2492 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:52.246184 kubelet[2492]: E0113 20:42:52.246187 2492 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:52.279558 kubelet[2492]: E0113 20:42:52.279527 2492 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="3.2s" Jan 13 20:42:52.379384 kubelet[2492]: I0113 20:42:52.379358 2492 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:42:52.379602 kubelet[2492]: E0113 20:42:52.379584 2492 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Jan 13 20:42:52.560679 kubelet[2492]: W0113 20:42:52.560611 2492 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:52.560679 kubelet[2492]: E0113 20:42:52.560638 2492 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 13 20:42:54.189925 kubelet[2492]: E0113 20:42:54.189896 2492 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Jan 13 20:42:54.549313 kubelet[2492]: E0113 20:42:54.549095 2492 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Jan 13 20:42:55.025238 kubelet[2492]: E0113 20:42:55.025217 2492 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Jan 13 20:42:55.482997 kubelet[2492]: E0113 20:42:55.482978 2492 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Jan 13 20:42:55.581152 kubelet[2492]: I0113 20:42:55.580887 2492 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:42:55.586891 kubelet[2492]: I0113 20:42:55.586354 2492 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Jan 13 20:42:55.591008 kubelet[2492]: E0113 20:42:55.590988 2492 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:42:55.691382 kubelet[2492]: E0113 20:42:55.691346 2492 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:42:55.792232 kubelet[2492]: E0113 20:42:55.792126 2492 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:42:55.892808 kubelet[2492]: E0113 20:42:55.892770 2492 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:42:55.993271 kubelet[2492]: E0113 20:42:55.993242 2492 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:42:56.015101 systemd[1]: Reloading requested from client PID 2772 ('systemctl') (unit session-9.scope)... Jan 13 20:42:56.015317 systemd[1]: Reloading... Jan 13 20:42:56.078782 zram_generator::config[2813]: No configuration found. Jan 13 20:42:56.094092 kubelet[2492]: E0113 20:42:56.094059 2492 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:42:56.143271 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:42:56.159342 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:42:56.194636 kubelet[2492]: E0113 20:42:56.194610 2492 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:42:56.214637 systemd[1]: Reloading finished in 199 ms. Jan 13 20:42:56.234686 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:42:56.249535 systemd[1]: kubelet.service: Deactivated successfully. Jan 13 20:42:56.249679 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:42:56.254120 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:42:56.418519 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:42:56.423622 (kubelet)[2877]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 20:42:56.537293 kubelet[2877]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:42:56.537293 kubelet[2877]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 13 20:42:56.537293 kubelet[2877]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:42:56.538255 kubelet[2877]: I0113 20:42:56.538231 2877 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 20:42:56.541419 kubelet[2877]: I0113 20:42:56.540835 2877 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Jan 13 20:42:56.541419 kubelet[2877]: I0113 20:42:56.540847 2877 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 20:42:56.541419 kubelet[2877]: I0113 20:42:56.540973 2877 server.go:927] "Client rotation is on, will bootstrap in background" Jan 13 20:42:56.542669 kubelet[2877]: I0113 20:42:56.542653 2877 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 13 20:42:56.543894 kubelet[2877]: I0113 20:42:56.543883 2877 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 20:42:56.547202 kubelet[2877]: I0113 20:42:56.547184 2877 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 20:42:56.547343 kubelet[2877]: I0113 20:42:56.547320 2877 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 20:42:56.547454 kubelet[2877]: I0113 20:42:56.547343 2877 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 13 20:42:56.547508 kubelet[2877]: I0113 20:42:56.547461 2877 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 20:42:56.547508 kubelet[2877]: I0113 20:42:56.547471 2877 container_manager_linux.go:301] "Creating device plugin manager" Jan 13 20:42:56.547546 kubelet[2877]: I0113 20:42:56.547509 2877 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:42:56.547581 kubelet[2877]: I0113 20:42:56.547571 2877 kubelet.go:400] "Attempting to sync node with API server" Jan 13 20:42:56.547603 kubelet[2877]: I0113 20:42:56.547581 2877 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 20:42:56.547603 kubelet[2877]: I0113 20:42:56.547594 2877 kubelet.go:312] "Adding apiserver pod source" Jan 13 20:42:56.549727 kubelet[2877]: I0113 20:42:56.547604 2877 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 20:42:56.552915 kubelet[2877]: I0113 20:42:56.552658 2877 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 13 20:42:56.553216 kubelet[2877]: I0113 20:42:56.553199 2877 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 20:42:56.553989 kubelet[2877]: I0113 20:42:56.553467 2877 server.go:1264] "Started kubelet" Jan 13 20:42:56.555130 kubelet[2877]: I0113 20:42:56.555115 2877 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 20:42:56.559491 kubelet[2877]: I0113 20:42:56.556560 2877 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 20:42:56.559491 kubelet[2877]: I0113 20:42:56.557214 2877 server.go:455] "Adding debug handlers to kubelet server" Jan 13 20:42:56.559491 kubelet[2877]: I0113 20:42:56.557672 2877 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 20:42:56.559491 kubelet[2877]: I0113 20:42:56.557799 2877 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 20:42:56.559640 kubelet[2877]: I0113 20:42:56.559605 2877 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 13 20:42:56.560753 kubelet[2877]: I0113 20:42:56.560728 2877 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Jan 13 20:42:56.560826 kubelet[2877]: I0113 20:42:56.560816 2877 reconciler.go:26] "Reconciler: start to sync state" Jan 13 20:42:56.564370 kubelet[2877]: I0113 20:42:56.563948 2877 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 20:42:56.565806 kubelet[2877]: I0113 20:42:56.565682 2877 factory.go:221] Registration of the systemd container factory successfully Jan 13 20:42:56.566569 kubelet[2877]: I0113 20:42:56.566548 2877 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 20:42:56.567007 kubelet[2877]: I0113 20:42:56.566821 2877 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 20:42:56.567007 kubelet[2877]: I0113 20:42:56.566840 2877 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 13 20:42:56.567007 kubelet[2877]: I0113 20:42:56.566853 2877 kubelet.go:2337] "Starting kubelet main sync loop" Jan 13 20:42:56.567007 kubelet[2877]: E0113 20:42:56.566884 2877 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 13 20:42:56.568675 kubelet[2877]: I0113 20:42:56.568664 2877 factory.go:221] Registration of the containerd container factory successfully Jan 13 20:42:56.572772 kubelet[2877]: E0113 20:42:56.572758 2877 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 13 20:42:56.611475 kubelet[2877]: I0113 20:42:56.611457 2877 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 13 20:42:56.611475 kubelet[2877]: I0113 20:42:56.611467 2877 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 13 20:42:56.611475 kubelet[2877]: I0113 20:42:56.611479 2877 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:42:56.611618 kubelet[2877]: I0113 20:42:56.611600 2877 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 13 20:42:56.611618 kubelet[2877]: I0113 20:42:56.611607 2877 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 13 20:42:56.611618 kubelet[2877]: I0113 20:42:56.611618 2877 policy_none.go:49] "None policy: Start" Jan 13 20:42:56.612004 kubelet[2877]: I0113 20:42:56.611995 2877 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 13 20:42:56.612099 kubelet[2877]: I0113 20:42:56.612094 2877 state_mem.go:35] "Initializing new in-memory state store" Jan 13 20:42:56.612223 kubelet[2877]: I0113 20:42:56.612215 2877 state_mem.go:75] "Updated machine memory state" Jan 13 20:42:56.614839 kubelet[2877]: I0113 20:42:56.614818 2877 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 20:42:56.615108 kubelet[2877]: I0113 20:42:56.615086 2877 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 13 20:42:56.615156 kubelet[2877]: I0113 20:42:56.615146 2877 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 20:42:56.661159 kubelet[2877]: I0113 20:42:56.661087 2877 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:42:56.666503 kubelet[2877]: I0113 20:42:56.666308 2877 kubelet_node_status.go:112] "Node was previously registered" node="localhost" Jan 13 20:42:56.666503 kubelet[2877]: I0113 20:42:56.666392 2877 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Jan 13 20:42:56.666950 kubelet[2877]: I0113 20:42:56.666924 2877 topology_manager.go:215] "Topology Admit Handler" podUID="a50886d4f9275a12962d368e55f2cbf4" podNamespace="kube-system" podName="kube-apiserver-localhost" Jan 13 20:42:56.667771 kubelet[2877]: I0113 20:42:56.667008 2877 topology_manager.go:215] "Topology Admit Handler" podUID="8a50003978138b3ab9890682eff4eae8" podNamespace="kube-system" podName="kube-controller-manager-localhost" Jan 13 20:42:56.667771 kubelet[2877]: I0113 20:42:56.667063 2877 topology_manager.go:215] "Topology Admit Handler" podUID="b107a98bcf27297d642d248711a3fc70" podNamespace="kube-system" podName="kube-scheduler-localhost" Jan 13 20:42:56.862010 kubelet[2877]: I0113 20:42:56.861905 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a50886d4f9275a12962d368e55f2cbf4-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"a50886d4f9275a12962d368e55f2cbf4\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:42:56.862010 kubelet[2877]: I0113 20:42:56.861972 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8a50003978138b3ab9890682eff4eae8-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"8a50003978138b3ab9890682eff4eae8\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:42:56.862010 kubelet[2877]: I0113 20:42:56.861989 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8a50003978138b3ab9890682eff4eae8-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8a50003978138b3ab9890682eff4eae8\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:42:56.862010 kubelet[2877]: I0113 20:42:56.861999 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8a50003978138b3ab9890682eff4eae8-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"8a50003978138b3ab9890682eff4eae8\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:42:56.862010 kubelet[2877]: I0113 20:42:56.862009 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8a50003978138b3ab9890682eff4eae8-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8a50003978138b3ab9890682eff4eae8\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:42:56.862154 kubelet[2877]: I0113 20:42:56.862019 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8a50003978138b3ab9890682eff4eae8-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"8a50003978138b3ab9890682eff4eae8\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:42:56.862154 kubelet[2877]: I0113 20:42:56.862027 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b107a98bcf27297d642d248711a3fc70-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"b107a98bcf27297d642d248711a3fc70\") " pod="kube-system/kube-scheduler-localhost" Jan 13 20:42:56.862154 kubelet[2877]: I0113 20:42:56.862035 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a50886d4f9275a12962d368e55f2cbf4-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"a50886d4f9275a12962d368e55f2cbf4\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:42:56.862154 kubelet[2877]: I0113 20:42:56.862045 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a50886d4f9275a12962d368e55f2cbf4-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"a50886d4f9275a12962d368e55f2cbf4\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:42:57.548270 kubelet[2877]: I0113 20:42:57.548242 2877 apiserver.go:52] "Watching apiserver" Jan 13 20:42:57.561611 kubelet[2877]: I0113 20:42:57.561593 2877 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Jan 13 20:42:57.603353 kubelet[2877]: E0113 20:42:57.602988 2877 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Jan 13 20:42:57.617749 kubelet[2877]: I0113 20:42:57.617532 2877 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.617518707 podStartE2EDuration="1.617518707s" podCreationTimestamp="2025-01-13 20:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:42:57.613378597 +0000 UTC m=+1.100420535" watchObservedRunningTime="2025-01-13 20:42:57.617518707 +0000 UTC m=+1.104560640" Jan 13 20:42:57.622194 kubelet[2877]: I0113 20:42:57.621987 2877 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.62197449 podStartE2EDuration="1.62197449s" podCreationTimestamp="2025-01-13 20:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:42:57.617923563 +0000 UTC m=+1.104965505" watchObservedRunningTime="2025-01-13 20:42:57.62197449 +0000 UTC m=+1.109016426" Jan 13 20:42:57.627235 kubelet[2877]: I0113 20:42:57.627019 2877 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.627006324 podStartE2EDuration="1.627006324s" podCreationTimestamp="2025-01-13 20:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:42:57.622118486 +0000 UTC m=+1.109160428" watchObservedRunningTime="2025-01-13 20:42:57.627006324 +0000 UTC m=+1.114048265" Jan 13 20:43:01.536953 sudo[1871]: pam_unix(sudo:session): session closed for user root Jan 13 20:43:01.538452 sshd[1870]: Connection closed by 147.75.109.163 port 51764 Jan 13 20:43:01.538879 sshd-session[1868]: pam_unix(sshd:session): session closed for user core Jan 13 20:43:01.540722 systemd[1]: sshd@7-139.178.70.106:22-147.75.109.163:51764.service: Deactivated successfully. Jan 13 20:43:01.542029 systemd[1]: session-9.scope: Deactivated successfully. Jan 13 20:43:01.542177 systemd[1]: session-9.scope: Consumed 2.794s CPU time, 184.0M memory peak, 0B memory swap peak. Jan 13 20:43:01.543071 systemd-logind[1532]: Session 9 logged out. Waiting for processes to exit. Jan 13 20:43:01.543702 systemd-logind[1532]: Removed session 9. Jan 13 20:43:11.454153 kubelet[2877]: I0113 20:43:11.454034 2877 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 13 20:43:11.454402 containerd[1551]: time="2025-01-13T20:43:11.454287278Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 13 20:43:11.454527 kubelet[2877]: I0113 20:43:11.454427 2877 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 13 20:43:11.964701 kubelet[2877]: I0113 20:43:11.964580 2877 topology_manager.go:215] "Topology Admit Handler" podUID="223e9b5b-8eed-4f85-a368-15cdf253df06" podNamespace="kube-system" podName="kube-proxy-cbd9t" Jan 13 20:43:11.970951 systemd[1]: Created slice kubepods-besteffort-pod223e9b5b_8eed_4f85_a368_15cdf253df06.slice - libcontainer container kubepods-besteffort-pod223e9b5b_8eed_4f85_a368_15cdf253df06.slice. Jan 13 20:43:12.060069 kubelet[2877]: I0113 20:43:12.060023 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/223e9b5b-8eed-4f85-a368-15cdf253df06-xtables-lock\") pod \"kube-proxy-cbd9t\" (UID: \"223e9b5b-8eed-4f85-a368-15cdf253df06\") " pod="kube-system/kube-proxy-cbd9t" Jan 13 20:43:12.060069 kubelet[2877]: I0113 20:43:12.060052 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/223e9b5b-8eed-4f85-a368-15cdf253df06-lib-modules\") pod \"kube-proxy-cbd9t\" (UID: \"223e9b5b-8eed-4f85-a368-15cdf253df06\") " pod="kube-system/kube-proxy-cbd9t" Jan 13 20:43:12.060069 kubelet[2877]: I0113 20:43:12.060066 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/223e9b5b-8eed-4f85-a368-15cdf253df06-kube-proxy\") pod \"kube-proxy-cbd9t\" (UID: \"223e9b5b-8eed-4f85-a368-15cdf253df06\") " pod="kube-system/kube-proxy-cbd9t" Jan 13 20:43:12.060244 kubelet[2877]: I0113 20:43:12.060079 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhp2l\" (UniqueName: \"kubernetes.io/projected/223e9b5b-8eed-4f85-a368-15cdf253df06-kube-api-access-vhp2l\") pod \"kube-proxy-cbd9t\" (UID: \"223e9b5b-8eed-4f85-a368-15cdf253df06\") " pod="kube-system/kube-proxy-cbd9t" Jan 13 20:43:12.123584 kubelet[2877]: I0113 20:43:12.123559 2877 topology_manager.go:215] "Topology Admit Handler" podUID="f7ffb744-5914-485f-acf0-acc03c648095" podNamespace="tigera-operator" podName="tigera-operator-7bc55997bb-rr77c" Jan 13 20:43:12.128598 systemd[1]: Created slice kubepods-besteffort-podf7ffb744_5914_485f_acf0_acc03c648095.slice - libcontainer container kubepods-besteffort-podf7ffb744_5914_485f_acf0_acc03c648095.slice. Jan 13 20:43:12.129652 kubelet[2877]: W0113 20:43:12.129634 2877 reflector.go:547] object-"tigera-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'localhost' and this object Jan 13 20:43:12.129727 kubelet[2877]: E0113 20:43:12.129659 2877 reflector.go:150] object-"tigera-operator"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'localhost' and this object Jan 13 20:43:12.129727 kubelet[2877]: W0113 20:43:12.129636 2877 reflector.go:547] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'localhost' and this object Jan 13 20:43:12.129727 kubelet[2877]: E0113 20:43:12.129672 2877 reflector.go:150] object-"tigera-operator"/"kubernetes-services-endpoint": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'localhost' and this object Jan 13 20:43:12.160859 kubelet[2877]: I0113 20:43:12.160825 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f7ffb744-5914-485f-acf0-acc03c648095-var-lib-calico\") pod \"tigera-operator-7bc55997bb-rr77c\" (UID: \"f7ffb744-5914-485f-acf0-acc03c648095\") " pod="tigera-operator/tigera-operator-7bc55997bb-rr77c" Jan 13 20:43:12.161253 kubelet[2877]: I0113 20:43:12.160858 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98nn6\" (UniqueName: \"kubernetes.io/projected/f7ffb744-5914-485f-acf0-acc03c648095-kube-api-access-98nn6\") pod \"tigera-operator-7bc55997bb-rr77c\" (UID: \"f7ffb744-5914-485f-acf0-acc03c648095\") " pod="tigera-operator/tigera-operator-7bc55997bb-rr77c" Jan 13 20:43:12.277954 containerd[1551]: time="2025-01-13T20:43:12.277518929Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cbd9t,Uid:223e9b5b-8eed-4f85-a368-15cdf253df06,Namespace:kube-system,Attempt:0,}" Jan 13 20:43:12.297126 containerd[1551]: time="2025-01-13T20:43:12.297000109Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:43:12.297126 containerd[1551]: time="2025-01-13T20:43:12.297056309Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:43:12.297126 containerd[1551]: time="2025-01-13T20:43:12.297072208Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:12.297576 containerd[1551]: time="2025-01-13T20:43:12.297508420Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:12.316847 systemd[1]: Started cri-containerd-532ac9fd268a58310039438a641f392df4cda8d4eb076aa8351a65de5efa4524.scope - libcontainer container 532ac9fd268a58310039438a641f392df4cda8d4eb076aa8351a65de5efa4524. Jan 13 20:43:12.331492 containerd[1551]: time="2025-01-13T20:43:12.331461935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cbd9t,Uid:223e9b5b-8eed-4f85-a368-15cdf253df06,Namespace:kube-system,Attempt:0,} returns sandbox id \"532ac9fd268a58310039438a641f392df4cda8d4eb076aa8351a65de5efa4524\"" Jan 13 20:43:12.335163 containerd[1551]: time="2025-01-13T20:43:12.335134148Z" level=info msg="CreateContainer within sandbox \"532ac9fd268a58310039438a641f392df4cda8d4eb076aa8351a65de5efa4524\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 13 20:43:12.350004 containerd[1551]: time="2025-01-13T20:43:12.349979466Z" level=info msg="CreateContainer within sandbox \"532ac9fd268a58310039438a641f392df4cda8d4eb076aa8351a65de5efa4524\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"2d7ec5bfe5648830253254c1c1879aeb95d1b9d6b46f8e627fe42f25e932f5bd\"" Jan 13 20:43:12.351212 containerd[1551]: time="2025-01-13T20:43:12.350435562Z" level=info msg="StartContainer for \"2d7ec5bfe5648830253254c1c1879aeb95d1b9d6b46f8e627fe42f25e932f5bd\"" Jan 13 20:43:12.367848 systemd[1]: Started cri-containerd-2d7ec5bfe5648830253254c1c1879aeb95d1b9d6b46f8e627fe42f25e932f5bd.scope - libcontainer container 2d7ec5bfe5648830253254c1c1879aeb95d1b9d6b46f8e627fe42f25e932f5bd. Jan 13 20:43:12.384801 containerd[1551]: time="2025-01-13T20:43:12.384773644Z" level=info msg="StartContainer for \"2d7ec5bfe5648830253254c1c1879aeb95d1b9d6b46f8e627fe42f25e932f5bd\" returns successfully" Jan 13 20:43:12.620354 kubelet[2877]: I0113 20:43:12.620196 2877 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-cbd9t" podStartSLOduration=1.6201821939999999 podStartE2EDuration="1.620182194s" podCreationTimestamp="2025-01-13 20:43:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:43:12.620104675 +0000 UTC m=+16.107146612" watchObservedRunningTime="2025-01-13 20:43:12.620182194 +0000 UTC m=+16.107224128" Jan 13 20:43:13.167422 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3264926471.mount: Deactivated successfully. Jan 13 20:43:13.332633 containerd[1551]: time="2025-01-13T20:43:13.332606739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-rr77c,Uid:f7ffb744-5914-485f-acf0-acc03c648095,Namespace:tigera-operator,Attempt:0,}" Jan 13 20:43:13.355551 containerd[1551]: time="2025-01-13T20:43:13.355431840Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:43:13.355551 containerd[1551]: time="2025-01-13T20:43:13.355506523Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:43:13.355551 containerd[1551]: time="2025-01-13T20:43:13.355524947Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:13.355773 containerd[1551]: time="2025-01-13T20:43:13.355592598Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:13.373857 systemd[1]: Started cri-containerd-fd83e7675e5791f46fbf5c588e282c412bc0d22642e6df9947f661b42b1736a5.scope - libcontainer container fd83e7675e5791f46fbf5c588e282c412bc0d22642e6df9947f661b42b1736a5. Jan 13 20:43:13.406147 containerd[1551]: time="2025-01-13T20:43:13.406097557Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-rr77c,Uid:f7ffb744-5914-485f-acf0-acc03c648095,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"fd83e7675e5791f46fbf5c588e282c412bc0d22642e6df9947f661b42b1736a5\"" Jan 13 20:43:13.411697 containerd[1551]: time="2025-01-13T20:43:13.411630091Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Jan 13 20:43:17.482874 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount498733275.mount: Deactivated successfully. Jan 13 20:43:17.880980 containerd[1551]: time="2025-01-13T20:43:17.880664717Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:17.881222 containerd[1551]: time="2025-01-13T20:43:17.881172778Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21764325" Jan 13 20:43:17.881604 containerd[1551]: time="2025-01-13T20:43:17.881494665Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:17.882667 containerd[1551]: time="2025-01-13T20:43:17.882640234Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:17.883374 containerd[1551]: time="2025-01-13T20:43:17.883103193Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 4.47145043s" Jan 13 20:43:17.883374 containerd[1551]: time="2025-01-13T20:43:17.883120606Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Jan 13 20:43:17.888180 containerd[1551]: time="2025-01-13T20:43:17.888075654Z" level=info msg="CreateContainer within sandbox \"fd83e7675e5791f46fbf5c588e282c412bc0d22642e6df9947f661b42b1736a5\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 13 20:43:17.895175 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3240480173.mount: Deactivated successfully. Jan 13 20:43:17.896253 containerd[1551]: time="2025-01-13T20:43:17.896201551Z" level=info msg="CreateContainer within sandbox \"fd83e7675e5791f46fbf5c588e282c412bc0d22642e6df9947f661b42b1736a5\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"37c81760070507c64553df2b459801da20d4fb5b3d026fe4b1766380ea947d4b\"" Jan 13 20:43:17.900244 containerd[1551]: time="2025-01-13T20:43:17.900220166Z" level=info msg="StartContainer for \"37c81760070507c64553df2b459801da20d4fb5b3d026fe4b1766380ea947d4b\"" Jan 13 20:43:17.920854 systemd[1]: Started cri-containerd-37c81760070507c64553df2b459801da20d4fb5b3d026fe4b1766380ea947d4b.scope - libcontainer container 37c81760070507c64553df2b459801da20d4fb5b3d026fe4b1766380ea947d4b. Jan 13 20:43:17.935549 containerd[1551]: time="2025-01-13T20:43:17.935527126Z" level=info msg="StartContainer for \"37c81760070507c64553df2b459801da20d4fb5b3d026fe4b1766380ea947d4b\" returns successfully" Jan 13 20:43:20.954194 kubelet[2877]: I0113 20:43:20.952393 2877 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7bc55997bb-rr77c" podStartSLOduration=4.470943482 podStartE2EDuration="8.951123784s" podCreationTimestamp="2025-01-13 20:43:12 +0000 UTC" firstStartedPulling="2025-01-13 20:43:13.406799858 +0000 UTC m=+16.893841791" lastFinishedPulling="2025-01-13 20:43:17.88698016 +0000 UTC m=+21.374022093" observedRunningTime="2025-01-13 20:43:18.694481861 +0000 UTC m=+22.181523809" watchObservedRunningTime="2025-01-13 20:43:20.951123784 +0000 UTC m=+24.438165726" Jan 13 20:43:20.954194 kubelet[2877]: I0113 20:43:20.953661 2877 topology_manager.go:215] "Topology Admit Handler" podUID="f2c01dc2-87d2-4540-aec9-1967b9f8eb4c" podNamespace="calico-system" podName="calico-typha-79b9fb9587-ccm8c" Jan 13 20:43:20.961546 systemd[1]: Created slice kubepods-besteffort-podf2c01dc2_87d2_4540_aec9_1967b9f8eb4c.slice - libcontainer container kubepods-besteffort-podf2c01dc2_87d2_4540_aec9_1967b9f8eb4c.slice. Jan 13 20:43:21.000403 kubelet[2877]: I0113 20:43:21.000372 2877 topology_manager.go:215] "Topology Admit Handler" podUID="4fcc65a9-735a-4ed2-b401-780aafec168a" podNamespace="calico-system" podName="calico-node-6zt9t" Jan 13 20:43:21.005861 systemd[1]: Created slice kubepods-besteffort-pod4fcc65a9_735a_4ed2_b401_780aafec168a.slice - libcontainer container kubepods-besteffort-pod4fcc65a9_735a_4ed2_b401_780aafec168a.slice. Jan 13 20:43:21.017395 kubelet[2877]: I0113 20:43:21.017327 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2c01dc2-87d2-4540-aec9-1967b9f8eb4c-tigera-ca-bundle\") pod \"calico-typha-79b9fb9587-ccm8c\" (UID: \"f2c01dc2-87d2-4540-aec9-1967b9f8eb4c\") " pod="calico-system/calico-typha-79b9fb9587-ccm8c" Jan 13 20:43:21.017555 kubelet[2877]: I0113 20:43:21.017539 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4fcc65a9-735a-4ed2-b401-780aafec168a-lib-modules\") pod \"calico-node-6zt9t\" (UID: \"4fcc65a9-735a-4ed2-b401-780aafec168a\") " pod="calico-system/calico-node-6zt9t" Jan 13 20:43:21.017613 kubelet[2877]: I0113 20:43:21.017606 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f2c01dc2-87d2-4540-aec9-1967b9f8eb4c-typha-certs\") pod \"calico-typha-79b9fb9587-ccm8c\" (UID: \"f2c01dc2-87d2-4540-aec9-1967b9f8eb4c\") " pod="calico-system/calico-typha-79b9fb9587-ccm8c" Jan 13 20:43:21.017677 kubelet[2877]: I0113 20:43:21.017660 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v8h4\" (UniqueName: \"kubernetes.io/projected/f2c01dc2-87d2-4540-aec9-1967b9f8eb4c-kube-api-access-5v8h4\") pod \"calico-typha-79b9fb9587-ccm8c\" (UID: \"f2c01dc2-87d2-4540-aec9-1967b9f8eb4c\") " pod="calico-system/calico-typha-79b9fb9587-ccm8c" Jan 13 20:43:21.017718 kubelet[2877]: I0113 20:43:21.017712 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4fcc65a9-735a-4ed2-b401-780aafec168a-xtables-lock\") pod \"calico-node-6zt9t\" (UID: \"4fcc65a9-735a-4ed2-b401-780aafec168a\") " pod="calico-system/calico-node-6zt9t" Jan 13 20:43:21.114456 kubelet[2877]: I0113 20:43:21.114430 2877 topology_manager.go:215] "Topology Admit Handler" podUID="3c210180-a974-4da0-9ca1-18a6f94f39f4" podNamespace="calico-system" podName="csi-node-driver-8prdm" Jan 13 20:43:21.114629 kubelet[2877]: E0113 20:43:21.114616 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8prdm" podUID="3c210180-a974-4da0-9ca1-18a6f94f39f4" Jan 13 20:43:21.119003 kubelet[2877]: I0113 20:43:21.118778 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fcc65a9-735a-4ed2-b401-780aafec168a-tigera-ca-bundle\") pod \"calico-node-6zt9t\" (UID: \"4fcc65a9-735a-4ed2-b401-780aafec168a\") " pod="calico-system/calico-node-6zt9t" Jan 13 20:43:21.119003 kubelet[2877]: I0113 20:43:21.118800 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/4fcc65a9-735a-4ed2-b401-780aafec168a-var-run-calico\") pod \"calico-node-6zt9t\" (UID: \"4fcc65a9-735a-4ed2-b401-780aafec168a\") " pod="calico-system/calico-node-6zt9t" Jan 13 20:43:21.119003 kubelet[2877]: I0113 20:43:21.118818 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/4fcc65a9-735a-4ed2-b401-780aafec168a-cni-bin-dir\") pod \"calico-node-6zt9t\" (UID: \"4fcc65a9-735a-4ed2-b401-780aafec168a\") " pod="calico-system/calico-node-6zt9t" Jan 13 20:43:21.119003 kubelet[2877]: I0113 20:43:21.118827 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/4fcc65a9-735a-4ed2-b401-780aafec168a-cni-net-dir\") pod \"calico-node-6zt9t\" (UID: \"4fcc65a9-735a-4ed2-b401-780aafec168a\") " pod="calico-system/calico-node-6zt9t" Jan 13 20:43:21.119003 kubelet[2877]: I0113 20:43:21.118837 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m26mx\" (UniqueName: \"kubernetes.io/projected/4fcc65a9-735a-4ed2-b401-780aafec168a-kube-api-access-m26mx\") pod \"calico-node-6zt9t\" (UID: \"4fcc65a9-735a-4ed2-b401-780aafec168a\") " pod="calico-system/calico-node-6zt9t" Jan 13 20:43:21.119166 kubelet[2877]: I0113 20:43:21.118852 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/4fcc65a9-735a-4ed2-b401-780aafec168a-cni-log-dir\") pod \"calico-node-6zt9t\" (UID: \"4fcc65a9-735a-4ed2-b401-780aafec168a\") " pod="calico-system/calico-node-6zt9t" Jan 13 20:43:21.119166 kubelet[2877]: I0113 20:43:21.118861 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/4fcc65a9-735a-4ed2-b401-780aafec168a-node-certs\") pod \"calico-node-6zt9t\" (UID: \"4fcc65a9-735a-4ed2-b401-780aafec168a\") " pod="calico-system/calico-node-6zt9t" Jan 13 20:43:21.119166 kubelet[2877]: I0113 20:43:21.118870 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/4fcc65a9-735a-4ed2-b401-780aafec168a-flexvol-driver-host\") pod \"calico-node-6zt9t\" (UID: \"4fcc65a9-735a-4ed2-b401-780aafec168a\") " pod="calico-system/calico-node-6zt9t" Jan 13 20:43:21.119166 kubelet[2877]: I0113 20:43:21.118889 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4fcc65a9-735a-4ed2-b401-780aafec168a-var-lib-calico\") pod \"calico-node-6zt9t\" (UID: \"4fcc65a9-735a-4ed2-b401-780aafec168a\") " pod="calico-system/calico-node-6zt9t" Jan 13 20:43:21.119166 kubelet[2877]: I0113 20:43:21.118898 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/4fcc65a9-735a-4ed2-b401-780aafec168a-policysync\") pod \"calico-node-6zt9t\" (UID: \"4fcc65a9-735a-4ed2-b401-780aafec168a\") " pod="calico-system/calico-node-6zt9t" Jan 13 20:43:21.219824 kubelet[2877]: I0113 20:43:21.219532 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3c210180-a974-4da0-9ca1-18a6f94f39f4-socket-dir\") pod \"csi-node-driver-8prdm\" (UID: \"3c210180-a974-4da0-9ca1-18a6f94f39f4\") " pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:21.219824 kubelet[2877]: I0113 20:43:21.219565 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3c210180-a974-4da0-9ca1-18a6f94f39f4-registration-dir\") pod \"csi-node-driver-8prdm\" (UID: \"3c210180-a974-4da0-9ca1-18a6f94f39f4\") " pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:21.219824 kubelet[2877]: I0113 20:43:21.219597 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lhxf\" (UniqueName: \"kubernetes.io/projected/3c210180-a974-4da0-9ca1-18a6f94f39f4-kube-api-access-4lhxf\") pod \"csi-node-driver-8prdm\" (UID: \"3c210180-a974-4da0-9ca1-18a6f94f39f4\") " pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:21.219824 kubelet[2877]: I0113 20:43:21.219663 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/3c210180-a974-4da0-9ca1-18a6f94f39f4-varrun\") pod \"csi-node-driver-8prdm\" (UID: \"3c210180-a974-4da0-9ca1-18a6f94f39f4\") " pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:21.219824 kubelet[2877]: I0113 20:43:21.219694 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c210180-a974-4da0-9ca1-18a6f94f39f4-kubelet-dir\") pod \"csi-node-driver-8prdm\" (UID: \"3c210180-a974-4da0-9ca1-18a6f94f39f4\") " pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:21.225090 kubelet[2877]: E0113 20:43:21.225070 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.225090 kubelet[2877]: W0113 20:43:21.225084 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.225203 kubelet[2877]: E0113 20:43:21.225096 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.230193 kubelet[2877]: E0113 20:43:21.230139 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.230193 kubelet[2877]: W0113 20:43:21.230151 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.230193 kubelet[2877]: E0113 20:43:21.230164 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.265808 containerd[1551]: time="2025-01-13T20:43:21.265679326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-79b9fb9587-ccm8c,Uid:f2c01dc2-87d2-4540-aec9-1967b9f8eb4c,Namespace:calico-system,Attempt:0,}" Jan 13 20:43:21.292389 containerd[1551]: time="2025-01-13T20:43:21.291685396Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:43:21.292496 containerd[1551]: time="2025-01-13T20:43:21.292410454Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:43:21.292704 containerd[1551]: time="2025-01-13T20:43:21.292666003Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:21.293387 containerd[1551]: time="2025-01-13T20:43:21.293360564Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:21.305854 systemd[1]: Started cri-containerd-f5860bbf5652b4c03370454c99ad0c9117ebf3f76046819c9c0c6d49770d6ccd.scope - libcontainer container f5860bbf5652b4c03370454c99ad0c9117ebf3f76046819c9c0c6d49770d6ccd. Jan 13 20:43:21.308149 containerd[1551]: time="2025-01-13T20:43:21.308123309Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6zt9t,Uid:4fcc65a9-735a-4ed2-b401-780aafec168a,Namespace:calico-system,Attempt:0,}" Jan 13 20:43:21.320203 kubelet[2877]: E0113 20:43:21.320113 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.320203 kubelet[2877]: W0113 20:43:21.320125 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.320203 kubelet[2877]: E0113 20:43:21.320143 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.320356 kubelet[2877]: E0113 20:43:21.320262 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.320468 kubelet[2877]: W0113 20:43:21.320386 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.320468 kubelet[2877]: E0113 20:43:21.320396 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.320591 kubelet[2877]: E0113 20:43:21.320496 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.320591 kubelet[2877]: W0113 20:43:21.320500 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.320591 kubelet[2877]: E0113 20:43:21.320506 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.320854 kubelet[2877]: E0113 20:43:21.320758 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.320854 kubelet[2877]: W0113 20:43:21.320766 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.320854 kubelet[2877]: E0113 20:43:21.320778 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.321063 kubelet[2877]: E0113 20:43:21.321024 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.321063 kubelet[2877]: W0113 20:43:21.321031 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.321063 kubelet[2877]: E0113 20:43:21.321043 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.321965 kubelet[2877]: E0113 20:43:21.321321 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.321965 kubelet[2877]: W0113 20:43:21.321326 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.321965 kubelet[2877]: E0113 20:43:21.321392 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.321965 kubelet[2877]: E0113 20:43:21.321880 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.321965 kubelet[2877]: W0113 20:43:21.321888 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.322244 kubelet[2877]: E0113 20:43:21.322176 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.322244 kubelet[2877]: W0113 20:43:21.322181 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.322244 kubelet[2877]: E0113 20:43:21.322188 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.322358 kubelet[2877]: E0113 20:43:21.322316 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.322453 kubelet[2877]: E0113 20:43:21.322401 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.322453 kubelet[2877]: W0113 20:43:21.322406 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.322453 kubelet[2877]: E0113 20:43:21.322411 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.322769 kubelet[2877]: E0113 20:43:21.322656 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.322769 kubelet[2877]: W0113 20:43:21.322663 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.322769 kubelet[2877]: E0113 20:43:21.322669 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.323009 kubelet[2877]: E0113 20:43:21.322932 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.323009 kubelet[2877]: W0113 20:43:21.322937 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.323117 kubelet[2877]: E0113 20:43:21.323083 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.323219 kubelet[2877]: E0113 20:43:21.323192 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.323219 kubelet[2877]: W0113 20:43:21.323198 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.323380 kubelet[2877]: E0113 20:43:21.323327 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.323456 kubelet[2877]: E0113 20:43:21.323418 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.323456 kubelet[2877]: W0113 20:43:21.323425 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.323456 kubelet[2877]: E0113 20:43:21.323432 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.323673 kubelet[2877]: E0113 20:43:21.323624 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.323673 kubelet[2877]: W0113 20:43:21.323630 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.323673 kubelet[2877]: E0113 20:43:21.323641 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.324004 kubelet[2877]: E0113 20:43:21.323996 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.324074 kubelet[2877]: W0113 20:43:21.324037 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.324074 kubelet[2877]: E0113 20:43:21.324046 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.325103 kubelet[2877]: E0113 20:43:21.324916 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.325103 kubelet[2877]: W0113 20:43:21.324923 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.325103 kubelet[2877]: E0113 20:43:21.324934 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.325242 kubelet[2877]: E0113 20:43:21.325210 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.325242 kubelet[2877]: W0113 20:43:21.325229 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.325943 kubelet[2877]: E0113 20:43:21.325387 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.326865 kubelet[2877]: E0113 20:43:21.326843 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.326865 kubelet[2877]: W0113 20:43:21.326854 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.327890 kubelet[2877]: E0113 20:43:21.327782 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.328995 kubelet[2877]: E0113 20:43:21.328055 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.328995 kubelet[2877]: W0113 20:43:21.328063 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.330094 kubelet[2877]: E0113 20:43:21.330083 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.330209 kubelet[2877]: W0113 20:43:21.330154 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.330413 kubelet[2877]: E0113 20:43:21.330404 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.330479 kubelet[2877]: E0113 20:43:21.330459 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.330656 kubelet[2877]: E0113 20:43:21.330649 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.330743 kubelet[2877]: W0113 20:43:21.330701 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.330743 kubelet[2877]: E0113 20:43:21.330713 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.331080 kubelet[2877]: E0113 20:43:21.331074 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.331125 kubelet[2877]: W0113 20:43:21.331119 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.331171 kubelet[2877]: E0113 20:43:21.331165 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.331458 kubelet[2877]: E0113 20:43:21.331451 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.331494 kubelet[2877]: W0113 20:43:21.331489 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.331543 kubelet[2877]: E0113 20:43:21.331536 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.332257 kubelet[2877]: E0113 20:43:21.332249 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.332301 kubelet[2877]: W0113 20:43:21.332295 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.332351 kubelet[2877]: E0113 20:43:21.332344 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.334408 kubelet[2877]: E0113 20:43:21.334297 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.334408 kubelet[2877]: W0113 20:43:21.334323 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.334408 kubelet[2877]: E0113 20:43:21.334341 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.334545 kubelet[2877]: E0113 20:43:21.334539 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:21.334587 kubelet[2877]: W0113 20:43:21.334574 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:21.334633 kubelet[2877]: E0113 20:43:21.334581 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:21.336046 containerd[1551]: time="2025-01-13T20:43:21.335975188Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:43:21.336156 containerd[1551]: time="2025-01-13T20:43:21.336135420Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:43:21.336217 containerd[1551]: time="2025-01-13T20:43:21.336205321Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:21.336349 containerd[1551]: time="2025-01-13T20:43:21.336309152Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:21.349936 systemd[1]: Started cri-containerd-923017f51f4a302cb6f0cb67a61eae3be224b125db930c755978551b526b01c8.scope - libcontainer container 923017f51f4a302cb6f0cb67a61eae3be224b125db930c755978551b526b01c8. Jan 13 20:43:21.354336 containerd[1551]: time="2025-01-13T20:43:21.354301824Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-79b9fb9587-ccm8c,Uid:f2c01dc2-87d2-4540-aec9-1967b9f8eb4c,Namespace:calico-system,Attempt:0,} returns sandbox id \"f5860bbf5652b4c03370454c99ad0c9117ebf3f76046819c9c0c6d49770d6ccd\"" Jan 13 20:43:21.358134 containerd[1551]: time="2025-01-13T20:43:21.357844959Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Jan 13 20:43:21.368273 containerd[1551]: time="2025-01-13T20:43:21.368248041Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6zt9t,Uid:4fcc65a9-735a-4ed2-b401-780aafec168a,Namespace:calico-system,Attempt:0,} returns sandbox id \"923017f51f4a302cb6f0cb67a61eae3be224b125db930c755978551b526b01c8\"" Jan 13 20:43:22.841812 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount857781768.mount: Deactivated successfully. Jan 13 20:43:23.606448 kubelet[2877]: E0113 20:43:23.606383 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8prdm" podUID="3c210180-a974-4da0-9ca1-18a6f94f39f4" Jan 13 20:43:23.625154 containerd[1551]: time="2025-01-13T20:43:23.624723472Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:23.625452 containerd[1551]: time="2025-01-13T20:43:23.625410743Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=31343363" Jan 13 20:43:23.625762 containerd[1551]: time="2025-01-13T20:43:23.625747810Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:23.626756 containerd[1551]: time="2025-01-13T20:43:23.626744316Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:23.627127 containerd[1551]: time="2025-01-13T20:43:23.627111412Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 2.268849761s" Jan 13 20:43:23.627160 containerd[1551]: time="2025-01-13T20:43:23.627128229Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Jan 13 20:43:23.630919 containerd[1551]: time="2025-01-13T20:43:23.630908242Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Jan 13 20:43:23.641758 containerd[1551]: time="2025-01-13T20:43:23.641486486Z" level=info msg="CreateContainer within sandbox \"f5860bbf5652b4c03370454c99ad0c9117ebf3f76046819c9c0c6d49770d6ccd\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 13 20:43:23.647785 containerd[1551]: time="2025-01-13T20:43:23.647759977Z" level=info msg="CreateContainer within sandbox \"f5860bbf5652b4c03370454c99ad0c9117ebf3f76046819c9c0c6d49770d6ccd\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"e9ef3f538f2e25b91492d26cf3cf15f33556a7509c58fc020fddbd9e861eac88\"" Jan 13 20:43:23.648336 containerd[1551]: time="2025-01-13T20:43:23.648236276Z" level=info msg="StartContainer for \"e9ef3f538f2e25b91492d26cf3cf15f33556a7509c58fc020fddbd9e861eac88\"" Jan 13 20:43:23.673886 systemd[1]: Started cri-containerd-e9ef3f538f2e25b91492d26cf3cf15f33556a7509c58fc020fddbd9e861eac88.scope - libcontainer container e9ef3f538f2e25b91492d26cf3cf15f33556a7509c58fc020fddbd9e861eac88. Jan 13 20:43:23.708933 containerd[1551]: time="2025-01-13T20:43:23.708908835Z" level=info msg="StartContainer for \"e9ef3f538f2e25b91492d26cf3cf15f33556a7509c58fc020fddbd9e861eac88\" returns successfully" Jan 13 20:43:24.697932 kubelet[2877]: I0113 20:43:24.697898 2877 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-79b9fb9587-ccm8c" podStartSLOduration=2.42230621 podStartE2EDuration="4.697879766s" podCreationTimestamp="2025-01-13 20:43:20 +0000 UTC" firstStartedPulling="2025-01-13 20:43:21.355228835 +0000 UTC m=+24.842270766" lastFinishedPulling="2025-01-13 20:43:23.630802389 +0000 UTC m=+27.117844322" observedRunningTime="2025-01-13 20:43:24.697715793 +0000 UTC m=+28.184757729" watchObservedRunningTime="2025-01-13 20:43:24.697879766 +0000 UTC m=+28.184921701" Jan 13 20:43:24.744363 kubelet[2877]: E0113 20:43:24.744258 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.744363 kubelet[2877]: W0113 20:43:24.744280 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.744363 kubelet[2877]: E0113 20:43:24.744306 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.744519 kubelet[2877]: E0113 20:43:24.744422 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.744519 kubelet[2877]: W0113 20:43:24.744444 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.744519 kubelet[2877]: E0113 20:43:24.744450 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.744605 kubelet[2877]: E0113 20:43:24.744542 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.744605 kubelet[2877]: W0113 20:43:24.744546 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.744605 kubelet[2877]: E0113 20:43:24.744551 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.744674 kubelet[2877]: E0113 20:43:24.744652 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.744674 kubelet[2877]: W0113 20:43:24.744656 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.744674 kubelet[2877]: E0113 20:43:24.744661 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.745181 kubelet[2877]: E0113 20:43:24.744773 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.745181 kubelet[2877]: W0113 20:43:24.744778 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.745181 kubelet[2877]: E0113 20:43:24.744783 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.745181 kubelet[2877]: E0113 20:43:24.744868 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.745181 kubelet[2877]: W0113 20:43:24.744872 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.745181 kubelet[2877]: E0113 20:43:24.744877 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.745181 kubelet[2877]: E0113 20:43:24.744999 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.745181 kubelet[2877]: W0113 20:43:24.745003 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.745181 kubelet[2877]: E0113 20:43:24.745008 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.745181 kubelet[2877]: E0113 20:43:24.745119 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.745385 kubelet[2877]: W0113 20:43:24.745125 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.745385 kubelet[2877]: E0113 20:43:24.745136 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.745385 kubelet[2877]: E0113 20:43:24.745228 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.745385 kubelet[2877]: W0113 20:43:24.745232 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.745385 kubelet[2877]: E0113 20:43:24.745237 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.745385 kubelet[2877]: E0113 20:43:24.745325 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.745385 kubelet[2877]: W0113 20:43:24.745338 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.745385 kubelet[2877]: E0113 20:43:24.745342 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.746223 kubelet[2877]: E0113 20:43:24.745428 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.746223 kubelet[2877]: W0113 20:43:24.745433 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.746223 kubelet[2877]: E0113 20:43:24.745438 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.746223 kubelet[2877]: E0113 20:43:24.745532 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.746223 kubelet[2877]: W0113 20:43:24.745536 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.746223 kubelet[2877]: E0113 20:43:24.745540 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.746223 kubelet[2877]: E0113 20:43:24.745631 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.746223 kubelet[2877]: W0113 20:43:24.745635 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.746223 kubelet[2877]: E0113 20:43:24.745650 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.746223 kubelet[2877]: E0113 20:43:24.745751 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.746410 kubelet[2877]: W0113 20:43:24.745757 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.746410 kubelet[2877]: E0113 20:43:24.745761 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.746410 kubelet[2877]: E0113 20:43:24.745868 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.746410 kubelet[2877]: W0113 20:43:24.745873 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.746410 kubelet[2877]: E0113 20:43:24.745882 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.748044 kubelet[2877]: E0113 20:43:24.748035 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.748150 kubelet[2877]: W0113 20:43:24.748085 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.748150 kubelet[2877]: E0113 20:43:24.748096 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.748222 kubelet[2877]: E0113 20:43:24.748216 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.748291 kubelet[2877]: W0113 20:43:24.748252 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.748291 kubelet[2877]: E0113 20:43:24.748265 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.748381 kubelet[2877]: E0113 20:43:24.748366 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.748381 kubelet[2877]: W0113 20:43:24.748376 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.748920 kubelet[2877]: E0113 20:43:24.748397 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.748920 kubelet[2877]: E0113 20:43:24.748486 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.748920 kubelet[2877]: W0113 20:43:24.748490 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.748920 kubelet[2877]: E0113 20:43:24.748496 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.748920 kubelet[2877]: E0113 20:43:24.748610 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.748920 kubelet[2877]: W0113 20:43:24.748614 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.748920 kubelet[2877]: E0113 20:43:24.748625 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.748920 kubelet[2877]: E0113 20:43:24.748774 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.748920 kubelet[2877]: W0113 20:43:24.748779 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.748920 kubelet[2877]: E0113 20:43:24.748790 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.749082 kubelet[2877]: E0113 20:43:24.748915 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.749082 kubelet[2877]: W0113 20:43:24.748921 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.749082 kubelet[2877]: E0113 20:43:24.748926 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.749147 kubelet[2877]: E0113 20:43:24.749135 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.749147 kubelet[2877]: W0113 20:43:24.749144 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.749201 kubelet[2877]: E0113 20:43:24.749158 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.749252 kubelet[2877]: E0113 20:43:24.749242 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.749252 kubelet[2877]: W0113 20:43:24.749250 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.749298 kubelet[2877]: E0113 20:43:24.749258 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.749375 kubelet[2877]: E0113 20:43:24.749366 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.749375 kubelet[2877]: W0113 20:43:24.749373 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.749427 kubelet[2877]: E0113 20:43:24.749382 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.749494 kubelet[2877]: E0113 20:43:24.749485 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.749494 kubelet[2877]: W0113 20:43:24.749492 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.749537 kubelet[2877]: E0113 20:43:24.749512 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.749676 kubelet[2877]: E0113 20:43:24.749666 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.749676 kubelet[2877]: W0113 20:43:24.749673 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.749719 kubelet[2877]: E0113 20:43:24.749681 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.749814 kubelet[2877]: E0113 20:43:24.749804 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.749841 kubelet[2877]: W0113 20:43:24.749822 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.749841 kubelet[2877]: E0113 20:43:24.749834 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.749942 kubelet[2877]: E0113 20:43:24.749933 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.749942 kubelet[2877]: W0113 20:43:24.749942 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.749987 kubelet[2877]: E0113 20:43:24.749949 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.750045 kubelet[2877]: E0113 20:43:24.750034 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.750045 kubelet[2877]: W0113 20:43:24.750043 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.750091 kubelet[2877]: E0113 20:43:24.750047 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.750144 kubelet[2877]: E0113 20:43:24.750136 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.750144 kubelet[2877]: W0113 20:43:24.750142 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.750186 kubelet[2877]: E0113 20:43:24.750153 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.750384 kubelet[2877]: E0113 20:43:24.750373 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.750384 kubelet[2877]: W0113 20:43:24.750381 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.750430 kubelet[2877]: E0113 20:43:24.750388 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:24.750487 kubelet[2877]: E0113 20:43:24.750476 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:43:24.750487 kubelet[2877]: W0113 20:43:24.750484 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:43:24.750531 kubelet[2877]: E0113 20:43:24.750489 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:43:25.118378 containerd[1551]: time="2025-01-13T20:43:25.118302668Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:25.130174 containerd[1551]: time="2025-01-13T20:43:25.130132055Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5362121" Jan 13 20:43:25.135412 containerd[1551]: time="2025-01-13T20:43:25.135398529Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:25.150445 containerd[1551]: time="2025-01-13T20:43:25.150381401Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:25.156899 containerd[1551]: time="2025-01-13T20:43:25.156801207Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.525808051s" Jan 13 20:43:25.156899 containerd[1551]: time="2025-01-13T20:43:25.156827826Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Jan 13 20:43:25.158324 containerd[1551]: time="2025-01-13T20:43:25.158218587Z" level=info msg="CreateContainer within sandbox \"923017f51f4a302cb6f0cb67a61eae3be224b125db930c755978551b526b01c8\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 13 20:43:25.171551 containerd[1551]: time="2025-01-13T20:43:25.171525552Z" level=info msg="CreateContainer within sandbox \"923017f51f4a302cb6f0cb67a61eae3be224b125db930c755978551b526b01c8\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"3ce333f2da35e5b5c946260831caf47d680818218be01c832e615cc51c5dad05\"" Jan 13 20:43:25.172818 containerd[1551]: time="2025-01-13T20:43:25.172075447Z" level=info msg="StartContainer for \"3ce333f2da35e5b5c946260831caf47d680818218be01c832e615cc51c5dad05\"" Jan 13 20:43:25.212203 systemd[1]: Started cri-containerd-3ce333f2da35e5b5c946260831caf47d680818218be01c832e615cc51c5dad05.scope - libcontainer container 3ce333f2da35e5b5c946260831caf47d680818218be01c832e615cc51c5dad05. Jan 13 20:43:25.237811 containerd[1551]: time="2025-01-13T20:43:25.237786487Z" level=info msg="StartContainer for \"3ce333f2da35e5b5c946260831caf47d680818218be01c832e615cc51c5dad05\" returns successfully" Jan 13 20:43:25.251962 systemd[1]: cri-containerd-3ce333f2da35e5b5c946260831caf47d680818218be01c832e615cc51c5dad05.scope: Deactivated successfully. Jan 13 20:43:25.272410 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3ce333f2da35e5b5c946260831caf47d680818218be01c832e615cc51c5dad05-rootfs.mount: Deactivated successfully. Jan 13 20:43:25.499480 containerd[1551]: time="2025-01-13T20:43:25.488874655Z" level=info msg="shim disconnected" id=3ce333f2da35e5b5c946260831caf47d680818218be01c832e615cc51c5dad05 namespace=k8s.io Jan 13 20:43:25.499480 containerd[1551]: time="2025-01-13T20:43:25.499477917Z" level=warning msg="cleaning up after shim disconnected" id=3ce333f2da35e5b5c946260831caf47d680818218be01c832e615cc51c5dad05 namespace=k8s.io Jan 13 20:43:25.499694 containerd[1551]: time="2025-01-13T20:43:25.499487576Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:43:25.568215 kubelet[2877]: E0113 20:43:25.568014 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8prdm" podUID="3c210180-a974-4da0-9ca1-18a6f94f39f4" Jan 13 20:43:25.694807 kubelet[2877]: I0113 20:43:25.694787 2877 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:43:25.696074 containerd[1551]: time="2025-01-13T20:43:25.696049802Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Jan 13 20:43:27.567993 kubelet[2877]: E0113 20:43:27.567960 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8prdm" podUID="3c210180-a974-4da0-9ca1-18a6f94f39f4" Jan 13 20:43:28.961789 containerd[1551]: time="2025-01-13T20:43:28.961755941Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:28.962251 containerd[1551]: time="2025-01-13T20:43:28.962225167Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Jan 13 20:43:28.962523 containerd[1551]: time="2025-01-13T20:43:28.962398377Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:28.965394 containerd[1551]: time="2025-01-13T20:43:28.965361722Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:28.966371 containerd[1551]: time="2025-01-13T20:43:28.966021506Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 3.269948815s" Jan 13 20:43:28.966371 containerd[1551]: time="2025-01-13T20:43:28.966043417Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Jan 13 20:43:28.968370 containerd[1551]: time="2025-01-13T20:43:28.968353676Z" level=info msg="CreateContainer within sandbox \"923017f51f4a302cb6f0cb67a61eae3be224b125db930c755978551b526b01c8\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 13 20:43:29.000301 containerd[1551]: time="2025-01-13T20:43:29.000271517Z" level=info msg="CreateContainer within sandbox \"923017f51f4a302cb6f0cb67a61eae3be224b125db930c755978551b526b01c8\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a8afb507a2c09f002039e8dad529641bd63bf4c036876374481e53351cd6e046\"" Jan 13 20:43:29.001268 containerd[1551]: time="2025-01-13T20:43:29.000822124Z" level=info msg="StartContainer for \"a8afb507a2c09f002039e8dad529641bd63bf4c036876374481e53351cd6e046\"" Jan 13 20:43:29.059841 systemd[1]: Started cri-containerd-a8afb507a2c09f002039e8dad529641bd63bf4c036876374481e53351cd6e046.scope - libcontainer container a8afb507a2c09f002039e8dad529641bd63bf4c036876374481e53351cd6e046. Jan 13 20:43:29.081392 containerd[1551]: time="2025-01-13T20:43:29.081320203Z" level=info msg="StartContainer for \"a8afb507a2c09f002039e8dad529641bd63bf4c036876374481e53351cd6e046\" returns successfully" Jan 13 20:43:29.567651 kubelet[2877]: E0113 20:43:29.567603 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8prdm" podUID="3c210180-a974-4da0-9ca1-18a6f94f39f4" Jan 13 20:43:30.421007 systemd[1]: cri-containerd-a8afb507a2c09f002039e8dad529641bd63bf4c036876374481e53351cd6e046.scope: Deactivated successfully. Jan 13 20:43:30.441685 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a8afb507a2c09f002039e8dad529641bd63bf4c036876374481e53351cd6e046-rootfs.mount: Deactivated successfully. Jan 13 20:43:30.489013 kubelet[2877]: I0113 20:43:30.488991 2877 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Jan 13 20:43:30.570587 kubelet[2877]: I0113 20:43:30.569655 2877 topology_manager.go:215] "Topology Admit Handler" podUID="f4b04d0c-4a38-4068-8730-7ed8bc9346ef" podNamespace="kube-system" podName="coredns-7db6d8ff4d-f9n77" Jan 13 20:43:30.570861 kubelet[2877]: I0113 20:43:30.570666 2877 topology_manager.go:215] "Topology Admit Handler" podUID="4bc90cb7-3014-45c2-9619-7765993cb1d0" podNamespace="kube-system" podName="coredns-7db6d8ff4d-zww27" Jan 13 20:43:30.577757 systemd[1]: Created slice kubepods-burstable-podf4b04d0c_4a38_4068_8730_7ed8bc9346ef.slice - libcontainer container kubepods-burstable-podf4b04d0c_4a38_4068_8730_7ed8bc9346ef.slice. Jan 13 20:43:30.586689 kubelet[2877]: I0113 20:43:30.583338 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4b04d0c-4a38-4068-8730-7ed8bc9346ef-config-volume\") pod \"coredns-7db6d8ff4d-f9n77\" (UID: \"f4b04d0c-4a38-4068-8730-7ed8bc9346ef\") " pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:30.586689 kubelet[2877]: I0113 20:43:30.583356 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk65z\" (UniqueName: \"kubernetes.io/projected/4bc90cb7-3014-45c2-9619-7765993cb1d0-kube-api-access-sk65z\") pod \"coredns-7db6d8ff4d-zww27\" (UID: \"4bc90cb7-3014-45c2-9619-7765993cb1d0\") " pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:30.586689 kubelet[2877]: I0113 20:43:30.583371 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6qgx\" (UniqueName: \"kubernetes.io/projected/f4b04d0c-4a38-4068-8730-7ed8bc9346ef-kube-api-access-j6qgx\") pod \"coredns-7db6d8ff4d-f9n77\" (UID: \"f4b04d0c-4a38-4068-8730-7ed8bc9346ef\") " pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:30.586689 kubelet[2877]: I0113 20:43:30.583381 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bc90cb7-3014-45c2-9619-7765993cb1d0-config-volume\") pod \"coredns-7db6d8ff4d-zww27\" (UID: \"4bc90cb7-3014-45c2-9619-7765993cb1d0\") " pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:30.603291 kubelet[2877]: I0113 20:43:30.598085 2877 topology_manager.go:215] "Topology Admit Handler" podUID="2fafcb48-274d-4a9e-adca-511adb0459f5" podNamespace="calico-apiserver" podName="calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:30.603291 kubelet[2877]: I0113 20:43:30.599571 2877 topology_manager.go:215] "Topology Admit Handler" podUID="4d21bbae-e62e-4365-ba57-17611b86a846" podNamespace="calico-system" podName="calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:30.603291 kubelet[2877]: I0113 20:43:30.599663 2877 topology_manager.go:215] "Topology Admit Handler" podUID="aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8" podNamespace="calico-apiserver" podName="calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:30.613808 kubelet[2877]: W0113 20:43:30.613757 2877 reflector.go:547] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'localhost' and this object Jan 13 20:43:30.613808 kubelet[2877]: E0113 20:43:30.613783 2877 reflector.go:150] object-"calico-apiserver"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'localhost' and this object Jan 13 20:43:30.615700 systemd[1]: Created slice kubepods-burstable-pod4bc90cb7_3014_45c2_9619_7765993cb1d0.slice - libcontainer container kubepods-burstable-pod4bc90cb7_3014_45c2_9619_7765993cb1d0.slice. Jan 13 20:43:30.632378 kubelet[2877]: W0113 20:43:30.632307 2877 reflector.go:547] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:localhost" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'localhost' and this object Jan 13 20:43:30.632378 kubelet[2877]: E0113 20:43:30.632327 2877 reflector.go:150] object-"calico-apiserver"/"calico-apiserver-certs": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:localhost" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'localhost' and this object Jan 13 20:43:30.636581 systemd[1]: Created slice kubepods-besteffort-pod2fafcb48_274d_4a9e_adca_511adb0459f5.slice - libcontainer container kubepods-besteffort-pod2fafcb48_274d_4a9e_adca_511adb0459f5.slice. Jan 13 20:43:30.640305 systemd[1]: Created slice kubepods-besteffort-podaac83cfd_4e11_46a1_b32b_b7b1ab43a7f8.slice - libcontainer container kubepods-besteffort-podaac83cfd_4e11_46a1_b32b_b7b1ab43a7f8.slice. Jan 13 20:43:30.658793 systemd[1]: Created slice kubepods-besteffort-pod4d21bbae_e62e_4365_ba57_17611b86a846.slice - libcontainer container kubepods-besteffort-pod4d21bbae_e62e_4365_ba57_17611b86a846.slice. Jan 13 20:43:30.684586 kubelet[2877]: I0113 20:43:30.684505 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2fafcb48-274d-4a9e-adca-511adb0459f5-calico-apiserver-certs\") pod \"calico-apiserver-595db757fc-h4ptk\" (UID: \"2fafcb48-274d-4a9e-adca-511adb0459f5\") " pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:30.684586 kubelet[2877]: I0113 20:43:30.684547 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d21bbae-e62e-4365-ba57-17611b86a846-tigera-ca-bundle\") pod \"calico-kube-controllers-69bd7ff58c-rg9pf\" (UID: \"4d21bbae-e62e-4365-ba57-17611b86a846\") " pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:30.684586 kubelet[2877]: I0113 20:43:30.684561 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8-calico-apiserver-certs\") pod \"calico-apiserver-595db757fc-b5vf4\" (UID: \"aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8\") " pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:30.684586 kubelet[2877]: I0113 20:43:30.684571 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mkms\" (UniqueName: \"kubernetes.io/projected/aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8-kube-api-access-9mkms\") pod \"calico-apiserver-595db757fc-b5vf4\" (UID: \"aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8\") " pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:30.684586 kubelet[2877]: I0113 20:43:30.684587 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4rnn\" (UniqueName: \"kubernetes.io/projected/2fafcb48-274d-4a9e-adca-511adb0459f5-kube-api-access-m4rnn\") pod \"calico-apiserver-595db757fc-h4ptk\" (UID: \"2fafcb48-274d-4a9e-adca-511adb0459f5\") " pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:30.688164 kubelet[2877]: I0113 20:43:30.684609 2877 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t7hk\" (UniqueName: \"kubernetes.io/projected/4d21bbae-e62e-4365-ba57-17611b86a846-kube-api-access-4t7hk\") pod \"calico-kube-controllers-69bd7ff58c-rg9pf\" (UID: \"4d21bbae-e62e-4365-ba57-17611b86a846\") " pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:30.764409 containerd[1551]: time="2025-01-13T20:43:30.764356146Z" level=info msg="shim disconnected" id=a8afb507a2c09f002039e8dad529641bd63bf4c036876374481e53351cd6e046 namespace=k8s.io Jan 13 20:43:30.764409 containerd[1551]: time="2025-01-13T20:43:30.764397457Z" level=warning msg="cleaning up after shim disconnected" id=a8afb507a2c09f002039e8dad529641bd63bf4c036876374481e53351cd6e046 namespace=k8s.io Jan 13 20:43:30.764409 containerd[1551]: time="2025-01-13T20:43:30.764404863Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:43:30.880624 containerd[1551]: time="2025-01-13T20:43:30.880596656Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:0,}" Jan 13 20:43:30.935642 containerd[1551]: time="2025-01-13T20:43:30.935514998Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:0,}" Jan 13 20:43:30.961349 containerd[1551]: time="2025-01-13T20:43:30.961130205Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:0,}" Jan 13 20:43:31.130494 containerd[1551]: time="2025-01-13T20:43:31.130461860Z" level=error msg="Failed to destroy network for sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.131229 containerd[1551]: time="2025-01-13T20:43:31.130838725Z" level=error msg="Failed to destroy network for sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.139564 containerd[1551]: time="2025-01-13T20:43:31.139539768Z" level=error msg="encountered an error cleaning up failed sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.139649 containerd[1551]: time="2025-01-13T20:43:31.139637044Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.146518 containerd[1551]: time="2025-01-13T20:43:31.146501297Z" level=error msg="encountered an error cleaning up failed sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.146597 containerd[1551]: time="2025-01-13T20:43:31.146585506Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.146757 containerd[1551]: time="2025-01-13T20:43:31.146746223Z" level=error msg="Failed to destroy network for sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.146959 containerd[1551]: time="2025-01-13T20:43:31.146946317Z" level=error msg="encountered an error cleaning up failed sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.147321 containerd[1551]: time="2025-01-13T20:43:31.147307897Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.154180 kubelet[2877]: E0113 20:43:31.147189 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.155049 kubelet[2877]: E0113 20:43:31.154592 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:31.155049 kubelet[2877]: E0113 20:43:31.154612 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:31.155049 kubelet[2877]: E0113 20:43:31.154639 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-69bd7ff58c-rg9pf_calico-system(4d21bbae-e62e-4365-ba57-17611b86a846)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-69bd7ff58c-rg9pf_calico-system(4d21bbae-e62e-4365-ba57-17611b86a846)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" podUID="4d21bbae-e62e-4365-ba57-17611b86a846" Jan 13 20:43:31.155165 kubelet[2877]: E0113 20:43:31.154439 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.155370 kubelet[2877]: E0113 20:43:31.155206 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:31.155370 kubelet[2877]: E0113 20:43:31.155219 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:31.155370 kubelet[2877]: E0113 20:43:31.155235 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-zww27_kube-system(4bc90cb7-3014-45c2-9619-7765993cb1d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-zww27_kube-system(4bc90cb7-3014-45c2-9619-7765993cb1d0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-zww27" podUID="4bc90cb7-3014-45c2-9619-7765993cb1d0" Jan 13 20:43:31.155453 kubelet[2877]: E0113 20:43:31.147456 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.155453 kubelet[2877]: E0113 20:43:31.155255 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:31.155453 kubelet[2877]: E0113 20:43:31.155262 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:31.155827 kubelet[2877]: E0113 20:43:31.155277 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-f9n77_kube-system(f4b04d0c-4a38-4068-8730-7ed8bc9346ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-f9n77_kube-system(f4b04d0c-4a38-4068-8730-7ed8bc9346ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-f9n77" podUID="f4b04d0c-4a38-4068-8730-7ed8bc9346ef" Jan 13 20:43:31.440470 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc-shm.mount: Deactivated successfully. Jan 13 20:43:31.571569 systemd[1]: Created slice kubepods-besteffort-pod3c210180_a974_4da0_9ca1_18a6f94f39f4.slice - libcontainer container kubepods-besteffort-pod3c210180_a974_4da0_9ca1_18a6f94f39f4.slice. Jan 13 20:43:31.573433 containerd[1551]: time="2025-01-13T20:43:31.573402770Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:0,}" Jan 13 20:43:31.615322 containerd[1551]: time="2025-01-13T20:43:31.615292581Z" level=error msg="Failed to destroy network for sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.615517 containerd[1551]: time="2025-01-13T20:43:31.615498078Z" level=error msg="encountered an error cleaning up failed sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.615544 containerd[1551]: time="2025-01-13T20:43:31.615533831Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.616542 kubelet[2877]: E0113 20:43:31.615651 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.616542 kubelet[2877]: E0113 20:43:31.615690 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:31.616542 kubelet[2877]: E0113 20:43:31.615704 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:31.616744 kubelet[2877]: E0113 20:43:31.615749 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8prdm_calico-system(3c210180-a974-4da0-9ca1-18a6f94f39f4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8prdm_calico-system(3c210180-a974-4da0-9ca1-18a6f94f39f4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8prdm" podUID="3c210180-a974-4da0-9ca1-18a6f94f39f4" Jan 13 20:43:31.616770 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b-shm.mount: Deactivated successfully. Jan 13 20:43:31.719146 kubelet[2877]: I0113 20:43:31.718273 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25" Jan 13 20:43:31.721145 kubelet[2877]: I0113 20:43:31.720564 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5" Jan 13 20:43:31.735600 kubelet[2877]: I0113 20:43:31.735501 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc" Jan 13 20:43:31.744019 containerd[1551]: time="2025-01-13T20:43:31.743775234Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Jan 13 20:43:31.744648 containerd[1551]: time="2025-01-13T20:43:31.744571106Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\"" Jan 13 20:43:31.746681 containerd[1551]: time="2025-01-13T20:43:31.746569799Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\"" Jan 13 20:43:31.750010 containerd[1551]: time="2025-01-13T20:43:31.749971959Z" level=info msg="Ensure that sandbox 3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25 in task-service has been cleanup successfully" Jan 13 20:43:31.750396 containerd[1551]: time="2025-01-13T20:43:31.750295939Z" level=info msg="TearDown network for sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" successfully" Jan 13 20:43:31.750473 containerd[1551]: time="2025-01-13T20:43:31.750433496Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" returns successfully" Jan 13 20:43:31.750667 containerd[1551]: time="2025-01-13T20:43:31.750599976Z" level=info msg="Ensure that sandbox 7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc in task-service has been cleanup successfully" Jan 13 20:43:31.751200 containerd[1551]: time="2025-01-13T20:43:31.751099632Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\"" Jan 13 20:43:31.751663 containerd[1551]: time="2025-01-13T20:43:31.751538992Z" level=info msg="Ensure that sandbox 5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5 in task-service has been cleanup successfully" Jan 13 20:43:31.751978 kubelet[2877]: I0113 20:43:31.751819 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b" Jan 13 20:43:31.753408 containerd[1551]: time="2025-01-13T20:43:31.753379596Z" level=info msg="TearDown network for sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" successfully" Jan 13 20:43:31.753408 containerd[1551]: time="2025-01-13T20:43:31.753392523Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" returns successfully" Jan 13 20:43:31.753455 systemd[1]: run-netns-cni\x2dc9c28ef5\x2d97d2\x2d95e2\x2dc6db\x2d330c8f989b63.mount: Deactivated successfully. Jan 13 20:43:31.753514 systemd[1]: run-netns-cni\x2dcf47852e\x2da972\x2d211a\x2d64d0\x2d6ffbf1b8ab8f.mount: Deactivated successfully. Jan 13 20:43:31.753862 containerd[1551]: time="2025-01-13T20:43:31.753772624Z" level=info msg="TearDown network for sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" successfully" Jan 13 20:43:31.753862 containerd[1551]: time="2025-01-13T20:43:31.753784779Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" returns successfully" Jan 13 20:43:31.754483 containerd[1551]: time="2025-01-13T20:43:31.754182311Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\"" Jan 13 20:43:31.754483 containerd[1551]: time="2025-01-13T20:43:31.754366639Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:1,}" Jan 13 20:43:31.754971 containerd[1551]: time="2025-01-13T20:43:31.754899107Z" level=info msg="Ensure that sandbox e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b in task-service has been cleanup successfully" Jan 13 20:43:31.755041 containerd[1551]: time="2025-01-13T20:43:31.755032157Z" level=info msg="TearDown network for sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" successfully" Jan 13 20:43:31.755209 containerd[1551]: time="2025-01-13T20:43:31.755071720Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" returns successfully" Jan 13 20:43:31.755209 containerd[1551]: time="2025-01-13T20:43:31.755139231Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:1,}" Jan 13 20:43:31.755209 containerd[1551]: time="2025-01-13T20:43:31.755148928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:1,}" Jan 13 20:43:31.756307 containerd[1551]: time="2025-01-13T20:43:31.756290776Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:1,}" Jan 13 20:43:31.821192 kubelet[2877]: E0113 20:43:31.821173 2877 projected.go:294] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 13 20:43:31.821946 kubelet[2877]: E0113 20:43:31.821343 2877 projected.go:200] Error preparing data for projected volume kube-api-access-m4rnn for pod calico-apiserver/calico-apiserver-595db757fc-h4ptk: failed to sync configmap cache: timed out waiting for the condition Jan 13 20:43:31.821946 kubelet[2877]: E0113 20:43:31.821296 2877 secret.go:194] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Jan 13 20:43:31.821946 kubelet[2877]: E0113 20:43:31.821305 2877 secret.go:194] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Jan 13 20:43:31.821946 kubelet[2877]: E0113 20:43:31.821320 2877 projected.go:294] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 13 20:43:31.821946 kubelet[2877]: E0113 20:43:31.821487 2877 projected.go:200] Error preparing data for projected volume kube-api-access-9mkms for pod calico-apiserver/calico-apiserver-595db757fc-b5vf4: failed to sync configmap cache: timed out waiting for the condition Jan 13 20:43:31.821946 kubelet[2877]: E0113 20:43:31.821520 2877 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2fafcb48-274d-4a9e-adca-511adb0459f5-kube-api-access-m4rnn podName:2fafcb48-274d-4a9e-adca-511adb0459f5 nodeName:}" failed. No retries permitted until 2025-01-13 20:43:32.321501961 +0000 UTC m=+35.808543894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-m4rnn" (UniqueName: "kubernetes.io/projected/2fafcb48-274d-4a9e-adca-511adb0459f5-kube-api-access-m4rnn") pod "calico-apiserver-595db757fc-h4ptk" (UID: "2fafcb48-274d-4a9e-adca-511adb0459f5") : failed to sync configmap cache: timed out waiting for the condition Jan 13 20:43:31.822089 kubelet[2877]: E0113 20:43:31.821529 2877 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8-kube-api-access-9mkms podName:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8 nodeName:}" failed. No retries permitted until 2025-01-13 20:43:32.321524995 +0000 UTC m=+35.808566928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9mkms" (UniqueName: "kubernetes.io/projected/aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8-kube-api-access-9mkms") pod "calico-apiserver-595db757fc-b5vf4" (UID: "aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8") : failed to sync configmap cache: timed out waiting for the condition Jan 13 20:43:31.822089 kubelet[2877]: E0113 20:43:31.821594 2877 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fafcb48-274d-4a9e-adca-511adb0459f5-calico-apiserver-certs podName:2fafcb48-274d-4a9e-adca-511adb0459f5 nodeName:}" failed. No retries permitted until 2025-01-13 20:43:32.321589459 +0000 UTC m=+35.808631392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/2fafcb48-274d-4a9e-adca-511adb0459f5-calico-apiserver-certs") pod "calico-apiserver-595db757fc-h4ptk" (UID: "2fafcb48-274d-4a9e-adca-511adb0459f5") : failed to sync secret cache: timed out waiting for the condition Jan 13 20:43:31.822089 kubelet[2877]: E0113 20:43:31.821604 2877 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8-calico-apiserver-certs podName:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8 nodeName:}" failed. No retries permitted until 2025-01-13 20:43:32.321600693 +0000 UTC m=+35.808642623 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8-calico-apiserver-certs") pod "calico-apiserver-595db757fc-b5vf4" (UID: "aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8") : failed to sync secret cache: timed out waiting for the condition Jan 13 20:43:31.842764 containerd[1551]: time="2025-01-13T20:43:31.842726482Z" level=error msg="Failed to destroy network for sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.843042 containerd[1551]: time="2025-01-13T20:43:31.842977759Z" level=error msg="encountered an error cleaning up failed sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.843042 containerd[1551]: time="2025-01-13T20:43:31.843014558Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.843411 kubelet[2877]: E0113 20:43:31.843298 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.843449 kubelet[2877]: E0113 20:43:31.843423 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:31.843449 kubelet[2877]: E0113 20:43:31.843439 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:31.843647 kubelet[2877]: E0113 20:43:31.843499 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-zww27_kube-system(4bc90cb7-3014-45c2-9619-7765993cb1d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-zww27_kube-system(4bc90cb7-3014-45c2-9619-7765993cb1d0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-zww27" podUID="4bc90cb7-3014-45c2-9619-7765993cb1d0" Jan 13 20:43:31.847955 containerd[1551]: time="2025-01-13T20:43:31.847804239Z" level=error msg="Failed to destroy network for sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.848419 containerd[1551]: time="2025-01-13T20:43:31.848402413Z" level=error msg="encountered an error cleaning up failed sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.848460 containerd[1551]: time="2025-01-13T20:43:31.848437622Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.848820 kubelet[2877]: E0113 20:43:31.848784 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.849222 containerd[1551]: time="2025-01-13T20:43:31.849151853Z" level=error msg="Failed to destroy network for sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.849388 containerd[1551]: time="2025-01-13T20:43:31.849374573Z" level=error msg="encountered an error cleaning up failed sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.849968 containerd[1551]: time="2025-01-13T20:43:31.849440876Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.850004 kubelet[2877]: E0113 20:43:31.849516 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.850004 kubelet[2877]: E0113 20:43:31.849535 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:31.850004 kubelet[2877]: E0113 20:43:31.849549 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:31.850065 kubelet[2877]: E0113 20:43:31.849574 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-f9n77_kube-system(f4b04d0c-4a38-4068-8730-7ed8bc9346ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-f9n77_kube-system(f4b04d0c-4a38-4068-8730-7ed8bc9346ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-f9n77" podUID="f4b04d0c-4a38-4068-8730-7ed8bc9346ef" Jan 13 20:43:31.850462 kubelet[2877]: E0113 20:43:31.850267 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:31.850462 kubelet[2877]: E0113 20:43:31.850283 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:31.850462 kubelet[2877]: E0113 20:43:31.850302 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-69bd7ff58c-rg9pf_calico-system(4d21bbae-e62e-4365-ba57-17611b86a846)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-69bd7ff58c-rg9pf_calico-system(4d21bbae-e62e-4365-ba57-17611b86a846)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" podUID="4d21bbae-e62e-4365-ba57-17611b86a846" Jan 13 20:43:31.850842 containerd[1551]: time="2025-01-13T20:43:31.850787645Z" level=error msg="Failed to destroy network for sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.851287 containerd[1551]: time="2025-01-13T20:43:31.851266203Z" level=error msg="encountered an error cleaning up failed sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.851321 containerd[1551]: time="2025-01-13T20:43:31.851308456Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.851771 kubelet[2877]: E0113 20:43:31.851400 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:31.851771 kubelet[2877]: E0113 20:43:31.851431 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:31.851771 kubelet[2877]: E0113 20:43:31.851441 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:31.851841 kubelet[2877]: E0113 20:43:31.851459 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8prdm_calico-system(3c210180-a974-4da0-9ca1-18a6f94f39f4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8prdm_calico-system(3c210180-a974-4da0-9ca1-18a6f94f39f4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8prdm" podUID="3c210180-a974-4da0-9ca1-18a6f94f39f4" Jan 13 20:43:32.439158 containerd[1551]: time="2025-01-13T20:43:32.439126464Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-h4ptk,Uid:2fafcb48-274d-4a9e-adca-511adb0459f5,Namespace:calico-apiserver,Attempt:0,}" Jan 13 20:43:32.447056 systemd[1]: run-netns-cni\x2d3a582f85\x2da855\x2dbe36\x2d0e9c\x2d204eb0c302ca.mount: Deactivated successfully. Jan 13 20:43:32.447109 systemd[1]: run-netns-cni\x2d56c1b11d\x2df606\x2d4368\x2d5ded\x2d1ee19e10b33f.mount: Deactivated successfully. Jan 13 20:43:32.456595 containerd[1551]: time="2025-01-13T20:43:32.456530447Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-b5vf4,Uid:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8,Namespace:calico-apiserver,Attempt:0,}" Jan 13 20:43:32.491244 containerd[1551]: time="2025-01-13T20:43:32.491149239Z" level=error msg="Failed to destroy network for sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.492671 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f-shm.mount: Deactivated successfully. Jan 13 20:43:32.493705 containerd[1551]: time="2025-01-13T20:43:32.493664046Z" level=error msg="encountered an error cleaning up failed sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.493857 containerd[1551]: time="2025-01-13T20:43:32.493725761Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-h4ptk,Uid:2fafcb48-274d-4a9e-adca-511adb0459f5,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.494008 kubelet[2877]: E0113 20:43:32.493931 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.494008 kubelet[2877]: E0113 20:43:32.493976 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:32.494008 kubelet[2877]: E0113 20:43:32.493999 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:32.494638 kubelet[2877]: E0113 20:43:32.494042 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-595db757fc-h4ptk_calico-apiserver(2fafcb48-274d-4a9e-adca-511adb0459f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-595db757fc-h4ptk_calico-apiserver(2fafcb48-274d-4a9e-adca-511adb0459f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" podUID="2fafcb48-274d-4a9e-adca-511adb0459f5" Jan 13 20:43:32.512046 containerd[1551]: time="2025-01-13T20:43:32.511967578Z" level=error msg="Failed to destroy network for sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.512971 containerd[1551]: time="2025-01-13T20:43:32.512261748Z" level=error msg="encountered an error cleaning up failed sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.512971 containerd[1551]: time="2025-01-13T20:43:32.512305408Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-b5vf4,Uid:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.513707 kubelet[2877]: E0113 20:43:32.513100 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.513707 kubelet[2877]: E0113 20:43:32.513144 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:32.513707 kubelet[2877]: E0113 20:43:32.513158 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:32.513851 kubelet[2877]: E0113 20:43:32.513189 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-595db757fc-b5vf4_calico-apiserver(aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-595db757fc-b5vf4_calico-apiserver(aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" podUID="aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8" Jan 13 20:43:32.753775 kubelet[2877]: I0113 20:43:32.753684 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94" Jan 13 20:43:32.754286 containerd[1551]: time="2025-01-13T20:43:32.754062482Z" level=info msg="StopPodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\"" Jan 13 20:43:32.754286 containerd[1551]: time="2025-01-13T20:43:32.754205135Z" level=info msg="Ensure that sandbox b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94 in task-service has been cleanup successfully" Jan 13 20:43:32.755610 containerd[1551]: time="2025-01-13T20:43:32.754385686Z" level=info msg="TearDown network for sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" successfully" Jan 13 20:43:32.755610 containerd[1551]: time="2025-01-13T20:43:32.754398107Z" level=info msg="StopPodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" returns successfully" Jan 13 20:43:32.755610 containerd[1551]: time="2025-01-13T20:43:32.754703883Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\"" Jan 13 20:43:32.755610 containerd[1551]: time="2025-01-13T20:43:32.754752870Z" level=info msg="TearDown network for sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" successfully" Jan 13 20:43:32.755610 containerd[1551]: time="2025-01-13T20:43:32.754758968Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" returns successfully" Jan 13 20:43:32.755610 containerd[1551]: time="2025-01-13T20:43:32.755017334Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:2,}" Jan 13 20:43:32.756372 kubelet[2877]: I0113 20:43:32.755558 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d" Jan 13 20:43:32.756410 containerd[1551]: time="2025-01-13T20:43:32.755859942Z" level=info msg="StopPodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\"" Jan 13 20:43:32.756410 containerd[1551]: time="2025-01-13T20:43:32.755963496Z" level=info msg="Ensure that sandbox d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d in task-service has been cleanup successfully" Jan 13 20:43:32.756410 containerd[1551]: time="2025-01-13T20:43:32.756075669Z" level=info msg="TearDown network for sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" successfully" Jan 13 20:43:32.756410 containerd[1551]: time="2025-01-13T20:43:32.756083929Z" level=info msg="StopPodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" returns successfully" Jan 13 20:43:32.757041 containerd[1551]: time="2025-01-13T20:43:32.756906871Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\"" Jan 13 20:43:32.757041 containerd[1551]: time="2025-01-13T20:43:32.756963627Z" level=info msg="TearDown network for sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" successfully" Jan 13 20:43:32.757041 containerd[1551]: time="2025-01-13T20:43:32.756970736Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" returns successfully" Jan 13 20:43:32.757880 containerd[1551]: time="2025-01-13T20:43:32.757392349Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:2,}" Jan 13 20:43:32.758380 kubelet[2877]: I0113 20:43:32.758167 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f" Jan 13 20:43:32.758556 containerd[1551]: time="2025-01-13T20:43:32.758477698Z" level=info msg="StopPodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\"" Jan 13 20:43:32.758675 containerd[1551]: time="2025-01-13T20:43:32.758657674Z" level=info msg="Ensure that sandbox 003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f in task-service has been cleanup successfully" Jan 13 20:43:32.759749 containerd[1551]: time="2025-01-13T20:43:32.759692253Z" level=info msg="TearDown network for sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" successfully" Jan 13 20:43:32.759749 containerd[1551]: time="2025-01-13T20:43:32.759704571Z" level=info msg="StopPodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" returns successfully" Jan 13 20:43:32.760094 kubelet[2877]: I0113 20:43:32.760081 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4" Jan 13 20:43:32.760404 containerd[1551]: time="2025-01-13T20:43:32.760372496Z" level=info msg="StopPodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\"" Jan 13 20:43:32.760599 containerd[1551]: time="2025-01-13T20:43:32.760481455Z" level=info msg="Ensure that sandbox 6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4 in task-service has been cleanup successfully" Jan 13 20:43:32.760717 containerd[1551]: time="2025-01-13T20:43:32.760700026Z" level=info msg="TearDown network for sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" successfully" Jan 13 20:43:32.760717 containerd[1551]: time="2025-01-13T20:43:32.760710956Z" level=info msg="StopPodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" returns successfully" Jan 13 20:43:32.761078 containerd[1551]: time="2025-01-13T20:43:32.760870334Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-h4ptk,Uid:2fafcb48-274d-4a9e-adca-511adb0459f5,Namespace:calico-apiserver,Attempt:1,}" Jan 13 20:43:32.761560 containerd[1551]: time="2025-01-13T20:43:32.761360644Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\"" Jan 13 20:43:32.761560 containerd[1551]: time="2025-01-13T20:43:32.761403439Z" level=info msg="TearDown network for sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" successfully" Jan 13 20:43:32.761560 containerd[1551]: time="2025-01-13T20:43:32.761410103Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" returns successfully" Jan 13 20:43:32.762576 kubelet[2877]: I0113 20:43:32.762518 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec" Jan 13 20:43:32.763238 containerd[1551]: time="2025-01-13T20:43:32.763107667Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:2,}" Jan 13 20:43:32.763300 containerd[1551]: time="2025-01-13T20:43:32.763286616Z" level=info msg="StopPodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\"" Jan 13 20:43:32.763407 containerd[1551]: time="2025-01-13T20:43:32.763392442Z" level=info msg="Ensure that sandbox 8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec in task-service has been cleanup successfully" Jan 13 20:43:32.764044 containerd[1551]: time="2025-01-13T20:43:32.764026125Z" level=info msg="TearDown network for sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" successfully" Jan 13 20:43:32.764044 containerd[1551]: time="2025-01-13T20:43:32.764037731Z" level=info msg="StopPodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" returns successfully" Jan 13 20:43:32.765549 containerd[1551]: time="2025-01-13T20:43:32.765471187Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\"" Jan 13 20:43:32.765549 containerd[1551]: time="2025-01-13T20:43:32.765516904Z" level=info msg="TearDown network for sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" successfully" Jan 13 20:43:32.765549 containerd[1551]: time="2025-01-13T20:43:32.765523775Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" returns successfully" Jan 13 20:43:32.766060 kubelet[2877]: I0113 20:43:32.765831 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd" Jan 13 20:43:32.766510 containerd[1551]: time="2025-01-13T20:43:32.766291594Z" level=info msg="StopPodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\"" Jan 13 20:43:32.766510 containerd[1551]: time="2025-01-13T20:43:32.766408346Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:2,}" Jan 13 20:43:32.766654 containerd[1551]: time="2025-01-13T20:43:32.766557296Z" level=info msg="Ensure that sandbox e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd in task-service has been cleanup successfully" Jan 13 20:43:32.766862 containerd[1551]: time="2025-01-13T20:43:32.766685618Z" level=info msg="TearDown network for sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" successfully" Jan 13 20:43:32.766862 containerd[1551]: time="2025-01-13T20:43:32.766695741Z" level=info msg="StopPodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" returns successfully" Jan 13 20:43:32.767380 containerd[1551]: time="2025-01-13T20:43:32.767364519Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-b5vf4,Uid:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8,Namespace:calico-apiserver,Attempt:1,}" Jan 13 20:43:32.882535 containerd[1551]: time="2025-01-13T20:43:32.882506164Z" level=error msg="Failed to destroy network for sandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.883363 containerd[1551]: time="2025-01-13T20:43:32.883348563Z" level=error msg="encountered an error cleaning up failed sandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.883505 containerd[1551]: time="2025-01-13T20:43:32.883490924Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.885069 kubelet[2877]: E0113 20:43:32.884467 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.885069 kubelet[2877]: E0113 20:43:32.884509 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:32.885069 kubelet[2877]: E0113 20:43:32.884525 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:32.885147 kubelet[2877]: E0113 20:43:32.884550 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-f9n77_kube-system(f4b04d0c-4a38-4068-8730-7ed8bc9346ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-f9n77_kube-system(f4b04d0c-4a38-4068-8730-7ed8bc9346ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-f9n77" podUID="f4b04d0c-4a38-4068-8730-7ed8bc9346ef" Jan 13 20:43:32.888875 containerd[1551]: time="2025-01-13T20:43:32.888801018Z" level=error msg="Failed to destroy network for sandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.889074 containerd[1551]: time="2025-01-13T20:43:32.889060811Z" level=error msg="encountered an error cleaning up failed sandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.889147 containerd[1551]: time="2025-01-13T20:43:32.889135782Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.889385 kubelet[2877]: E0113 20:43:32.889288 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.889385 kubelet[2877]: E0113 20:43:32.889333 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:32.889385 kubelet[2877]: E0113 20:43:32.889351 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:32.889571 kubelet[2877]: E0113 20:43:32.889481 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8prdm_calico-system(3c210180-a974-4da0-9ca1-18a6f94f39f4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8prdm_calico-system(3c210180-a974-4da0-9ca1-18a6f94f39f4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8prdm" podUID="3c210180-a974-4da0-9ca1-18a6f94f39f4" Jan 13 20:43:32.893438 containerd[1551]: time="2025-01-13T20:43:32.893362679Z" level=error msg="Failed to destroy network for sandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.893742 containerd[1551]: time="2025-01-13T20:43:32.893645966Z" level=error msg="encountered an error cleaning up failed sandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.893742 containerd[1551]: time="2025-01-13T20:43:32.893697220Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-h4ptk,Uid:2fafcb48-274d-4a9e-adca-511adb0459f5,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.894433 kubelet[2877]: E0113 20:43:32.893921 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.894433 kubelet[2877]: E0113 20:43:32.893957 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:32.894433 kubelet[2877]: E0113 20:43:32.893970 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:32.894512 kubelet[2877]: E0113 20:43:32.894000 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-595db757fc-h4ptk_calico-apiserver(2fafcb48-274d-4a9e-adca-511adb0459f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-595db757fc-h4ptk_calico-apiserver(2fafcb48-274d-4a9e-adca-511adb0459f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" podUID="2fafcb48-274d-4a9e-adca-511adb0459f5" Jan 13 20:43:32.899379 containerd[1551]: time="2025-01-13T20:43:32.899349176Z" level=error msg="Failed to destroy network for sandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.899744 containerd[1551]: time="2025-01-13T20:43:32.899670106Z" level=error msg="encountered an error cleaning up failed sandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.899744 containerd[1551]: time="2025-01-13T20:43:32.899707810Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.900631 kubelet[2877]: E0113 20:43:32.900003 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.900631 kubelet[2877]: E0113 20:43:32.900041 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:32.900631 kubelet[2877]: E0113 20:43:32.900053 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:32.900707 kubelet[2877]: E0113 20:43:32.900079 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-69bd7ff58c-rg9pf_calico-system(4d21bbae-e62e-4365-ba57-17611b86a846)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-69bd7ff58c-rg9pf_calico-system(4d21bbae-e62e-4365-ba57-17611b86a846)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" podUID="4d21bbae-e62e-4365-ba57-17611b86a846" Jan 13 20:43:32.905352 containerd[1551]: time="2025-01-13T20:43:32.905327109Z" level=error msg="Failed to destroy network for sandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.906034 containerd[1551]: time="2025-01-13T20:43:32.905960078Z" level=error msg="encountered an error cleaning up failed sandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.906034 containerd[1551]: time="2025-01-13T20:43:32.905996271Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.906151 kubelet[2877]: E0113 20:43:32.906117 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.906189 kubelet[2877]: E0113 20:43:32.906154 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:32.906189 kubelet[2877]: E0113 20:43:32.906169 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:32.906249 kubelet[2877]: E0113 20:43:32.906205 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-zww27_kube-system(4bc90cb7-3014-45c2-9619-7765993cb1d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-zww27_kube-system(4bc90cb7-3014-45c2-9619-7765993cb1d0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-zww27" podUID="4bc90cb7-3014-45c2-9619-7765993cb1d0" Jan 13 20:43:32.907755 containerd[1551]: time="2025-01-13T20:43:32.907692135Z" level=error msg="Failed to destroy network for sandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.908061 containerd[1551]: time="2025-01-13T20:43:32.908041957Z" level=error msg="encountered an error cleaning up failed sandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.908093 containerd[1551]: time="2025-01-13T20:43:32.908076237Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-b5vf4,Uid:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.908261 kubelet[2877]: E0113 20:43:32.908183 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:32.908261 kubelet[2877]: E0113 20:43:32.908207 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:32.908261 kubelet[2877]: E0113 20:43:32.908219 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:32.908334 kubelet[2877]: E0113 20:43:32.908238 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-595db757fc-b5vf4_calico-apiserver(aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-595db757fc-b5vf4_calico-apiserver(aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" podUID="aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8" Jan 13 20:43:33.440876 systemd[1]: run-netns-cni\x2d62e83e9c\x2dfbe4\x2d6a60\x2d08f4\x2d7160143facd9.mount: Deactivated successfully. Jan 13 20:43:33.440932 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd-shm.mount: Deactivated successfully. Jan 13 20:43:33.440994 systemd[1]: run-netns-cni\x2d31206290\x2dea40\x2d12ac\x2d372b\x2d1dff918f58f5.mount: Deactivated successfully. Jan 13 20:43:33.441057 systemd[1]: run-netns-cni\x2d49cc87d7\x2de9b5\x2d9f3b\x2d34fd\x2d55f210e3badc.mount: Deactivated successfully. Jan 13 20:43:33.441104 systemd[1]: run-netns-cni\x2d2aae841b\x2d2a2c\x2d7ffe\x2d56c4\x2dab15c9833d90.mount: Deactivated successfully. Jan 13 20:43:33.441143 systemd[1]: run-netns-cni\x2dd372ef60\x2d9dfa\x2dc961\x2da23d\x2dcedd0700942f.mount: Deactivated successfully. Jan 13 20:43:33.441184 systemd[1]: run-netns-cni\x2d07bbb48b\x2dfb2f\x2de042\x2d1618\x2db9dba4cffca6.mount: Deactivated successfully. Jan 13 20:43:33.768091 kubelet[2877]: I0113 20:43:33.768030 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad" Jan 13 20:43:33.769626 containerd[1551]: time="2025-01-13T20:43:33.769599972Z" level=info msg="StopPodSandbox for \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\"" Jan 13 20:43:33.771450 containerd[1551]: time="2025-01-13T20:43:33.769831876Z" level=info msg="Ensure that sandbox a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad in task-service has been cleanup successfully" Jan 13 20:43:33.771476 systemd[1]: run-netns-cni\x2d50aabbc2\x2dfd13\x2de98c\x2d750f\x2dd21f413445e1.mount: Deactivated successfully. Jan 13 20:43:33.772529 containerd[1551]: time="2025-01-13T20:43:33.771883262Z" level=info msg="TearDown network for sandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\" successfully" Jan 13 20:43:33.772529 containerd[1551]: time="2025-01-13T20:43:33.771899088Z" level=info msg="StopPodSandbox for \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\" returns successfully" Jan 13 20:43:33.772529 containerd[1551]: time="2025-01-13T20:43:33.772298315Z" level=info msg="StopPodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\"" Jan 13 20:43:33.772529 containerd[1551]: time="2025-01-13T20:43:33.772339755Z" level=info msg="TearDown network for sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" successfully" Jan 13 20:43:33.772529 containerd[1551]: time="2025-01-13T20:43:33.772345720Z" level=info msg="StopPodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" returns successfully" Jan 13 20:43:33.774719 kubelet[2877]: I0113 20:43:33.774604 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57" Jan 13 20:43:33.775656 containerd[1551]: time="2025-01-13T20:43:33.775061147Z" level=info msg="StopPodSandbox for \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\"" Jan 13 20:43:33.775656 containerd[1551]: time="2025-01-13T20:43:33.775201514Z" level=info msg="Ensure that sandbox c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57 in task-service has been cleanup successfully" Jan 13 20:43:33.775656 containerd[1551]: time="2025-01-13T20:43:33.775360633Z" level=info msg="TearDown network for sandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\" successfully" Jan 13 20:43:33.775656 containerd[1551]: time="2025-01-13T20:43:33.775369307Z" level=info msg="StopPodSandbox for \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\" returns successfully" Jan 13 20:43:33.776951 systemd[1]: run-netns-cni\x2d142b2345\x2dc4be\x2dd2c7\x2d7c1d\x2d1a29339711b1.mount: Deactivated successfully. Jan 13 20:43:33.777955 containerd[1551]: time="2025-01-13T20:43:33.777097253Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\"" Jan 13 20:43:33.777955 containerd[1551]: time="2025-01-13T20:43:33.777141279Z" level=info msg="TearDown network for sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" successfully" Jan 13 20:43:33.777955 containerd[1551]: time="2025-01-13T20:43:33.777148095Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" returns successfully" Jan 13 20:43:33.777955 containerd[1551]: time="2025-01-13T20:43:33.777103545Z" level=info msg="StopPodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\"" Jan 13 20:43:33.777955 containerd[1551]: time="2025-01-13T20:43:33.777652095Z" level=info msg="TearDown network for sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" successfully" Jan 13 20:43:33.777955 containerd[1551]: time="2025-01-13T20:43:33.777659452Z" level=info msg="StopPodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" returns successfully" Jan 13 20:43:33.777955 containerd[1551]: time="2025-01-13T20:43:33.777766815Z" level=info msg="StopPodSandbox for \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\"" Jan 13 20:43:33.777955 containerd[1551]: time="2025-01-13T20:43:33.777860607Z" level=info msg="Ensure that sandbox 70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed in task-service has been cleanup successfully" Jan 13 20:43:33.778117 kubelet[2877]: I0113 20:43:33.777347 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed" Jan 13 20:43:33.778364 containerd[1551]: time="2025-01-13T20:43:33.778353444Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\"" Jan 13 20:43:33.778436 containerd[1551]: time="2025-01-13T20:43:33.778427780Z" level=info msg="TearDown network for sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" successfully" Jan 13 20:43:33.778472 containerd[1551]: time="2025-01-13T20:43:33.778465281Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" returns successfully" Jan 13 20:43:33.778583 containerd[1551]: time="2025-01-13T20:43:33.778572588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:3,}" Jan 13 20:43:33.779019 containerd[1551]: time="2025-01-13T20:43:33.778894444Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:3,}" Jan 13 20:43:33.779251 kubelet[2877]: I0113 20:43:33.779105 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405" Jan 13 20:43:33.779562 containerd[1551]: time="2025-01-13T20:43:33.779322099Z" level=info msg="TearDown network for sandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\" successfully" Jan 13 20:43:33.779562 containerd[1551]: time="2025-01-13T20:43:33.779332818Z" level=info msg="StopPodSandbox for \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\" returns successfully" Jan 13 20:43:33.779562 containerd[1551]: time="2025-01-13T20:43:33.779403647Z" level=info msg="StopPodSandbox for \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\"" Jan 13 20:43:33.779562 containerd[1551]: time="2025-01-13T20:43:33.779498849Z" level=info msg="Ensure that sandbox dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405 in task-service has been cleanup successfully" Jan 13 20:43:33.779785 containerd[1551]: time="2025-01-13T20:43:33.779774943Z" level=info msg="TearDown network for sandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\" successfully" Jan 13 20:43:33.779826 containerd[1551]: time="2025-01-13T20:43:33.779819666Z" level=info msg="StopPodSandbox for \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\" returns successfully" Jan 13 20:43:33.781534 containerd[1551]: time="2025-01-13T20:43:33.780806588Z" level=info msg="StopPodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\"" Jan 13 20:43:33.781534 containerd[1551]: time="2025-01-13T20:43:33.780852430Z" level=info msg="TearDown network for sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" successfully" Jan 13 20:43:33.781534 containerd[1551]: time="2025-01-13T20:43:33.780859362Z" level=info msg="StopPodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" returns successfully" Jan 13 20:43:33.781534 containerd[1551]: time="2025-01-13T20:43:33.780929991Z" level=info msg="StopPodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\"" Jan 13 20:43:33.781534 containerd[1551]: time="2025-01-13T20:43:33.780974003Z" level=info msg="TearDown network for sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" successfully" Jan 13 20:43:33.781534 containerd[1551]: time="2025-01-13T20:43:33.780982301Z" level=info msg="StopPodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" returns successfully" Jan 13 20:43:33.781413 systemd[1]: run-netns-cni\x2dfd5111f4\x2dbb1a\x2d7a44\x2d13d4\x2dbb52400c99c0.mount: Deactivated successfully. Jan 13 20:43:33.781472 systemd[1]: run-netns-cni\x2d7cf90824\x2d517b\x2d4410\x2da127\x2d116ebe1644fe.mount: Deactivated successfully. Jan 13 20:43:33.783202 containerd[1551]: time="2025-01-13T20:43:33.782942178Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-b5vf4,Uid:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8,Namespace:calico-apiserver,Attempt:2,}" Jan 13 20:43:33.783392 containerd[1551]: time="2025-01-13T20:43:33.783287676Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-h4ptk,Uid:2fafcb48-274d-4a9e-adca-511adb0459f5,Namespace:calico-apiserver,Attempt:2,}" Jan 13 20:43:33.789979 kubelet[2877]: I0113 20:43:33.789765 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313" Jan 13 20:43:33.790100 containerd[1551]: time="2025-01-13T20:43:33.790079981Z" level=info msg="StopPodSandbox for \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\"" Jan 13 20:43:33.790747 containerd[1551]: time="2025-01-13T20:43:33.790252705Z" level=info msg="Ensure that sandbox 8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313 in task-service has been cleanup successfully" Jan 13 20:43:33.790747 containerd[1551]: time="2025-01-13T20:43:33.790377796Z" level=info msg="TearDown network for sandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\" successfully" Jan 13 20:43:33.790747 containerd[1551]: time="2025-01-13T20:43:33.790385973Z" level=info msg="StopPodSandbox for \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\" returns successfully" Jan 13 20:43:33.790838 containerd[1551]: time="2025-01-13T20:43:33.790752973Z" level=info msg="StopPodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\"" Jan 13 20:43:33.790838 containerd[1551]: time="2025-01-13T20:43:33.790792092Z" level=info msg="TearDown network for sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" successfully" Jan 13 20:43:33.790838 containerd[1551]: time="2025-01-13T20:43:33.790798460Z" level=info msg="StopPodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" returns successfully" Jan 13 20:43:33.793279 containerd[1551]: time="2025-01-13T20:43:33.793236299Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\"" Jan 13 20:43:33.794780 containerd[1551]: time="2025-01-13T20:43:33.794765691Z" level=info msg="TearDown network for sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" successfully" Jan 13 20:43:33.795008 containerd[1551]: time="2025-01-13T20:43:33.794864061Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" returns successfully" Jan 13 20:43:33.796064 containerd[1551]: time="2025-01-13T20:43:33.795878000Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:3,}" Jan 13 20:43:33.797927 kubelet[2877]: I0113 20:43:33.797894 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0" Jan 13 20:43:33.799519 containerd[1551]: time="2025-01-13T20:43:33.799425330Z" level=info msg="StopPodSandbox for \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\"" Jan 13 20:43:33.799911 containerd[1551]: time="2025-01-13T20:43:33.799898559Z" level=info msg="Ensure that sandbox adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0 in task-service has been cleanup successfully" Jan 13 20:43:33.800186 containerd[1551]: time="2025-01-13T20:43:33.800129689Z" level=info msg="TearDown network for sandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\" successfully" Jan 13 20:43:33.800271 containerd[1551]: time="2025-01-13T20:43:33.800262065Z" level=info msg="StopPodSandbox for \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\" returns successfully" Jan 13 20:43:33.800627 containerd[1551]: time="2025-01-13T20:43:33.800610844Z" level=info msg="StopPodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\"" Jan 13 20:43:33.800671 containerd[1551]: time="2025-01-13T20:43:33.800659330Z" level=info msg="TearDown network for sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" successfully" Jan 13 20:43:33.800671 containerd[1551]: time="2025-01-13T20:43:33.800668706Z" level=info msg="StopPodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" returns successfully" Jan 13 20:43:33.808081 containerd[1551]: time="2025-01-13T20:43:33.808057738Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\"" Jan 13 20:43:33.808318 containerd[1551]: time="2025-01-13T20:43:33.808308417Z" level=info msg="TearDown network for sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" successfully" Jan 13 20:43:33.808377 containerd[1551]: time="2025-01-13T20:43:33.808367913Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" returns successfully" Jan 13 20:43:33.815959 containerd[1551]: time="2025-01-13T20:43:33.815899430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:3,}" Jan 13 20:43:33.902912 containerd[1551]: time="2025-01-13T20:43:33.902852363Z" level=error msg="Failed to destroy network for sandbox \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.903533 containerd[1551]: time="2025-01-13T20:43:33.903386010Z" level=error msg="encountered an error cleaning up failed sandbox \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.903533 containerd[1551]: time="2025-01-13T20:43:33.903420326Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.903820 kubelet[2877]: E0113 20:43:33.903670 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.903820 kubelet[2877]: E0113 20:43:33.903719 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:33.903820 kubelet[2877]: E0113 20:43:33.903756 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:33.903901 kubelet[2877]: E0113 20:43:33.903793 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-zww27_kube-system(4bc90cb7-3014-45c2-9619-7765993cb1d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-zww27_kube-system(4bc90cb7-3014-45c2-9619-7765993cb1d0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-zww27" podUID="4bc90cb7-3014-45c2-9619-7765993cb1d0" Jan 13 20:43:33.909568 containerd[1551]: time="2025-01-13T20:43:33.909531199Z" level=error msg="Failed to destroy network for sandbox \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.909836 containerd[1551]: time="2025-01-13T20:43:33.909809592Z" level=error msg="encountered an error cleaning up failed sandbox \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.909874 containerd[1551]: time="2025-01-13T20:43:33.909853692Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-h4ptk,Uid:2fafcb48-274d-4a9e-adca-511adb0459f5,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.910846 kubelet[2877]: E0113 20:43:33.909999 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.910846 kubelet[2877]: E0113 20:43:33.910035 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:33.910846 kubelet[2877]: E0113 20:43:33.910048 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:33.910937 kubelet[2877]: E0113 20:43:33.910077 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-595db757fc-h4ptk_calico-apiserver(2fafcb48-274d-4a9e-adca-511adb0459f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-595db757fc-h4ptk_calico-apiserver(2fafcb48-274d-4a9e-adca-511adb0459f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" podUID="2fafcb48-274d-4a9e-adca-511adb0459f5" Jan 13 20:43:33.939833 containerd[1551]: time="2025-01-13T20:43:33.939803742Z" level=error msg="Failed to destroy network for sandbox \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.940406 containerd[1551]: time="2025-01-13T20:43:33.940387972Z" level=error msg="encountered an error cleaning up failed sandbox \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.940445 containerd[1551]: time="2025-01-13T20:43:33.940427380Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.941336 kubelet[2877]: E0113 20:43:33.941316 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.941507 kubelet[2877]: E0113 20:43:33.941466 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:33.941507 kubelet[2877]: E0113 20:43:33.941484 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:33.941715 kubelet[2877]: E0113 20:43:33.941604 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-69bd7ff58c-rg9pf_calico-system(4d21bbae-e62e-4365-ba57-17611b86a846)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-69bd7ff58c-rg9pf_calico-system(4d21bbae-e62e-4365-ba57-17611b86a846)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" podUID="4d21bbae-e62e-4365-ba57-17611b86a846" Jan 13 20:43:33.959467 containerd[1551]: time="2025-01-13T20:43:33.959359350Z" level=error msg="Failed to destroy network for sandbox \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.959695 containerd[1551]: time="2025-01-13T20:43:33.959643740Z" level=error msg="encountered an error cleaning up failed sandbox \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.959739 containerd[1551]: time="2025-01-13T20:43:33.959691896Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-b5vf4,Uid:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.959786 containerd[1551]: time="2025-01-13T20:43:33.959760962Z" level=error msg="Failed to destroy network for sandbox \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.960746 kubelet[2877]: E0113 20:43:33.960214 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.960746 kubelet[2877]: E0113 20:43:33.960248 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:33.960746 kubelet[2877]: E0113 20:43:33.960260 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:33.960850 kubelet[2877]: E0113 20:43:33.960312 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-595db757fc-b5vf4_calico-apiserver(aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-595db757fc-b5vf4_calico-apiserver(aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" podUID="aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8" Jan 13 20:43:33.961049 containerd[1551]: time="2025-01-13T20:43:33.960871316Z" level=error msg="encountered an error cleaning up failed sandbox \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.961049 containerd[1551]: time="2025-01-13T20:43:33.960909727Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.961647 kubelet[2877]: E0113 20:43:33.961018 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.961647 kubelet[2877]: E0113 20:43:33.961039 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:33.961647 kubelet[2877]: E0113 20:43:33.961559 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:33.961717 kubelet[2877]: E0113 20:43:33.961590 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8prdm_calico-system(3c210180-a974-4da0-9ca1-18a6f94f39f4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8prdm_calico-system(3c210180-a974-4da0-9ca1-18a6f94f39f4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8prdm" podUID="3c210180-a974-4da0-9ca1-18a6f94f39f4" Jan 13 20:43:33.973509 containerd[1551]: time="2025-01-13T20:43:33.973482501Z" level=error msg="Failed to destroy network for sandbox \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.973876 containerd[1551]: time="2025-01-13T20:43:33.973782719Z" level=error msg="encountered an error cleaning up failed sandbox \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.973876 containerd[1551]: time="2025-01-13T20:43:33.973821441Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.973998 kubelet[2877]: E0113 20:43:33.973973 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:33.974042 kubelet[2877]: E0113 20:43:33.974014 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:33.974042 kubelet[2877]: E0113 20:43:33.974028 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:33.974109 kubelet[2877]: E0113 20:43:33.974059 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-f9n77_kube-system(f4b04d0c-4a38-4068-8730-7ed8bc9346ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-f9n77_kube-system(f4b04d0c-4a38-4068-8730-7ed8bc9346ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-f9n77" podUID="f4b04d0c-4a38-4068-8730-7ed8bc9346ef" Jan 13 20:43:34.442152 systemd[1]: run-netns-cni\x2dda05ced3\x2d3fbe\x2d7b5c\x2d018f\x2dfc16d0f117b4.mount: Deactivated successfully. Jan 13 20:43:34.442222 systemd[1]: run-netns-cni\x2d316113a2\x2dbdc4\x2d3098\x2db447\x2db1e87a8366e2.mount: Deactivated successfully. Jan 13 20:43:34.800966 kubelet[2877]: I0113 20:43:34.800615 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898" Jan 13 20:43:34.801461 containerd[1551]: time="2025-01-13T20:43:34.801071140Z" level=info msg="StopPodSandbox for \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\"" Jan 13 20:43:34.802394 containerd[1551]: time="2025-01-13T20:43:34.802137845Z" level=info msg="Ensure that sandbox ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898 in task-service has been cleanup successfully" Jan 13 20:43:34.802857 containerd[1551]: time="2025-01-13T20:43:34.802816206Z" level=info msg="TearDown network for sandbox \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\" successfully" Jan 13 20:43:34.802857 containerd[1551]: time="2025-01-13T20:43:34.802828007Z" level=info msg="StopPodSandbox for \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\" returns successfully" Jan 13 20:43:34.804087 systemd[1]: run-netns-cni\x2dbe288421\x2dba37\x2d8075\x2d8f96\x2dc1e71e961d66.mount: Deactivated successfully. Jan 13 20:43:34.804435 containerd[1551]: time="2025-01-13T20:43:34.804169594Z" level=info msg="StopPodSandbox for \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\"" Jan 13 20:43:34.804435 containerd[1551]: time="2025-01-13T20:43:34.804239052Z" level=info msg="TearDown network for sandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\" successfully" Jan 13 20:43:34.804435 containerd[1551]: time="2025-01-13T20:43:34.804248520Z" level=info msg="StopPodSandbox for \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\" returns successfully" Jan 13 20:43:34.806224 containerd[1551]: time="2025-01-13T20:43:34.805935592Z" level=info msg="StopPodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\"" Jan 13 20:43:34.806224 containerd[1551]: time="2025-01-13T20:43:34.805987616Z" level=info msg="TearDown network for sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" successfully" Jan 13 20:43:34.806224 containerd[1551]: time="2025-01-13T20:43:34.805994582Z" level=info msg="StopPodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" returns successfully" Jan 13 20:43:34.806224 containerd[1551]: time="2025-01-13T20:43:34.806121427Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\"" Jan 13 20:43:34.806224 containerd[1551]: time="2025-01-13T20:43:34.806167599Z" level=info msg="TearDown network for sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" successfully" Jan 13 20:43:34.806224 containerd[1551]: time="2025-01-13T20:43:34.806173828Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" returns successfully" Jan 13 20:43:34.807016 containerd[1551]: time="2025-01-13T20:43:34.806975990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:4,}" Jan 13 20:43:34.807187 kubelet[2877]: I0113 20:43:34.807169 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4" Jan 13 20:43:34.807784 containerd[1551]: time="2025-01-13T20:43:34.807772330Z" level=info msg="StopPodSandbox for \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\"" Jan 13 20:43:34.808164 containerd[1551]: time="2025-01-13T20:43:34.808153824Z" level=info msg="Ensure that sandbox d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4 in task-service has been cleanup successfully" Jan 13 20:43:34.809002 containerd[1551]: time="2025-01-13T20:43:34.808983386Z" level=info msg="TearDown network for sandbox \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\" successfully" Jan 13 20:43:34.809042 containerd[1551]: time="2025-01-13T20:43:34.809003122Z" level=info msg="StopPodSandbox for \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\" returns successfully" Jan 13 20:43:34.811065 containerd[1551]: time="2025-01-13T20:43:34.810914729Z" level=info msg="StopPodSandbox for \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\"" Jan 13 20:43:34.811065 containerd[1551]: time="2025-01-13T20:43:34.810977525Z" level=info msg="TearDown network for sandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\" successfully" Jan 13 20:43:34.811065 containerd[1551]: time="2025-01-13T20:43:34.810985144Z" level=info msg="StopPodSandbox for \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\" returns successfully" Jan 13 20:43:34.812109 systemd[1]: run-netns-cni\x2d3863683d\x2dc5fd\x2dcac4\x2dc2c4\x2d2454cbc6746d.mount: Deactivated successfully. Jan 13 20:43:34.813561 containerd[1551]: time="2025-01-13T20:43:34.812915010Z" level=info msg="StopPodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\"" Jan 13 20:43:34.813561 containerd[1551]: time="2025-01-13T20:43:34.812976187Z" level=info msg="TearDown network for sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" successfully" Jan 13 20:43:34.813561 containerd[1551]: time="2025-01-13T20:43:34.812983149Z" level=info msg="StopPodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" returns successfully" Jan 13 20:43:34.814896 kubelet[2877]: I0113 20:43:34.814471 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561" Jan 13 20:43:34.815377 containerd[1551]: time="2025-01-13T20:43:34.815102009Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\"" Jan 13 20:43:34.815377 containerd[1551]: time="2025-01-13T20:43:34.815157085Z" level=info msg="TearDown network for sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" successfully" Jan 13 20:43:34.815377 containerd[1551]: time="2025-01-13T20:43:34.815163549Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" returns successfully" Jan 13 20:43:34.815377 containerd[1551]: time="2025-01-13T20:43:34.815278748Z" level=info msg="StopPodSandbox for \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\"" Jan 13 20:43:34.816359 containerd[1551]: time="2025-01-13T20:43:34.816343515Z" level=info msg="Ensure that sandbox 3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561 in task-service has been cleanup successfully" Jan 13 20:43:34.819396 containerd[1551]: time="2025-01-13T20:43:34.817536130Z" level=info msg="TearDown network for sandbox \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\" successfully" Jan 13 20:43:34.819396 containerd[1551]: time="2025-01-13T20:43:34.817561643Z" level=info msg="StopPodSandbox for \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\" returns successfully" Jan 13 20:43:34.819172 systemd[1]: run-netns-cni\x2d99758a7c\x2ddd1b\x2de41a\x2d8a43\x2da5b1753b117c.mount: Deactivated successfully. Jan 13 20:43:34.820008 containerd[1551]: time="2025-01-13T20:43:34.819644033Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:4,}" Jan 13 20:43:34.821455 containerd[1551]: time="2025-01-13T20:43:34.821441950Z" level=info msg="StopPodSandbox for \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\"" Jan 13 20:43:34.821579 containerd[1551]: time="2025-01-13T20:43:34.821569916Z" level=info msg="TearDown network for sandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\" successfully" Jan 13 20:43:34.821626 containerd[1551]: time="2025-01-13T20:43:34.821619284Z" level=info msg="StopPodSandbox for \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\" returns successfully" Jan 13 20:43:34.823834 containerd[1551]: time="2025-01-13T20:43:34.823812891Z" level=info msg="StopPodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\"" Jan 13 20:43:34.823980 containerd[1551]: time="2025-01-13T20:43:34.823930373Z" level=info msg="TearDown network for sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" successfully" Jan 13 20:43:34.823980 containerd[1551]: time="2025-01-13T20:43:34.823939263Z" level=info msg="StopPodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" returns successfully" Jan 13 20:43:34.825483 containerd[1551]: time="2025-01-13T20:43:34.825323498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-h4ptk,Uid:2fafcb48-274d-4a9e-adca-511adb0459f5,Namespace:calico-apiserver,Attempt:3,}" Jan 13 20:43:34.826223 kubelet[2877]: I0113 20:43:34.826206 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2" Jan 13 20:43:34.826506 containerd[1551]: time="2025-01-13T20:43:34.826444867Z" level=info msg="StopPodSandbox for \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\"" Jan 13 20:43:34.833316 kubelet[2877]: I0113 20:43:34.833301 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec" Jan 13 20:43:34.833544 containerd[1551]: time="2025-01-13T20:43:34.833529790Z" level=info msg="StopPodSandbox for \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\"" Jan 13 20:43:34.845273 containerd[1551]: time="2025-01-13T20:43:34.845159671Z" level=info msg="Ensure that sandbox 8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec in task-service has been cleanup successfully" Jan 13 20:43:34.845783 containerd[1551]: time="2025-01-13T20:43:34.845771227Z" level=info msg="Ensure that sandbox 6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2 in task-service has been cleanup successfully" Jan 13 20:43:34.847364 systemd[1]: run-netns-cni\x2d6d7709dc\x2de6de\x2d80c3\x2d7a79\x2d703231a3e46c.mount: Deactivated successfully. Jan 13 20:43:34.847566 containerd[1551]: time="2025-01-13T20:43:34.847514823Z" level=info msg="TearDown network for sandbox \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\" successfully" Jan 13 20:43:34.847566 containerd[1551]: time="2025-01-13T20:43:34.847525617Z" level=info msg="StopPodSandbox for \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\" returns successfully" Jan 13 20:43:34.847814 containerd[1551]: time="2025-01-13T20:43:34.847762786Z" level=info msg="TearDown network for sandbox \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\" successfully" Jan 13 20:43:34.847814 containerd[1551]: time="2025-01-13T20:43:34.847785814Z" level=info msg="StopPodSandbox for \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\" returns successfully" Jan 13 20:43:34.849952 containerd[1551]: time="2025-01-13T20:43:34.849654439Z" level=info msg="StopPodSandbox for \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\"" Jan 13 20:43:34.850258 containerd[1551]: time="2025-01-13T20:43:34.850129371Z" level=info msg="StopPodSandbox for \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\"" Jan 13 20:43:34.850289 containerd[1551]: time="2025-01-13T20:43:34.850280834Z" level=info msg="TearDown network for sandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\" successfully" Jan 13 20:43:34.850289 containerd[1551]: time="2025-01-13T20:43:34.850287399Z" level=info msg="StopPodSandbox for \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\" returns successfully" Jan 13 20:43:34.850324 containerd[1551]: time="2025-01-13T20:43:34.850235323Z" level=info msg="TearDown network for sandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\" successfully" Jan 13 20:43:34.850324 containerd[1551]: time="2025-01-13T20:43:34.850313577Z" level=info msg="StopPodSandbox for \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\" returns successfully" Jan 13 20:43:34.852169 containerd[1551]: time="2025-01-13T20:43:34.851826466Z" level=info msg="StopPodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\"" Jan 13 20:43:34.852169 containerd[1551]: time="2025-01-13T20:43:34.851868294Z" level=info msg="TearDown network for sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" successfully" Jan 13 20:43:34.852169 containerd[1551]: time="2025-01-13T20:43:34.851874391Z" level=info msg="StopPodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" returns successfully" Jan 13 20:43:34.852169 containerd[1551]: time="2025-01-13T20:43:34.851902652Z" level=info msg="StopPodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\"" Jan 13 20:43:34.856040 kubelet[2877]: I0113 20:43:34.855994 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3" Jan 13 20:43:34.898773 containerd[1551]: time="2025-01-13T20:43:34.851934188Z" level=info msg="TearDown network for sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" successfully" Jan 13 20:43:34.898925 containerd[1551]: time="2025-01-13T20:43:34.898862725Z" level=info msg="StopPodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" returns successfully" Jan 13 20:43:34.898925 containerd[1551]: time="2025-01-13T20:43:34.853163635Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\"" Jan 13 20:43:34.898973 containerd[1551]: time="2025-01-13T20:43:34.898965353Z" level=info msg="TearDown network for sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" successfully" Jan 13 20:43:34.898973 containerd[1551]: time="2025-01-13T20:43:34.898973421Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" returns successfully" Jan 13 20:43:34.899018 containerd[1551]: time="2025-01-13T20:43:34.856249819Z" level=info msg="StopPodSandbox for \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\"" Jan 13 20:43:34.899508 containerd[1551]: time="2025-01-13T20:43:34.899321228Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-b5vf4,Uid:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8,Namespace:calico-apiserver,Attempt:3,}" Jan 13 20:43:34.899508 containerd[1551]: time="2025-01-13T20:43:34.899461201Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:4,}" Jan 13 20:43:34.914942 containerd[1551]: time="2025-01-13T20:43:34.914927085Z" level=info msg="Ensure that sandbox 552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3 in task-service has been cleanup successfully" Jan 13 20:43:34.915424 containerd[1551]: time="2025-01-13T20:43:34.915414134Z" level=info msg="TearDown network for sandbox \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\" successfully" Jan 13 20:43:34.915467 containerd[1551]: time="2025-01-13T20:43:34.915460871Z" level=info msg="StopPodSandbox for \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\" returns successfully" Jan 13 20:43:34.916118 containerd[1551]: time="2025-01-13T20:43:34.916099460Z" level=info msg="StopPodSandbox for \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\"" Jan 13 20:43:34.916159 containerd[1551]: time="2025-01-13T20:43:34.916147917Z" level=info msg="TearDown network for sandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\" successfully" Jan 13 20:43:34.916159 containerd[1551]: time="2025-01-13T20:43:34.916156155Z" level=info msg="StopPodSandbox for \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\" returns successfully" Jan 13 20:43:34.916457 containerd[1551]: time="2025-01-13T20:43:34.916444370Z" level=info msg="StopPodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\"" Jan 13 20:43:34.916503 containerd[1551]: time="2025-01-13T20:43:34.916491691Z" level=info msg="TearDown network for sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" successfully" Jan 13 20:43:34.916526 containerd[1551]: time="2025-01-13T20:43:34.916501659Z" level=info msg="StopPodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" returns successfully" Jan 13 20:43:34.916636 containerd[1551]: time="2025-01-13T20:43:34.916626835Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\"" Jan 13 20:43:34.916748 containerd[1551]: time="2025-01-13T20:43:34.916697190Z" level=info msg="TearDown network for sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" successfully" Jan 13 20:43:34.916789 containerd[1551]: time="2025-01-13T20:43:34.916781488Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" returns successfully" Jan 13 20:43:34.917417 containerd[1551]: time="2025-01-13T20:43:34.917405495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:4,}" Jan 13 20:43:34.973866 containerd[1551]: time="2025-01-13T20:43:34.973796365Z" level=error msg="Failed to destroy network for sandbox \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:34.974152 containerd[1551]: time="2025-01-13T20:43:34.974069782Z" level=error msg="encountered an error cleaning up failed sandbox \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:34.974152 containerd[1551]: time="2025-01-13T20:43:34.974106038Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:34.974463 kubelet[2877]: E0113 20:43:34.974441 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:34.974512 kubelet[2877]: E0113 20:43:34.974479 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:34.974512 kubelet[2877]: E0113 20:43:34.974497 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:34.974581 kubelet[2877]: E0113 20:43:34.974525 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8prdm_calico-system(3c210180-a974-4da0-9ca1-18a6f94f39f4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8prdm_calico-system(3c210180-a974-4da0-9ca1-18a6f94f39f4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8prdm" podUID="3c210180-a974-4da0-9ca1-18a6f94f39f4" Jan 13 20:43:35.442994 systemd[1]: run-netns-cni\x2d922bab4f\x2d0634\x2dd719\x2d111f\x2d6b5b21b3a8f1.mount: Deactivated successfully. Jan 13 20:43:35.443235 systemd[1]: run-netns-cni\x2d57c22c3c\x2d5814\x2df37d\x2deb63\x2dfbfd1087068f.mount: Deactivated successfully. Jan 13 20:43:35.566918 containerd[1551]: time="2025-01-13T20:43:35.566025917Z" level=error msg="Failed to destroy network for sandbox \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.567874 containerd[1551]: time="2025-01-13T20:43:35.567458829Z" level=error msg="encountered an error cleaning up failed sandbox \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.567874 containerd[1551]: time="2025-01-13T20:43:35.567516659Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.567934 kubelet[2877]: E0113 20:43:35.567797 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.567934 kubelet[2877]: E0113 20:43:35.567837 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:35.567934 kubelet[2877]: E0113 20:43:35.567851 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:35.569272 kubelet[2877]: E0113 20:43:35.567880 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-f9n77_kube-system(f4b04d0c-4a38-4068-8730-7ed8bc9346ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-f9n77_kube-system(f4b04d0c-4a38-4068-8730-7ed8bc9346ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-f9n77" podUID="f4b04d0c-4a38-4068-8730-7ed8bc9346ef" Jan 13 20:43:35.569098 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30-shm.mount: Deactivated successfully. Jan 13 20:43:35.653194 containerd[1551]: time="2025-01-13T20:43:35.653166654Z" level=error msg="Failed to destroy network for sandbox \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.658160 containerd[1551]: time="2025-01-13T20:43:35.653791923Z" level=error msg="encountered an error cleaning up failed sandbox \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.658160 containerd[1551]: time="2025-01-13T20:43:35.654489045Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.658260 kubelet[2877]: E0113 20:43:35.654721 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.658260 kubelet[2877]: E0113 20:43:35.654788 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:35.658260 kubelet[2877]: E0113 20:43:35.654801 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:35.667709 kubelet[2877]: E0113 20:43:35.654830 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-69bd7ff58c-rg9pf_calico-system(4d21bbae-e62e-4365-ba57-17611b86a846)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-69bd7ff58c-rg9pf_calico-system(4d21bbae-e62e-4365-ba57-17611b86a846)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" podUID="4d21bbae-e62e-4365-ba57-17611b86a846" Jan 13 20:43:35.684950 containerd[1551]: time="2025-01-13T20:43:35.684920228Z" level=error msg="Failed to destroy network for sandbox \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.691646 containerd[1551]: time="2025-01-13T20:43:35.685217164Z" level=error msg="encountered an error cleaning up failed sandbox \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.691646 containerd[1551]: time="2025-01-13T20:43:35.685251284Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-b5vf4,Uid:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.696411 kubelet[2877]: E0113 20:43:35.685402 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.696411 kubelet[2877]: E0113 20:43:35.685436 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:35.696411 kubelet[2877]: E0113 20:43:35.685452 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:35.696497 kubelet[2877]: E0113 20:43:35.685480 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-595db757fc-b5vf4_calico-apiserver(aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-595db757fc-b5vf4_calico-apiserver(aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" podUID="aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8" Jan 13 20:43:35.719217 containerd[1551]: time="2025-01-13T20:43:35.719077903Z" level=error msg="Failed to destroy network for sandbox \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.720701 containerd[1551]: time="2025-01-13T20:43:35.719957673Z" level=error msg="encountered an error cleaning up failed sandbox \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.720701 containerd[1551]: time="2025-01-13T20:43:35.719992996Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-h4ptk,Uid:2fafcb48-274d-4a9e-adca-511adb0459f5,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.720800 kubelet[2877]: E0113 20:43:35.720134 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.720800 kubelet[2877]: E0113 20:43:35.720168 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:35.720800 kubelet[2877]: E0113 20:43:35.720184 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:35.720872 kubelet[2877]: E0113 20:43:35.720213 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-595db757fc-h4ptk_calico-apiserver(2fafcb48-274d-4a9e-adca-511adb0459f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-595db757fc-h4ptk_calico-apiserver(2fafcb48-274d-4a9e-adca-511adb0459f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" podUID="2fafcb48-274d-4a9e-adca-511adb0459f5" Jan 13 20:43:35.727963 containerd[1551]: time="2025-01-13T20:43:35.727906436Z" level=error msg="Failed to destroy network for sandbox \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.728366 containerd[1551]: time="2025-01-13T20:43:35.728314057Z" level=error msg="encountered an error cleaning up failed sandbox \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.728405 containerd[1551]: time="2025-01-13T20:43:35.728373869Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.728660 kubelet[2877]: E0113 20:43:35.728564 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.728660 kubelet[2877]: E0113 20:43:35.728597 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:35.728660 kubelet[2877]: E0113 20:43:35.728612 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:35.728949 kubelet[2877]: E0113 20:43:35.728638 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-zww27_kube-system(4bc90cb7-3014-45c2-9619-7765993cb1d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-zww27_kube-system(4bc90cb7-3014-45c2-9619-7765993cb1d0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-zww27" podUID="4bc90cb7-3014-45c2-9619-7765993cb1d0" Jan 13 20:43:35.862048 containerd[1551]: time="2025-01-13T20:43:35.860884857Z" level=info msg="StopPodSandbox for \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\"" Jan 13 20:43:35.862338 containerd[1551]: time="2025-01-13T20:43:35.862324805Z" level=info msg="Ensure that sandbox e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824 in task-service has been cleanup successfully" Jan 13 20:43:35.862837 containerd[1551]: time="2025-01-13T20:43:35.862812753Z" level=info msg="TearDown network for sandbox \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\" successfully" Jan 13 20:43:35.862837 containerd[1551]: time="2025-01-13T20:43:35.862829620Z" level=info msg="StopPodSandbox for \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\" returns successfully" Jan 13 20:43:35.864257 containerd[1551]: time="2025-01-13T20:43:35.863297874Z" level=info msg="StopPodSandbox for \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\"" Jan 13 20:43:35.864257 containerd[1551]: time="2025-01-13T20:43:35.863435332Z" level=info msg="TearDown network for sandbox \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\" successfully" Jan 13 20:43:35.864257 containerd[1551]: time="2025-01-13T20:43:35.863443228Z" level=info msg="StopPodSandbox for \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\" returns successfully" Jan 13 20:43:35.864257 containerd[1551]: time="2025-01-13T20:43:35.863789025Z" level=info msg="StopPodSandbox for \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\"" Jan 13 20:43:35.864257 containerd[1551]: time="2025-01-13T20:43:35.864032065Z" level=info msg="TearDown network for sandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\" successfully" Jan 13 20:43:35.864257 containerd[1551]: time="2025-01-13T20:43:35.864039785Z" level=info msg="StopPodSandbox for \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\" returns successfully" Jan 13 20:43:35.864910 containerd[1551]: time="2025-01-13T20:43:35.864399636Z" level=info msg="StopPodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\"" Jan 13 20:43:35.864910 containerd[1551]: time="2025-01-13T20:43:35.864526970Z" level=info msg="TearDown network for sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" successfully" Jan 13 20:43:35.864910 containerd[1551]: time="2025-01-13T20:43:35.864534396Z" level=info msg="StopPodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" returns successfully" Jan 13 20:43:35.865329 containerd[1551]: time="2025-01-13T20:43:35.865161454Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\"" Jan 13 20:43:35.865329 containerd[1551]: time="2025-01-13T20:43:35.865210233Z" level=info msg="TearDown network for sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" successfully" Jan 13 20:43:35.865329 containerd[1551]: time="2025-01-13T20:43:35.865216654Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" returns successfully" Jan 13 20:43:35.865932 containerd[1551]: time="2025-01-13T20:43:35.865724672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:5,}" Jan 13 20:43:35.866091 kubelet[2877]: I0113 20:43:35.859883 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824" Jan 13 20:43:35.866858 kubelet[2877]: I0113 20:43:35.866392 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b" Jan 13 20:43:35.867915 containerd[1551]: time="2025-01-13T20:43:35.867628545Z" level=info msg="StopPodSandbox for \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\"" Jan 13 20:43:35.867915 containerd[1551]: time="2025-01-13T20:43:35.867753353Z" level=info msg="Ensure that sandbox 0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b in task-service has been cleanup successfully" Jan 13 20:43:35.868671 containerd[1551]: time="2025-01-13T20:43:35.868655843Z" level=info msg="TearDown network for sandbox \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\" successfully" Jan 13 20:43:35.869549 containerd[1551]: time="2025-01-13T20:43:35.869240159Z" level=info msg="StopPodSandbox for \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\" returns successfully" Jan 13 20:43:35.870486 containerd[1551]: time="2025-01-13T20:43:35.869780743Z" level=info msg="StopPodSandbox for \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\"" Jan 13 20:43:35.870486 containerd[1551]: time="2025-01-13T20:43:35.869830785Z" level=info msg="TearDown network for sandbox \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\" successfully" Jan 13 20:43:35.870486 containerd[1551]: time="2025-01-13T20:43:35.869837281Z" level=info msg="StopPodSandbox for \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\" returns successfully" Jan 13 20:43:35.870486 containerd[1551]: time="2025-01-13T20:43:35.870050282Z" level=info msg="StopPodSandbox for \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\"" Jan 13 20:43:35.870486 containerd[1551]: time="2025-01-13T20:43:35.870103010Z" level=info msg="TearDown network for sandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\" successfully" Jan 13 20:43:35.870486 containerd[1551]: time="2025-01-13T20:43:35.870110202Z" level=info msg="StopPodSandbox for \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\" returns successfully" Jan 13 20:43:35.870486 containerd[1551]: time="2025-01-13T20:43:35.870376240Z" level=info msg="StopPodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\"" Jan 13 20:43:35.870824 containerd[1551]: time="2025-01-13T20:43:35.870814930Z" level=info msg="TearDown network for sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" successfully" Jan 13 20:43:35.870900 containerd[1551]: time="2025-01-13T20:43:35.870870561Z" level=info msg="StopPodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" returns successfully" Jan 13 20:43:35.871883 containerd[1551]: time="2025-01-13T20:43:35.871415739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-b5vf4,Uid:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8,Namespace:calico-apiserver,Attempt:4,}" Jan 13 20:43:35.931559 kubelet[2877]: I0113 20:43:35.931418 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6" Jan 13 20:43:35.932165 containerd[1551]: time="2025-01-13T20:43:35.932008695Z" level=info msg="StopPodSandbox for \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\"" Jan 13 20:43:35.933827 containerd[1551]: time="2025-01-13T20:43:35.933634637Z" level=info msg="Ensure that sandbox c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6 in task-service has been cleanup successfully" Jan 13 20:43:35.936017 containerd[1551]: time="2025-01-13T20:43:35.935987599Z" level=info msg="TearDown network for sandbox \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\" successfully" Jan 13 20:43:35.936017 containerd[1551]: time="2025-01-13T20:43:35.936009175Z" level=info msg="StopPodSandbox for \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\" returns successfully" Jan 13 20:43:35.936934 containerd[1551]: time="2025-01-13T20:43:35.936913052Z" level=info msg="StopPodSandbox for \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\"" Jan 13 20:43:35.936995 containerd[1551]: time="2025-01-13T20:43:35.936969830Z" level=info msg="TearDown network for sandbox \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\" successfully" Jan 13 20:43:35.936995 containerd[1551]: time="2025-01-13T20:43:35.936976410Z" level=info msg="StopPodSandbox for \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\" returns successfully" Jan 13 20:43:35.937779 kubelet[2877]: I0113 20:43:35.937290 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30" Jan 13 20:43:35.942471 containerd[1551]: time="2025-01-13T20:43:35.942065710Z" level=info msg="StopPodSandbox for \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\"" Jan 13 20:43:35.942999 containerd[1551]: time="2025-01-13T20:43:35.942826899Z" level=info msg="TearDown network for sandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\" successfully" Jan 13 20:43:35.942999 containerd[1551]: time="2025-01-13T20:43:35.942842822Z" level=info msg="StopPodSandbox for \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\" returns successfully" Jan 13 20:43:35.942999 containerd[1551]: time="2025-01-13T20:43:35.942905170Z" level=info msg="StopPodSandbox for \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\"" Jan 13 20:43:35.949524 containerd[1551]: time="2025-01-13T20:43:35.948722919Z" level=info msg="Ensure that sandbox 690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30 in task-service has been cleanup successfully" Jan 13 20:43:35.950307 containerd[1551]: time="2025-01-13T20:43:35.950294561Z" level=info msg="TearDown network for sandbox \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\" successfully" Jan 13 20:43:35.950718 containerd[1551]: time="2025-01-13T20:43:35.950700486Z" level=info msg="StopPodSandbox for \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\" returns successfully" Jan 13 20:43:35.950829 containerd[1551]: time="2025-01-13T20:43:35.950818773Z" level=info msg="StopPodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\"" Jan 13 20:43:35.951204 containerd[1551]: time="2025-01-13T20:43:35.951162873Z" level=info msg="TearDown network for sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" successfully" Jan 13 20:43:35.951672 containerd[1551]: time="2025-01-13T20:43:35.951177441Z" level=info msg="StopPodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" returns successfully" Jan 13 20:43:35.951672 containerd[1551]: time="2025-01-13T20:43:35.950939005Z" level=info msg="StopPodSandbox for \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\"" Jan 13 20:43:35.951672 containerd[1551]: time="2025-01-13T20:43:35.951641961Z" level=info msg="TearDown network for sandbox \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\" successfully" Jan 13 20:43:35.951672 containerd[1551]: time="2025-01-13T20:43:35.951649119Z" level=info msg="StopPodSandbox for \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\" returns successfully" Jan 13 20:43:35.952118 containerd[1551]: time="2025-01-13T20:43:35.951908226Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\"" Jan 13 20:43:35.952118 containerd[1551]: time="2025-01-13T20:43:35.951951548Z" level=info msg="TearDown network for sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" successfully" Jan 13 20:43:35.952118 containerd[1551]: time="2025-01-13T20:43:35.951957361Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" returns successfully" Jan 13 20:43:35.952118 containerd[1551]: time="2025-01-13T20:43:35.951981218Z" level=info msg="StopPodSandbox for \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\"" Jan 13 20:43:35.952118 containerd[1551]: time="2025-01-13T20:43:35.952009196Z" level=info msg="TearDown network for sandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\" successfully" Jan 13 20:43:35.952118 containerd[1551]: time="2025-01-13T20:43:35.952014032Z" level=info msg="StopPodSandbox for \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\" returns successfully" Jan 13 20:43:35.952233 kubelet[2877]: I0113 20:43:35.951942 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537" Jan 13 20:43:35.954925 containerd[1551]: time="2025-01-13T20:43:35.954683431Z" level=info msg="StopPodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\"" Jan 13 20:43:35.955160 containerd[1551]: time="2025-01-13T20:43:35.955149201Z" level=info msg="TearDown network for sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" successfully" Jan 13 20:43:35.955293 containerd[1551]: time="2025-01-13T20:43:35.955276649Z" level=info msg="StopPodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" returns successfully" Jan 13 20:43:35.955336 containerd[1551]: time="2025-01-13T20:43:35.955227824Z" level=info msg="StopPodSandbox for \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\"" Jan 13 20:43:35.955411 containerd[1551]: time="2025-01-13T20:43:35.955373694Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:5,}" Jan 13 20:43:35.955437 containerd[1551]: time="2025-01-13T20:43:35.955408664Z" level=info msg="Ensure that sandbox 668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537 in task-service has been cleanup successfully" Jan 13 20:43:35.960517 containerd[1551]: time="2025-01-13T20:43:35.956083836Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\"" Jan 13 20:43:35.960517 containerd[1551]: time="2025-01-13T20:43:35.956288784Z" level=info msg="TearDown network for sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" successfully" Jan 13 20:43:35.960517 containerd[1551]: time="2025-01-13T20:43:35.956295788Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" returns successfully" Jan 13 20:43:35.960517 containerd[1551]: time="2025-01-13T20:43:35.956153925Z" level=info msg="TearDown network for sandbox \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\" successfully" Jan 13 20:43:35.960517 containerd[1551]: time="2025-01-13T20:43:35.956985838Z" level=info msg="StopPodSandbox for \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\" returns successfully" Jan 13 20:43:35.960517 containerd[1551]: time="2025-01-13T20:43:35.957239150Z" level=info msg="StopPodSandbox for \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\"" Jan 13 20:43:35.960517 containerd[1551]: time="2025-01-13T20:43:35.957276078Z" level=info msg="TearDown network for sandbox \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\" successfully" Jan 13 20:43:35.960517 containerd[1551]: time="2025-01-13T20:43:35.957281666Z" level=info msg="StopPodSandbox for \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\" returns successfully" Jan 13 20:43:35.960517 containerd[1551]: time="2025-01-13T20:43:35.957486635Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:5,}" Jan 13 20:43:35.960517 containerd[1551]: time="2025-01-13T20:43:35.958160022Z" level=info msg="StopPodSandbox for \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\"" Jan 13 20:43:35.960517 containerd[1551]: time="2025-01-13T20:43:35.958196504Z" level=info msg="TearDown network for sandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\" successfully" Jan 13 20:43:35.960517 containerd[1551]: time="2025-01-13T20:43:35.958202711Z" level=info msg="StopPodSandbox for \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\" returns successfully" Jan 13 20:43:35.960981 containerd[1551]: time="2025-01-13T20:43:35.960917944Z" level=info msg="StopPodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\"" Jan 13 20:43:35.960981 containerd[1551]: time="2025-01-13T20:43:35.960973435Z" level=info msg="TearDown network for sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" successfully" Jan 13 20:43:35.967456 containerd[1551]: time="2025-01-13T20:43:35.960981258Z" level=info msg="StopPodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" returns successfully" Jan 13 20:43:35.967456 containerd[1551]: time="2025-01-13T20:43:35.961716921Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-h4ptk,Uid:2fafcb48-274d-4a9e-adca-511adb0459f5,Namespace:calico-apiserver,Attempt:4,}" Jan 13 20:43:35.967456 containerd[1551]: time="2025-01-13T20:43:35.962786685Z" level=info msg="StopPodSandbox for \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\"" Jan 13 20:43:35.967456 containerd[1551]: time="2025-01-13T20:43:35.963551993Z" level=info msg="Ensure that sandbox 02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08 in task-service has been cleanup successfully" Jan 13 20:43:35.967456 containerd[1551]: time="2025-01-13T20:43:35.966248392Z" level=info msg="TearDown network for sandbox \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\" successfully" Jan 13 20:43:35.967456 containerd[1551]: time="2025-01-13T20:43:35.966265596Z" level=info msg="StopPodSandbox for \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\" returns successfully" Jan 13 20:43:35.967456 containerd[1551]: time="2025-01-13T20:43:35.966892642Z" level=info msg="StopPodSandbox for \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\"" Jan 13 20:43:35.967456 containerd[1551]: time="2025-01-13T20:43:35.966938242Z" level=info msg="TearDown network for sandbox \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\" successfully" Jan 13 20:43:35.967456 containerd[1551]: time="2025-01-13T20:43:35.966944365Z" level=info msg="StopPodSandbox for \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\" returns successfully" Jan 13 20:43:35.967456 containerd[1551]: time="2025-01-13T20:43:35.967104209Z" level=info msg="StopPodSandbox for \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\"" Jan 13 20:43:35.967456 containerd[1551]: time="2025-01-13T20:43:35.967159814Z" level=info msg="TearDown network for sandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\" successfully" Jan 13 20:43:35.967456 containerd[1551]: time="2025-01-13T20:43:35.967167115Z" level=info msg="StopPodSandbox for \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\" returns successfully" Jan 13 20:43:35.967880 kubelet[2877]: I0113 20:43:35.962426 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08" Jan 13 20:43:35.973837 containerd[1551]: time="2025-01-13T20:43:35.973761392Z" level=error msg="Failed to destroy network for sandbox \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.975954 containerd[1551]: time="2025-01-13T20:43:35.975856700Z" level=error msg="encountered an error cleaning up failed sandbox \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.975954 containerd[1551]: time="2025-01-13T20:43:35.975893324Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.977373 kubelet[2877]: E0113 20:43:35.977351 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.977431 kubelet[2877]: E0113 20:43:35.977402 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:35.977431 kubelet[2877]: E0113 20:43:35.977420 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:35.977478 kubelet[2877]: E0113 20:43:35.977448 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-zww27_kube-system(4bc90cb7-3014-45c2-9619-7765993cb1d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-zww27_kube-system(4bc90cb7-3014-45c2-9619-7765993cb1d0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-zww27" podUID="4bc90cb7-3014-45c2-9619-7765993cb1d0" Jan 13 20:43:35.991960 containerd[1551]: time="2025-01-13T20:43:35.991923862Z" level=info msg="StopPodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\"" Jan 13 20:43:35.992067 containerd[1551]: time="2025-01-13T20:43:35.992017954Z" level=info msg="TearDown network for sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" successfully" Jan 13 20:43:35.992093 containerd[1551]: time="2025-01-13T20:43:35.992065400Z" level=info msg="StopPodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" returns successfully" Jan 13 20:43:35.992313 containerd[1551]: time="2025-01-13T20:43:35.992291961Z" level=error msg="Failed to destroy network for sandbox \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.992545 containerd[1551]: time="2025-01-13T20:43:35.992525772Z" level=error msg="encountered an error cleaning up failed sandbox \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.992589 containerd[1551]: time="2025-01-13T20:43:35.992570296Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-b5vf4,Uid:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.992836 kubelet[2877]: E0113 20:43:35.992762 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:35.992836 kubelet[2877]: E0113 20:43:35.992800 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:35.992836 kubelet[2877]: E0113 20:43:35.992817 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:35.992911 kubelet[2877]: E0113 20:43:35.992853 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-595db757fc-b5vf4_calico-apiserver(aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-595db757fc-b5vf4_calico-apiserver(aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" podUID="aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8" Jan 13 20:43:35.993522 containerd[1551]: time="2025-01-13T20:43:35.993404744Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\"" Jan 13 20:43:35.993522 containerd[1551]: time="2025-01-13T20:43:35.993462337Z" level=info msg="TearDown network for sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" successfully" Jan 13 20:43:35.993522 containerd[1551]: time="2025-01-13T20:43:35.993472327Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" returns successfully" Jan 13 20:43:35.997139 containerd[1551]: time="2025-01-13T20:43:35.993862077Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:5,}" Jan 13 20:43:36.128994 containerd[1551]: time="2025-01-13T20:43:36.128907195Z" level=error msg="Failed to destroy network for sandbox \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.129607 containerd[1551]: time="2025-01-13T20:43:36.129256794Z" level=error msg="encountered an error cleaning up failed sandbox \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.129607 containerd[1551]: time="2025-01-13T20:43:36.129292686Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.129676 kubelet[2877]: E0113 20:43:36.129504 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.129676 kubelet[2877]: E0113 20:43:36.129553 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:36.129676 kubelet[2877]: E0113 20:43:36.129568 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:36.129787 kubelet[2877]: E0113 20:43:36.129596 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8prdm_calico-system(3c210180-a974-4da0-9ca1-18a6f94f39f4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8prdm_calico-system(3c210180-a974-4da0-9ca1-18a6f94f39f4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8prdm" podUID="3c210180-a974-4da0-9ca1-18a6f94f39f4" Jan 13 20:43:36.143125 containerd[1551]: time="2025-01-13T20:43:36.143093769Z" level=error msg="Failed to destroy network for sandbox \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.144214 containerd[1551]: time="2025-01-13T20:43:36.144133707Z" level=error msg="encountered an error cleaning up failed sandbox \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.144214 containerd[1551]: time="2025-01-13T20:43:36.144175316Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.145162 containerd[1551]: time="2025-01-13T20:43:36.144337063Z" level=error msg="Failed to destroy network for sandbox \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.145162 containerd[1551]: time="2025-01-13T20:43:36.144537884Z" level=error msg="encountered an error cleaning up failed sandbox \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.145162 containerd[1551]: time="2025-01-13T20:43:36.144561292Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.153332 kubelet[2877]: E0113 20:43:36.144504 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.153332 kubelet[2877]: E0113 20:43:36.144541 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:36.153332 kubelet[2877]: E0113 20:43:36.144555 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:36.153431 kubelet[2877]: E0113 20:43:36.144582 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-f9n77_kube-system(f4b04d0c-4a38-4068-8730-7ed8bc9346ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-f9n77_kube-system(f4b04d0c-4a38-4068-8730-7ed8bc9346ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-f9n77" podUID="f4b04d0c-4a38-4068-8730-7ed8bc9346ef" Jan 13 20:43:36.153431 kubelet[2877]: E0113 20:43:36.144641 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.153431 kubelet[2877]: E0113 20:43:36.144689 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:36.153805 kubelet[2877]: E0113 20:43:36.144702 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:36.153805 kubelet[2877]: E0113 20:43:36.144914 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-69bd7ff58c-rg9pf_calico-system(4d21bbae-e62e-4365-ba57-17611b86a846)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-69bd7ff58c-rg9pf_calico-system(4d21bbae-e62e-4365-ba57-17611b86a846)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" podUID="4d21bbae-e62e-4365-ba57-17611b86a846" Jan 13 20:43:36.153805 kubelet[2877]: E0113 20:43:36.153776 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.153913 containerd[1551]: time="2025-01-13T20:43:36.153439440Z" level=error msg="Failed to destroy network for sandbox \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.153913 containerd[1551]: time="2025-01-13T20:43:36.153632142Z" level=error msg="encountered an error cleaning up failed sandbox \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.153913 containerd[1551]: time="2025-01-13T20:43:36.153664936Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-h4ptk,Uid:2fafcb48-274d-4a9e-adca-511adb0459f5,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:36.154619 kubelet[2877]: E0113 20:43:36.153806 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:36.154619 kubelet[2877]: E0113 20:43:36.153818 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:36.154619 kubelet[2877]: E0113 20:43:36.153842 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-595db757fc-h4ptk_calico-apiserver(2fafcb48-274d-4a9e-adca-511adb0459f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-595db757fc-h4ptk_calico-apiserver(2fafcb48-274d-4a9e-adca-511adb0459f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" podUID="2fafcb48-274d-4a9e-adca-511adb0459f5" Jan 13 20:43:36.444445 systemd[1]: run-netns-cni\x2d70c2fbe0\x2d898f\x2d6084\x2d535e\x2d850fc37ce3d6.mount: Deactivated successfully. Jan 13 20:43:36.444750 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08-shm.mount: Deactivated successfully. Jan 13 20:43:36.444869 systemd[1]: run-netns-cni\x2d85bcf931\x2db87a\x2d32b0\x2d311e\x2dc059439421c1.mount: Deactivated successfully. Jan 13 20:43:36.444963 systemd[1]: run-netns-cni\x2d446989a7\x2db8b2\x2d0109\x2dfcaf\x2df72123ab7839.mount: Deactivated successfully. Jan 13 20:43:36.445068 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1386264318.mount: Deactivated successfully. Jan 13 20:43:36.508746 containerd[1551]: time="2025-01-13T20:43:36.508546291Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:36.509041 containerd[1551]: time="2025-01-13T20:43:36.509014886Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Jan 13 20:43:36.555227 containerd[1551]: time="2025-01-13T20:43:36.555181407Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:36.596004 containerd[1551]: time="2025-01-13T20:43:36.595923586Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:36.598512 containerd[1551]: time="2025-01-13T20:43:36.598384414Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 4.8526194s" Jan 13 20:43:36.598512 containerd[1551]: time="2025-01-13T20:43:36.598409236Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Jan 13 20:43:36.837802 containerd[1551]: time="2025-01-13T20:43:36.837623343Z" level=info msg="CreateContainer within sandbox \"923017f51f4a302cb6f0cb67a61eae3be224b125db930c755978551b526b01c8\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 13 20:43:36.928622 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1480042396.mount: Deactivated successfully. Jan 13 20:43:36.942927 containerd[1551]: time="2025-01-13T20:43:36.942887553Z" level=info msg="CreateContainer within sandbox \"923017f51f4a302cb6f0cb67a61eae3be224b125db930c755978551b526b01c8\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"cc7f14a9972e674efbce89d3466cd282b10c6a017ce3de24f64051180dd02a56\"" Jan 13 20:43:36.961858 containerd[1551]: time="2025-01-13T20:43:36.960372825Z" level=info msg="StartContainer for \"cc7f14a9972e674efbce89d3466cd282b10c6a017ce3de24f64051180dd02a56\"" Jan 13 20:43:36.967807 kubelet[2877]: I0113 20:43:36.967787 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e" Jan 13 20:43:36.969563 containerd[1551]: time="2025-01-13T20:43:36.968660175Z" level=info msg="StopPodSandbox for \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\"" Jan 13 20:43:36.969563 containerd[1551]: time="2025-01-13T20:43:36.968835044Z" level=info msg="Ensure that sandbox 7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e in task-service has been cleanup successfully" Jan 13 20:43:36.970919 containerd[1551]: time="2025-01-13T20:43:36.969746127Z" level=info msg="TearDown network for sandbox \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\" successfully" Jan 13 20:43:36.970919 containerd[1551]: time="2025-01-13T20:43:36.969762056Z" level=info msg="StopPodSandbox for \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\" returns successfully" Jan 13 20:43:36.972690 containerd[1551]: time="2025-01-13T20:43:36.971870897Z" level=info msg="StopPodSandbox for \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\"" Jan 13 20:43:36.972690 containerd[1551]: time="2025-01-13T20:43:36.971929874Z" level=info msg="TearDown network for sandbox \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\" successfully" Jan 13 20:43:36.972690 containerd[1551]: time="2025-01-13T20:43:36.971936462Z" level=info msg="StopPodSandbox for \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\" returns successfully" Jan 13 20:43:36.973214 containerd[1551]: time="2025-01-13T20:43:36.973170276Z" level=info msg="StopPodSandbox for \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\"" Jan 13 20:43:36.973292 containerd[1551]: time="2025-01-13T20:43:36.973271398Z" level=info msg="TearDown network for sandbox \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\" successfully" Jan 13 20:43:36.973336 containerd[1551]: time="2025-01-13T20:43:36.973328878Z" level=info msg="StopPodSandbox for \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\" returns successfully" Jan 13 20:43:36.974249 containerd[1551]: time="2025-01-13T20:43:36.974238582Z" level=info msg="StopPodSandbox for \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\"" Jan 13 20:43:36.974390 containerd[1551]: time="2025-01-13T20:43:36.974314793Z" level=info msg="TearDown network for sandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\" successfully" Jan 13 20:43:36.974390 containerd[1551]: time="2025-01-13T20:43:36.974333471Z" level=info msg="StopPodSandbox for \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\" returns successfully" Jan 13 20:43:36.974745 containerd[1551]: time="2025-01-13T20:43:36.974661700Z" level=info msg="StopPodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\"" Jan 13 20:43:36.974745 containerd[1551]: time="2025-01-13T20:43:36.974703786Z" level=info msg="TearDown network for sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" successfully" Jan 13 20:43:36.974745 containerd[1551]: time="2025-01-13T20:43:36.974709784Z" level=info msg="StopPodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" returns successfully" Jan 13 20:43:36.975039 containerd[1551]: time="2025-01-13T20:43:36.975029730Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\"" Jan 13 20:43:36.975156 containerd[1551]: time="2025-01-13T20:43:36.975119263Z" level=info msg="TearDown network for sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" successfully" Jan 13 20:43:36.975156 containerd[1551]: time="2025-01-13T20:43:36.975125964Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" returns successfully" Jan 13 20:43:36.976328 containerd[1551]: time="2025-01-13T20:43:36.975827905Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:6,}" Jan 13 20:43:36.976553 kubelet[2877]: I0113 20:43:36.976542 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596" Jan 13 20:43:36.978748 containerd[1551]: time="2025-01-13T20:43:36.978595353Z" level=info msg="StopPodSandbox for \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\"" Jan 13 20:43:36.978748 containerd[1551]: time="2025-01-13T20:43:36.978717228Z" level=info msg="Ensure that sandbox 8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596 in task-service has been cleanup successfully" Jan 13 20:43:36.979783 containerd[1551]: time="2025-01-13T20:43:36.978844166Z" level=info msg="TearDown network for sandbox \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\" successfully" Jan 13 20:43:36.979783 containerd[1551]: time="2025-01-13T20:43:36.978854413Z" level=info msg="StopPodSandbox for \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\" returns successfully" Jan 13 20:43:36.981355 containerd[1551]: time="2025-01-13T20:43:36.981196126Z" level=info msg="StopPodSandbox for \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\"" Jan 13 20:43:36.981355 containerd[1551]: time="2025-01-13T20:43:36.981264048Z" level=info msg="TearDown network for sandbox \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\" successfully" Jan 13 20:43:36.981355 containerd[1551]: time="2025-01-13T20:43:36.981275077Z" level=info msg="StopPodSandbox for \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\" returns successfully" Jan 13 20:43:36.982266 containerd[1551]: time="2025-01-13T20:43:36.982234938Z" level=info msg="StopPodSandbox for \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\"" Jan 13 20:43:36.982487 containerd[1551]: time="2025-01-13T20:43:36.982408700Z" level=info msg="TearDown network for sandbox \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\" successfully" Jan 13 20:43:36.982487 containerd[1551]: time="2025-01-13T20:43:36.982419214Z" level=info msg="StopPodSandbox for \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\" returns successfully" Jan 13 20:43:36.982571 containerd[1551]: time="2025-01-13T20:43:36.982550671Z" level=info msg="StopPodSandbox for \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\"" Jan 13 20:43:36.982626 containerd[1551]: time="2025-01-13T20:43:36.982599825Z" level=info msg="TearDown network for sandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\" successfully" Jan 13 20:43:36.983257 containerd[1551]: time="2025-01-13T20:43:36.983197045Z" level=info msg="StopPodSandbox for \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\" returns successfully" Jan 13 20:43:36.985558 containerd[1551]: time="2025-01-13T20:43:36.983932411Z" level=info msg="StopPodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\"" Jan 13 20:43:36.985558 containerd[1551]: time="2025-01-13T20:43:36.983975135Z" level=info msg="TearDown network for sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" successfully" Jan 13 20:43:36.985558 containerd[1551]: time="2025-01-13T20:43:36.983982183Z" level=info msg="StopPodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" returns successfully" Jan 13 20:43:36.985558 containerd[1551]: time="2025-01-13T20:43:36.984939777Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-b5vf4,Uid:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8,Namespace:calico-apiserver,Attempt:5,}" Jan 13 20:43:36.985558 containerd[1551]: time="2025-01-13T20:43:36.985043058Z" level=info msg="StopPodSandbox for \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\"" Jan 13 20:43:36.985558 containerd[1551]: time="2025-01-13T20:43:36.985337280Z" level=info msg="Ensure that sandbox 6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af in task-service has been cleanup successfully" Jan 13 20:43:36.985558 containerd[1551]: time="2025-01-13T20:43:36.985494426Z" level=info msg="TearDown network for sandbox \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\" successfully" Jan 13 20:43:36.985558 containerd[1551]: time="2025-01-13T20:43:36.985504101Z" level=info msg="StopPodSandbox for \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\" returns successfully" Jan 13 20:43:36.985726 kubelet[2877]: I0113 20:43:36.984069 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af" Jan 13 20:43:36.986895 containerd[1551]: time="2025-01-13T20:43:36.986789829Z" level=info msg="StopPodSandbox for \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\"" Jan 13 20:43:36.986895 containerd[1551]: time="2025-01-13T20:43:36.986848499Z" level=info msg="TearDown network for sandbox \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\" successfully" Jan 13 20:43:36.986895 containerd[1551]: time="2025-01-13T20:43:36.986855114Z" level=info msg="StopPodSandbox for \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\" returns successfully" Jan 13 20:43:36.987264 containerd[1551]: time="2025-01-13T20:43:36.987253598Z" level=info msg="StopPodSandbox for \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\"" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.987401858Z" level=info msg="TearDown network for sandbox \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\" successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.987409266Z" level=info msg="StopPodSandbox for \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\" returns successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.987563582Z" level=info msg="StopPodSandbox for \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\"" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.987692215Z" level=info msg="TearDown network for sandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\" successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.987699716Z" level=info msg="StopPodSandbox for \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\" returns successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.987867937Z" level=info msg="StopPodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\"" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.987901258Z" level=info msg="TearDown network for sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.987907024Z" level=info msg="StopPodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" returns successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.988116861Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\"" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.988237182Z" level=info msg="TearDown network for sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.988247168Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" returns successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.988639364Z" level=info msg="StopPodSandbox for \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\"" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.988825412Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:6,}" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.989854226Z" level=info msg="StopPodSandbox for \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\"" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.989955397Z" level=info msg="Ensure that sandbox 35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943 in task-service has been cleanup successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.990058033Z" level=info msg="TearDown network for sandbox \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\" successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.990066543Z" level=info msg="StopPodSandbox for \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\" returns successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.990268108Z" level=info msg="StopPodSandbox for \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\"" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.990306285Z" level=info msg="TearDown network for sandbox \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\" successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.990312536Z" level=info msg="StopPodSandbox for \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\" returns successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.990462639Z" level=info msg="StopPodSandbox for \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\"" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.990496819Z" level=info msg="TearDown network for sandbox \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\" successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.990502125Z" level=info msg="StopPodSandbox for \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\" returns successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.990692275Z" level=info msg="StopPodSandbox for \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\"" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.995177298Z" level=info msg="Ensure that sandbox 3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad in task-service has been cleanup successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.995339310Z" level=info msg="TearDown network for sandbox \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\" successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.995348608Z" level=info msg="StopPodSandbox for \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\" returns successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.996164764Z" level=info msg="StopPodSandbox for \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\"" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.996217783Z" level=info msg="TearDown network for sandbox \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\" successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.996223903Z" level=info msg="StopPodSandbox for \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\" returns successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.996421906Z" level=info msg="StopPodSandbox for \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\"" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.996471747Z" level=info msg="TearDown network for sandbox \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\" successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.996479562Z" level=info msg="StopPodSandbox for \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\" returns successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.996708436Z" level=info msg="StopPodSandbox for \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\"" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.996820583Z" level=info msg="TearDown network for sandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\" successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.996828461Z" level=info msg="StopPodSandbox for \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\" returns successfully" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.997069402Z" level=info msg="StopPodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\"" Jan 13 20:43:36.997766 containerd[1551]: time="2025-01-13T20:43:36.997120272Z" level=info msg="TearDown network for sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" successfully" Jan 13 20:43:37.004207 kubelet[2877]: I0113 20:43:36.988288 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad" Jan 13 20:43:37.004207 kubelet[2877]: I0113 20:43:36.989660 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943" Jan 13 20:43:37.004207 kubelet[2877]: I0113 20:43:37.001485 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:36.997127454Z" level=info msg="StopPodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" returns successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:36.997345567Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\"" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:36.997391941Z" level=info msg="TearDown network for sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:36.997400002Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" returns successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:36.997691520Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:6,}" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.001845493Z" level=info msg="StopPodSandbox for \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\"" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.001966555Z" level=info msg="Ensure that sandbox 61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b in task-service has been cleanup successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002088021Z" level=info msg="TearDown network for sandbox \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\" successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002096356Z" level=info msg="StopPodSandbox for \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\" returns successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002211266Z" level=info msg="StopPodSandbox for \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\"" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002260667Z" level=info msg="TearDown network for sandbox \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\" successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002267579Z" level=info msg="StopPodSandbox for \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\" returns successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002379103Z" level=info msg="StopPodSandbox for \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\"" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002413611Z" level=info msg="TearDown network for sandbox \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\" successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002433885Z" level=info msg="StopPodSandbox for \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\" returns successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002483610Z" level=info msg="TearDown network for sandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\" successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002493273Z" level=info msg="StopPodSandbox for \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\" returns successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002547328Z" level=info msg="StopPodSandbox for \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\"" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002583885Z" level=info msg="TearDown network for sandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\" successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002606175Z" level=info msg="StopPodSandbox for \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\" returns successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002608675Z" level=info msg="StopPodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\"" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002653728Z" level=info msg="TearDown network for sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.002658715Z" level=info msg="StopPodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" returns successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.003511971Z" level=info msg="StopPodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\"" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.003548026Z" level=info msg="TearDown network for sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.003553702Z" level=info msg="StopPodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" returns successfully" Jan 13 20:43:37.004271 containerd[1551]: time="2025-01-13T20:43:37.003600208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-h4ptk,Uid:2fafcb48-274d-4a9e-adca-511adb0459f5,Namespace:calico-apiserver,Attempt:5,}" Jan 13 20:43:37.008802 containerd[1551]: time="2025-01-13T20:43:37.004361590Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\"" Jan 13 20:43:37.008802 containerd[1551]: time="2025-01-13T20:43:37.004404525Z" level=info msg="TearDown network for sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" successfully" Jan 13 20:43:37.008802 containerd[1551]: time="2025-01-13T20:43:37.004411133Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" returns successfully" Jan 13 20:43:37.008802 containerd[1551]: time="2025-01-13T20:43:37.004584164Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:6,}" Jan 13 20:43:37.085053 systemd[1]: Started cri-containerd-cc7f14a9972e674efbce89d3466cd282b10c6a017ce3de24f64051180dd02a56.scope - libcontainer container cc7f14a9972e674efbce89d3466cd282b10c6a017ce3de24f64051180dd02a56. Jan 13 20:43:37.180920 containerd[1551]: time="2025-01-13T20:43:37.180880455Z" level=info msg="StartContainer for \"cc7f14a9972e674efbce89d3466cd282b10c6a017ce3de24f64051180dd02a56\" returns successfully" Jan 13 20:43:37.188060 containerd[1551]: time="2025-01-13T20:43:37.188012778Z" level=error msg="Failed to destroy network for sandbox \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.189545 containerd[1551]: time="2025-01-13T20:43:37.189519100Z" level=error msg="encountered an error cleaning up failed sandbox \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.189660 containerd[1551]: time="2025-01-13T20:43:37.189647195Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-b5vf4,Uid:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.189987 kubelet[2877]: E0113 20:43:37.189964 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.208886 kubelet[2877]: E0113 20:43:37.190082 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:37.208886 kubelet[2877]: E0113 20:43:37.190096 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" Jan 13 20:43:37.208886 kubelet[2877]: E0113 20:43:37.190268 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-595db757fc-b5vf4_calico-apiserver(aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-595db757fc-b5vf4_calico-apiserver(aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" podUID="aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8" Jan 13 20:43:37.208996 containerd[1551]: time="2025-01-13T20:43:37.197447231Z" level=error msg="Failed to destroy network for sandbox \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.208996 containerd[1551]: time="2025-01-13T20:43:37.197535716Z" level=error msg="Failed to destroy network for sandbox \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.208996 containerd[1551]: time="2025-01-13T20:43:37.197690500Z" level=error msg="encountered an error cleaning up failed sandbox \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.208996 containerd[1551]: time="2025-01-13T20:43:37.197953244Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.208996 containerd[1551]: time="2025-01-13T20:43:37.199661813Z" level=error msg="encountered an error cleaning up failed sandbox \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.208996 containerd[1551]: time="2025-01-13T20:43:37.199687061Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.208996 containerd[1551]: time="2025-01-13T20:43:37.205314130Z" level=error msg="Failed to destroy network for sandbox \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.208996 containerd[1551]: time="2025-01-13T20:43:37.205556067Z" level=error msg="encountered an error cleaning up failed sandbox \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.208996 containerd[1551]: time="2025-01-13T20:43:37.205979452Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.209161 kubelet[2877]: E0113 20:43:37.198141 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.209161 kubelet[2877]: E0113 20:43:37.198180 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:37.209161 kubelet[2877]: E0113 20:43:37.198206 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zww27" Jan 13 20:43:37.209216 kubelet[2877]: E0113 20:43:37.198239 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-zww27_kube-system(4bc90cb7-3014-45c2-9619-7765993cb1d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-zww27_kube-system(4bc90cb7-3014-45c2-9619-7765993cb1d0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-zww27" podUID="4bc90cb7-3014-45c2-9619-7765993cb1d0" Jan 13 20:43:37.209216 kubelet[2877]: E0113 20:43:37.199791 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.209216 kubelet[2877]: E0113 20:43:37.199808 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:37.209296 kubelet[2877]: E0113 20:43:37.199818 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8prdm" Jan 13 20:43:37.209296 kubelet[2877]: E0113 20:43:37.199853 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8prdm_calico-system(3c210180-a974-4da0-9ca1-18a6f94f39f4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8prdm_calico-system(3c210180-a974-4da0-9ca1-18a6f94f39f4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8prdm" podUID="3c210180-a974-4da0-9ca1-18a6f94f39f4" Jan 13 20:43:37.209296 kubelet[2877]: E0113 20:43:37.206111 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.209540 kubelet[2877]: E0113 20:43:37.206148 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:37.209540 kubelet[2877]: E0113 20:43:37.206165 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-f9n77" Jan 13 20:43:37.209540 kubelet[2877]: E0113 20:43:37.206192 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-f9n77_kube-system(f4b04d0c-4a38-4068-8730-7ed8bc9346ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-f9n77_kube-system(f4b04d0c-4a38-4068-8730-7ed8bc9346ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-f9n77" podUID="f4b04d0c-4a38-4068-8730-7ed8bc9346ef" Jan 13 20:43:37.221453 containerd[1551]: time="2025-01-13T20:43:37.209476230Z" level=error msg="Failed to destroy network for sandbox \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.221453 containerd[1551]: time="2025-01-13T20:43:37.209753913Z" level=error msg="encountered an error cleaning up failed sandbox \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.221453 containerd[1551]: time="2025-01-13T20:43:37.209782418Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-h4ptk,Uid:2fafcb48-274d-4a9e-adca-511adb0459f5,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.221453 containerd[1551]: time="2025-01-13T20:43:37.217571577Z" level=error msg="Failed to destroy network for sandbox \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.221453 containerd[1551]: time="2025-01-13T20:43:37.217785478Z" level=error msg="encountered an error cleaning up failed sandbox \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.221453 containerd[1551]: time="2025-01-13T20:43:37.217828183Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.221620 kubelet[2877]: E0113 20:43:37.209880 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.221620 kubelet[2877]: E0113 20:43:37.209921 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:37.221620 kubelet[2877]: E0113 20:43:37.209934 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" Jan 13 20:43:37.221828 kubelet[2877]: E0113 20:43:37.209957 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-595db757fc-h4ptk_calico-apiserver(2fafcb48-274d-4a9e-adca-511adb0459f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-595db757fc-h4ptk_calico-apiserver(2fafcb48-274d-4a9e-adca-511adb0459f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" podUID="2fafcb48-274d-4a9e-adca-511adb0459f5" Jan 13 20:43:37.221828 kubelet[2877]: E0113 20:43:37.217952 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:43:37.221828 kubelet[2877]: E0113 20:43:37.217980 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:37.231798 kubelet[2877]: E0113 20:43:37.218010 2877 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" Jan 13 20:43:37.231798 kubelet[2877]: E0113 20:43:37.218043 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-69bd7ff58c-rg9pf_calico-system(4d21bbae-e62e-4365-ba57-17611b86a846)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-69bd7ff58c-rg9pf_calico-system(4d21bbae-e62e-4365-ba57-17611b86a846)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" podUID="4d21bbae-e62e-4365-ba57-17611b86a846" Jan 13 20:43:37.444991 systemd[1]: run-netns-cni\x2dad915bbe\x2d31f7\x2d0266\x2d7695\x2dc702e7463457.mount: Deactivated successfully. Jan 13 20:43:37.445268 systemd[1]: run-netns-cni\x2d382e77b3\x2d6053\x2d212b\x2d2c13\x2d510cb3bf733a.mount: Deactivated successfully. Jan 13 20:43:37.445307 systemd[1]: run-netns-cni\x2d98d9e2d3\x2dab39\x2d2b9a\x2d0ce7\x2d09c52f6d1164.mount: Deactivated successfully. Jan 13 20:43:37.445339 systemd[1]: run-netns-cni\x2d3bc79253\x2d0ee2\x2df220\x2d3c7c\x2dfb8e94e58313.mount: Deactivated successfully. Jan 13 20:43:37.445372 systemd[1]: run-netns-cni\x2d40d331c3\x2d38b9\x2df13f\x2d1350\x2d5a05bfebb31e.mount: Deactivated successfully. Jan 13 20:43:37.445403 systemd[1]: run-netns-cni\x2df495ceb3\x2d9401\x2d3d79\x2d0144\x2de554daf85ddf.mount: Deactivated successfully. Jan 13 20:43:37.763815 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 13 20:43:37.764193 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 13 20:43:38.023509 kubelet[2877]: I0113 20:43:38.023488 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58" Jan 13 20:43:38.026757 containerd[1551]: time="2025-01-13T20:43:38.024459130Z" level=info msg="StopPodSandbox for \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\"" Jan 13 20:43:38.026757 containerd[1551]: time="2025-01-13T20:43:38.024608229Z" level=info msg="Ensure that sandbox 2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58 in task-service has been cleanup successfully" Jan 13 20:43:38.026757 containerd[1551]: time="2025-01-13T20:43:38.024771547Z" level=info msg="TearDown network for sandbox \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\" successfully" Jan 13 20:43:38.026757 containerd[1551]: time="2025-01-13T20:43:38.024780194Z" level=info msg="StopPodSandbox for \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\" returns successfully" Jan 13 20:43:38.027142 containerd[1551]: time="2025-01-13T20:43:38.026947330Z" level=info msg="StopPodSandbox for \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\"" Jan 13 20:43:38.027142 containerd[1551]: time="2025-01-13T20:43:38.027010634Z" level=info msg="TearDown network for sandbox \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\" successfully" Jan 13 20:43:38.027142 containerd[1551]: time="2025-01-13T20:43:38.027017881Z" level=info msg="StopPodSandbox for \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\" returns successfully" Jan 13 20:43:38.027291 containerd[1551]: time="2025-01-13T20:43:38.027266922Z" level=info msg="StopPodSandbox for \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\"" Jan 13 20:43:38.027325 containerd[1551]: time="2025-01-13T20:43:38.027307529Z" level=info msg="TearDown network for sandbox \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\" successfully" Jan 13 20:43:38.027325 containerd[1551]: time="2025-01-13T20:43:38.027314172Z" level=info msg="StopPodSandbox for \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\" returns successfully" Jan 13 20:43:38.027683 containerd[1551]: time="2025-01-13T20:43:38.027533099Z" level=info msg="StopPodSandbox for \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\"" Jan 13 20:43:38.027683 containerd[1551]: time="2025-01-13T20:43:38.027598143Z" level=info msg="TearDown network for sandbox \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\" successfully" Jan 13 20:43:38.027683 containerd[1551]: time="2025-01-13T20:43:38.027608234Z" level=info msg="StopPodSandbox for \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\" returns successfully" Jan 13 20:43:38.028203 systemd[1]: run-netns-cni\x2dd40ea5e8\x2d5194\x2d9057\x2d7a37\x2dffa7e1356b4f.mount: Deactivated successfully. Jan 13 20:43:38.029702 containerd[1551]: time="2025-01-13T20:43:38.028824417Z" level=info msg="StopPodSandbox for \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\"" Jan 13 20:43:38.030993 containerd[1551]: time="2025-01-13T20:43:38.030839637Z" level=info msg="TearDown network for sandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\" successfully" Jan 13 20:43:38.030993 containerd[1551]: time="2025-01-13T20:43:38.030918916Z" level=info msg="StopPodSandbox for \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\" returns successfully" Jan 13 20:43:38.031116 containerd[1551]: time="2025-01-13T20:43:38.031098451Z" level=info msg="StopPodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\"" Jan 13 20:43:38.031246 containerd[1551]: time="2025-01-13T20:43:38.031155891Z" level=info msg="TearDown network for sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" successfully" Jan 13 20:43:38.031246 containerd[1551]: time="2025-01-13T20:43:38.031166362Z" level=info msg="StopPodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" returns successfully" Jan 13 20:43:38.031712 containerd[1551]: time="2025-01-13T20:43:38.031696327Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\"" Jan 13 20:43:38.031840 containerd[1551]: time="2025-01-13T20:43:38.031772791Z" level=info msg="TearDown network for sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" successfully" Jan 13 20:43:38.031840 containerd[1551]: time="2025-01-13T20:43:38.031782813Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" returns successfully" Jan 13 20:43:38.032239 kubelet[2877]: I0113 20:43:38.032221 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2" Jan 13 20:43:38.033212 containerd[1551]: time="2025-01-13T20:43:38.033091455Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:7,}" Jan 13 20:43:38.033913 containerd[1551]: time="2025-01-13T20:43:38.033890748Z" level=info msg="StopPodSandbox for \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\"" Jan 13 20:43:38.036481 containerd[1551]: time="2025-01-13T20:43:38.035878139Z" level=info msg="Ensure that sandbox 4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2 in task-service has been cleanup successfully" Jan 13 20:43:38.039991 containerd[1551]: time="2025-01-13T20:43:38.036956231Z" level=info msg="TearDown network for sandbox \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\" successfully" Jan 13 20:43:38.039991 containerd[1551]: time="2025-01-13T20:43:38.036978140Z" level=info msg="StopPodSandbox for \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\" returns successfully" Jan 13 20:43:38.040531 containerd[1551]: time="2025-01-13T20:43:38.040502753Z" level=info msg="StopPodSandbox for \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\"" Jan 13 20:43:38.041027 systemd[1]: run-netns-cni\x2d3a6a9a58\x2decf4\x2dbaf4\x2d3a3f\x2d5d6c95170849.mount: Deactivated successfully. Jan 13 20:43:38.042313 containerd[1551]: time="2025-01-13T20:43:38.040663177Z" level=info msg="TearDown network for sandbox \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\" successfully" Jan 13 20:43:38.042432 containerd[1551]: time="2025-01-13T20:43:38.042416441Z" level=info msg="StopPodSandbox for \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\" returns successfully" Jan 13 20:43:38.043694 containerd[1551]: time="2025-01-13T20:43:38.043677424Z" level=info msg="StopPodSandbox for \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\"" Jan 13 20:43:38.043887 containerd[1551]: time="2025-01-13T20:43:38.043864350Z" level=info msg="TearDown network for sandbox \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\" successfully" Jan 13 20:43:38.044996 containerd[1551]: time="2025-01-13T20:43:38.043910318Z" level=info msg="StopPodSandbox for \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\" returns successfully" Jan 13 20:43:38.048066 containerd[1551]: time="2025-01-13T20:43:38.048014956Z" level=info msg="StopPodSandbox for \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\"" Jan 13 20:43:38.050372 containerd[1551]: time="2025-01-13T20:43:38.048366161Z" level=info msg="TearDown network for sandbox \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\" successfully" Jan 13 20:43:38.050372 containerd[1551]: time="2025-01-13T20:43:38.048388292Z" level=info msg="StopPodSandbox for \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\" returns successfully" Jan 13 20:43:38.050372 containerd[1551]: time="2025-01-13T20:43:38.049006613Z" level=info msg="StopPodSandbox for \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\"" Jan 13 20:43:38.050372 containerd[1551]: time="2025-01-13T20:43:38.049068904Z" level=info msg="TearDown network for sandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\" successfully" Jan 13 20:43:38.050372 containerd[1551]: time="2025-01-13T20:43:38.049086085Z" level=info msg="StopPodSandbox for \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\" returns successfully" Jan 13 20:43:38.050372 containerd[1551]: time="2025-01-13T20:43:38.049311391Z" level=info msg="StopPodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\"" Jan 13 20:43:38.050372 containerd[1551]: time="2025-01-13T20:43:38.050150800Z" level=info msg="TearDown network for sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" successfully" Jan 13 20:43:38.050372 containerd[1551]: time="2025-01-13T20:43:38.050208045Z" level=info msg="StopPodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" returns successfully" Jan 13 20:43:38.050561 kubelet[2877]: I0113 20:43:38.049396 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906" Jan 13 20:43:38.052264 containerd[1551]: time="2025-01-13T20:43:38.051839760Z" level=info msg="StopPodSandbox for \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\"" Jan 13 20:43:38.052264 containerd[1551]: time="2025-01-13T20:43:38.052053247Z" level=info msg="Ensure that sandbox de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906 in task-service has been cleanup successfully" Jan 13 20:43:38.054829 containerd[1551]: time="2025-01-13T20:43:38.054626044Z" level=info msg="TearDown network for sandbox \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\" successfully" Jan 13 20:43:38.054829 containerd[1551]: time="2025-01-13T20:43:38.054661049Z" level=info msg="StopPodSandbox for \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\" returns successfully" Jan 13 20:43:38.056633 systemd[1]: run-netns-cni\x2d103adfa6\x2d6825\x2dc10e\x2da02b\x2dc8de3b4861c7.mount: Deactivated successfully. Jan 13 20:43:38.061752 containerd[1551]: time="2025-01-13T20:43:38.061622269Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\"" Jan 13 20:43:38.061935 containerd[1551]: time="2025-01-13T20:43:38.061789554Z" level=info msg="TearDown network for sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" successfully" Jan 13 20:43:38.061961 containerd[1551]: time="2025-01-13T20:43:38.061934657Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" returns successfully" Jan 13 20:43:38.062001 containerd[1551]: time="2025-01-13T20:43:38.061852265Z" level=info msg="StopPodSandbox for \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\"" Jan 13 20:43:38.062214 containerd[1551]: time="2025-01-13T20:43:38.062196212Z" level=info msg="TearDown network for sandbox \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\" successfully" Jan 13 20:43:38.062243 containerd[1551]: time="2025-01-13T20:43:38.062210862Z" level=info msg="StopPodSandbox for \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\" returns successfully" Jan 13 20:43:38.062762 containerd[1551]: time="2025-01-13T20:43:38.062672445Z" level=info msg="StopPodSandbox for \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\"" Jan 13 20:43:38.062762 containerd[1551]: time="2025-01-13T20:43:38.062716675Z" level=info msg="TearDown network for sandbox \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\" successfully" Jan 13 20:43:38.062762 containerd[1551]: time="2025-01-13T20:43:38.062723257Z" level=info msg="StopPodSandbox for \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\" returns successfully" Jan 13 20:43:38.062850 containerd[1551]: time="2025-01-13T20:43:38.062814721Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:7,}" Jan 13 20:43:38.068462 containerd[1551]: time="2025-01-13T20:43:38.068094495Z" level=info msg="StopPodSandbox for \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\"" Jan 13 20:43:38.068462 containerd[1551]: time="2025-01-13T20:43:38.068149114Z" level=info msg="TearDown network for sandbox \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\" successfully" Jan 13 20:43:38.068462 containerd[1551]: time="2025-01-13T20:43:38.068155534Z" level=info msg="StopPodSandbox for \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\" returns successfully" Jan 13 20:43:38.069354 containerd[1551]: time="2025-01-13T20:43:38.068584834Z" level=info msg="StopPodSandbox for \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\"" Jan 13 20:43:38.069354 containerd[1551]: time="2025-01-13T20:43:38.068635990Z" level=info msg="TearDown network for sandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\" successfully" Jan 13 20:43:38.069354 containerd[1551]: time="2025-01-13T20:43:38.068643270Z" level=info msg="StopPodSandbox for \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\" returns successfully" Jan 13 20:43:38.069354 containerd[1551]: time="2025-01-13T20:43:38.069075406Z" level=info msg="StopPodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\"" Jan 13 20:43:38.069354 containerd[1551]: time="2025-01-13T20:43:38.069116202Z" level=info msg="TearDown network for sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" successfully" Jan 13 20:43:38.069354 containerd[1551]: time="2025-01-13T20:43:38.069143448Z" level=info msg="StopPodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" returns successfully" Jan 13 20:43:38.070919 containerd[1551]: time="2025-01-13T20:43:38.070329051Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-h4ptk,Uid:2fafcb48-274d-4a9e-adca-511adb0459f5,Namespace:calico-apiserver,Attempt:6,}" Jan 13 20:43:38.070986 kubelet[2877]: I0113 20:43:38.070569 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8" Jan 13 20:43:38.071063 containerd[1551]: time="2025-01-13T20:43:38.071046487Z" level=info msg="StopPodSandbox for \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\"" Jan 13 20:43:38.071191 containerd[1551]: time="2025-01-13T20:43:38.071175532Z" level=info msg="Ensure that sandbox 69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8 in task-service has been cleanup successfully" Jan 13 20:43:38.071485 containerd[1551]: time="2025-01-13T20:43:38.071470054Z" level=info msg="TearDown network for sandbox \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\" successfully" Jan 13 20:43:38.071516 containerd[1551]: time="2025-01-13T20:43:38.071480989Z" level=info msg="StopPodSandbox for \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\" returns successfully" Jan 13 20:43:38.071897 containerd[1551]: time="2025-01-13T20:43:38.071627251Z" level=info msg="StopPodSandbox for \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\"" Jan 13 20:43:38.071897 containerd[1551]: time="2025-01-13T20:43:38.071675680Z" level=info msg="TearDown network for sandbox \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\" successfully" Jan 13 20:43:38.071897 containerd[1551]: time="2025-01-13T20:43:38.071681499Z" level=info msg="StopPodSandbox for \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\" returns successfully" Jan 13 20:43:38.072454 containerd[1551]: time="2025-01-13T20:43:38.072136877Z" level=info msg="StopPodSandbox for \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\"" Jan 13 20:43:38.072454 containerd[1551]: time="2025-01-13T20:43:38.072179154Z" level=info msg="TearDown network for sandbox \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\" successfully" Jan 13 20:43:38.072454 containerd[1551]: time="2025-01-13T20:43:38.072185658Z" level=info msg="StopPodSandbox for \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\" returns successfully" Jan 13 20:43:38.072951 containerd[1551]: time="2025-01-13T20:43:38.072745465Z" level=info msg="StopPodSandbox for \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\"" Jan 13 20:43:38.072951 containerd[1551]: time="2025-01-13T20:43:38.072790012Z" level=info msg="TearDown network for sandbox \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\" successfully" Jan 13 20:43:38.072951 containerd[1551]: time="2025-01-13T20:43:38.072796663Z" level=info msg="StopPodSandbox for \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\" returns successfully" Jan 13 20:43:38.073195 containerd[1551]: time="2025-01-13T20:43:38.073138549Z" level=info msg="StopPodSandbox for \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\"" Jan 13 20:43:38.073195 containerd[1551]: time="2025-01-13T20:43:38.073178194Z" level=info msg="TearDown network for sandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\" successfully" Jan 13 20:43:38.073300 containerd[1551]: time="2025-01-13T20:43:38.073260238Z" level=info msg="StopPodSandbox for \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\" returns successfully" Jan 13 20:43:38.074234 containerd[1551]: time="2025-01-13T20:43:38.074144559Z" level=info msg="StopPodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\"" Jan 13 20:43:38.074234 containerd[1551]: time="2025-01-13T20:43:38.074196001Z" level=info msg="TearDown network for sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" successfully" Jan 13 20:43:38.074234 containerd[1551]: time="2025-01-13T20:43:38.074205323Z" level=info msg="StopPodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" returns successfully" Jan 13 20:43:38.076702 containerd[1551]: time="2025-01-13T20:43:38.074874170Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\"" Jan 13 20:43:38.076702 containerd[1551]: time="2025-01-13T20:43:38.074929294Z" level=info msg="TearDown network for sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" successfully" Jan 13 20:43:38.076702 containerd[1551]: time="2025-01-13T20:43:38.074938609Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" returns successfully" Jan 13 20:43:38.076702 containerd[1551]: time="2025-01-13T20:43:38.075676232Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:7,}" Jan 13 20:43:38.076702 containerd[1551]: time="2025-01-13T20:43:38.075846800Z" level=info msg="StopPodSandbox for \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\"" Jan 13 20:43:38.076702 containerd[1551]: time="2025-01-13T20:43:38.076433040Z" level=info msg="Ensure that sandbox 76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3 in task-service has been cleanup successfully" Jan 13 20:43:38.077569 kubelet[2877]: I0113 20:43:38.074932 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3" Jan 13 20:43:38.077569 kubelet[2877]: I0113 20:43:38.077355 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f" Jan 13 20:43:38.077783 containerd[1551]: time="2025-01-13T20:43:38.076829259Z" level=info msg="TearDown network for sandbox \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\" successfully" Jan 13 20:43:38.077783 containerd[1551]: time="2025-01-13T20:43:38.076838344Z" level=info msg="StopPodSandbox for \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\" returns successfully" Jan 13 20:43:38.077783 containerd[1551]: time="2025-01-13T20:43:38.077113371Z" level=info msg="StopPodSandbox for \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\"" Jan 13 20:43:38.077783 containerd[1551]: time="2025-01-13T20:43:38.077260208Z" level=info msg="TearDown network for sandbox \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\" successfully" Jan 13 20:43:38.077783 containerd[1551]: time="2025-01-13T20:43:38.077271563Z" level=info msg="StopPodSandbox for \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\" returns successfully" Jan 13 20:43:38.077783 containerd[1551]: time="2025-01-13T20:43:38.077752116Z" level=info msg="StopPodSandbox for \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\"" Jan 13 20:43:38.077911 containerd[1551]: time="2025-01-13T20:43:38.077800879Z" level=info msg="TearDown network for sandbox \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\" successfully" Jan 13 20:43:38.077911 containerd[1551]: time="2025-01-13T20:43:38.077807484Z" level=info msg="StopPodSandbox for \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\" returns successfully" Jan 13 20:43:38.077911 containerd[1551]: time="2025-01-13T20:43:38.077836242Z" level=info msg="StopPodSandbox for \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\"" Jan 13 20:43:38.077971 containerd[1551]: time="2025-01-13T20:43:38.077931589Z" level=info msg="Ensure that sandbox 2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f in task-service has been cleanup successfully" Jan 13 20:43:38.078231 containerd[1551]: time="2025-01-13T20:43:38.078040803Z" level=info msg="TearDown network for sandbox \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\" successfully" Jan 13 20:43:38.078231 containerd[1551]: time="2025-01-13T20:43:38.078057066Z" level=info msg="StopPodSandbox for \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\" returns successfully" Jan 13 20:43:38.078231 containerd[1551]: time="2025-01-13T20:43:38.078195052Z" level=info msg="StopPodSandbox for \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\"" Jan 13 20:43:38.078384 containerd[1551]: time="2025-01-13T20:43:38.078239762Z" level=info msg="TearDown network for sandbox \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\" successfully" Jan 13 20:43:38.078384 containerd[1551]: time="2025-01-13T20:43:38.078246738Z" level=info msg="StopPodSandbox for \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\" returns successfully" Jan 13 20:43:38.078384 containerd[1551]: time="2025-01-13T20:43:38.078284109Z" level=info msg="StopPodSandbox for \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\"" Jan 13 20:43:38.078384 containerd[1551]: time="2025-01-13T20:43:38.078314012Z" level=info msg="TearDown network for sandbox \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\" successfully" Jan 13 20:43:38.078384 containerd[1551]: time="2025-01-13T20:43:38.078318833Z" level=info msg="StopPodSandbox for \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\" returns successfully" Jan 13 20:43:38.078650 containerd[1551]: time="2025-01-13T20:43:38.078444350Z" level=info msg="StopPodSandbox for \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\"" Jan 13 20:43:38.078650 containerd[1551]: time="2025-01-13T20:43:38.078461159Z" level=info msg="StopPodSandbox for \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\"" Jan 13 20:43:38.078650 containerd[1551]: time="2025-01-13T20:43:38.078479360Z" level=info msg="TearDown network for sandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\" successfully" Jan 13 20:43:38.078650 containerd[1551]: time="2025-01-13T20:43:38.078484527Z" level=info msg="StopPodSandbox for \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\" returns successfully" Jan 13 20:43:38.078650 containerd[1551]: time="2025-01-13T20:43:38.078521091Z" level=info msg="TearDown network for sandbox \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\" successfully" Jan 13 20:43:38.078650 containerd[1551]: time="2025-01-13T20:43:38.078528416Z" level=info msg="StopPodSandbox for \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\" returns successfully" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.078710387Z" level=info msg="StopPodSandbox for \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\"" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.078756732Z" level=info msg="TearDown network for sandbox \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\" successfully" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.078764748Z" level=info msg="StopPodSandbox for \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\" returns successfully" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.078710950Z" level=info msg="StopPodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\"" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.078807036Z" level=info msg="TearDown network for sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" successfully" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.078830351Z" level=info msg="StopPodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" returns successfully" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.078955080Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\"" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.078966439Z" level=info msg="StopPodSandbox for \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\"" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.079002838Z" level=info msg="TearDown network for sandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\" successfully" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.079008928Z" level=info msg="StopPodSandbox for \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\" returns successfully" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.079016908Z" level=info msg="TearDown network for sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" successfully" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.079024684Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" returns successfully" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.079247329Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:7,}" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.079268703Z" level=info msg="StopPodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\"" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.079310833Z" level=info msg="TearDown network for sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" successfully" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.079329432Z" level=info msg="StopPodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" returns successfully" Jan 13 20:43:38.081925 containerd[1551]: time="2025-01-13T20:43:38.079528656Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-b5vf4,Uid:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8,Namespace:calico-apiserver,Attempt:6,}" Jan 13 20:43:38.150137 kubelet[2877]: I0113 20:43:38.138574 2877 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-6zt9t" podStartSLOduration=2.688551876 podStartE2EDuration="18.057962159s" podCreationTimestamp="2025-01-13 20:43:20 +0000 UTC" firstStartedPulling="2025-01-13 20:43:21.368979901 +0000 UTC m=+24.856021833" lastFinishedPulling="2025-01-13 20:43:36.738390183 +0000 UTC m=+40.225432116" observedRunningTime="2025-01-13 20:43:38.057502852 +0000 UTC m=+41.544544794" watchObservedRunningTime="2025-01-13 20:43:38.057962159 +0000 UTC m=+41.545004096" Jan 13 20:43:38.453852 systemd[1]: run-netns-cni\x2da4f7390b\x2dc6cc\x2d2d98\x2deacd\x2d017304242e56.mount: Deactivated successfully. Jan 13 20:43:38.454001 systemd[1]: run-netns-cni\x2dbd6deb4e\x2dad92\x2d1854\x2d20f1\x2d0cf0bebd511f.mount: Deactivated successfully. Jan 13 20:43:38.454040 systemd[1]: run-netns-cni\x2d859517e0\x2df9a2\x2de55c\x2d3081\x2d77467ec72edf.mount: Deactivated successfully. Jan 13 20:43:38.509190 kubelet[2877]: I0113 20:43:38.509148 2877 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:43:38.816164 systemd-networkd[1475]: cali0691d0ef5f7: Link UP Jan 13 20:43:38.816684 systemd-networkd[1475]: cali0691d0ef5f7: Gained carrier Jan 13 20:43:38.816860 systemd-networkd[1475]: calif9bdb80f7b4: Link UP Jan 13 20:43:38.817260 systemd-networkd[1475]: calif9bdb80f7b4: Gained carrier Jan 13 20:43:38.819403 systemd-networkd[1475]: cali231f6bf509d: Link UP Jan 13 20:43:38.820806 systemd-networkd[1475]: cali231f6bf509d: Gained carrier Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.213 [INFO][4888] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.227 [INFO][4888] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--69bd7ff58c--rg9pf-eth0 calico-kube-controllers-69bd7ff58c- calico-system 4d21bbae-e62e-4365-ba57-17611b86a846 714 0 2025-01-13 20:43:21 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:69bd7ff58c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-69bd7ff58c-rg9pf eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali0691d0ef5f7 [] []}} ContainerID="318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" Namespace="calico-system" Pod="calico-kube-controllers-69bd7ff58c-rg9pf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--69bd7ff58c--rg9pf-" Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.227 [INFO][4888] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" Namespace="calico-system" Pod="calico-kube-controllers-69bd7ff58c-rg9pf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--69bd7ff58c--rg9pf-eth0" Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.672 [INFO][4938] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" HandleID="k8s-pod-network.318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" Workload="localhost-k8s-calico--kube--controllers--69bd7ff58c--rg9pf-eth0" Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.697 [INFO][4938] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" HandleID="k8s-pod-network.318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" Workload="localhost-k8s-calico--kube--controllers--69bd7ff58c--rg9pf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c4c60), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-69bd7ff58c-rg9pf", "timestamp":"2025-01-13 20:43:38.672604271 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.697 [INFO][4938] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.697 [INFO][4938] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.697 [INFO][4938] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.700 [INFO][4938] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" host="localhost" Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.724 [INFO][4938] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.727 [INFO][4938] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.731 [INFO][4938] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.732 [INFO][4938] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.732 [INFO][4938] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" host="localhost" Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.733 [INFO][4938] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.737 [INFO][4938] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" host="localhost" Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.741 [INFO][4938] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" host="localhost" Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.741 [INFO][4938] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" host="localhost" Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.741 [INFO][4938] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:43:38.832103 containerd[1551]: 2025-01-13 20:43:38.741 [INFO][4938] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" HandleID="k8s-pod-network.318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" Workload="localhost-k8s-calico--kube--controllers--69bd7ff58c--rg9pf-eth0" Jan 13 20:43:38.832643 containerd[1551]: 2025-01-13 20:43:38.744 [INFO][4888] cni-plugin/k8s.go 386: Populated endpoint ContainerID="318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" Namespace="calico-system" Pod="calico-kube-controllers-69bd7ff58c-rg9pf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--69bd7ff58c--rg9pf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--69bd7ff58c--rg9pf-eth0", GenerateName:"calico-kube-controllers-69bd7ff58c-", Namespace:"calico-system", SelfLink:"", UID:"4d21bbae-e62e-4365-ba57-17611b86a846", ResourceVersion:"714", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 43, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"69bd7ff58c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-69bd7ff58c-rg9pf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0691d0ef5f7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:43:38.832643 containerd[1551]: 2025-01-13 20:43:38.744 [INFO][4888] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" Namespace="calico-system" Pod="calico-kube-controllers-69bd7ff58c-rg9pf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--69bd7ff58c--rg9pf-eth0" Jan 13 20:43:38.832643 containerd[1551]: 2025-01-13 20:43:38.744 [INFO][4888] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0691d0ef5f7 ContainerID="318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" Namespace="calico-system" Pod="calico-kube-controllers-69bd7ff58c-rg9pf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--69bd7ff58c--rg9pf-eth0" Jan 13 20:43:38.832643 containerd[1551]: 2025-01-13 20:43:38.811 [INFO][4888] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" Namespace="calico-system" Pod="calico-kube-controllers-69bd7ff58c-rg9pf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--69bd7ff58c--rg9pf-eth0" Jan 13 20:43:38.832643 containerd[1551]: 2025-01-13 20:43:38.813 [INFO][4888] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" Namespace="calico-system" Pod="calico-kube-controllers-69bd7ff58c-rg9pf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--69bd7ff58c--rg9pf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--69bd7ff58c--rg9pf-eth0", GenerateName:"calico-kube-controllers-69bd7ff58c-", Namespace:"calico-system", SelfLink:"", UID:"4d21bbae-e62e-4365-ba57-17611b86a846", ResourceVersion:"714", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 43, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"69bd7ff58c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba", Pod:"calico-kube-controllers-69bd7ff58c-rg9pf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0691d0ef5f7", MAC:"6a:9b:8e:cc:3a:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:43:38.832643 containerd[1551]: 2025-01-13 20:43:38.829 [INFO][4888] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba" Namespace="calico-system" Pod="calico-kube-controllers-69bd7ff58c-rg9pf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--69bd7ff58c--rg9pf-eth0" Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.234 [INFO][4883] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.256 [INFO][4883] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--595db757fc--h4ptk-eth0 calico-apiserver-595db757fc- calico-apiserver 2fafcb48-274d-4a9e-adca-511adb0459f5 712 0 2025-01-13 20:43:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:595db757fc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-595db757fc-h4ptk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali231f6bf509d [] []}} ContainerID="f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-h4ptk" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--h4ptk-" Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.256 [INFO][4883] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-h4ptk" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--h4ptk-eth0" Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.672 [INFO][4943] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" HandleID="k8s-pod-network.f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" Workload="localhost-k8s-calico--apiserver--595db757fc--h4ptk-eth0" Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.698 [INFO][4943] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" HandleID="k8s-pod-network.f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" Workload="localhost-k8s-calico--apiserver--595db757fc--h4ptk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000515b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-595db757fc-h4ptk", "timestamp":"2025-01-13 20:43:38.672713996 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.698 [INFO][4943] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.775 [INFO][4943] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.775 [INFO][4943] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.777 [INFO][4943] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" host="localhost" Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.780 [INFO][4943] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.784 [INFO][4943] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.793 [INFO][4943] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.795 [INFO][4943] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.795 [INFO][4943] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" host="localhost" Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.796 [INFO][4943] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.799 [INFO][4943] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" host="localhost" Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.803 [INFO][4943] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" host="localhost" Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.803 [INFO][4943] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" host="localhost" Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.803 [INFO][4943] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:43:38.843594 containerd[1551]: 2025-01-13 20:43:38.803 [INFO][4943] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" HandleID="k8s-pod-network.f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" Workload="localhost-k8s-calico--apiserver--595db757fc--h4ptk-eth0" Jan 13 20:43:38.845493 containerd[1551]: 2025-01-13 20:43:38.813 [INFO][4883] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-h4ptk" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--h4ptk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--595db757fc--h4ptk-eth0", GenerateName:"calico-apiserver-595db757fc-", Namespace:"calico-apiserver", SelfLink:"", UID:"2fafcb48-274d-4a9e-adca-511adb0459f5", ResourceVersion:"712", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 43, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"595db757fc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-595db757fc-h4ptk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali231f6bf509d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:43:38.845493 containerd[1551]: 2025-01-13 20:43:38.814 [INFO][4883] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-h4ptk" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--h4ptk-eth0" Jan 13 20:43:38.845493 containerd[1551]: 2025-01-13 20:43:38.814 [INFO][4883] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali231f6bf509d ContainerID="f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-h4ptk" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--h4ptk-eth0" Jan 13 20:43:38.845493 containerd[1551]: 2025-01-13 20:43:38.822 [INFO][4883] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-h4ptk" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--h4ptk-eth0" Jan 13 20:43:38.845493 containerd[1551]: 2025-01-13 20:43:38.823 [INFO][4883] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-h4ptk" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--h4ptk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--595db757fc--h4ptk-eth0", GenerateName:"calico-apiserver-595db757fc-", Namespace:"calico-apiserver", SelfLink:"", UID:"2fafcb48-274d-4a9e-adca-511adb0459f5", ResourceVersion:"712", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 43, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"595db757fc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b", Pod:"calico-apiserver-595db757fc-h4ptk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali231f6bf509d", MAC:"06:1c:da:07:fe:6c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:43:38.845493 containerd[1551]: 2025-01-13 20:43:38.837 [INFO][4883] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-h4ptk" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--h4ptk-eth0" Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.232 [INFO][4912] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.261 [INFO][4912] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--595db757fc--b5vf4-eth0 calico-apiserver-595db757fc- calico-apiserver aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8 713 0 2025-01-13 20:43:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:595db757fc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-595db757fc-b5vf4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif9bdb80f7b4 [] []}} ContainerID="1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-b5vf4" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--b5vf4-" Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.261 [INFO][4912] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-b5vf4" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--b5vf4-eth0" Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.672 [INFO][4944] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" HandleID="k8s-pod-network.1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" Workload="localhost-k8s-calico--apiserver--595db757fc--b5vf4-eth0" Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.697 [INFO][4944] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" HandleID="k8s-pod-network.1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" Workload="localhost-k8s-calico--apiserver--595db757fc--b5vf4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000399490), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-595db757fc-b5vf4", "timestamp":"2025-01-13 20:43:38.672862302 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.697 [INFO][4944] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.741 [INFO][4944] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.741 [INFO][4944] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.744 [INFO][4944] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" host="localhost" Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.747 [INFO][4944] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.750 [INFO][4944] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.751 [INFO][4944] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.752 [INFO][4944] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.752 [INFO][4944] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" host="localhost" Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.753 [INFO][4944] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.761 [INFO][4944] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" host="localhost" Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.774 [INFO][4944] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" host="localhost" Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.774 [INFO][4944] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" host="localhost" Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.774 [INFO][4944] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:43:38.850898 containerd[1551]: 2025-01-13 20:43:38.775 [INFO][4944] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" HandleID="k8s-pod-network.1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" Workload="localhost-k8s-calico--apiserver--595db757fc--b5vf4-eth0" Jan 13 20:43:38.852686 containerd[1551]: 2025-01-13 20:43:38.777 [INFO][4912] cni-plugin/k8s.go 386: Populated endpoint ContainerID="1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-b5vf4" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--b5vf4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--595db757fc--b5vf4-eth0", GenerateName:"calico-apiserver-595db757fc-", Namespace:"calico-apiserver", SelfLink:"", UID:"aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 43, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"595db757fc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-595db757fc-b5vf4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif9bdb80f7b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:43:38.852686 containerd[1551]: 2025-01-13 20:43:38.777 [INFO][4912] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-b5vf4" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--b5vf4-eth0" Jan 13 20:43:38.852686 containerd[1551]: 2025-01-13 20:43:38.777 [INFO][4912] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif9bdb80f7b4 ContainerID="1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-b5vf4" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--b5vf4-eth0" Jan 13 20:43:38.852686 containerd[1551]: 2025-01-13 20:43:38.812 [INFO][4912] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-b5vf4" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--b5vf4-eth0" Jan 13 20:43:38.852686 containerd[1551]: 2025-01-13 20:43:38.814 [INFO][4912] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-b5vf4" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--b5vf4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--595db757fc--b5vf4-eth0", GenerateName:"calico-apiserver-595db757fc-", Namespace:"calico-apiserver", SelfLink:"", UID:"aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 43, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"595db757fc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d", Pod:"calico-apiserver-595db757fc-b5vf4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif9bdb80f7b4", MAC:"5e:35:2f:37:30:12", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:43:38.852686 containerd[1551]: 2025-01-13 20:43:38.841 [INFO][4912] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d" Namespace="calico-apiserver" Pod="calico-apiserver-595db757fc-b5vf4" WorkloadEndpoint="localhost-k8s-calico--apiserver--595db757fc--b5vf4-eth0" Jan 13 20:43:38.876271 systemd-networkd[1475]: calib8ed82489d8: Link UP Jan 13 20:43:38.877308 systemd-networkd[1475]: calib8ed82489d8: Gained carrier Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.203 [INFO][4873] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.229 [INFO][4873] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--f9n77-eth0 coredns-7db6d8ff4d- kube-system f4b04d0c-4a38-4068-8730-7ed8bc9346ef 707 0 2025-01-13 20:43:12 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-f9n77 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib8ed82489d8 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-f9n77" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--f9n77-" Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.229 [INFO][4873] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-f9n77" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--f9n77-eth0" Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.672 [INFO][4936] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" HandleID="k8s-pod-network.0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" Workload="localhost-k8s-coredns--7db6d8ff4d--f9n77-eth0" Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.698 [INFO][4936] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" HandleID="k8s-pod-network.0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" Workload="localhost-k8s-coredns--7db6d8ff4d--f9n77-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000398b50), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-f9n77", "timestamp":"2025-01-13 20:43:38.672808464 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.698 [INFO][4936] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.803 [INFO][4936] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.803 [INFO][4936] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.805 [INFO][4936] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" host="localhost" Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.820 [INFO][4936] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.843 [INFO][4936] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.845 [INFO][4936] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.848 [INFO][4936] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.848 [INFO][4936] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" host="localhost" Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.849 [INFO][4936] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2 Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.858 [INFO][4936] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" host="localhost" Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.869 [INFO][4936] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" host="localhost" Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.869 [INFO][4936] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" host="localhost" Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.869 [INFO][4936] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:43:38.901492 containerd[1551]: 2025-01-13 20:43:38.869 [INFO][4936] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" HandleID="k8s-pod-network.0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" Workload="localhost-k8s-coredns--7db6d8ff4d--f9n77-eth0" Jan 13 20:43:38.904482 containerd[1551]: 2025-01-13 20:43:38.872 [INFO][4873] cni-plugin/k8s.go 386: Populated endpoint ContainerID="0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-f9n77" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--f9n77-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--f9n77-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"f4b04d0c-4a38-4068-8730-7ed8bc9346ef", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 43, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-f9n77", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib8ed82489d8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:43:38.904482 containerd[1551]: 2025-01-13 20:43:38.872 [INFO][4873] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-f9n77" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--f9n77-eth0" Jan 13 20:43:38.904482 containerd[1551]: 2025-01-13 20:43:38.872 [INFO][4873] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib8ed82489d8 ContainerID="0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-f9n77" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--f9n77-eth0" Jan 13 20:43:38.904482 containerd[1551]: 2025-01-13 20:43:38.878 [INFO][4873] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-f9n77" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--f9n77-eth0" Jan 13 20:43:38.904482 containerd[1551]: 2025-01-13 20:43:38.878 [INFO][4873] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-f9n77" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--f9n77-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--f9n77-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"f4b04d0c-4a38-4068-8730-7ed8bc9346ef", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 43, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2", Pod:"coredns-7db6d8ff4d-f9n77", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib8ed82489d8", MAC:"66:aa:0d:0b:3e:95", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:43:38.904482 containerd[1551]: 2025-01-13 20:43:38.899 [INFO][4873] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-f9n77" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--f9n77-eth0" Jan 13 20:43:38.957114 containerd[1551]: time="2025-01-13T20:43:38.957007946Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:43:38.960787 containerd[1551]: time="2025-01-13T20:43:38.958077931Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:43:38.960787 containerd[1551]: time="2025-01-13T20:43:38.958316475Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:38.960787 containerd[1551]: time="2025-01-13T20:43:38.958490338Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:38.966198 containerd[1551]: time="2025-01-13T20:43:38.965345666Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:43:38.966198 containerd[1551]: time="2025-01-13T20:43:38.965395910Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:43:38.966198 containerd[1551]: time="2025-01-13T20:43:38.965406441Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:38.966198 containerd[1551]: time="2025-01-13T20:43:38.965471118Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:38.968759 containerd[1551]: time="2025-01-13T20:43:38.966586131Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:43:38.968759 containerd[1551]: time="2025-01-13T20:43:38.966616005Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:43:38.968759 containerd[1551]: time="2025-01-13T20:43:38.966622993Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:38.968759 containerd[1551]: time="2025-01-13T20:43:38.967292785Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:38.989210 systemd-networkd[1475]: cali60ed9622122: Link UP Jan 13 20:43:38.991065 systemd-networkd[1475]: cali60ed9622122: Gained carrier Jan 13 20:43:38.998915 systemd[1]: Started cri-containerd-1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d.scope - libcontainer container 1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d. Jan 13 20:43:39.006702 containerd[1551]: time="2025-01-13T20:43:39.005049230Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:43:39.006702 containerd[1551]: time="2025-01-13T20:43:39.005095938Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:43:39.006702 containerd[1551]: time="2025-01-13T20:43:39.005104640Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:39.006702 containerd[1551]: time="2025-01-13T20:43:39.005160512Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.248 [INFO][4899] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.262 [INFO][4899] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--zww27-eth0 coredns-7db6d8ff4d- kube-system 4bc90cb7-3014-45c2-9619-7765993cb1d0 710 0 2025-01-13 20:43:12 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-zww27 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali60ed9622122 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zww27" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zww27-" Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.263 [INFO][4899] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zww27" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zww27-eth0" Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.672 [INFO][4946] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" HandleID="k8s-pod-network.9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" Workload="localhost-k8s-coredns--7db6d8ff4d--zww27-eth0" Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.698 [INFO][4946] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" HandleID="k8s-pod-network.9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" Workload="localhost-k8s-coredns--7db6d8ff4d--zww27-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00030d980), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-zww27", "timestamp":"2025-01-13 20:43:38.672610714 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.698 [INFO][4946] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.869 [INFO][4946] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.870 [INFO][4946] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.872 [INFO][4946] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" host="localhost" Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.876 [INFO][4946] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.881 [INFO][4946] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.882 [INFO][4946] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.885 [INFO][4946] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.885 [INFO][4946] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" host="localhost" Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.886 [INFO][4946] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5 Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.904 [INFO][4946] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" host="localhost" Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.934 [INFO][4946] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" host="localhost" Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.934 [INFO][4946] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" host="localhost" Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.934 [INFO][4946] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:43:39.016148 containerd[1551]: 2025-01-13 20:43:38.934 [INFO][4946] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" HandleID="k8s-pod-network.9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" Workload="localhost-k8s-coredns--7db6d8ff4d--zww27-eth0" Jan 13 20:43:39.017529 containerd[1551]: 2025-01-13 20:43:38.947 [INFO][4899] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zww27" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zww27-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--zww27-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"4bc90cb7-3014-45c2-9619-7765993cb1d0", ResourceVersion:"710", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 43, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-zww27", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali60ed9622122", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:43:39.017529 containerd[1551]: 2025-01-13 20:43:38.951 [INFO][4899] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zww27" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zww27-eth0" Jan 13 20:43:39.017529 containerd[1551]: 2025-01-13 20:43:38.972 [INFO][4899] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali60ed9622122 ContainerID="9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zww27" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zww27-eth0" Jan 13 20:43:39.017529 containerd[1551]: 2025-01-13 20:43:38.994 [INFO][4899] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zww27" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zww27-eth0" Jan 13 20:43:39.017529 containerd[1551]: 2025-01-13 20:43:38.995 [INFO][4899] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zww27" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zww27-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--zww27-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"4bc90cb7-3014-45c2-9619-7765993cb1d0", ResourceVersion:"710", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 43, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5", Pod:"coredns-7db6d8ff4d-zww27", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali60ed9622122", MAC:"06:56:28:c8:0d:20", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:43:39.017529 containerd[1551]: 2025-01-13 20:43:39.010 [INFO][4899] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zww27" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zww27-eth0" Jan 13 20:43:39.019882 systemd[1]: Started cri-containerd-0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2.scope - libcontainer container 0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2. Jan 13 20:43:39.022916 systemd[1]: Started cri-containerd-f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b.scope - libcontainer container f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b. Jan 13 20:43:39.038129 systemd[1]: Started cri-containerd-318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba.scope - libcontainer container 318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba. Jan 13 20:43:39.041283 systemd-networkd[1475]: cali6ab164a1921: Link UP Jan 13 20:43:39.042775 systemd-networkd[1475]: cali6ab164a1921: Gained carrier Jan 13 20:43:39.049078 systemd-resolved[1413]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:38.095 [INFO][4862] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:38.228 [INFO][4862] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--8prdm-eth0 csi-node-driver- calico-system 3c210180-a974-4da0-9ca1-18a6f94f39f4 630 0 2025-01-13 20:43:21 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:65bf684474 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-8prdm eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali6ab164a1921 [] []}} ContainerID="4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" Namespace="calico-system" Pod="csi-node-driver-8prdm" WorkloadEndpoint="localhost-k8s-csi--node--driver--8prdm-" Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:38.229 [INFO][4862] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" Namespace="calico-system" Pod="csi-node-driver-8prdm" WorkloadEndpoint="localhost-k8s-csi--node--driver--8prdm-eth0" Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:38.672 [INFO][4937] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" HandleID="k8s-pod-network.4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" Workload="localhost-k8s-csi--node--driver--8prdm-eth0" Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:38.698 [INFO][4937] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" HandleID="k8s-pod-network.4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" Workload="localhost-k8s-csi--node--driver--8prdm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001039a0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-8prdm", "timestamp":"2025-01-13 20:43:38.672710613 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:38.698 [INFO][4937] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:38.936 [INFO][4937] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:38.936 [INFO][4937] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:38.945 [INFO][4937] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" host="localhost" Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:38.961 [INFO][4937] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:38.990 [INFO][4937] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:38.993 [INFO][4937] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:39.000 [INFO][4937] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:39.000 [INFO][4937] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" host="localhost" Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:39.003 [INFO][4937] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32 Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:39.009 [INFO][4937] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" host="localhost" Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:39.019 [INFO][4937] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" host="localhost" Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:39.019 [INFO][4937] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" host="localhost" Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:39.024 [INFO][4937] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:43:39.062362 containerd[1551]: 2025-01-13 20:43:39.024 [INFO][4937] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" HandleID="k8s-pod-network.4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" Workload="localhost-k8s-csi--node--driver--8prdm-eth0" Jan 13 20:43:39.063186 containerd[1551]: 2025-01-13 20:43:39.030 [INFO][4862] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" Namespace="calico-system" Pod="csi-node-driver-8prdm" WorkloadEndpoint="localhost-k8s-csi--node--driver--8prdm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--8prdm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3c210180-a974-4da0-9ca1-18a6f94f39f4", ResourceVersion:"630", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 43, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-8prdm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6ab164a1921", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:43:39.063186 containerd[1551]: 2025-01-13 20:43:39.032 [INFO][4862] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" Namespace="calico-system" Pod="csi-node-driver-8prdm" WorkloadEndpoint="localhost-k8s-csi--node--driver--8prdm-eth0" Jan 13 20:43:39.063186 containerd[1551]: 2025-01-13 20:43:39.034 [INFO][4862] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6ab164a1921 ContainerID="4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" Namespace="calico-system" Pod="csi-node-driver-8prdm" WorkloadEndpoint="localhost-k8s-csi--node--driver--8prdm-eth0" Jan 13 20:43:39.063186 containerd[1551]: 2025-01-13 20:43:39.041 [INFO][4862] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" Namespace="calico-system" Pod="csi-node-driver-8prdm" WorkloadEndpoint="localhost-k8s-csi--node--driver--8prdm-eth0" Jan 13 20:43:39.063186 containerd[1551]: 2025-01-13 20:43:39.045 [INFO][4862] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" Namespace="calico-system" Pod="csi-node-driver-8prdm" WorkloadEndpoint="localhost-k8s-csi--node--driver--8prdm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--8prdm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3c210180-a974-4da0-9ca1-18a6f94f39f4", ResourceVersion:"630", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 43, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32", Pod:"csi-node-driver-8prdm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6ab164a1921", MAC:"76:03:56:27:fb:32", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:43:39.063186 containerd[1551]: 2025-01-13 20:43:39.060 [INFO][4862] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32" Namespace="calico-system" Pod="csi-node-driver-8prdm" WorkloadEndpoint="localhost-k8s-csi--node--driver--8prdm-eth0" Jan 13 20:43:39.073120 systemd-resolved[1413]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:43:39.087122 systemd-resolved[1413]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:43:39.088973 containerd[1551]: time="2025-01-13T20:43:39.088519223Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:43:39.088973 containerd[1551]: time="2025-01-13T20:43:39.088813095Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:43:39.088973 containerd[1551]: time="2025-01-13T20:43:39.088845427Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:39.088973 containerd[1551]: time="2025-01-13T20:43:39.088919914Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:39.104549 containerd[1551]: time="2025-01-13T20:43:39.104517619Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-f9n77,Uid:f4b04d0c-4a38-4068-8730-7ed8bc9346ef,Namespace:kube-system,Attempt:7,} returns sandbox id \"0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2\"" Jan 13 20:43:39.105093 systemd-resolved[1413]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:43:39.109458 containerd[1551]: time="2025-01-13T20:43:39.109262662Z" level=info msg="CreateContainer within sandbox \"0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 13 20:43:39.137467 containerd[1551]: time="2025-01-13T20:43:39.136821289Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:43:39.137467 containerd[1551]: time="2025-01-13T20:43:39.136860863Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:43:39.137467 containerd[1551]: time="2025-01-13T20:43:39.137397071Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:39.137755 containerd[1551]: time="2025-01-13T20:43:39.137707748Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:43:39.137914 systemd[1]: Started cri-containerd-9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5.scope - libcontainer container 9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5. Jan 13 20:43:39.161577 containerd[1551]: time="2025-01-13T20:43:39.161508747Z" level=info msg="CreateContainer within sandbox \"0e430002ca4193c4b3b1bd798a07c17705333c11b9a968ba05481c759aaab9a2\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ce3d97e3c3e0747de2288b515e6bf5b0764cbcedb5e1eff392af9b7b16e5da25\"" Jan 13 20:43:39.163603 containerd[1551]: time="2025-01-13T20:43:39.162858168Z" level=info msg="StartContainer for \"ce3d97e3c3e0747de2288b515e6bf5b0764cbcedb5e1eff392af9b7b16e5da25\"" Jan 13 20:43:39.182107 systemd-resolved[1413]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:43:39.187201 systemd[1]: Started cri-containerd-4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32.scope - libcontainer container 4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32. Jan 13 20:43:39.191597 containerd[1551]: time="2025-01-13T20:43:39.191567593Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-h4ptk,Uid:2fafcb48-274d-4a9e-adca-511adb0459f5,Namespace:calico-apiserver,Attempt:6,} returns sandbox id \"f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b\"" Jan 13 20:43:39.193476 containerd[1551]: time="2025-01-13T20:43:39.193456938Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 13 20:43:39.196672 containerd[1551]: time="2025-01-13T20:43:39.196613921Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69bd7ff58c-rg9pf,Uid:4d21bbae-e62e-4365-ba57-17611b86a846,Namespace:calico-system,Attempt:7,} returns sandbox id \"318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba\"" Jan 13 20:43:39.226889 systemd[1]: Started cri-containerd-ce3d97e3c3e0747de2288b515e6bf5b0764cbcedb5e1eff392af9b7b16e5da25.scope - libcontainer container ce3d97e3c3e0747de2288b515e6bf5b0764cbcedb5e1eff392af9b7b16e5da25. Jan 13 20:43:39.265061 containerd[1551]: time="2025-01-13T20:43:39.264996719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zww27,Uid:4bc90cb7-3014-45c2-9619-7765993cb1d0,Namespace:kube-system,Attempt:7,} returns sandbox id \"9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5\"" Jan 13 20:43:39.269891 containerd[1551]: time="2025-01-13T20:43:39.269792063Z" level=info msg="CreateContainer within sandbox \"9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 13 20:43:39.301055 systemd-resolved[1413]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:43:39.321936 containerd[1551]: time="2025-01-13T20:43:39.321669544Z" level=info msg="CreateContainer within sandbox \"9c953771b7abdc52f299a1fd70d9ccc07acb37cbb9372d75398aa5b3e77b21c5\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"854d93733d846e9a2683240a771f3d53a52210748dcb3244543f1425ae16fc84\"" Jan 13 20:43:39.325047 containerd[1551]: time="2025-01-13T20:43:39.324826302Z" level=info msg="StartContainer for \"854d93733d846e9a2683240a771f3d53a52210748dcb3244543f1425ae16fc84\"" Jan 13 20:43:39.371706 containerd[1551]: time="2025-01-13T20:43:39.370984320Z" level=info msg="StartContainer for \"ce3d97e3c3e0747de2288b515e6bf5b0764cbcedb5e1eff392af9b7b16e5da25\" returns successfully" Jan 13 20:43:39.375809 containerd[1551]: time="2025-01-13T20:43:39.375779863Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-595db757fc-b5vf4,Uid:aac83cfd-4e11-46a1-b32b-b7b1ab43a7f8,Namespace:calico-apiserver,Attempt:6,} returns sandbox id \"1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d\"" Jan 13 20:43:39.377786 containerd[1551]: time="2025-01-13T20:43:39.377759056Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8prdm,Uid:3c210180-a974-4da0-9ca1-18a6f94f39f4,Namespace:calico-system,Attempt:7,} returns sandbox id \"4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32\"" Jan 13 20:43:39.401972 systemd[1]: Started cri-containerd-854d93733d846e9a2683240a771f3d53a52210748dcb3244543f1425ae16fc84.scope - libcontainer container 854d93733d846e9a2683240a771f3d53a52210748dcb3244543f1425ae16fc84. Jan 13 20:43:39.427697 containerd[1551]: time="2025-01-13T20:43:39.427663908Z" level=info msg="StartContainer for \"854d93733d846e9a2683240a771f3d53a52210748dcb3244543f1425ae16fc84\" returns successfully" Jan 13 20:43:39.689880 kernel: bpftool[5471]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Jan 13 20:43:39.941571 systemd-networkd[1475]: vxlan.calico: Link UP Jan 13 20:43:39.941577 systemd-networkd[1475]: vxlan.calico: Gained carrier Jan 13 20:43:40.073372 systemd-networkd[1475]: calif9bdb80f7b4: Gained IPv6LL Jan 13 20:43:40.101402 kubelet[2877]: I0113 20:43:40.101168 2877 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-f9n77" podStartSLOduration=28.101152843 podStartE2EDuration="28.101152843s" podCreationTimestamp="2025-01-13 20:43:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:43:40.099525673 +0000 UTC m=+43.586567615" watchObservedRunningTime="2025-01-13 20:43:40.101152843 +0000 UTC m=+43.588194773" Jan 13 20:43:40.135446 kubelet[2877]: I0113 20:43:40.135409 2877 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-zww27" podStartSLOduration=28.135396329 podStartE2EDuration="28.135396329s" podCreationTimestamp="2025-01-13 20:43:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:43:40.121146154 +0000 UTC m=+43.608188095" watchObservedRunningTime="2025-01-13 20:43:40.135396329 +0000 UTC m=+43.622438265" Jan 13 20:43:40.328904 systemd-networkd[1475]: cali60ed9622122: Gained IPv6LL Jan 13 20:43:40.456883 systemd-networkd[1475]: calib8ed82489d8: Gained IPv6LL Jan 13 20:43:40.458307 systemd-networkd[1475]: cali0691d0ef5f7: Gained IPv6LL Jan 13 20:43:40.520916 systemd-networkd[1475]: cali231f6bf509d: Gained IPv6LL Jan 13 20:43:40.904845 systemd-networkd[1475]: cali6ab164a1921: Gained IPv6LL Jan 13 20:43:41.224957 systemd-networkd[1475]: vxlan.calico: Gained IPv6LL Jan 13 20:43:41.579521 containerd[1551]: time="2025-01-13T20:43:41.579318158Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:41.580168 containerd[1551]: time="2025-01-13T20:43:41.580069411Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Jan 13 20:43:41.581051 containerd[1551]: time="2025-01-13T20:43:41.580958559Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:41.583201 containerd[1551]: time="2025-01-13T20:43:41.582617345Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 2.389060798s" Jan 13 20:43:41.583201 containerd[1551]: time="2025-01-13T20:43:41.582638088Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 13 20:43:41.583201 containerd[1551]: time="2025-01-13T20:43:41.582876181Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:41.587317 containerd[1551]: time="2025-01-13T20:43:41.587104821Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Jan 13 20:43:41.587948 containerd[1551]: time="2025-01-13T20:43:41.587933413Z" level=info msg="CreateContainer within sandbox \"f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 13 20:43:41.597181 containerd[1551]: time="2025-01-13T20:43:41.597124555Z" level=info msg="CreateContainer within sandbox \"f1eed581f85ba004af628f159b9b226fda008480a3c26d4ad6cf023b4cb1af8b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"1e5b5d0b08c1d1019c6b672f9c6de0b24d06c343a83b31edff8f10eed7315bdb\"" Jan 13 20:43:41.598715 containerd[1551]: time="2025-01-13T20:43:41.598009468Z" level=info msg="StartContainer for \"1e5b5d0b08c1d1019c6b672f9c6de0b24d06c343a83b31edff8f10eed7315bdb\"" Jan 13 20:43:41.621883 systemd[1]: Started cri-containerd-1e5b5d0b08c1d1019c6b672f9c6de0b24d06c343a83b31edff8f10eed7315bdb.scope - libcontainer container 1e5b5d0b08c1d1019c6b672f9c6de0b24d06c343a83b31edff8f10eed7315bdb. Jan 13 20:43:41.654643 containerd[1551]: time="2025-01-13T20:43:41.654613360Z" level=info msg="StartContainer for \"1e5b5d0b08c1d1019c6b672f9c6de0b24d06c343a83b31edff8f10eed7315bdb\" returns successfully" Jan 13 20:43:42.149965 kubelet[2877]: I0113 20:43:42.149920 2877 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-595db757fc-h4ptk" podStartSLOduration=19.756167224 podStartE2EDuration="22.149907084s" podCreationTimestamp="2025-01-13 20:43:20 +0000 UTC" firstStartedPulling="2025-01-13 20:43:39.193216652 +0000 UTC m=+42.680258586" lastFinishedPulling="2025-01-13 20:43:41.586956514 +0000 UTC m=+45.073998446" observedRunningTime="2025-01-13 20:43:42.149200764 +0000 UTC m=+45.636242706" watchObservedRunningTime="2025-01-13 20:43:42.149907084 +0000 UTC m=+45.636949021" Jan 13 20:43:43.744949 containerd[1551]: time="2025-01-13T20:43:43.744906537Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:43.745423 containerd[1551]: time="2025-01-13T20:43:43.745346807Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Jan 13 20:43:43.746001 containerd[1551]: time="2025-01-13T20:43:43.745689073Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:43.746825 containerd[1551]: time="2025-01-13T20:43:43.746789331Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:43.747226 containerd[1551]: time="2025-01-13T20:43:43.747177097Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 2.160046231s" Jan 13 20:43:43.747226 containerd[1551]: time="2025-01-13T20:43:43.747195551Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Jan 13 20:43:43.748055 containerd[1551]: time="2025-01-13T20:43:43.748044175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Jan 13 20:43:43.767506 containerd[1551]: time="2025-01-13T20:43:43.766203220Z" level=info msg="CreateContainer within sandbox \"318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 13 20:43:43.773981 containerd[1551]: time="2025-01-13T20:43:43.773912604Z" level=info msg="CreateContainer within sandbox \"318d37fee9de15e9956487bbab04b9542b8474260da07c5bab52cbfb71895cba\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"dda849b6f7e3aa6ad1e8234c98621bcb7a3568d62911e4dd5c0fb3fffb297256\"" Jan 13 20:43:43.775505 containerd[1551]: time="2025-01-13T20:43:43.775115613Z" level=info msg="StartContainer for \"dda849b6f7e3aa6ad1e8234c98621bcb7a3568d62911e4dd5c0fb3fffb297256\"" Jan 13 20:43:43.801862 systemd[1]: Started cri-containerd-dda849b6f7e3aa6ad1e8234c98621bcb7a3568d62911e4dd5c0fb3fffb297256.scope - libcontainer container dda849b6f7e3aa6ad1e8234c98621bcb7a3568d62911e4dd5c0fb3fffb297256. Jan 13 20:43:43.831213 containerd[1551]: time="2025-01-13T20:43:43.831144164Z" level=info msg="StartContainer for \"dda849b6f7e3aa6ad1e8234c98621bcb7a3568d62911e4dd5c0fb3fffb297256\" returns successfully" Jan 13 20:43:44.378344 kubelet[2877]: I0113 20:43:44.378297 2877 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-69bd7ff58c-rg9pf" podStartSLOduration=18.830019231 podStartE2EDuration="23.378102343s" podCreationTimestamp="2025-01-13 20:43:21 +0000 UTC" firstStartedPulling="2025-01-13 20:43:39.199615705 +0000 UTC m=+42.686657638" lastFinishedPulling="2025-01-13 20:43:43.747698817 +0000 UTC m=+47.234740750" observedRunningTime="2025-01-13 20:43:44.144401176 +0000 UTC m=+47.631443117" watchObservedRunningTime="2025-01-13 20:43:44.378102343 +0000 UTC m=+47.865144279" Jan 13 20:43:45.032836 containerd[1551]: time="2025-01-13T20:43:45.032810973Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:45.033724 containerd[1551]: time="2025-01-13T20:43:45.033703039Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Jan 13 20:43:45.034160 containerd[1551]: time="2025-01-13T20:43:45.034019228Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:45.035362 containerd[1551]: time="2025-01-13T20:43:45.035349837Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:45.035821 containerd[1551]: time="2025-01-13T20:43:45.035628786Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.287516411s" Jan 13 20:43:45.036016 containerd[1551]: time="2025-01-13T20:43:45.036006445Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Jan 13 20:43:45.036630 containerd[1551]: time="2025-01-13T20:43:45.036609010Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 13 20:43:45.039815 containerd[1551]: time="2025-01-13T20:43:45.039498790Z" level=info msg="CreateContainer within sandbox \"4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jan 13 20:43:45.052896 containerd[1551]: time="2025-01-13T20:43:45.052872379Z" level=info msg="CreateContainer within sandbox \"4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"b676063bca006963506e73de7945ab47aed587b949f76a610d778f0a640c0f1b\"" Jan 13 20:43:45.054585 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2047338205.mount: Deactivated successfully. Jan 13 20:43:45.055929 containerd[1551]: time="2025-01-13T20:43:45.055854675Z" level=info msg="StartContainer for \"b676063bca006963506e73de7945ab47aed587b949f76a610d778f0a640c0f1b\"" Jan 13 20:43:45.078848 systemd[1]: Started cri-containerd-b676063bca006963506e73de7945ab47aed587b949f76a610d778f0a640c0f1b.scope - libcontainer container b676063bca006963506e73de7945ab47aed587b949f76a610d778f0a640c0f1b. Jan 13 20:43:45.101830 containerd[1551]: time="2025-01-13T20:43:45.101793801Z" level=info msg="StartContainer for \"b676063bca006963506e73de7945ab47aed587b949f76a610d778f0a640c0f1b\" returns successfully" Jan 13 20:43:45.402154 containerd[1551]: time="2025-01-13T20:43:45.402116365Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:45.402809 containerd[1551]: time="2025-01-13T20:43:45.402562473Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Jan 13 20:43:45.404132 containerd[1551]: time="2025-01-13T20:43:45.404058410Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 367.433267ms" Jan 13 20:43:45.404132 containerd[1551]: time="2025-01-13T20:43:45.404076205Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 13 20:43:45.404707 containerd[1551]: time="2025-01-13T20:43:45.404679765Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Jan 13 20:43:45.405945 containerd[1551]: time="2025-01-13T20:43:45.405818014Z" level=info msg="CreateContainer within sandbox \"1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 13 20:43:45.421821 containerd[1551]: time="2025-01-13T20:43:45.421799450Z" level=info msg="CreateContainer within sandbox \"1ef92784f523266b6a465edbbe5976eb414e66ae052a85850ff56290cc61a44d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"42b431793ab8afa0c99f1a95e7fd236574a06636b254409d3bd328e0672139dd\"" Jan 13 20:43:45.422882 containerd[1551]: time="2025-01-13T20:43:45.422050169Z" level=info msg="StartContainer for \"42b431793ab8afa0c99f1a95e7fd236574a06636b254409d3bd328e0672139dd\"" Jan 13 20:43:45.439827 systemd[1]: Started cri-containerd-42b431793ab8afa0c99f1a95e7fd236574a06636b254409d3bd328e0672139dd.scope - libcontainer container 42b431793ab8afa0c99f1a95e7fd236574a06636b254409d3bd328e0672139dd. Jan 13 20:43:45.471159 containerd[1551]: time="2025-01-13T20:43:45.471114610Z" level=info msg="StartContainer for \"42b431793ab8afa0c99f1a95e7fd236574a06636b254409d3bd328e0672139dd\" returns successfully" Jan 13 20:43:47.220342 kubelet[2877]: I0113 20:43:47.219919 2877 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:43:47.435028 containerd[1551]: time="2025-01-13T20:43:47.434991008Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:47.438921 containerd[1551]: time="2025-01-13T20:43:47.438887650Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Jan 13 20:43:47.439588 containerd[1551]: time="2025-01-13T20:43:47.439282370Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:47.440203 containerd[1551]: time="2025-01-13T20:43:47.440179335Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:43:47.440748 containerd[1551]: time="2025-01-13T20:43:47.440569712Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 2.035871938s" Jan 13 20:43:47.440748 containerd[1551]: time="2025-01-13T20:43:47.440587514Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Jan 13 20:43:47.474879 containerd[1551]: time="2025-01-13T20:43:47.474807399Z" level=info msg="CreateContainer within sandbox \"4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jan 13 20:43:47.495922 containerd[1551]: time="2025-01-13T20:43:47.495843855Z" level=info msg="CreateContainer within sandbox \"4aa46a48923cd8ad0a8ef13fe347b50dbf1fb322b5e06b0697dfd89481122f32\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"4b6d8ac67039837657497d1f240abc4f6ec44d60e70f13d7c953606d3f450694\"" Jan 13 20:43:47.499070 containerd[1551]: time="2025-01-13T20:43:47.496777131Z" level=info msg="StartContainer for \"4b6d8ac67039837657497d1f240abc4f6ec44d60e70f13d7c953606d3f450694\"" Jan 13 20:43:47.523819 systemd[1]: Started cri-containerd-4b6d8ac67039837657497d1f240abc4f6ec44d60e70f13d7c953606d3f450694.scope - libcontainer container 4b6d8ac67039837657497d1f240abc4f6ec44d60e70f13d7c953606d3f450694. Jan 13 20:43:47.541220 containerd[1551]: time="2025-01-13T20:43:47.541143710Z" level=info msg="StartContainer for \"4b6d8ac67039837657497d1f240abc4f6ec44d60e70f13d7c953606d3f450694\" returns successfully" Jan 13 20:43:47.818635 kubelet[2877]: I0113 20:43:47.818539 2877 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jan 13 20:43:47.821549 kubelet[2877]: I0113 20:43:47.821517 2877 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jan 13 20:43:48.159413 kubelet[2877]: I0113 20:43:48.159377 2877 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-8prdm" podStartSLOduration=19.067056915 podStartE2EDuration="27.159365357s" podCreationTimestamp="2025-01-13 20:43:21 +0000 UTC" firstStartedPulling="2025-01-13 20:43:39.380915443 +0000 UTC m=+42.867957376" lastFinishedPulling="2025-01-13 20:43:47.473223886 +0000 UTC m=+50.960265818" observedRunningTime="2025-01-13 20:43:48.158707883 +0000 UTC m=+51.645749824" watchObservedRunningTime="2025-01-13 20:43:48.159365357 +0000 UTC m=+51.646407294" Jan 13 20:43:48.159542 kubelet[2877]: I0113 20:43:48.159486 2877 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-595db757fc-b5vf4" podStartSLOduration=22.135960064 podStartE2EDuration="28.15948197s" podCreationTimestamp="2025-01-13 20:43:20 +0000 UTC" firstStartedPulling="2025-01-13 20:43:39.380972346 +0000 UTC m=+42.868014279" lastFinishedPulling="2025-01-13 20:43:45.404494253 +0000 UTC m=+48.891536185" observedRunningTime="2025-01-13 20:43:46.232805872 +0000 UTC m=+49.719847805" watchObservedRunningTime="2025-01-13 20:43:48.15948197 +0000 UTC m=+51.646523906" Jan 13 20:43:56.596225 containerd[1551]: time="2025-01-13T20:43:56.596164818Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\"" Jan 13 20:43:56.597153 containerd[1551]: time="2025-01-13T20:43:56.596229912Z" level=info msg="TearDown network for sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" successfully" Jan 13 20:43:56.597153 containerd[1551]: time="2025-01-13T20:43:56.596237929Z" level=info msg="StopPodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" returns successfully" Jan 13 20:43:56.598228 containerd[1551]: time="2025-01-13T20:43:56.598153958Z" level=info msg="RemovePodSandbox for \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\"" Jan 13 20:43:56.603780 containerd[1551]: time="2025-01-13T20:43:56.603756010Z" level=info msg="Forcibly stopping sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\"" Jan 13 20:43:56.615069 containerd[1551]: time="2025-01-13T20:43:56.603800224Z" level=info msg="TearDown network for sandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" successfully" Jan 13 20:43:56.619800 containerd[1551]: time="2025-01-13T20:43:56.619778937Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.624620 containerd[1551]: time="2025-01-13T20:43:56.624550179Z" level=info msg="RemovePodSandbox \"e568f5a5c6174346fae789ad3bbf48f6cd52805a8a702022e2da824dbb4e083b\" returns successfully" Jan 13 20:43:56.625291 containerd[1551]: time="2025-01-13T20:43:56.625245455Z" level=info msg="StopPodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\"" Jan 13 20:43:56.630258 containerd[1551]: time="2025-01-13T20:43:56.625316428Z" level=info msg="TearDown network for sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" successfully" Jan 13 20:43:56.630307 containerd[1551]: time="2025-01-13T20:43:56.630257717Z" level=info msg="StopPodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" returns successfully" Jan 13 20:43:56.631268 containerd[1551]: time="2025-01-13T20:43:56.630408339Z" level=info msg="RemovePodSandbox for \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\"" Jan 13 20:43:56.631268 containerd[1551]: time="2025-01-13T20:43:56.630420557Z" level=info msg="Forcibly stopping sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\"" Jan 13 20:43:56.631268 containerd[1551]: time="2025-01-13T20:43:56.630453051Z" level=info msg="TearDown network for sandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" successfully" Jan 13 20:43:56.632429 containerd[1551]: time="2025-01-13T20:43:56.632414015Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.632466 containerd[1551]: time="2025-01-13T20:43:56.632439326Z" level=info msg="RemovePodSandbox \"b6f4450959040d546067c0f1198147ad2d7abfc234b361d3996040dd5aa85d94\" returns successfully" Jan 13 20:43:56.632669 containerd[1551]: time="2025-01-13T20:43:56.632591323Z" level=info msg="StopPodSandbox for \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\"" Jan 13 20:43:56.632669 containerd[1551]: time="2025-01-13T20:43:56.632636581Z" level=info msg="TearDown network for sandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\" successfully" Jan 13 20:43:56.632669 containerd[1551]: time="2025-01-13T20:43:56.632642784Z" level=info msg="StopPodSandbox for \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\" returns successfully" Jan 13 20:43:56.632960 containerd[1551]: time="2025-01-13T20:43:56.632860654Z" level=info msg="RemovePodSandbox for \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\"" Jan 13 20:43:56.632960 containerd[1551]: time="2025-01-13T20:43:56.632870339Z" level=info msg="Forcibly stopping sandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\"" Jan 13 20:43:56.632960 containerd[1551]: time="2025-01-13T20:43:56.632898897Z" level=info msg="TearDown network for sandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\" successfully" Jan 13 20:43:56.634239 containerd[1551]: time="2025-01-13T20:43:56.634182156Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.634239 containerd[1551]: time="2025-01-13T20:43:56.634204703Z" level=info msg="RemovePodSandbox \"8421ebf2bf148f8e5e85944f083bd2b0659c31618a34c960c653c1ce1a86a313\" returns successfully" Jan 13 20:43:56.634755 containerd[1551]: time="2025-01-13T20:43:56.634334451Z" level=info msg="StopPodSandbox for \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\"" Jan 13 20:43:56.634755 containerd[1551]: time="2025-01-13T20:43:56.634372461Z" level=info msg="TearDown network for sandbox \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\" successfully" Jan 13 20:43:56.634755 containerd[1551]: time="2025-01-13T20:43:56.634378519Z" level=info msg="StopPodSandbox for \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\" returns successfully" Jan 13 20:43:56.634755 containerd[1551]: time="2025-01-13T20:43:56.634497992Z" level=info msg="RemovePodSandbox for \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\"" Jan 13 20:43:56.634755 containerd[1551]: time="2025-01-13T20:43:56.634507342Z" level=info msg="Forcibly stopping sandbox \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\"" Jan 13 20:43:56.634755 containerd[1551]: time="2025-01-13T20:43:56.634564913Z" level=info msg="TearDown network for sandbox \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\" successfully" Jan 13 20:43:56.635809 containerd[1551]: time="2025-01-13T20:43:56.635793072Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.635836 containerd[1551]: time="2025-01-13T20:43:56.635820458Z" level=info msg="RemovePodSandbox \"ff7ee1e2eaec9b064a0e66312a6e62949b42fc095889253c2b1eeeb6c7cb2898\" returns successfully" Jan 13 20:43:56.635988 containerd[1551]: time="2025-01-13T20:43:56.635976445Z" level=info msg="StopPodSandbox for \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\"" Jan 13 20:43:56.636139 containerd[1551]: time="2025-01-13T20:43:56.636097461Z" level=info msg="TearDown network for sandbox \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\" successfully" Jan 13 20:43:56.636139 containerd[1551]: time="2025-01-13T20:43:56.636109277Z" level=info msg="StopPodSandbox for \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\" returns successfully" Jan 13 20:43:56.636275 containerd[1551]: time="2025-01-13T20:43:56.636262267Z" level=info msg="RemovePodSandbox for \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\"" Jan 13 20:43:56.636301 containerd[1551]: time="2025-01-13T20:43:56.636274528Z" level=info msg="Forcibly stopping sandbox \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\"" Jan 13 20:43:56.636392 containerd[1551]: time="2025-01-13T20:43:56.636369493Z" level=info msg="TearDown network for sandbox \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\" successfully" Jan 13 20:43:56.637570 containerd[1551]: time="2025-01-13T20:43:56.637551834Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.637607 containerd[1551]: time="2025-01-13T20:43:56.637581994Z" level=info msg="RemovePodSandbox \"c1cd60a3aea6a358cb8dedce60b453140f697fca51611103baa725f9056eb9e6\" returns successfully" Jan 13 20:43:56.637814 containerd[1551]: time="2025-01-13T20:43:56.637800129Z" level=info msg="StopPodSandbox for \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\"" Jan 13 20:43:56.637860 containerd[1551]: time="2025-01-13T20:43:56.637848111Z" level=info msg="TearDown network for sandbox \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\" successfully" Jan 13 20:43:56.637884 containerd[1551]: time="2025-01-13T20:43:56.637860408Z" level=info msg="StopPodSandbox for \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\" returns successfully" Jan 13 20:43:56.638036 containerd[1551]: time="2025-01-13T20:43:56.638002952Z" level=info msg="RemovePodSandbox for \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\"" Jan 13 20:43:56.638065 containerd[1551]: time="2025-01-13T20:43:56.638034837Z" level=info msg="Forcibly stopping sandbox \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\"" Jan 13 20:43:56.638097 containerd[1551]: time="2025-01-13T20:43:56.638072818Z" level=info msg="TearDown network for sandbox \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\" successfully" Jan 13 20:43:56.640756 containerd[1551]: time="2025-01-13T20:43:56.640146040Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.640756 containerd[1551]: time="2025-01-13T20:43:56.640189200Z" level=info msg="RemovePodSandbox \"6fbb0a16520dbfeb4382bb36749ea7656a8e2ebd7e660e45609e213462f0d5af\" returns successfully" Jan 13 20:43:56.641849 containerd[1551]: time="2025-01-13T20:43:56.641837398Z" level=info msg="StopPodSandbox for \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\"" Jan 13 20:43:56.641936 containerd[1551]: time="2025-01-13T20:43:56.641927667Z" level=info msg="TearDown network for sandbox \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\" successfully" Jan 13 20:43:56.641980 containerd[1551]: time="2025-01-13T20:43:56.641970230Z" level=info msg="StopPodSandbox for \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\" returns successfully" Jan 13 20:43:56.642161 containerd[1551]: time="2025-01-13T20:43:56.642145533Z" level=info msg="RemovePodSandbox for \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\"" Jan 13 20:43:56.642273 containerd[1551]: time="2025-01-13T20:43:56.642240601Z" level=info msg="Forcibly stopping sandbox \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\"" Jan 13 20:43:56.642325 containerd[1551]: time="2025-01-13T20:43:56.642310286Z" level=info msg="TearDown network for sandbox \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\" successfully" Jan 13 20:43:56.643715 containerd[1551]: time="2025-01-13T20:43:56.643699258Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.643773 containerd[1551]: time="2025-01-13T20:43:56.643754280Z" level=info msg="RemovePodSandbox \"2115202165da9b36cdbbb104181e8ede7d73014f0f6db7e43abc21c68259aa58\" returns successfully" Jan 13 20:43:56.643952 containerd[1551]: time="2025-01-13T20:43:56.643941461Z" level=info msg="StopPodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\"" Jan 13 20:43:56.644197 containerd[1551]: time="2025-01-13T20:43:56.644058669Z" level=info msg="TearDown network for sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" successfully" Jan 13 20:43:56.644197 containerd[1551]: time="2025-01-13T20:43:56.644122540Z" level=info msg="StopPodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" returns successfully" Jan 13 20:43:56.644805 containerd[1551]: time="2025-01-13T20:43:56.644282252Z" level=info msg="RemovePodSandbox for \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\"" Jan 13 20:43:56.644805 containerd[1551]: time="2025-01-13T20:43:56.644292248Z" level=info msg="Forcibly stopping sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\"" Jan 13 20:43:56.644805 containerd[1551]: time="2025-01-13T20:43:56.644338241Z" level=info msg="TearDown network for sandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" successfully" Jan 13 20:43:56.645997 containerd[1551]: time="2025-01-13T20:43:56.645982809Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.646154 containerd[1551]: time="2025-01-13T20:43:56.646008601Z" level=info msg="RemovePodSandbox \"e828fa8756958fdc7c128920b573772b751251be9a4f972e08de1e682f5455cd\" returns successfully" Jan 13 20:43:56.646357 containerd[1551]: time="2025-01-13T20:43:56.646225324Z" level=info msg="StopPodSandbox for \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\"" Jan 13 20:43:56.646357 containerd[1551]: time="2025-01-13T20:43:56.646269948Z" level=info msg="TearDown network for sandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\" successfully" Jan 13 20:43:56.646357 containerd[1551]: time="2025-01-13T20:43:56.646283606Z" level=info msg="StopPodSandbox for \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\" returns successfully" Jan 13 20:43:56.646590 containerd[1551]: time="2025-01-13T20:43:56.646467119Z" level=info msg="RemovePodSandbox for \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\"" Jan 13 20:43:56.646590 containerd[1551]: time="2025-01-13T20:43:56.646477660Z" level=info msg="Forcibly stopping sandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\"" Jan 13 20:43:56.646590 containerd[1551]: time="2025-01-13T20:43:56.646557306Z" level=info msg="TearDown network for sandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\" successfully" Jan 13 20:43:56.647758 containerd[1551]: time="2025-01-13T20:43:56.647705166Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.647758 containerd[1551]: time="2025-01-13T20:43:56.647724874Z" level=info msg="RemovePodSandbox \"70b1199a3adab50b12e882727520c45c2740c4d362a9eb3c16ce6a0215d8f2ed\" returns successfully" Jan 13 20:43:56.647890 containerd[1551]: time="2025-01-13T20:43:56.647877009Z" level=info msg="StopPodSandbox for \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\"" Jan 13 20:43:56.647944 containerd[1551]: time="2025-01-13T20:43:56.647932945Z" level=info msg="TearDown network for sandbox \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\" successfully" Jan 13 20:43:56.647944 containerd[1551]: time="2025-01-13T20:43:56.647941369Z" level=info msg="StopPodSandbox for \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\" returns successfully" Jan 13 20:43:56.648103 containerd[1551]: time="2025-01-13T20:43:56.648090715Z" level=info msg="RemovePodSandbox for \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\"" Jan 13 20:43:56.648133 containerd[1551]: time="2025-01-13T20:43:56.648103588Z" level=info msg="Forcibly stopping sandbox \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\"" Jan 13 20:43:56.648952 containerd[1551]: time="2025-01-13T20:43:56.648131502Z" level=info msg="TearDown network for sandbox \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\" successfully" Jan 13 20:43:56.649306 containerd[1551]: time="2025-01-13T20:43:56.649291547Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.662376 containerd[1551]: time="2025-01-13T20:43:56.662359413Z" level=info msg="RemovePodSandbox \"8054a36668849df5b64349607cccd48858d154e8bf479bda31f016837d25abec\" returns successfully" Jan 13 20:43:56.662584 containerd[1551]: time="2025-01-13T20:43:56.662564086Z" level=info msg="StopPodSandbox for \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\"" Jan 13 20:43:56.662633 containerd[1551]: time="2025-01-13T20:43:56.662617252Z" level=info msg="TearDown network for sandbox \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\" successfully" Jan 13 20:43:56.662633 containerd[1551]: time="2025-01-13T20:43:56.662625140Z" level=info msg="StopPodSandbox for \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\" returns successfully" Jan 13 20:43:56.665691 containerd[1551]: time="2025-01-13T20:43:56.662824204Z" level=info msg="RemovePodSandbox for \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\"" Jan 13 20:43:56.665691 containerd[1551]: time="2025-01-13T20:43:56.662835760Z" level=info msg="Forcibly stopping sandbox \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\"" Jan 13 20:43:56.665691 containerd[1551]: time="2025-01-13T20:43:56.662867260Z" level=info msg="TearDown network for sandbox \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\" successfully" Jan 13 20:43:56.667918 containerd[1551]: time="2025-01-13T20:43:56.666097983Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.667918 containerd[1551]: time="2025-01-13T20:43:56.666115649Z" level=info msg="RemovePodSandbox \"0729e87aa890ed159f1966fdedd4fbe9f50f09c0bfa1f1cc0a298b036184185b\" returns successfully" Jan 13 20:43:56.667918 containerd[1551]: time="2025-01-13T20:43:56.666399614Z" level=info msg="StopPodSandbox for \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\"" Jan 13 20:43:56.667918 containerd[1551]: time="2025-01-13T20:43:56.666454975Z" level=info msg="TearDown network for sandbox \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\" successfully" Jan 13 20:43:56.667918 containerd[1551]: time="2025-01-13T20:43:56.666461259Z" level=info msg="StopPodSandbox for \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\" returns successfully" Jan 13 20:43:56.667918 containerd[1551]: time="2025-01-13T20:43:56.666597256Z" level=info msg="RemovePodSandbox for \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\"" Jan 13 20:43:56.667918 containerd[1551]: time="2025-01-13T20:43:56.666607264Z" level=info msg="Forcibly stopping sandbox \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\"" Jan 13 20:43:56.667918 containerd[1551]: time="2025-01-13T20:43:56.666763933Z" level=info msg="TearDown network for sandbox \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\" successfully" Jan 13 20:43:56.668792 containerd[1551]: time="2025-01-13T20:43:56.668468829Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.668792 containerd[1551]: time="2025-01-13T20:43:56.668493178Z" level=info msg="RemovePodSandbox \"8a827f24da4557199a93f3fbbc1a5cf77beed61eb81a6a458835aad40ca94596\" returns successfully" Jan 13 20:43:56.668792 containerd[1551]: time="2025-01-13T20:43:56.668634562Z" level=info msg="StopPodSandbox for \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\"" Jan 13 20:43:56.668792 containerd[1551]: time="2025-01-13T20:43:56.668674025Z" level=info msg="TearDown network for sandbox \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\" successfully" Jan 13 20:43:56.668792 containerd[1551]: time="2025-01-13T20:43:56.668679676Z" level=info msg="StopPodSandbox for \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\" returns successfully" Jan 13 20:43:56.669802 containerd[1551]: time="2025-01-13T20:43:56.669036237Z" level=info msg="RemovePodSandbox for \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\"" Jan 13 20:43:56.669802 containerd[1551]: time="2025-01-13T20:43:56.669047845Z" level=info msg="Forcibly stopping sandbox \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\"" Jan 13 20:43:56.669802 containerd[1551]: time="2025-01-13T20:43:56.669096498Z" level=info msg="TearDown network for sandbox \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\" successfully" Jan 13 20:43:56.671085 containerd[1551]: time="2025-01-13T20:43:56.671070904Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.671119 containerd[1551]: time="2025-01-13T20:43:56.671093364Z" level=info msg="RemovePodSandbox \"2a86f1bc738cf9e3fab17cac8af6c27130eb7c458b151173af5ad7e52730215f\" returns successfully" Jan 13 20:43:56.671304 containerd[1551]: time="2025-01-13T20:43:56.671287317Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\"" Jan 13 20:43:56.671333 containerd[1551]: time="2025-01-13T20:43:56.671325805Z" level=info msg="TearDown network for sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" successfully" Jan 13 20:43:56.671333 containerd[1551]: time="2025-01-13T20:43:56.671331423Z" level=info msg="StopPodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" returns successfully" Jan 13 20:43:56.671477 containerd[1551]: time="2025-01-13T20:43:56.671465256Z" level=info msg="RemovePodSandbox for \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\"" Jan 13 20:43:56.671477 containerd[1551]: time="2025-01-13T20:43:56.671476012Z" level=info msg="Forcibly stopping sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\"" Jan 13 20:43:56.671523 containerd[1551]: time="2025-01-13T20:43:56.671504256Z" level=info msg="TearDown network for sandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" successfully" Jan 13 20:43:56.672723 containerd[1551]: time="2025-01-13T20:43:56.672707205Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.672766 containerd[1551]: time="2025-01-13T20:43:56.672728536Z" level=info msg="RemovePodSandbox \"5d3545dd8ece52ab37ba2d0c7d46dddb7a0eac9ece1ea191386c8a9ace00a4c5\" returns successfully" Jan 13 20:43:56.672937 containerd[1551]: time="2025-01-13T20:43:56.672892362Z" level=info msg="StopPodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\"" Jan 13 20:43:56.672965 containerd[1551]: time="2025-01-13T20:43:56.672941014Z" level=info msg="TearDown network for sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" successfully" Jan 13 20:43:56.672965 containerd[1551]: time="2025-01-13T20:43:56.672947896Z" level=info msg="StopPodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" returns successfully" Jan 13 20:43:56.673288 containerd[1551]: time="2025-01-13T20:43:56.673277045Z" level=info msg="RemovePodSandbox for \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\"" Jan 13 20:43:56.673316 containerd[1551]: time="2025-01-13T20:43:56.673287860Z" level=info msg="Forcibly stopping sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\"" Jan 13 20:43:56.673337 containerd[1551]: time="2025-01-13T20:43:56.673317538Z" level=info msg="TearDown network for sandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" successfully" Jan 13 20:43:56.674642 containerd[1551]: time="2025-01-13T20:43:56.674626927Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.674701 containerd[1551]: time="2025-01-13T20:43:56.674647608Z" level=info msg="RemovePodSandbox \"8b91310530916fe5538f11fda363e275abc9fc8daa17bb0c25fc765a094077ec\" returns successfully" Jan 13 20:43:56.674841 containerd[1551]: time="2025-01-13T20:43:56.674829066Z" level=info msg="StopPodSandbox for \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\"" Jan 13 20:43:56.674882 containerd[1551]: time="2025-01-13T20:43:56.674875034Z" level=info msg="TearDown network for sandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\" successfully" Jan 13 20:43:56.674904 containerd[1551]: time="2025-01-13T20:43:56.674882564Z" level=info msg="StopPodSandbox for \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\" returns successfully" Jan 13 20:43:56.675017 containerd[1551]: time="2025-01-13T20:43:56.675004533Z" level=info msg="RemovePodSandbox for \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\"" Jan 13 20:43:56.675043 containerd[1551]: time="2025-01-13T20:43:56.675016542Z" level=info msg="Forcibly stopping sandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\"" Jan 13 20:43:56.675102 containerd[1551]: time="2025-01-13T20:43:56.675056122Z" level=info msg="TearDown network for sandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\" successfully" Jan 13 20:43:56.676381 containerd[1551]: time="2025-01-13T20:43:56.676366597Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.676413 containerd[1551]: time="2025-01-13T20:43:56.676390907Z" level=info msg="RemovePodSandbox \"c3344b44830d4cf88769ba736664bca0435090e7a1916e662f5d66e431f42c57\" returns successfully" Jan 13 20:43:56.676554 containerd[1551]: time="2025-01-13T20:43:56.676544072Z" level=info msg="StopPodSandbox for \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\"" Jan 13 20:43:56.676702 containerd[1551]: time="2025-01-13T20:43:56.676628568Z" level=info msg="TearDown network for sandbox \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\" successfully" Jan 13 20:43:56.676702 containerd[1551]: time="2025-01-13T20:43:56.676636593Z" level=info msg="StopPodSandbox for \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\" returns successfully" Jan 13 20:43:56.676864 containerd[1551]: time="2025-01-13T20:43:56.676850303Z" level=info msg="RemovePodSandbox for \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\"" Jan 13 20:43:56.676894 containerd[1551]: time="2025-01-13T20:43:56.676864763Z" level=info msg="Forcibly stopping sandbox \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\"" Jan 13 20:43:56.676965 containerd[1551]: time="2025-01-13T20:43:56.676943113Z" level=info msg="TearDown network for sandbox \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\" successfully" Jan 13 20:43:56.677996 containerd[1551]: time="2025-01-13T20:43:56.677982442Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.678020 containerd[1551]: time="2025-01-13T20:43:56.678007741Z" level=info msg="RemovePodSandbox \"6d33687bc04cb96f5fcf83bb3447a5ec733ec8ff80378573b285e47e72dee0a2\" returns successfully" Jan 13 20:43:56.678151 containerd[1551]: time="2025-01-13T20:43:56.678138038Z" level=info msg="StopPodSandbox for \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\"" Jan 13 20:43:56.678202 containerd[1551]: time="2025-01-13T20:43:56.678182014Z" level=info msg="TearDown network for sandbox \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\" successfully" Jan 13 20:43:56.678202 containerd[1551]: time="2025-01-13T20:43:56.678193174Z" level=info msg="StopPodSandbox for \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\" returns successfully" Jan 13 20:43:56.678315 containerd[1551]: time="2025-01-13T20:43:56.678304019Z" level=info msg="RemovePodSandbox for \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\"" Jan 13 20:43:56.678338 containerd[1551]: time="2025-01-13T20:43:56.678315344Z" level=info msg="Forcibly stopping sandbox \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\"" Jan 13 20:43:56.678356 containerd[1551]: time="2025-01-13T20:43:56.678344140Z" level=info msg="TearDown network for sandbox \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\" successfully" Jan 13 20:43:56.679370 containerd[1551]: time="2025-01-13T20:43:56.679356622Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.679400 containerd[1551]: time="2025-01-13T20:43:56.679377383Z" level=info msg="RemovePodSandbox \"e9e6d221b1d42cea7acd82fee1f081c02e81b7e3b282ddc3301d403abe6c9824\" returns successfully" Jan 13 20:43:56.679506 containerd[1551]: time="2025-01-13T20:43:56.679493994Z" level=info msg="StopPodSandbox for \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\"" Jan 13 20:43:56.679557 containerd[1551]: time="2025-01-13T20:43:56.679534547Z" level=info msg="TearDown network for sandbox \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\" successfully" Jan 13 20:43:56.679557 containerd[1551]: time="2025-01-13T20:43:56.679542112Z" level=info msg="StopPodSandbox for \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\" returns successfully" Jan 13 20:43:56.679746 containerd[1551]: time="2025-01-13T20:43:56.679662044Z" level=info msg="RemovePodSandbox for \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\"" Jan 13 20:43:56.679746 containerd[1551]: time="2025-01-13T20:43:56.679673900Z" level=info msg="Forcibly stopping sandbox \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\"" Jan 13 20:43:56.680136 containerd[1551]: time="2025-01-13T20:43:56.679812286Z" level=info msg="TearDown network for sandbox \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\" successfully" Jan 13 20:43:56.680917 containerd[1551]: time="2025-01-13T20:43:56.680905731Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.680975 containerd[1551]: time="2025-01-13T20:43:56.680966602Z" level=info msg="RemovePodSandbox \"7dfc89856eccbf0224a27f08733c053339cca397a1b0a840763909c7b34b4d7e\" returns successfully" Jan 13 20:43:56.681136 containerd[1551]: time="2025-01-13T20:43:56.681126454Z" level=info msg="StopPodSandbox for \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\"" Jan 13 20:43:56.681250 containerd[1551]: time="2025-01-13T20:43:56.681241648Z" level=info msg="TearDown network for sandbox \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\" successfully" Jan 13 20:43:56.681297 containerd[1551]: time="2025-01-13T20:43:56.681291089Z" level=info msg="StopPodSandbox for \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\" returns successfully" Jan 13 20:43:56.681504 containerd[1551]: time="2025-01-13T20:43:56.681483078Z" level=info msg="RemovePodSandbox for \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\"" Jan 13 20:43:56.681504 containerd[1551]: time="2025-01-13T20:43:56.681497910Z" level=info msg="Forcibly stopping sandbox \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\"" Jan 13 20:43:56.681548 containerd[1551]: time="2025-01-13T20:43:56.681528517Z" level=info msg="TearDown network for sandbox \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\" successfully" Jan 13 20:43:56.682594 containerd[1551]: time="2025-01-13T20:43:56.682580158Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.682631 containerd[1551]: time="2025-01-13T20:43:56.682599446Z" level=info msg="RemovePodSandbox \"76980d011e21f649a7ab01adbe652930c9430bd4eb887e8f27c67db1e8673fa3\" returns successfully" Jan 13 20:43:56.682868 containerd[1551]: time="2025-01-13T20:43:56.682806856Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\"" Jan 13 20:43:56.682979 containerd[1551]: time="2025-01-13T20:43:56.682943291Z" level=info msg="TearDown network for sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" successfully" Jan 13 20:43:56.682979 containerd[1551]: time="2025-01-13T20:43:56.682950153Z" level=info msg="StopPodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" returns successfully" Jan 13 20:43:56.683188 containerd[1551]: time="2025-01-13T20:43:56.683161749Z" level=info msg="RemovePodSandbox for \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\"" Jan 13 20:43:56.683188 containerd[1551]: time="2025-01-13T20:43:56.683173733Z" level=info msg="Forcibly stopping sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\"" Jan 13 20:43:56.683243 containerd[1551]: time="2025-01-13T20:43:56.683205017Z" level=info msg="TearDown network for sandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" successfully" Jan 13 20:43:56.684304 containerd[1551]: time="2025-01-13T20:43:56.684288781Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.684330 containerd[1551]: time="2025-01-13T20:43:56.684309761Z" level=info msg="RemovePodSandbox \"7d4c8f79453c247ced1bccb840fa9122aae9efb8083ef86fbe42f9e797ab0ddc\" returns successfully" Jan 13 20:43:56.684517 containerd[1551]: time="2025-01-13T20:43:56.684452982Z" level=info msg="StopPodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\"" Jan 13 20:43:56.684517 containerd[1551]: time="2025-01-13T20:43:56.684490675Z" level=info msg="TearDown network for sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" successfully" Jan 13 20:43:56.684517 containerd[1551]: time="2025-01-13T20:43:56.684496343Z" level=info msg="StopPodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" returns successfully" Jan 13 20:43:56.684806 containerd[1551]: time="2025-01-13T20:43:56.684716919Z" level=info msg="RemovePodSandbox for \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\"" Jan 13 20:43:56.684806 containerd[1551]: time="2025-01-13T20:43:56.684737054Z" level=info msg="Forcibly stopping sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\"" Jan 13 20:43:56.684806 containerd[1551]: time="2025-01-13T20:43:56.684778080Z" level=info msg="TearDown network for sandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" successfully" Jan 13 20:43:56.686464 containerd[1551]: time="2025-01-13T20:43:56.686400876Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.686464 containerd[1551]: time="2025-01-13T20:43:56.686419263Z" level=info msg="RemovePodSandbox \"d9e9cd68e421664b85d103e985cd91007227ba642467e6f1a700bc6fd4883e3d\" returns successfully" Jan 13 20:43:56.686562 containerd[1551]: time="2025-01-13T20:43:56.686548163Z" level=info msg="StopPodSandbox for \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\"" Jan 13 20:43:56.686593 containerd[1551]: time="2025-01-13T20:43:56.686587876Z" level=info msg="TearDown network for sandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\" successfully" Jan 13 20:43:56.686615 containerd[1551]: time="2025-01-13T20:43:56.686593345Z" level=info msg="StopPodSandbox for \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\" returns successfully" Jan 13 20:43:56.686788 containerd[1551]: time="2025-01-13T20:43:56.686775386Z" level=info msg="RemovePodSandbox for \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\"" Jan 13 20:43:56.686821 containerd[1551]: time="2025-01-13T20:43:56.686787595Z" level=info msg="Forcibly stopping sandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\"" Jan 13 20:43:56.686840 containerd[1551]: time="2025-01-13T20:43:56.686815608Z" level=info msg="TearDown network for sandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\" successfully" Jan 13 20:43:56.688344 containerd[1551]: time="2025-01-13T20:43:56.688329533Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.688371 containerd[1551]: time="2025-01-13T20:43:56.688349611Z" level=info msg="RemovePodSandbox \"adfa3decef023210aeb4bc97c5740fc045886b7ecbeb9af1131b64476d0857c0\" returns successfully" Jan 13 20:43:56.688479 containerd[1551]: time="2025-01-13T20:43:56.688467173Z" level=info msg="StopPodSandbox for \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\"" Jan 13 20:43:56.692999 containerd[1551]: time="2025-01-13T20:43:56.692985287Z" level=info msg="TearDown network for sandbox \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\" successfully" Jan 13 20:43:56.692999 containerd[1551]: time="2025-01-13T20:43:56.692997388Z" level=info msg="StopPodSandbox for \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\" returns successfully" Jan 13 20:43:56.693163 containerd[1551]: time="2025-01-13T20:43:56.693149258Z" level=info msg="RemovePodSandbox for \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\"" Jan 13 20:43:56.693163 containerd[1551]: time="2025-01-13T20:43:56.693162673Z" level=info msg="Forcibly stopping sandbox \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\"" Jan 13 20:43:56.693214 containerd[1551]: time="2025-01-13T20:43:56.693190463Z" level=info msg="TearDown network for sandbox \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\" successfully" Jan 13 20:43:56.694197 containerd[1551]: time="2025-01-13T20:43:56.694182553Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.694228 containerd[1551]: time="2025-01-13T20:43:56.694202573Z" level=info msg="RemovePodSandbox \"d53d51af6d4372eb34de9da29d4b00942eac7c2f786fcc4fedbad843c288dde4\" returns successfully" Jan 13 20:43:56.694435 containerd[1551]: time="2025-01-13T20:43:56.694381490Z" level=info msg="StopPodSandbox for \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\"" Jan 13 20:43:56.694534 containerd[1551]: time="2025-01-13T20:43:56.694493774Z" level=info msg="TearDown network for sandbox \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\" successfully" Jan 13 20:43:56.694649 containerd[1551]: time="2025-01-13T20:43:56.694575939Z" level=info msg="StopPodSandbox for \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\" returns successfully" Jan 13 20:43:56.695003 containerd[1551]: time="2025-01-13T20:43:56.694854710Z" level=info msg="RemovePodSandbox for \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\"" Jan 13 20:43:56.695003 containerd[1551]: time="2025-01-13T20:43:56.694866488Z" level=info msg="Forcibly stopping sandbox \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\"" Jan 13 20:43:56.695003 containerd[1551]: time="2025-01-13T20:43:56.694898219Z" level=info msg="TearDown network for sandbox \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\" successfully" Jan 13 20:43:56.696137 containerd[1551]: time="2025-01-13T20:43:56.696085912Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.696137 containerd[1551]: time="2025-01-13T20:43:56.696105998Z" level=info msg="RemovePodSandbox \"690cce4094269fce981073528e1a73a8960440765721a1d182660dd211a7ef30\" returns successfully" Jan 13 20:43:56.696387 containerd[1551]: time="2025-01-13T20:43:56.696292887Z" level=info msg="StopPodSandbox for \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\"" Jan 13 20:43:56.696387 containerd[1551]: time="2025-01-13T20:43:56.696336153Z" level=info msg="TearDown network for sandbox \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\" successfully" Jan 13 20:43:56.696387 containerd[1551]: time="2025-01-13T20:43:56.696341855Z" level=info msg="StopPodSandbox for \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\" returns successfully" Jan 13 20:43:56.696459 containerd[1551]: time="2025-01-13T20:43:56.696438745Z" level=info msg="RemovePodSandbox for \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\"" Jan 13 20:43:56.696459 containerd[1551]: time="2025-01-13T20:43:56.696451002Z" level=info msg="Forcibly stopping sandbox \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\"" Jan 13 20:43:56.696514 containerd[1551]: time="2025-01-13T20:43:56.696487275Z" level=info msg="TearDown network for sandbox \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\" successfully" Jan 13 20:43:56.697775 containerd[1551]: time="2025-01-13T20:43:56.697760285Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.697815 containerd[1551]: time="2025-01-13T20:43:56.697789236Z" level=info msg="RemovePodSandbox \"3ca236bd34496861f64c867ada5ea8f6fa5c3ab7fb4f0729bb3934672e49f2ad\" returns successfully" Jan 13 20:43:56.698006 containerd[1551]: time="2025-01-13T20:43:56.697936153Z" level=info msg="StopPodSandbox for \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\"" Jan 13 20:43:56.698006 containerd[1551]: time="2025-01-13T20:43:56.697975058Z" level=info msg="TearDown network for sandbox \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\" successfully" Jan 13 20:43:56.698006 containerd[1551]: time="2025-01-13T20:43:56.697980735Z" level=info msg="StopPodSandbox for \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\" returns successfully" Jan 13 20:43:56.698202 containerd[1551]: time="2025-01-13T20:43:56.698142846Z" level=info msg="RemovePodSandbox for \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\"" Jan 13 20:43:56.698202 containerd[1551]: time="2025-01-13T20:43:56.698154267Z" level=info msg="Forcibly stopping sandbox \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\"" Jan 13 20:43:56.698567 containerd[1551]: time="2025-01-13T20:43:56.698289249Z" level=info msg="TearDown network for sandbox \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\" successfully" Jan 13 20:43:56.699436 containerd[1551]: time="2025-01-13T20:43:56.699424406Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.699508 containerd[1551]: time="2025-01-13T20:43:56.699500568Z" level=info msg="RemovePodSandbox \"4e978feecc3160b675fce685a97b54f5c6bbd24705f96bfc2073924378626fc2\" returns successfully" Jan 13 20:43:56.699697 containerd[1551]: time="2025-01-13T20:43:56.699677451Z" level=info msg="StopPodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\"" Jan 13 20:43:56.699780 containerd[1551]: time="2025-01-13T20:43:56.699768388Z" level=info msg="TearDown network for sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" successfully" Jan 13 20:43:56.699816 containerd[1551]: time="2025-01-13T20:43:56.699810322Z" level=info msg="StopPodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" returns successfully" Jan 13 20:43:56.700004 containerd[1551]: time="2025-01-13T20:43:56.699995109Z" level=info msg="RemovePodSandbox for \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\"" Jan 13 20:43:56.700062 containerd[1551]: time="2025-01-13T20:43:56.700054607Z" level=info msg="Forcibly stopping sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\"" Jan 13 20:43:56.700130 containerd[1551]: time="2025-01-13T20:43:56.700113792Z" level=info msg="TearDown network for sandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" successfully" Jan 13 20:43:56.701213 containerd[1551]: time="2025-01-13T20:43:56.701159822Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.701213 containerd[1551]: time="2025-01-13T20:43:56.701179904Z" level=info msg="RemovePodSandbox \"003baee13bf92bbd9b900040e93d406bfeb7b7da1d6d1d0a586c32bef4552a4f\" returns successfully" Jan 13 20:43:56.701323 containerd[1551]: time="2025-01-13T20:43:56.701302381Z" level=info msg="StopPodSandbox for \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\"" Jan 13 20:43:56.701349 containerd[1551]: time="2025-01-13T20:43:56.701341248Z" level=info msg="TearDown network for sandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\" successfully" Jan 13 20:43:56.701349 containerd[1551]: time="2025-01-13T20:43:56.701346893Z" level=info msg="StopPodSandbox for \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\" returns successfully" Jan 13 20:43:56.701490 containerd[1551]: time="2025-01-13T20:43:56.701462451Z" level=info msg="RemovePodSandbox for \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\"" Jan 13 20:43:56.701516 containerd[1551]: time="2025-01-13T20:43:56.701493057Z" level=info msg="Forcibly stopping sandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\"" Jan 13 20:43:56.701554 containerd[1551]: time="2025-01-13T20:43:56.701523910Z" level=info msg="TearDown network for sandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\" successfully" Jan 13 20:43:56.703099 containerd[1551]: time="2025-01-13T20:43:56.703085522Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.703125 containerd[1551]: time="2025-01-13T20:43:56.703106069Z" level=info msg="RemovePodSandbox \"dc462240f5852c6782f72fac374d8c3b3a7b810e40972eaf0b75a4e4233f7405\" returns successfully" Jan 13 20:43:56.703360 containerd[1551]: time="2025-01-13T20:43:56.703264394Z" level=info msg="StopPodSandbox for \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\"" Jan 13 20:43:56.703360 containerd[1551]: time="2025-01-13T20:43:56.703310009Z" level=info msg="TearDown network for sandbox \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\" successfully" Jan 13 20:43:56.703360 containerd[1551]: time="2025-01-13T20:43:56.703317108Z" level=info msg="StopPodSandbox for \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\" returns successfully" Jan 13 20:43:56.703477 containerd[1551]: time="2025-01-13T20:43:56.703462403Z" level=info msg="RemovePodSandbox for \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\"" Jan 13 20:43:56.703499 containerd[1551]: time="2025-01-13T20:43:56.703476955Z" level=info msg="Forcibly stopping sandbox \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\"" Jan 13 20:43:56.703536 containerd[1551]: time="2025-01-13T20:43:56.703523976Z" level=info msg="TearDown network for sandbox \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\" successfully" Jan 13 20:43:56.704556 containerd[1551]: time="2025-01-13T20:43:56.704541524Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.704615 containerd[1551]: time="2025-01-13T20:43:56.704561245Z" level=info msg="RemovePodSandbox \"3c4ba50b7f7d6fea039229631b07761850b8bed4902555ea1c67cc457b76b561\" returns successfully" Jan 13 20:43:56.704812 containerd[1551]: time="2025-01-13T20:43:56.704799130Z" level=info msg="StopPodSandbox for \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\"" Jan 13 20:43:56.704870 containerd[1551]: time="2025-01-13T20:43:56.704848104Z" level=info msg="TearDown network for sandbox \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\" successfully" Jan 13 20:43:56.704870 containerd[1551]: time="2025-01-13T20:43:56.704856887Z" level=info msg="StopPodSandbox for \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\" returns successfully" Jan 13 20:43:56.705012 containerd[1551]: time="2025-01-13T20:43:56.704999317Z" level=info msg="RemovePodSandbox for \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\"" Jan 13 20:43:56.705012 containerd[1551]: time="2025-01-13T20:43:56.705011216Z" level=info msg="Forcibly stopping sandbox \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\"" Jan 13 20:43:56.705100 containerd[1551]: time="2025-01-13T20:43:56.705062102Z" level=info msg="TearDown network for sandbox \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\" successfully" Jan 13 20:43:56.706302 containerd[1551]: time="2025-01-13T20:43:56.706288094Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.706327 containerd[1551]: time="2025-01-13T20:43:56.706307786Z" level=info msg="RemovePodSandbox \"668bc186dee4ad06a08c90dcca1898b598e42fab2552ce871053b0c99dc8c537\" returns successfully" Jan 13 20:43:56.706473 containerd[1551]: time="2025-01-13T20:43:56.706460234Z" level=info msg="StopPodSandbox for \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\"" Jan 13 20:43:56.706503 containerd[1551]: time="2025-01-13T20:43:56.706498041Z" level=info msg="TearDown network for sandbox \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\" successfully" Jan 13 20:43:56.706528 containerd[1551]: time="2025-01-13T20:43:56.706503363Z" level=info msg="StopPodSandbox for \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\" returns successfully" Jan 13 20:43:56.706749 containerd[1551]: time="2025-01-13T20:43:56.706707272Z" level=info msg="RemovePodSandbox for \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\"" Jan 13 20:43:56.707492 containerd[1551]: time="2025-01-13T20:43:56.706788465Z" level=info msg="Forcibly stopping sandbox \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\"" Jan 13 20:43:56.707492 containerd[1551]: time="2025-01-13T20:43:56.706825016Z" level=info msg="TearDown network for sandbox \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\" successfully" Jan 13 20:43:56.707877 containerd[1551]: time="2025-01-13T20:43:56.707865047Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.707927 containerd[1551]: time="2025-01-13T20:43:56.707919020Z" level=info msg="RemovePodSandbox \"35e32b4bfa2ee0089144d8ec032c900dc142bb79f3ec6a8e8712cab3eee53943\" returns successfully" Jan 13 20:43:56.708092 containerd[1551]: time="2025-01-13T20:43:56.708077876Z" level=info msg="StopPodSandbox for \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\"" Jan 13 20:43:56.708133 containerd[1551]: time="2025-01-13T20:43:56.708122541Z" level=info msg="TearDown network for sandbox \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\" successfully" Jan 13 20:43:56.708133 containerd[1551]: time="2025-01-13T20:43:56.708129907Z" level=info msg="StopPodSandbox for \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\" returns successfully" Jan 13 20:43:56.708306 containerd[1551]: time="2025-01-13T20:43:56.708293210Z" level=info msg="RemovePodSandbox for \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\"" Jan 13 20:43:56.708336 containerd[1551]: time="2025-01-13T20:43:56.708306570Z" level=info msg="Forcibly stopping sandbox \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\"" Jan 13 20:43:56.708354 containerd[1551]: time="2025-01-13T20:43:56.708334499Z" level=info msg="TearDown network for sandbox \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\" successfully" Jan 13 20:43:56.709394 containerd[1551]: time="2025-01-13T20:43:56.709379143Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.709427 containerd[1551]: time="2025-01-13T20:43:56.709400334Z" level=info msg="RemovePodSandbox \"de695ade648953c5ef4049641bcc5ac2fa3331d3b15ff956cf0d6b7b3849f906\" returns successfully" Jan 13 20:43:56.709629 containerd[1551]: time="2025-01-13T20:43:56.709556001Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\"" Jan 13 20:43:56.709629 containerd[1551]: time="2025-01-13T20:43:56.709595600Z" level=info msg="TearDown network for sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" successfully" Jan 13 20:43:56.709629 containerd[1551]: time="2025-01-13T20:43:56.709601597Z" level=info msg="StopPodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" returns successfully" Jan 13 20:43:56.709765 containerd[1551]: time="2025-01-13T20:43:56.709720592Z" level=info msg="RemovePodSandbox for \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\"" Jan 13 20:43:56.709793 containerd[1551]: time="2025-01-13T20:43:56.709765632Z" level=info msg="Forcibly stopping sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\"" Jan 13 20:43:56.709814 containerd[1551]: time="2025-01-13T20:43:56.709799338Z" level=info msg="TearDown network for sandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" successfully" Jan 13 20:43:56.710892 containerd[1551]: time="2025-01-13T20:43:56.710875946Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.710919 containerd[1551]: time="2025-01-13T20:43:56.710897125Z" level=info msg="RemovePodSandbox \"3ba3cee0d5edda98084e1fba8918bc1bc17275c05b31ca98bfc5148ad79b6f25\" returns successfully" Jan 13 20:43:56.711143 containerd[1551]: time="2025-01-13T20:43:56.711065354Z" level=info msg="StopPodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\"" Jan 13 20:43:56.711143 containerd[1551]: time="2025-01-13T20:43:56.711104227Z" level=info msg="TearDown network for sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" successfully" Jan 13 20:43:56.711143 containerd[1551]: time="2025-01-13T20:43:56.711109954Z" level=info msg="StopPodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" returns successfully" Jan 13 20:43:56.711758 containerd[1551]: time="2025-01-13T20:43:56.711259477Z" level=info msg="RemovePodSandbox for \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\"" Jan 13 20:43:56.711758 containerd[1551]: time="2025-01-13T20:43:56.711275135Z" level=info msg="Forcibly stopping sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\"" Jan 13 20:43:56.711758 containerd[1551]: time="2025-01-13T20:43:56.711332708Z" level=info msg="TearDown network for sandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" successfully" Jan 13 20:43:56.712411 containerd[1551]: time="2025-01-13T20:43:56.712395691Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.712435 containerd[1551]: time="2025-01-13T20:43:56.712417115Z" level=info msg="RemovePodSandbox \"6bdfa034d6907859e534bef86d936d50af3d611573374975fde925229b4c44d4\" returns successfully" Jan 13 20:43:56.712774 containerd[1551]: time="2025-01-13T20:43:56.712595669Z" level=info msg="StopPodSandbox for \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\"" Jan 13 20:43:56.712774 containerd[1551]: time="2025-01-13T20:43:56.712633868Z" level=info msg="TearDown network for sandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\" successfully" Jan 13 20:43:56.712774 containerd[1551]: time="2025-01-13T20:43:56.712639794Z" level=info msg="StopPodSandbox for \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\" returns successfully" Jan 13 20:43:56.712956 containerd[1551]: time="2025-01-13T20:43:56.712942508Z" level=info msg="RemovePodSandbox for \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\"" Jan 13 20:43:56.713561 containerd[1551]: time="2025-01-13T20:43:56.712956410Z" level=info msg="Forcibly stopping sandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\"" Jan 13 20:43:56.713561 containerd[1551]: time="2025-01-13T20:43:56.713000459Z" level=info msg="TearDown network for sandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\" successfully" Jan 13 20:43:56.714483 containerd[1551]: time="2025-01-13T20:43:56.714469031Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.715570 containerd[1551]: time="2025-01-13T20:43:56.714490703Z" level=info msg="RemovePodSandbox \"a5bf1a3edebe8504ce4fcd5833dc9c13a307bd42f098d6827ce302f4fae5a3ad\" returns successfully" Jan 13 20:43:56.715570 containerd[1551]: time="2025-01-13T20:43:56.714709400Z" level=info msg="StopPodSandbox for \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\"" Jan 13 20:43:56.715570 containerd[1551]: time="2025-01-13T20:43:56.714778776Z" level=info msg="TearDown network for sandbox \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\" successfully" Jan 13 20:43:56.715570 containerd[1551]: time="2025-01-13T20:43:56.714785216Z" level=info msg="StopPodSandbox for \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\" returns successfully" Jan 13 20:43:56.715570 containerd[1551]: time="2025-01-13T20:43:56.714906487Z" level=info msg="RemovePodSandbox for \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\"" Jan 13 20:43:56.715570 containerd[1551]: time="2025-01-13T20:43:56.714916867Z" level=info msg="Forcibly stopping sandbox \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\"" Jan 13 20:43:56.715570 containerd[1551]: time="2025-01-13T20:43:56.714960617Z" level=info msg="TearDown network for sandbox \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\" successfully" Jan 13 20:43:56.716011 containerd[1551]: time="2025-01-13T20:43:56.715996357Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.716454 containerd[1551]: time="2025-01-13T20:43:56.716016562Z" level=info msg="RemovePodSandbox \"552509e00dfa3b4bb578cb0af364a92f25bc9efa558207feca41d46dba394cb3\" returns successfully" Jan 13 20:43:56.716454 containerd[1551]: time="2025-01-13T20:43:56.716127675Z" level=info msg="StopPodSandbox for \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\"" Jan 13 20:43:56.716454 containerd[1551]: time="2025-01-13T20:43:56.716172356Z" level=info msg="TearDown network for sandbox \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\" successfully" Jan 13 20:43:56.716454 containerd[1551]: time="2025-01-13T20:43:56.716180220Z" level=info msg="StopPodSandbox for \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\" returns successfully" Jan 13 20:43:56.716454 containerd[1551]: time="2025-01-13T20:43:56.716343538Z" level=info msg="RemovePodSandbox for \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\"" Jan 13 20:43:56.716454 containerd[1551]: time="2025-01-13T20:43:56.716354110Z" level=info msg="Forcibly stopping sandbox \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\"" Jan 13 20:43:56.716454 containerd[1551]: time="2025-01-13T20:43:56.716383914Z" level=info msg="TearDown network for sandbox \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\" successfully" Jan 13 20:43:56.717454 containerd[1551]: time="2025-01-13T20:43:56.717441243Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.717484 containerd[1551]: time="2025-01-13T20:43:56.717461037Z" level=info msg="RemovePodSandbox \"02cb2eb008117051adfdfb5be256f5bf15ea73bae8c81e2db2dadaa4d8d1aa08\" returns successfully" Jan 13 20:43:56.717672 containerd[1551]: time="2025-01-13T20:43:56.717591770Z" level=info msg="StopPodSandbox for \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\"" Jan 13 20:43:56.717672 containerd[1551]: time="2025-01-13T20:43:56.717635538Z" level=info msg="TearDown network for sandbox \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\" successfully" Jan 13 20:43:56.717672 containerd[1551]: time="2025-01-13T20:43:56.717641169Z" level=info msg="StopPodSandbox for \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\" returns successfully" Jan 13 20:43:56.718605 containerd[1551]: time="2025-01-13T20:43:56.717898964Z" level=info msg="RemovePodSandbox for \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\"" Jan 13 20:43:56.718605 containerd[1551]: time="2025-01-13T20:43:56.717915965Z" level=info msg="Forcibly stopping sandbox \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\"" Jan 13 20:43:56.718605 containerd[1551]: time="2025-01-13T20:43:56.717991984Z" level=info msg="TearDown network for sandbox \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\" successfully" Jan 13 20:43:56.719050 containerd[1551]: time="2025-01-13T20:43:56.719035497Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.719090 containerd[1551]: time="2025-01-13T20:43:56.719056061Z" level=info msg="RemovePodSandbox \"61fb2dc06bb3408eee66f4d017a16936100b6688391889f61c85d3489c76797b\" returns successfully" Jan 13 20:43:56.719241 containerd[1551]: time="2025-01-13T20:43:56.719227007Z" level=info msg="StopPodSandbox for \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\"" Jan 13 20:43:56.719280 containerd[1551]: time="2025-01-13T20:43:56.719267154Z" level=info msg="TearDown network for sandbox \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\" successfully" Jan 13 20:43:56.719280 containerd[1551]: time="2025-01-13T20:43:56.719273283Z" level=info msg="StopPodSandbox for \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\" returns successfully" Jan 13 20:43:56.719879 containerd[1551]: time="2025-01-13T20:43:56.719396214Z" level=info msg="RemovePodSandbox for \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\"" Jan 13 20:43:56.719879 containerd[1551]: time="2025-01-13T20:43:56.719407921Z" level=info msg="Forcibly stopping sandbox \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\"" Jan 13 20:43:56.719879 containerd[1551]: time="2025-01-13T20:43:56.719438221Z" level=info msg="TearDown network for sandbox \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\" successfully" Jan 13 20:43:56.720582 containerd[1551]: time="2025-01-13T20:43:56.720570525Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:43:56.720637 containerd[1551]: time="2025-01-13T20:43:56.720628688Z" level=info msg="RemovePodSandbox \"69e3c37b3ff4d1439d3ca82324a9fd5fe23fee164afe27d1918181615d29fdd8\" returns successfully" Jan 13 20:43:57.900146 systemd[1]: run-containerd-runc-k8s.io-cc7f14a9972e674efbce89d3466cd282b10c6a017ce3de24f64051180dd02a56-runc.H7rtDo.mount: Deactivated successfully. Jan 13 20:44:05.656882 systemd[1]: Started sshd@8-139.178.70.106:22-147.75.109.163:33486.service - OpenSSH per-connection server daemon (147.75.109.163:33486). Jan 13 20:44:05.749244 sshd[5914]: Accepted publickey for core from 147.75.109.163 port 33486 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:44:05.751165 sshd-session[5914]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:44:05.754091 systemd-logind[1532]: New session 10 of user core. Jan 13 20:44:05.760901 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 13 20:44:06.177999 sshd[5916]: Connection closed by 147.75.109.163 port 33486 Jan 13 20:44:06.178573 sshd-session[5914]: pam_unix(sshd:session): session closed for user core Jan 13 20:44:06.180527 systemd[1]: sshd@8-139.178.70.106:22-147.75.109.163:33486.service: Deactivated successfully. Jan 13 20:44:06.181894 systemd[1]: session-10.scope: Deactivated successfully. Jan 13 20:44:06.182363 systemd-logind[1532]: Session 10 logged out. Waiting for processes to exit. Jan 13 20:44:06.183077 systemd-logind[1532]: Removed session 10. Jan 13 20:44:11.188897 systemd[1]: Started sshd@9-139.178.70.106:22-147.75.109.163:51286.service - OpenSSH per-connection server daemon (147.75.109.163:51286). Jan 13 20:44:11.235641 sshd[5932]: Accepted publickey for core from 147.75.109.163 port 51286 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:44:11.236644 sshd-session[5932]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:44:11.240539 systemd-logind[1532]: New session 11 of user core. Jan 13 20:44:11.243882 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 13 20:44:11.361826 sshd[5934]: Connection closed by 147.75.109.163 port 51286 Jan 13 20:44:11.362148 sshd-session[5932]: pam_unix(sshd:session): session closed for user core Jan 13 20:44:11.364316 systemd[1]: sshd@9-139.178.70.106:22-147.75.109.163:51286.service: Deactivated successfully. Jan 13 20:44:11.365343 systemd[1]: session-11.scope: Deactivated successfully. Jan 13 20:44:11.365757 systemd-logind[1532]: Session 11 logged out. Waiting for processes to exit. Jan 13 20:44:11.366408 systemd-logind[1532]: Removed session 11. Jan 13 20:44:16.370657 systemd[1]: Started sshd@10-139.178.70.106:22-147.75.109.163:51290.service - OpenSSH per-connection server daemon (147.75.109.163:51290). Jan 13 20:44:16.419053 sshd[5948]: Accepted publickey for core from 147.75.109.163 port 51290 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:44:16.420117 sshd-session[5948]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:44:16.423279 systemd-logind[1532]: New session 12 of user core. Jan 13 20:44:16.431832 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 13 20:44:16.525726 sshd[5950]: Connection closed by 147.75.109.163 port 51290 Jan 13 20:44:16.526886 sshd-session[5948]: pam_unix(sshd:session): session closed for user core Jan 13 20:44:16.532477 systemd[1]: sshd@10-139.178.70.106:22-147.75.109.163:51290.service: Deactivated successfully. Jan 13 20:44:16.533677 systemd[1]: session-12.scope: Deactivated successfully. Jan 13 20:44:16.534501 systemd-logind[1532]: Session 12 logged out. Waiting for processes to exit. Jan 13 20:44:16.536979 systemd[1]: Started sshd@11-139.178.70.106:22-147.75.109.163:51302.service - OpenSSH per-connection server daemon (147.75.109.163:51302). Jan 13 20:44:16.538061 systemd-logind[1532]: Removed session 12. Jan 13 20:44:16.585396 sshd[5961]: Accepted publickey for core from 147.75.109.163 port 51302 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:44:16.585608 sshd-session[5961]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:44:16.588481 systemd-logind[1532]: New session 13 of user core. Jan 13 20:44:16.594036 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 13 20:44:16.771083 sshd[5963]: Connection closed by 147.75.109.163 port 51302 Jan 13 20:44:16.772184 sshd-session[5961]: pam_unix(sshd:session): session closed for user core Jan 13 20:44:16.777939 systemd[1]: sshd@11-139.178.70.106:22-147.75.109.163:51302.service: Deactivated successfully. Jan 13 20:44:16.779661 systemd[1]: session-13.scope: Deactivated successfully. Jan 13 20:44:16.781288 systemd-logind[1532]: Session 13 logged out. Waiting for processes to exit. Jan 13 20:44:16.789073 systemd[1]: Started sshd@12-139.178.70.106:22-147.75.109.163:51306.service - OpenSSH per-connection server daemon (147.75.109.163:51306). Jan 13 20:44:16.791479 systemd-logind[1532]: Removed session 13. Jan 13 20:44:16.840192 sshd[5972]: Accepted publickey for core from 147.75.109.163 port 51306 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:44:16.841531 sshd-session[5972]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:44:16.844350 systemd-logind[1532]: New session 14 of user core. Jan 13 20:44:16.849854 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 13 20:44:16.946761 sshd[5974]: Connection closed by 147.75.109.163 port 51306 Jan 13 20:44:16.947071 sshd-session[5972]: pam_unix(sshd:session): session closed for user core Jan 13 20:44:16.949152 systemd-logind[1532]: Session 14 logged out. Waiting for processes to exit. Jan 13 20:44:16.949259 systemd[1]: sshd@12-139.178.70.106:22-147.75.109.163:51306.service: Deactivated successfully. Jan 13 20:44:16.950300 systemd[1]: session-14.scope: Deactivated successfully. Jan 13 20:44:16.950928 systemd-logind[1532]: Removed session 14. Jan 13 20:44:21.956741 systemd[1]: Started sshd@13-139.178.70.106:22-147.75.109.163:49736.service - OpenSSH per-connection server daemon (147.75.109.163:49736). Jan 13 20:44:22.051288 sshd[6001]: Accepted publickey for core from 147.75.109.163 port 49736 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:44:22.052492 sshd-session[6001]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:44:22.055319 systemd-logind[1532]: New session 15 of user core. Jan 13 20:44:22.060818 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 13 20:44:22.143959 sshd[6003]: Connection closed by 147.75.109.163 port 49736 Jan 13 20:44:22.144491 sshd-session[6001]: pam_unix(sshd:session): session closed for user core Jan 13 20:44:22.146098 systemd[1]: sshd@13-139.178.70.106:22-147.75.109.163:49736.service: Deactivated successfully. Jan 13 20:44:22.147619 systemd[1]: session-15.scope: Deactivated successfully. Jan 13 20:44:22.149463 systemd-logind[1532]: Session 15 logged out. Waiting for processes to exit. Jan 13 20:44:22.150187 systemd-logind[1532]: Removed session 15. Jan 13 20:44:26.898697 kubelet[2877]: I0113 20:44:26.898361 2877 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:44:27.154616 systemd[1]: Started sshd@14-139.178.70.106:22-147.75.109.163:49752.service - OpenSSH per-connection server daemon (147.75.109.163:49752). Jan 13 20:44:27.185424 sshd[6016]: Accepted publickey for core from 147.75.109.163 port 49752 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:44:27.186158 sshd-session[6016]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:44:27.188974 systemd-logind[1532]: New session 16 of user core. Jan 13 20:44:27.191812 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 13 20:44:27.285630 sshd[6018]: Connection closed by 147.75.109.163 port 49752 Jan 13 20:44:27.285554 sshd-session[6016]: pam_unix(sshd:session): session closed for user core Jan 13 20:44:27.291316 systemd[1]: sshd@14-139.178.70.106:22-147.75.109.163:49752.service: Deactivated successfully. Jan 13 20:44:27.292295 systemd[1]: session-16.scope: Deactivated successfully. Jan 13 20:44:27.293111 systemd-logind[1532]: Session 16 logged out. Waiting for processes to exit. Jan 13 20:44:27.298891 systemd[1]: Started sshd@15-139.178.70.106:22-147.75.109.163:37296.service - OpenSSH per-connection server daemon (147.75.109.163:37296). Jan 13 20:44:27.299725 systemd-logind[1532]: Removed session 16. Jan 13 20:44:27.334212 sshd[6029]: Accepted publickey for core from 147.75.109.163 port 37296 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:44:27.335791 sshd-session[6029]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:44:27.340134 systemd-logind[1532]: New session 17 of user core. Jan 13 20:44:27.343816 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 13 20:44:27.679637 sshd[6031]: Connection closed by 147.75.109.163 port 37296 Jan 13 20:44:27.682843 sshd-session[6029]: pam_unix(sshd:session): session closed for user core Jan 13 20:44:27.686583 systemd[1]: Started sshd@16-139.178.70.106:22-147.75.109.163:37312.service - OpenSSH per-connection server daemon (147.75.109.163:37312). Jan 13 20:44:27.687875 systemd[1]: sshd@15-139.178.70.106:22-147.75.109.163:37296.service: Deactivated successfully. Jan 13 20:44:27.689342 systemd[1]: session-17.scope: Deactivated successfully. Jan 13 20:44:27.690209 systemd-logind[1532]: Session 17 logged out. Waiting for processes to exit. Jan 13 20:44:27.691437 systemd-logind[1532]: Removed session 17. Jan 13 20:44:27.742465 sshd[6038]: Accepted publickey for core from 147.75.109.163 port 37312 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:44:27.743450 sshd-session[6038]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:44:27.746963 systemd-logind[1532]: New session 18 of user core. Jan 13 20:44:27.756833 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 13 20:44:27.950170 systemd[1]: run-containerd-runc-k8s.io-cc7f14a9972e674efbce89d3466cd282b10c6a017ce3de24f64051180dd02a56-runc.f00zvr.mount: Deactivated successfully. Jan 13 20:44:29.090171 sshd[6042]: Connection closed by 147.75.109.163 port 37312 Jan 13 20:44:29.093704 sshd-session[6038]: pam_unix(sshd:session): session closed for user core Jan 13 20:44:29.098274 systemd[1]: Started sshd@17-139.178.70.106:22-147.75.109.163:37320.service - OpenSSH per-connection server daemon (147.75.109.163:37320). Jan 13 20:44:29.114833 systemd[1]: sshd@16-139.178.70.106:22-147.75.109.163:37312.service: Deactivated successfully. Jan 13 20:44:29.115920 systemd[1]: session-18.scope: Deactivated successfully. Jan 13 20:44:29.119395 systemd-logind[1532]: Session 18 logged out. Waiting for processes to exit. Jan 13 20:44:29.120958 systemd-logind[1532]: Removed session 18. Jan 13 20:44:29.173851 sshd[6075]: Accepted publickey for core from 147.75.109.163 port 37320 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:44:29.174404 sshd-session[6075]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:44:29.177520 systemd-logind[1532]: New session 19 of user core. Jan 13 20:44:29.180814 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 13 20:44:29.478524 sshd[6081]: Connection closed by 147.75.109.163 port 37320 Jan 13 20:44:29.479512 sshd-session[6075]: pam_unix(sshd:session): session closed for user core Jan 13 20:44:29.485654 systemd[1]: sshd@17-139.178.70.106:22-147.75.109.163:37320.service: Deactivated successfully. Jan 13 20:44:29.487523 systemd[1]: session-19.scope: Deactivated successfully. Jan 13 20:44:29.489193 systemd-logind[1532]: Session 19 logged out. Waiting for processes to exit. Jan 13 20:44:29.495078 systemd[1]: Started sshd@18-139.178.70.106:22-147.75.109.163:37328.service - OpenSSH per-connection server daemon (147.75.109.163:37328). Jan 13 20:44:29.499161 systemd-logind[1532]: Removed session 19. Jan 13 20:44:29.525573 sshd[6090]: Accepted publickey for core from 147.75.109.163 port 37328 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:44:29.526075 sshd-session[6090]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:44:29.528500 systemd-logind[1532]: New session 20 of user core. Jan 13 20:44:29.545000 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 13 20:44:29.634896 sshd[6092]: Connection closed by 147.75.109.163 port 37328 Jan 13 20:44:29.634583 sshd-session[6090]: pam_unix(sshd:session): session closed for user core Jan 13 20:44:29.636141 systemd[1]: sshd@18-139.178.70.106:22-147.75.109.163:37328.service: Deactivated successfully. Jan 13 20:44:29.637282 systemd[1]: session-20.scope: Deactivated successfully. Jan 13 20:44:29.638257 systemd-logind[1532]: Session 20 logged out. Waiting for processes to exit. Jan 13 20:44:29.639135 systemd-logind[1532]: Removed session 20. Jan 13 20:44:34.643370 systemd[1]: Started sshd@19-139.178.70.106:22-147.75.109.163:37330.service - OpenSSH per-connection server daemon (147.75.109.163:37330). Jan 13 20:44:34.728704 sshd[6125]: Accepted publickey for core from 147.75.109.163 port 37330 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:44:34.729429 sshd-session[6125]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:44:34.733129 systemd-logind[1532]: New session 21 of user core. Jan 13 20:44:34.737818 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 13 20:44:34.914803 sshd[6127]: Connection closed by 147.75.109.163 port 37330 Jan 13 20:44:34.915497 sshd-session[6125]: pam_unix(sshd:session): session closed for user core Jan 13 20:44:34.917123 systemd[1]: sshd@19-139.178.70.106:22-147.75.109.163:37330.service: Deactivated successfully. Jan 13 20:44:34.918983 systemd[1]: session-21.scope: Deactivated successfully. Jan 13 20:44:34.920134 systemd-logind[1532]: Session 21 logged out. Waiting for processes to exit. Jan 13 20:44:34.920974 systemd-logind[1532]: Removed session 21. Jan 13 20:44:39.924517 systemd[1]: Started sshd@20-139.178.70.106:22-147.75.109.163:38170.service - OpenSSH per-connection server daemon (147.75.109.163:38170). Jan 13 20:44:39.977718 sshd[6141]: Accepted publickey for core from 147.75.109.163 port 38170 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:44:39.978755 sshd-session[6141]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:44:39.982411 systemd-logind[1532]: New session 22 of user core. Jan 13 20:44:39.991858 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 13 20:44:40.106393 sshd[6143]: Connection closed by 147.75.109.163 port 38170 Jan 13 20:44:40.108926 systemd[1]: sshd@20-139.178.70.106:22-147.75.109.163:38170.service: Deactivated successfully. Jan 13 20:44:40.106868 sshd-session[6141]: pam_unix(sshd:session): session closed for user core Jan 13 20:44:40.110057 systemd[1]: session-22.scope: Deactivated successfully. Jan 13 20:44:40.110524 systemd-logind[1532]: Session 22 logged out. Waiting for processes to exit. Jan 13 20:44:40.111106 systemd-logind[1532]: Removed session 22. Jan 13 20:44:45.115224 systemd[1]: Started sshd@21-139.178.70.106:22-147.75.109.163:38180.service - OpenSSH per-connection server daemon (147.75.109.163:38180). Jan 13 20:44:45.148183 sshd[6174]: Accepted publickey for core from 147.75.109.163 port 38180 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:44:45.148885 sshd-session[6174]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:44:45.151457 systemd-logind[1532]: New session 23 of user core. Jan 13 20:44:45.153889 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 13 20:44:45.248504 sshd[6176]: Connection closed by 147.75.109.163 port 38180 Jan 13 20:44:45.249082 sshd-session[6174]: pam_unix(sshd:session): session closed for user core Jan 13 20:44:45.251637 systemd[1]: sshd@21-139.178.70.106:22-147.75.109.163:38180.service: Deactivated successfully. Jan 13 20:44:45.252846 systemd[1]: session-23.scope: Deactivated successfully. Jan 13 20:44:45.253423 systemd-logind[1532]: Session 23 logged out. Waiting for processes to exit. Jan 13 20:44:45.254469 systemd-logind[1532]: Removed session 23.