Jan 13 20:43:34.739803 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241116 p3) 14.2.1 20241116, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Jan 13 18:58:40 -00 2025 Jan 13 20:43:34.739821 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 20:43:34.739828 kernel: Disabled fast string operations Jan 13 20:43:34.739832 kernel: BIOS-provided physical RAM map: Jan 13 20:43:34.739836 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Jan 13 20:43:34.739840 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Jan 13 20:43:34.739846 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Jan 13 20:43:34.739851 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Jan 13 20:43:34.739855 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Jan 13 20:43:34.739859 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Jan 13 20:43:34.739864 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Jan 13 20:43:34.739868 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Jan 13 20:43:34.739872 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Jan 13 20:43:34.739877 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Jan 13 20:43:34.739883 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Jan 13 20:43:34.739888 kernel: NX (Execute Disable) protection: active Jan 13 20:43:34.739893 kernel: APIC: Static calls initialized Jan 13 20:43:34.739898 kernel: SMBIOS 2.7 present. Jan 13 20:43:34.739903 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Jan 13 20:43:34.739908 kernel: vmware: hypercall mode: 0x00 Jan 13 20:43:34.739913 kernel: Hypervisor detected: VMware Jan 13 20:43:34.739918 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Jan 13 20:43:34.739924 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Jan 13 20:43:34.739929 kernel: vmware: using clock offset of 2521513259 ns Jan 13 20:43:34.739934 kernel: tsc: Detected 3408.000 MHz processor Jan 13 20:43:34.739939 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 13 20:43:34.739944 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 13 20:43:34.739949 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Jan 13 20:43:34.739954 kernel: total RAM covered: 3072M Jan 13 20:43:34.739959 kernel: Found optimal setting for mtrr clean up Jan 13 20:43:34.739967 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Jan 13 20:43:34.739972 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Jan 13 20:43:34.739978 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 13 20:43:34.739983 kernel: Using GB pages for direct mapping Jan 13 20:43:34.739988 kernel: ACPI: Early table checksum verification disabled Jan 13 20:43:34.739993 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Jan 13 20:43:34.739998 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Jan 13 20:43:34.740012 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Jan 13 20:43:34.740018 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Jan 13 20:43:34.740023 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:43:34.740031 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:43:34.740037 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Jan 13 20:43:34.740042 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Jan 13 20:43:34.740047 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Jan 13 20:43:34.740053 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Jan 13 20:43:34.740058 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Jan 13 20:43:34.740064 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Jan 13 20:43:34.740070 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Jan 13 20:43:34.740075 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Jan 13 20:43:34.740080 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:43:34.740085 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:43:34.740090 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Jan 13 20:43:34.740096 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Jan 13 20:43:34.740101 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Jan 13 20:43:34.740106 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Jan 13 20:43:34.740113 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Jan 13 20:43:34.740118 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Jan 13 20:43:34.740124 kernel: system APIC only can use physical flat Jan 13 20:43:34.740129 kernel: APIC: Switched APIC routing to: physical flat Jan 13 20:43:34.740134 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 13 20:43:34.740139 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 13 20:43:34.740145 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 13 20:43:34.740150 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 13 20:43:34.740155 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 13 20:43:34.740160 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 13 20:43:34.740167 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 13 20:43:34.740172 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 13 20:43:34.740177 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Jan 13 20:43:34.740182 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Jan 13 20:43:34.740187 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Jan 13 20:43:34.740192 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Jan 13 20:43:34.740197 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Jan 13 20:43:34.740202 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Jan 13 20:43:34.740207 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Jan 13 20:43:34.740213 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Jan 13 20:43:34.740219 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Jan 13 20:43:34.740224 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Jan 13 20:43:34.740229 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Jan 13 20:43:34.740234 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Jan 13 20:43:34.740239 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Jan 13 20:43:34.740244 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Jan 13 20:43:34.740250 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Jan 13 20:43:34.740255 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Jan 13 20:43:34.740260 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Jan 13 20:43:34.740265 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Jan 13 20:43:34.740271 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Jan 13 20:43:34.740276 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Jan 13 20:43:34.740281 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Jan 13 20:43:34.740287 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Jan 13 20:43:34.740292 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Jan 13 20:43:34.740298 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Jan 13 20:43:34.740303 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Jan 13 20:43:34.740308 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Jan 13 20:43:34.740313 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Jan 13 20:43:34.740318 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Jan 13 20:43:34.740324 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Jan 13 20:43:34.740330 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Jan 13 20:43:34.740335 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Jan 13 20:43:34.740342 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Jan 13 20:43:34.740350 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Jan 13 20:43:34.740355 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Jan 13 20:43:34.740360 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Jan 13 20:43:34.740365 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Jan 13 20:43:34.740370 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Jan 13 20:43:34.740377 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Jan 13 20:43:34.740387 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Jan 13 20:43:34.740396 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Jan 13 20:43:34.740402 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Jan 13 20:43:34.740407 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Jan 13 20:43:34.740412 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Jan 13 20:43:34.740418 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Jan 13 20:43:34.740423 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Jan 13 20:43:34.740428 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Jan 13 20:43:34.740433 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Jan 13 20:43:34.740438 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Jan 13 20:43:34.740443 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Jan 13 20:43:34.740450 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Jan 13 20:43:34.740455 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Jan 13 20:43:34.740464 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Jan 13 20:43:34.740470 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Jan 13 20:43:34.740476 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Jan 13 20:43:34.740484 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Jan 13 20:43:34.740492 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Jan 13 20:43:34.740500 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Jan 13 20:43:34.740509 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Jan 13 20:43:34.740518 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Jan 13 20:43:34.740525 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Jan 13 20:43:34.740533 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Jan 13 20:43:34.740540 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Jan 13 20:43:34.740548 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Jan 13 20:43:34.740556 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Jan 13 20:43:34.740563 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Jan 13 20:43:34.740571 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Jan 13 20:43:34.740579 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Jan 13 20:43:34.740588 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Jan 13 20:43:34.740596 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Jan 13 20:43:34.740604 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Jan 13 20:43:34.740612 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Jan 13 20:43:34.740619 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Jan 13 20:43:34.740625 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Jan 13 20:43:34.740630 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Jan 13 20:43:34.740635 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Jan 13 20:43:34.740640 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Jan 13 20:43:34.740646 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Jan 13 20:43:34.740653 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Jan 13 20:43:34.740658 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Jan 13 20:43:34.740664 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Jan 13 20:43:34.740669 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Jan 13 20:43:34.740675 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Jan 13 20:43:34.740680 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Jan 13 20:43:34.740685 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Jan 13 20:43:34.740691 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Jan 13 20:43:34.740696 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Jan 13 20:43:34.740702 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Jan 13 20:43:34.740707 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Jan 13 20:43:34.740714 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Jan 13 20:43:34.740720 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Jan 13 20:43:34.740725 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Jan 13 20:43:34.740730 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Jan 13 20:43:34.740736 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Jan 13 20:43:34.740742 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Jan 13 20:43:34.740747 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Jan 13 20:43:34.740752 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Jan 13 20:43:34.740758 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Jan 13 20:43:34.740763 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Jan 13 20:43:34.740835 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Jan 13 20:43:34.740842 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Jan 13 20:43:34.740848 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Jan 13 20:43:34.740853 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Jan 13 20:43:34.740859 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Jan 13 20:43:34.740864 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Jan 13 20:43:34.740869 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Jan 13 20:43:34.740875 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Jan 13 20:43:34.740880 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Jan 13 20:43:34.740886 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Jan 13 20:43:34.740893 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Jan 13 20:43:34.740899 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Jan 13 20:43:34.740904 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Jan 13 20:43:34.740910 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Jan 13 20:43:34.740915 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Jan 13 20:43:34.740921 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Jan 13 20:43:34.740926 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Jan 13 20:43:34.740932 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Jan 13 20:43:34.740937 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Jan 13 20:43:34.740943 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Jan 13 20:43:34.740949 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Jan 13 20:43:34.740955 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Jan 13 20:43:34.740960 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 13 20:43:34.740966 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 13 20:43:34.740972 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Jan 13 20:43:34.740977 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Jan 13 20:43:34.740983 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Jan 13 20:43:34.740989 kernel: Zone ranges: Jan 13 20:43:34.740994 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 13 20:43:34.741000 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Jan 13 20:43:34.741007 kernel: Normal empty Jan 13 20:43:34.741012 kernel: Movable zone start for each node Jan 13 20:43:34.741018 kernel: Early memory node ranges Jan 13 20:43:34.741023 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Jan 13 20:43:34.741029 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Jan 13 20:43:34.741034 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Jan 13 20:43:34.741040 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Jan 13 20:43:34.741046 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 13 20:43:34.741051 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Jan 13 20:43:34.741058 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Jan 13 20:43:34.741064 kernel: ACPI: PM-Timer IO Port: 0x1008 Jan 13 20:43:34.741070 kernel: system APIC only can use physical flat Jan 13 20:43:34.741075 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Jan 13 20:43:34.741081 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Jan 13 20:43:34.741087 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Jan 13 20:43:34.741092 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Jan 13 20:43:34.741098 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Jan 13 20:43:34.741103 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Jan 13 20:43:34.741109 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Jan 13 20:43:34.741116 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Jan 13 20:43:34.741121 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Jan 13 20:43:34.741127 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Jan 13 20:43:34.741132 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Jan 13 20:43:34.741137 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Jan 13 20:43:34.741143 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Jan 13 20:43:34.741148 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Jan 13 20:43:34.741154 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Jan 13 20:43:34.741159 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Jan 13 20:43:34.741166 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Jan 13 20:43:34.741171 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Jan 13 20:43:34.741177 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Jan 13 20:43:34.741182 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Jan 13 20:43:34.741188 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Jan 13 20:43:34.741193 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Jan 13 20:43:34.741199 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Jan 13 20:43:34.741204 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Jan 13 20:43:34.741210 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Jan 13 20:43:34.741215 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Jan 13 20:43:34.741222 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Jan 13 20:43:34.741228 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Jan 13 20:43:34.741233 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Jan 13 20:43:34.741239 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Jan 13 20:43:34.741244 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Jan 13 20:43:34.741250 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Jan 13 20:43:34.741255 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Jan 13 20:43:34.741261 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Jan 13 20:43:34.741266 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Jan 13 20:43:34.741272 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Jan 13 20:43:34.741278 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Jan 13 20:43:34.741284 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Jan 13 20:43:34.741290 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Jan 13 20:43:34.741295 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Jan 13 20:43:34.741301 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Jan 13 20:43:34.741306 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Jan 13 20:43:34.741312 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Jan 13 20:43:34.741317 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Jan 13 20:43:34.741323 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Jan 13 20:43:34.741330 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Jan 13 20:43:34.741336 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Jan 13 20:43:34.741341 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Jan 13 20:43:34.741347 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Jan 13 20:43:34.741352 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Jan 13 20:43:34.741358 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Jan 13 20:43:34.741363 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Jan 13 20:43:34.741369 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Jan 13 20:43:34.741374 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Jan 13 20:43:34.741379 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Jan 13 20:43:34.741386 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Jan 13 20:43:34.741392 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Jan 13 20:43:34.741397 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Jan 13 20:43:34.741403 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Jan 13 20:43:34.741408 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Jan 13 20:43:34.741414 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Jan 13 20:43:34.741419 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Jan 13 20:43:34.741425 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Jan 13 20:43:34.741430 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Jan 13 20:43:34.741435 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Jan 13 20:43:34.741442 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Jan 13 20:43:34.741448 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Jan 13 20:43:34.741453 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Jan 13 20:43:34.741459 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Jan 13 20:43:34.741464 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Jan 13 20:43:34.741470 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Jan 13 20:43:34.741475 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Jan 13 20:43:34.741481 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Jan 13 20:43:34.741487 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Jan 13 20:43:34.741493 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Jan 13 20:43:34.741499 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Jan 13 20:43:34.741504 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Jan 13 20:43:34.741509 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Jan 13 20:43:34.741515 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Jan 13 20:43:34.741520 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Jan 13 20:43:34.741526 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Jan 13 20:43:34.741531 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Jan 13 20:43:34.741537 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Jan 13 20:43:34.741542 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Jan 13 20:43:34.741549 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Jan 13 20:43:34.741554 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Jan 13 20:43:34.741560 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Jan 13 20:43:34.741565 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Jan 13 20:43:34.741571 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Jan 13 20:43:34.741577 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Jan 13 20:43:34.741582 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Jan 13 20:43:34.741588 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Jan 13 20:43:34.741593 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Jan 13 20:43:34.741599 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Jan 13 20:43:34.741605 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Jan 13 20:43:34.741610 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Jan 13 20:43:34.741616 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Jan 13 20:43:34.741621 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Jan 13 20:43:34.741627 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Jan 13 20:43:34.741633 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Jan 13 20:43:34.741638 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Jan 13 20:43:34.741644 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Jan 13 20:43:34.741650 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Jan 13 20:43:34.741655 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Jan 13 20:43:34.741662 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Jan 13 20:43:34.741668 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Jan 13 20:43:34.741673 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Jan 13 20:43:34.741683 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Jan 13 20:43:34.741689 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Jan 13 20:43:34.741694 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Jan 13 20:43:34.741700 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Jan 13 20:43:34.741705 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Jan 13 20:43:34.741711 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Jan 13 20:43:34.741718 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Jan 13 20:43:34.741723 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Jan 13 20:43:34.741729 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Jan 13 20:43:34.741734 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Jan 13 20:43:34.741740 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Jan 13 20:43:34.741745 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Jan 13 20:43:34.741750 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Jan 13 20:43:34.741756 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Jan 13 20:43:34.741761 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Jan 13 20:43:34.741767 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Jan 13 20:43:34.741783 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Jan 13 20:43:34.741789 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Jan 13 20:43:34.741805 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Jan 13 20:43:34.741811 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Jan 13 20:43:34.741816 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Jan 13 20:43:34.741822 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Jan 13 20:43:34.741827 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Jan 13 20:43:34.741833 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 13 20:43:34.741839 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Jan 13 20:43:34.741846 kernel: TSC deadline timer available Jan 13 20:43:34.741852 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Jan 13 20:43:34.741857 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Jan 13 20:43:34.741863 kernel: Booting paravirtualized kernel on VMware hypervisor Jan 13 20:43:34.741869 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 13 20:43:34.741875 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Jan 13 20:43:34.741880 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 13 20:43:34.741886 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 13 20:43:34.741892 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Jan 13 20:43:34.741899 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Jan 13 20:43:34.741904 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Jan 13 20:43:34.741910 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Jan 13 20:43:34.741915 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Jan 13 20:43:34.741928 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Jan 13 20:43:34.741935 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Jan 13 20:43:34.741943 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Jan 13 20:43:34.741948 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Jan 13 20:43:34.741954 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Jan 13 20:43:34.741961 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Jan 13 20:43:34.741967 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Jan 13 20:43:34.741973 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Jan 13 20:43:34.741979 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Jan 13 20:43:34.741984 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Jan 13 20:43:34.741990 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Jan 13 20:43:34.741997 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 20:43:34.742003 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 13 20:43:34.742010 kernel: random: crng init done Jan 13 20:43:34.742016 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Jan 13 20:43:34.742022 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Jan 13 20:43:34.742028 kernel: printk: log_buf_len min size: 262144 bytes Jan 13 20:43:34.742034 kernel: printk: log_buf_len: 1048576 bytes Jan 13 20:43:34.742039 kernel: printk: early log buf free: 239648(91%) Jan 13 20:43:34.742045 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 13 20:43:34.742051 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 13 20:43:34.742057 kernel: Fallback order for Node 0: 0 Jan 13 20:43:34.742064 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Jan 13 20:43:34.742070 kernel: Policy zone: DMA32 Jan 13 20:43:34.742076 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 13 20:43:34.742082 kernel: Memory: 1934348K/2096628K available (14336K kernel code, 2299K rwdata, 22800K rodata, 43320K init, 1756K bss, 162020K reserved, 0K cma-reserved) Jan 13 20:43:34.742090 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Jan 13 20:43:34.742097 kernel: ftrace: allocating 37890 entries in 149 pages Jan 13 20:43:34.742103 kernel: ftrace: allocated 149 pages with 4 groups Jan 13 20:43:34.742109 kernel: Dynamic Preempt: voluntary Jan 13 20:43:34.742115 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 13 20:43:34.742121 kernel: rcu: RCU event tracing is enabled. Jan 13 20:43:34.742127 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Jan 13 20:43:34.742133 kernel: Trampoline variant of Tasks RCU enabled. Jan 13 20:43:34.742139 kernel: Rude variant of Tasks RCU enabled. Jan 13 20:43:34.742145 kernel: Tracing variant of Tasks RCU enabled. Jan 13 20:43:34.742151 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 13 20:43:34.742158 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Jan 13 20:43:34.742164 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Jan 13 20:43:34.742170 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Jan 13 20:43:34.742176 kernel: Console: colour VGA+ 80x25 Jan 13 20:43:34.742182 kernel: printk: console [tty0] enabled Jan 13 20:43:34.742188 kernel: printk: console [ttyS0] enabled Jan 13 20:43:34.742194 kernel: ACPI: Core revision 20230628 Jan 13 20:43:34.742200 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Jan 13 20:43:34.742206 kernel: APIC: Switch to symmetric I/O mode setup Jan 13 20:43:34.742213 kernel: x2apic enabled Jan 13 20:43:34.742219 kernel: APIC: Switched APIC routing to: physical x2apic Jan 13 20:43:34.742225 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 13 20:43:34.742231 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:43:34.742237 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Jan 13 20:43:34.742243 kernel: Disabled fast string operations Jan 13 20:43:34.742248 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Jan 13 20:43:34.742254 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Jan 13 20:43:34.742261 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 13 20:43:34.742268 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Jan 13 20:43:34.742274 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Jan 13 20:43:34.742280 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 13 20:43:34.742286 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 13 20:43:34.742292 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 13 20:43:34.742298 kernel: RETBleed: Mitigation: Enhanced IBRS Jan 13 20:43:34.742304 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 13 20:43:34.742310 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 13 20:43:34.742316 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jan 13 20:43:34.742323 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 13 20:43:34.742329 kernel: GDS: Unknown: Dependent on hypervisor status Jan 13 20:43:34.742335 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 13 20:43:34.742341 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 13 20:43:34.742347 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 13 20:43:34.742354 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 13 20:43:34.742360 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jan 13 20:43:34.742366 kernel: Freeing SMP alternatives memory: 32K Jan 13 20:43:34.742372 kernel: pid_max: default: 131072 minimum: 1024 Jan 13 20:43:34.742379 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 13 20:43:34.742385 kernel: landlock: Up and running. Jan 13 20:43:34.742390 kernel: SELinux: Initializing. Jan 13 20:43:34.742396 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:43:34.742402 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:43:34.742408 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Jan 13 20:43:34.742415 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:43:34.742421 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:43:34.742427 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:43:34.742434 kernel: Performance Events: Skylake events, core PMU driver. Jan 13 20:43:34.742440 kernel: core: CPUID marked event: 'cpu cycles' unavailable Jan 13 20:43:34.742447 kernel: core: CPUID marked event: 'instructions' unavailable Jan 13 20:43:34.742452 kernel: core: CPUID marked event: 'bus cycles' unavailable Jan 13 20:43:34.742458 kernel: core: CPUID marked event: 'cache references' unavailable Jan 13 20:43:34.742464 kernel: core: CPUID marked event: 'cache misses' unavailable Jan 13 20:43:34.742469 kernel: core: CPUID marked event: 'branch instructions' unavailable Jan 13 20:43:34.742475 kernel: core: CPUID marked event: 'branch misses' unavailable Jan 13 20:43:34.742482 kernel: ... version: 1 Jan 13 20:43:34.742488 kernel: ... bit width: 48 Jan 13 20:43:34.742494 kernel: ... generic registers: 4 Jan 13 20:43:34.742500 kernel: ... value mask: 0000ffffffffffff Jan 13 20:43:34.742506 kernel: ... max period: 000000007fffffff Jan 13 20:43:34.742512 kernel: ... fixed-purpose events: 0 Jan 13 20:43:34.742518 kernel: ... event mask: 000000000000000f Jan 13 20:43:34.742524 kernel: signal: max sigframe size: 1776 Jan 13 20:43:34.742529 kernel: rcu: Hierarchical SRCU implementation. Jan 13 20:43:34.742536 kernel: rcu: Max phase no-delay instances is 400. Jan 13 20:43:34.742543 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 13 20:43:34.742548 kernel: smp: Bringing up secondary CPUs ... Jan 13 20:43:34.742554 kernel: smpboot: x86: Booting SMP configuration: Jan 13 20:43:34.742560 kernel: .... node #0, CPUs: #1 Jan 13 20:43:34.742566 kernel: Disabled fast string operations Jan 13 20:43:34.742572 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Jan 13 20:43:34.742578 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 13 20:43:34.742584 kernel: smp: Brought up 1 node, 2 CPUs Jan 13 20:43:34.742590 kernel: smpboot: Max logical packages: 128 Jan 13 20:43:34.742597 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Jan 13 20:43:34.742603 kernel: devtmpfs: initialized Jan 13 20:43:34.742609 kernel: x86/mm: Memory block size: 128MB Jan 13 20:43:34.742615 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Jan 13 20:43:34.742621 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 13 20:43:34.742627 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Jan 13 20:43:34.742633 kernel: pinctrl core: initialized pinctrl subsystem Jan 13 20:43:34.742639 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 13 20:43:34.742645 kernel: audit: initializing netlink subsys (disabled) Jan 13 20:43:34.742652 kernel: audit: type=2000 audit(1736801013.066:1): state=initialized audit_enabled=0 res=1 Jan 13 20:43:34.742658 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 13 20:43:34.742664 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 13 20:43:34.742669 kernel: cpuidle: using governor menu Jan 13 20:43:34.742675 kernel: Simple Boot Flag at 0x36 set to 0x80 Jan 13 20:43:34.742681 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 13 20:43:34.742687 kernel: dca service started, version 1.12.1 Jan 13 20:43:34.742693 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Jan 13 20:43:34.742699 kernel: PCI: Using configuration type 1 for base access Jan 13 20:43:34.742706 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 13 20:43:34.742714 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 13 20:43:34.742720 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 13 20:43:34.742726 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 13 20:43:34.742731 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 13 20:43:34.742737 kernel: ACPI: Added _OSI(Module Device) Jan 13 20:43:34.742743 kernel: ACPI: Added _OSI(Processor Device) Jan 13 20:43:34.742749 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 13 20:43:34.742755 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 13 20:43:34.742762 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 13 20:43:34.742912 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Jan 13 20:43:34.742922 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 13 20:43:34.742929 kernel: ACPI: Interpreter enabled Jan 13 20:43:34.742934 kernel: ACPI: PM: (supports S0 S1 S5) Jan 13 20:43:34.742940 kernel: ACPI: Using IOAPIC for interrupt routing Jan 13 20:43:34.742946 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 13 20:43:34.742952 kernel: PCI: Using E820 reservations for host bridge windows Jan 13 20:43:34.742958 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Jan 13 20:43:34.742966 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Jan 13 20:43:34.743048 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 13 20:43:34.743104 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Jan 13 20:43:34.743154 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Jan 13 20:43:34.743163 kernel: PCI host bridge to bus 0000:00 Jan 13 20:43:34.743213 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 13 20:43:34.743261 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Jan 13 20:43:34.743306 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 13 20:43:34.743350 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 13 20:43:34.743623 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Jan 13 20:43:34.743671 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Jan 13 20:43:34.743732 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Jan 13 20:43:34.743812 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Jan 13 20:43:34.743874 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Jan 13 20:43:34.743930 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Jan 13 20:43:34.743981 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Jan 13 20:43:34.744032 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Jan 13 20:43:34.744082 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Jan 13 20:43:34.744132 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Jan 13 20:43:34.744183 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Jan 13 20:43:34.744241 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Jan 13 20:43:34.744291 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Jan 13 20:43:34.744341 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Jan 13 20:43:34.744396 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Jan 13 20:43:34.744446 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Jan 13 20:43:34.744496 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Jan 13 20:43:34.744552 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Jan 13 20:43:34.744602 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Jan 13 20:43:34.744651 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Jan 13 20:43:34.744715 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Jan 13 20:43:34.744766 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Jan 13 20:43:34.744826 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 13 20:43:34.744880 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Jan 13 20:43:34.746824 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.746887 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.746948 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747001 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747055 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747107 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747164 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747214 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747269 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747319 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747373 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747424 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747481 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747532 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747586 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747636 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747702 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747754 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747833 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747886 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747940 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747991 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748045 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748096 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748153 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748204 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748258 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748309 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748362 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748413 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748470 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748521 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748574 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748625 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748693 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748749 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748814 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748865 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748919 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748971 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749024 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749075 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749132 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749184 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749238 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749289 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749344 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749396 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749450 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749504 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749558 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749609 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749679 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749748 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749881 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749936 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749990 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.750040 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.750094 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.750144 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.750198 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.750251 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.750304 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.750355 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.750464 kernel: pci_bus 0000:01: extended config space not accessible Jan 13 20:43:34.750522 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:43:34.750598 kernel: pci_bus 0000:02: extended config space not accessible Jan 13 20:43:34.752715 kernel: acpiphp: Slot [32] registered Jan 13 20:43:34.752723 kernel: acpiphp: Slot [33] registered Jan 13 20:43:34.752729 kernel: acpiphp: Slot [34] registered Jan 13 20:43:34.752735 kernel: acpiphp: Slot [35] registered Jan 13 20:43:34.752740 kernel: acpiphp: Slot [36] registered Jan 13 20:43:34.752746 kernel: acpiphp: Slot [37] registered Jan 13 20:43:34.752752 kernel: acpiphp: Slot [38] registered Jan 13 20:43:34.752758 kernel: acpiphp: Slot [39] registered Jan 13 20:43:34.752764 kernel: acpiphp: Slot [40] registered Jan 13 20:43:34.752840 kernel: acpiphp: Slot [41] registered Jan 13 20:43:34.752851 kernel: acpiphp: Slot [42] registered Jan 13 20:43:34.752857 kernel: acpiphp: Slot [43] registered Jan 13 20:43:34.752862 kernel: acpiphp: Slot [44] registered Jan 13 20:43:34.752868 kernel: acpiphp: Slot [45] registered Jan 13 20:43:34.752874 kernel: acpiphp: Slot [46] registered Jan 13 20:43:34.752880 kernel: acpiphp: Slot [47] registered Jan 13 20:43:34.752886 kernel: acpiphp: Slot [48] registered Jan 13 20:43:34.752892 kernel: acpiphp: Slot [49] registered Jan 13 20:43:34.752898 kernel: acpiphp: Slot [50] registered Jan 13 20:43:34.752905 kernel: acpiphp: Slot [51] registered Jan 13 20:43:34.752911 kernel: acpiphp: Slot [52] registered Jan 13 20:43:34.752917 kernel: acpiphp: Slot [53] registered Jan 13 20:43:34.752922 kernel: acpiphp: Slot [54] registered Jan 13 20:43:34.752928 kernel: acpiphp: Slot [55] registered Jan 13 20:43:34.752934 kernel: acpiphp: Slot [56] registered Jan 13 20:43:34.752940 kernel: acpiphp: Slot [57] registered Jan 13 20:43:34.752946 kernel: acpiphp: Slot [58] registered Jan 13 20:43:34.752952 kernel: acpiphp: Slot [59] registered Jan 13 20:43:34.752958 kernel: acpiphp: Slot [60] registered Jan 13 20:43:34.752965 kernel: acpiphp: Slot [61] registered Jan 13 20:43:34.752971 kernel: acpiphp: Slot [62] registered Jan 13 20:43:34.752977 kernel: acpiphp: Slot [63] registered Jan 13 20:43:34.753041 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jan 13 20:43:34.753096 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:43:34.753147 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:43:34.753198 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:43:34.753247 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Jan 13 20:43:34.753301 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Jan 13 20:43:34.753351 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Jan 13 20:43:34.753400 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Jan 13 20:43:34.753450 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Jan 13 20:43:34.753532 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Jan 13 20:43:34.753591 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Jan 13 20:43:34.753643 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Jan 13 20:43:34.753698 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:43:34.753750 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.754844 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:43:34.754900 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:43:34.754952 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:43:34.755002 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:43:34.755055 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:43:34.755105 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:43:34.755158 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:43:34.755209 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:43:34.755260 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:43:34.755312 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:43:34.755362 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:43:34.755414 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:43:34.755465 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:43:34.755518 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:43:34.755569 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:43:34.755621 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:43:34.755671 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:43:34.755721 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:43:34.756816 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:43:34.756905 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:43:34.756957 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:43:34.757009 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:43:34.757059 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:43:34.757110 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:43:34.757161 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:43:34.757214 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:43:34.757264 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:43:34.757320 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Jan 13 20:43:34.757373 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Jan 13 20:43:34.757424 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Jan 13 20:43:34.757475 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Jan 13 20:43:34.757526 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Jan 13 20:43:34.757577 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:43:34.757631 kernel: pci 0000:0b:00.0: supports D1 D2 Jan 13 20:43:34.757687 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Jan 13 20:43:34.757739 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:43:34.759809 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:43:34.759862 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:43:34.759913 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:43:34.759964 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:43:34.760015 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:43:34.760068 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:43:34.760119 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:43:34.760171 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:43:34.760221 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:43:34.760271 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:43:34.760321 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:43:34.760373 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:43:34.760425 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:43:34.760476 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:43:34.760528 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:43:34.760578 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:43:34.760629 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:43:34.760683 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:43:34.760734 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:43:34.760797 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:43:34.760853 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:43:34.760904 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:43:34.760955 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:43:34.761006 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:43:34.761056 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:43:34.761107 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:43:34.761158 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:43:34.761208 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:43:34.761261 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:43:34.761312 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:43:34.761364 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:43:34.761415 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:43:34.761465 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:43:34.761515 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:43:34.761567 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:43:34.761617 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:43:34.761670 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:43:34.761724 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:43:34.762796 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:43:34.762852 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:43:34.762902 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:43:34.762952 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:43:34.763002 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:43:34.763055 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:43:34.763105 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:43:34.763155 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:43:34.763205 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:43:34.763256 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:43:34.763305 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:43:34.763354 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:43:34.763405 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:43:34.763457 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:43:34.763506 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:43:34.763557 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:43:34.763607 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:43:34.763657 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:43:34.763706 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:43:34.763757 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:43:34.765572 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:43:34.765629 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:43:34.765680 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:43:34.765732 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:43:34.765808 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:43:34.765860 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:43:34.765910 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:43:34.765960 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:43:34.766010 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:43:34.766065 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:43:34.766114 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:43:34.766164 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:43:34.766216 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:43:34.766265 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:43:34.766315 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:43:34.766365 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:43:34.766416 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:43:34.766468 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:43:34.766519 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:43:34.766569 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:43:34.766618 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:43:34.766627 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Jan 13 20:43:34.766633 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Jan 13 20:43:34.766639 kernel: ACPI: PCI: Interrupt link LNKB disabled Jan 13 20:43:34.766646 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 13 20:43:34.766653 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Jan 13 20:43:34.766659 kernel: iommu: Default domain type: Translated Jan 13 20:43:34.766665 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 13 20:43:34.766672 kernel: PCI: Using ACPI for IRQ routing Jan 13 20:43:34.766678 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 13 20:43:34.766684 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Jan 13 20:43:34.766690 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Jan 13 20:43:34.766740 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Jan 13 20:43:34.766804 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Jan 13 20:43:34.766859 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 13 20:43:34.766868 kernel: vgaarb: loaded Jan 13 20:43:34.766874 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Jan 13 20:43:34.766880 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Jan 13 20:43:34.766886 kernel: clocksource: Switched to clocksource tsc-early Jan 13 20:43:34.766892 kernel: VFS: Disk quotas dquot_6.6.0 Jan 13 20:43:34.766898 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 13 20:43:34.766904 kernel: pnp: PnP ACPI init Jan 13 20:43:34.766956 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Jan 13 20:43:34.767006 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Jan 13 20:43:34.767052 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Jan 13 20:43:34.767102 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Jan 13 20:43:34.767152 kernel: pnp 00:06: [dma 2] Jan 13 20:43:34.767202 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Jan 13 20:43:34.767248 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Jan 13 20:43:34.767296 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Jan 13 20:43:34.767305 kernel: pnp: PnP ACPI: found 8 devices Jan 13 20:43:34.767311 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 13 20:43:34.767317 kernel: NET: Registered PF_INET protocol family Jan 13 20:43:34.767323 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 13 20:43:34.767329 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 13 20:43:34.767335 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 13 20:43:34.767342 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 13 20:43:34.767347 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 13 20:43:34.767355 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 13 20:43:34.767361 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:43:34.767367 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:43:34.767373 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 13 20:43:34.767379 kernel: NET: Registered PF_XDP protocol family Jan 13 20:43:34.767430 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Jan 13 20:43:34.767482 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 13 20:43:34.767532 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 13 20:43:34.767586 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 13 20:43:34.767637 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 13 20:43:34.767698 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 13 20:43:34.767750 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 13 20:43:34.767831 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 13 20:43:34.767888 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 13 20:43:34.767939 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 13 20:43:34.767990 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 13 20:43:34.768041 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 13 20:43:34.768374 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 13 20:43:34.768502 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 13 20:43:34.768787 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 13 20:43:34.768848 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 13 20:43:34.768923 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 13 20:43:34.768976 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 13 20:43:34.769027 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 13 20:43:34.769078 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 13 20:43:34.769132 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 13 20:43:34.769183 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 13 20:43:34.769234 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Jan 13 20:43:34.769284 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:43:34.769334 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:43:34.769384 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.769437 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.769489 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.769539 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.769589 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.769638 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.769688 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.769737 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.769818 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.769870 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.769923 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.769972 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770022 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770071 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770120 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770169 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770219 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770268 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770320 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770370 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770419 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770468 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770517 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770567 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770617 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770666 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770722 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770779 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770830 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770880 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770930 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770979 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771029 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771079 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771132 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771182 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771231 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771281 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771330 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771380 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771430 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771480 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771533 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771582 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771643 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771708 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771761 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771867 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771917 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771979 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772034 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772087 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772137 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772186 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772236 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772286 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772336 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772384 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772434 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772484 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772533 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772586 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772636 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772685 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772735 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772804 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772856 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772906 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772956 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773005 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773058 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773132 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773203 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773257 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773307 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773356 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773406 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773455 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773505 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773556 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773610 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773661 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773719 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773826 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773885 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:43:34.773936 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Jan 13 20:43:34.773986 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:43:34.774050 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:43:34.774098 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:43:34.774153 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Jan 13 20:43:34.774202 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:43:34.774250 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:43:34.774298 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:43:34.774346 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:43:34.774395 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:43:34.774443 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:43:34.774491 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:43:34.774542 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:43:34.774591 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:43:34.774640 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:43:34.774695 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:43:34.774744 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:43:34.774857 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:43:34.774910 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:43:34.774960 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:43:34.775010 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:43:34.775063 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:43:34.775112 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:43:34.775164 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:43:34.775213 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:43:34.775263 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:43:34.775312 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:43:34.775363 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:43:34.775412 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:43:34.775462 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:43:34.775511 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:43:34.775561 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:43:34.775613 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Jan 13 20:43:34.775664 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:43:34.775714 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:43:34.775765 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:43:34.777841 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:43:34.777902 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:43:34.777955 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:43:34.778006 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:43:34.778057 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:43:34.778110 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:43:34.778161 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:43:34.778211 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:43:34.778262 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:43:34.778312 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:43:34.778364 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:43:34.778414 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:43:34.778464 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:43:34.778514 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:43:34.778564 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:43:34.778615 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:43:34.778665 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:43:34.778721 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:43:34.778784 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:43:34.778841 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:43:34.778891 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:43:34.778941 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:43:34.778991 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:43:34.779041 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:43:34.779092 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:43:34.779143 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:43:34.779192 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:43:34.779243 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:43:34.779295 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:43:34.779348 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:43:34.779397 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:43:34.779447 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:43:34.779498 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:43:34.779548 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:43:34.779598 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:43:34.779648 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:43:34.779698 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:43:34.779749 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:43:34.779829 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:43:34.779880 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:43:34.779931 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:43:34.779980 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:43:34.780030 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:43:34.780080 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:43:34.780129 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:43:34.780179 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:43:34.780228 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:43:34.780278 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:43:34.780331 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:43:34.780382 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:43:34.780431 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:43:34.780481 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:43:34.780531 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:43:34.780581 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:43:34.780630 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:43:34.780683 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:43:34.780734 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:43:34.780856 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:43:34.780910 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:43:34.780960 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:43:34.781011 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:43:34.781061 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:43:34.781111 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:43:34.781161 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:43:34.781211 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:43:34.781263 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:43:34.781642 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:43:34.781704 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:43:34.781757 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:43:34.782002 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:43:34.782056 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:43:34.782108 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:43:34.782158 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:43:34.782207 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:43:34.782257 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:43:34.782306 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:43:34.782358 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:43:34.782406 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:43:34.782451 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:43:34.782494 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:43:34.782537 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:43:34.782580 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:43:34.782627 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Jan 13 20:43:34.782673 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Jan 13 20:43:34.782738 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:43:34.782858 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:43:34.782906 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:43:34.782951 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:43:34.782997 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:43:34.783042 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:43:34.783094 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Jan 13 20:43:34.783143 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Jan 13 20:43:34.783208 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:43:34.783808 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Jan 13 20:43:34.783864 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Jan 13 20:43:34.783913 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:43:34.783965 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Jan 13 20:43:34.784013 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Jan 13 20:43:34.784062 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:43:34.784112 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Jan 13 20:43:34.784159 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:43:34.784208 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Jan 13 20:43:34.784255 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:43:34.784305 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Jan 13 20:43:34.784353 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:43:34.784403 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Jan 13 20:43:34.784449 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:43:34.784502 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Jan 13 20:43:34.784556 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:43:34.784611 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Jan 13 20:43:34.784661 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Jan 13 20:43:34.784711 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:43:34.784762 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Jan 13 20:43:34.784844 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Jan 13 20:43:34.784891 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:43:34.784941 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Jan 13 20:43:34.784989 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Jan 13 20:43:34.785041 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:43:34.785091 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Jan 13 20:43:34.785137 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:43:34.785187 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Jan 13 20:43:34.785234 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:43:34.785283 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Jan 13 20:43:34.785332 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:43:34.785382 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Jan 13 20:43:34.785428 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:43:34.785477 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Jan 13 20:43:34.785523 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:43:34.785574 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Jan 13 20:43:34.785622 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Jan 13 20:43:34.785668 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:43:34.785722 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Jan 13 20:43:34.786805 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Jan 13 20:43:34.786875 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:43:34.786931 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Jan 13 20:43:34.786979 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Jan 13 20:43:34.787028 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:43:34.787080 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Jan 13 20:43:34.787128 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:43:34.787177 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Jan 13 20:43:34.787224 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:43:34.787274 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Jan 13 20:43:34.787323 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:43:34.787374 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Jan 13 20:43:34.787421 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:43:34.787471 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Jan 13 20:43:34.787518 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:43:34.787570 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 13 20:43:34.787620 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Jan 13 20:43:34.787667 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:43:34.787729 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Jan 13 20:43:34.787970 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Jan 13 20:43:34.788023 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:43:34.788074 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Jan 13 20:43:34.788122 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:43:34.788176 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Jan 13 20:43:34.788223 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:43:34.788275 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Jan 13 20:43:34.788323 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:43:34.788373 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Jan 13 20:43:34.788420 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:43:34.788472 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Jan 13 20:43:34.788519 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:43:34.788569 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Jan 13 20:43:34.788616 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:43:34.788671 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jan 13 20:43:34.788686 kernel: PCI: CLS 32 bytes, default 64 Jan 13 20:43:34.788695 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 13 20:43:34.788702 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:43:34.788709 kernel: clocksource: Switched to clocksource tsc Jan 13 20:43:34.788715 kernel: Initialise system trusted keyrings Jan 13 20:43:34.788722 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 13 20:43:34.788728 kernel: Key type asymmetric registered Jan 13 20:43:34.788735 kernel: Asymmetric key parser 'x509' registered Jan 13 20:43:34.788741 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 13 20:43:34.788747 kernel: io scheduler mq-deadline registered Jan 13 20:43:34.788755 kernel: io scheduler kyber registered Jan 13 20:43:34.788762 kernel: io scheduler bfq registered Jan 13 20:43:34.790456 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Jan 13 20:43:34.790519 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.790576 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Jan 13 20:43:34.790630 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.790687 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Jan 13 20:43:34.790741 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.790835 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Jan 13 20:43:34.790888 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.790941 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Jan 13 20:43:34.790992 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791044 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Jan 13 20:43:34.791095 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791151 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Jan 13 20:43:34.791202 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791255 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Jan 13 20:43:34.791305 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791358 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Jan 13 20:43:34.791412 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791464 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Jan 13 20:43:34.791515 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791567 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Jan 13 20:43:34.791618 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791671 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Jan 13 20:43:34.791729 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791806 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Jan 13 20:43:34.791862 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791915 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Jan 13 20:43:34.791967 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.792371 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Jan 13 20:43:34.792434 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.792489 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Jan 13 20:43:34.792543 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.792597 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Jan 13 20:43:34.792650 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.792703 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Jan 13 20:43:34.792758 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.793869 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Jan 13 20:43:34.793929 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.793985 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Jan 13 20:43:34.794039 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.794096 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Jan 13 20:43:34.794150 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.794203 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Jan 13 20:43:34.794254 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.794306 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Jan 13 20:43:34.794358 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.794410 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Jan 13 20:43:34.794464 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.794517 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Jan 13 20:43:34.794569 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.794623 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Jan 13 20:43:34.794678 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.794732 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Jan 13 20:43:34.795806 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.795872 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Jan 13 20:43:34.795929 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.795984 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Jan 13 20:43:34.796038 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.796096 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Jan 13 20:43:34.796148 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.796201 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Jan 13 20:43:34.796253 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.796305 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Jan 13 20:43:34.796356 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.796368 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 13 20:43:34.796374 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 13 20:43:34.796381 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 13 20:43:34.796388 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Jan 13 20:43:34.796394 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 13 20:43:34.796401 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 13 20:43:34.796452 kernel: rtc_cmos 00:01: registered as rtc0 Jan 13 20:43:34.796500 kernel: rtc_cmos 00:01: setting system clock to 2025-01-13T20:43:34 UTC (1736801014) Jan 13 20:43:34.796548 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Jan 13 20:43:34.796557 kernel: intel_pstate: CPU model not supported Jan 13 20:43:34.796564 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 13 20:43:34.796571 kernel: NET: Registered PF_INET6 protocol family Jan 13 20:43:34.796577 kernel: Segment Routing with IPv6 Jan 13 20:43:34.796584 kernel: In-situ OAM (IOAM) with IPv6 Jan 13 20:43:34.796590 kernel: NET: Registered PF_PACKET protocol family Jan 13 20:43:34.796596 kernel: Key type dns_resolver registered Jan 13 20:43:34.796605 kernel: IPI shorthand broadcast: enabled Jan 13 20:43:34.796611 kernel: sched_clock: Marking stable (866003432, 222486605)->(1145558823, -57068786) Jan 13 20:43:34.796619 kernel: registered taskstats version 1 Jan 13 20:43:34.796625 kernel: Loading compiled-in X.509 certificates Jan 13 20:43:34.796631 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: ede78b3e719729f95eaaf7cb6a5289b567f6ee3e' Jan 13 20:43:34.796638 kernel: Key type .fscrypt registered Jan 13 20:43:34.796644 kernel: Key type fscrypt-provisioning registered Jan 13 20:43:34.796650 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 13 20:43:34.796657 kernel: ima: Allocated hash algorithm: sha1 Jan 13 20:43:34.796665 kernel: ima: No architecture policies found Jan 13 20:43:34.796671 kernel: clk: Disabling unused clocks Jan 13 20:43:34.796678 kernel: Freeing unused kernel image (initmem) memory: 43320K Jan 13 20:43:34.796684 kernel: Write protecting the kernel read-only data: 38912k Jan 13 20:43:34.796690 kernel: Freeing unused kernel image (rodata/data gap) memory: 1776K Jan 13 20:43:34.796697 kernel: Run /init as init process Jan 13 20:43:34.796703 kernel: with arguments: Jan 13 20:43:34.796709 kernel: /init Jan 13 20:43:34.796716 kernel: with environment: Jan 13 20:43:34.796723 kernel: HOME=/ Jan 13 20:43:34.796729 kernel: TERM=linux Jan 13 20:43:34.796735 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 13 20:43:34.796743 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 20:43:34.796751 systemd[1]: Detected virtualization vmware. Jan 13 20:43:34.797177 systemd[1]: Detected architecture x86-64. Jan 13 20:43:34.797186 systemd[1]: Running in initrd. Jan 13 20:43:34.797193 systemd[1]: No hostname configured, using default hostname. Jan 13 20:43:34.797202 systemd[1]: Hostname set to . Jan 13 20:43:34.797208 systemd[1]: Initializing machine ID from random generator. Jan 13 20:43:34.797215 systemd[1]: Queued start job for default target initrd.target. Jan 13 20:43:34.797221 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:43:34.797228 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:43:34.797235 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 13 20:43:34.797242 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 20:43:34.797249 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 13 20:43:34.797257 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 13 20:43:34.797265 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 13 20:43:34.797271 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 13 20:43:34.797278 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:43:34.797285 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:43:34.797292 systemd[1]: Reached target paths.target - Path Units. Jan 13 20:43:34.797298 systemd[1]: Reached target slices.target - Slice Units. Jan 13 20:43:34.797307 systemd[1]: Reached target swap.target - Swaps. Jan 13 20:43:34.797313 systemd[1]: Reached target timers.target - Timer Units. Jan 13 20:43:34.797319 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:43:34.797326 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:43:34.797333 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 13 20:43:34.797339 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 13 20:43:34.797346 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:43:34.797353 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 20:43:34.797361 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:43:34.797368 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 20:43:34.797374 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 13 20:43:34.797381 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 20:43:34.797388 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 13 20:43:34.797394 systemd[1]: Starting systemd-fsck-usr.service... Jan 13 20:43:34.797401 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 20:43:34.797724 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 20:43:34.797736 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:43:34.797758 systemd-journald[217]: Collecting audit messages is disabled. Jan 13 20:43:34.797784 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 13 20:43:34.797791 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:43:34.797798 systemd[1]: Finished systemd-fsck-usr.service. Jan 13 20:43:34.797808 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 20:43:34.797815 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 13 20:43:34.797822 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:43:34.797829 kernel: Bridge firewalling registered Jan 13 20:43:34.797837 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:43:34.797843 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 20:43:34.797850 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:43:34.797857 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 20:43:34.797864 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 20:43:34.797871 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:43:34.797877 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:43:34.797885 systemd-journald[217]: Journal started Jan 13 20:43:34.797901 systemd-journald[217]: Runtime Journal (/run/log/journal/5e3b0bbb7bbf4bfda61a998dad51be59) is 4.8M, max 38.6M, 33.8M free. Jan 13 20:43:34.752809 systemd-modules-load[218]: Inserted module 'overlay' Jan 13 20:43:34.773713 systemd-modules-load[218]: Inserted module 'br_netfilter' Jan 13 20:43:34.800048 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 20:43:34.800401 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:43:34.804939 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 13 20:43:34.808135 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 20:43:34.812395 dracut-cmdline[248]: dracut-dracut-053 Jan 13 20:43:34.812700 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:43:34.813865 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 20:43:34.816645 dracut-cmdline[248]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 20:43:34.842835 systemd-resolved[260]: Positive Trust Anchors: Jan 13 20:43:34.842842 systemd-resolved[260]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 20:43:34.842866 systemd-resolved[260]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 20:43:34.844430 systemd-resolved[260]: Defaulting to hostname 'linux'. Jan 13 20:43:34.845272 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 20:43:34.845419 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:43:34.863781 kernel: SCSI subsystem initialized Jan 13 20:43:34.869797 kernel: Loading iSCSI transport class v2.0-870. Jan 13 20:43:34.875780 kernel: iscsi: registered transport (tcp) Jan 13 20:43:34.889784 kernel: iscsi: registered transport (qla4xxx) Jan 13 20:43:34.889818 kernel: QLogic iSCSI HBA Driver Jan 13 20:43:34.909259 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 13 20:43:34.913872 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 13 20:43:34.928182 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 13 20:43:34.928212 kernel: device-mapper: uevent: version 1.0.3 Jan 13 20:43:34.929240 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 13 20:43:34.960789 kernel: raid6: avx2x4 gen() 47320 MB/s Jan 13 20:43:34.976788 kernel: raid6: avx2x2 gen() 52062 MB/s Jan 13 20:43:34.994040 kernel: raid6: avx2x1 gen() 42997 MB/s Jan 13 20:43:34.994084 kernel: raid6: using algorithm avx2x2 gen() 52062 MB/s Jan 13 20:43:35.011986 kernel: raid6: .... xor() 30694 MB/s, rmw enabled Jan 13 20:43:35.012028 kernel: raid6: using avx2x2 recovery algorithm Jan 13 20:43:35.025783 kernel: xor: automatically using best checksumming function avx Jan 13 20:43:35.116808 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 13 20:43:35.122317 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:43:35.127862 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:43:35.135109 systemd-udevd[435]: Using default interface naming scheme 'v255'. Jan 13 20:43:35.137566 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:43:35.144861 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 13 20:43:35.151395 dracut-pre-trigger[438]: rd.md=0: removing MD RAID activation Jan 13 20:43:35.167323 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:43:35.171840 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 20:43:35.241574 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:43:35.244872 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 13 20:43:35.251573 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 13 20:43:35.252346 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:43:35.252641 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:43:35.252906 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 20:43:35.256881 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 13 20:43:35.265286 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:43:35.306800 kernel: VMware PVSCSI driver - version 1.0.7.0-k Jan 13 20:43:35.317256 kernel: vmw_pvscsi: using 64bit dma Jan 13 20:43:35.317278 kernel: vmw_pvscsi: max_id: 16 Jan 13 20:43:35.317286 kernel: vmw_pvscsi: setting ring_pages to 8 Jan 13 20:43:35.319846 kernel: cryptd: max_cpu_qlen set to 1000 Jan 13 20:43:35.319864 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Jan 13 20:43:35.321802 kernel: vmw_pvscsi: enabling reqCallThreshold Jan 13 20:43:35.321818 kernel: vmw_pvscsi: driver-based request coalescing enabled Jan 13 20:43:35.321827 kernel: vmw_pvscsi: using MSI-X Jan 13 20:43:35.329639 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Jan 13 20:43:35.329680 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Jan 13 20:43:35.339359 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Jan 13 20:43:35.339445 kernel: AVX2 version of gcm_enc/dec engaged. Jan 13 20:43:35.339459 kernel: AES CTR mode by8 optimization enabled Jan 13 20:43:35.339467 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Jan 13 20:43:35.339537 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Jan 13 20:43:35.334828 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:43:35.334906 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:43:35.335073 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:43:35.335166 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:43:35.335228 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:43:35.335335 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:43:35.340612 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:43:35.343811 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Jan 13 20:43:35.349791 kernel: libata version 3.00 loaded. Jan 13 20:43:35.365434 kernel: ata_piix 0000:00:07.1: version 2.13 Jan 13 20:43:35.366353 kernel: scsi host1: ata_piix Jan 13 20:43:35.368120 kernel: scsi host2: ata_piix Jan 13 20:43:35.368188 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Jan 13 20:43:35.368197 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Jan 13 20:43:35.368948 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Jan 13 20:43:35.377371 kernel: sd 0:0:0:0: [sda] Write Protect is off Jan 13 20:43:35.377464 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Jan 13 20:43:35.377557 kernel: sd 0:0:0:0: [sda] Cache data unavailable Jan 13 20:43:35.377672 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Jan 13 20:43:35.377755 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:43:35.377767 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jan 13 20:43:35.374578 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:43:35.378891 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:43:35.389101 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:43:35.535795 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Jan 13 20:43:35.539835 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Jan 13 20:43:35.560781 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Jan 13 20:43:35.568673 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 13 20:43:35.568691 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jan 13 20:43:35.570778 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (491) Jan 13 20:43:35.577252 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 20:43:35.579740 kernel: BTRFS: device fsid 7f507843-6957-466b-8fb7-5bee228b170a devid 1 transid 44 /dev/sda3 scanned by (udev-worker) (498) Jan 13 20:43:35.582645 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Jan 13 20:43:35.585544 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Jan 13 20:43:35.587687 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Jan 13 20:43:35.587831 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Jan 13 20:43:35.596859 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 13 20:43:35.622786 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:43:36.629846 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:43:36.629892 disk-uuid[595]: The operation has completed successfully. Jan 13 20:43:36.673198 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 13 20:43:36.673259 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 13 20:43:36.677957 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 13 20:43:36.679711 sh[612]: Success Jan 13 20:43:36.687844 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Jan 13 20:43:36.729908 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 13 20:43:36.735522 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 13 20:43:36.735828 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 13 20:43:36.750848 kernel: BTRFS info (device dm-0): first mount of filesystem 7f507843-6957-466b-8fb7-5bee228b170a Jan 13 20:43:36.750871 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:43:36.750880 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 13 20:43:36.753196 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 13 20:43:36.753209 kernel: BTRFS info (device dm-0): using free space tree Jan 13 20:43:36.760785 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 13 20:43:36.762366 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 13 20:43:36.776963 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Jan 13 20:43:36.778125 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 13 20:43:36.800795 kernel: BTRFS info (device sda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:43:36.800817 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:43:36.800826 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:43:36.819119 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:43:36.823252 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 13 20:43:36.824779 kernel: BTRFS info (device sda6): last unmount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:43:36.827893 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 13 20:43:36.832000 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 13 20:43:36.843534 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:43:36.848845 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 13 20:43:36.904623 ignition[671]: Ignition 2.20.0 Jan 13 20:43:36.904632 ignition[671]: Stage: fetch-offline Jan 13 20:43:36.904651 ignition[671]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:43:36.904656 ignition[671]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:43:36.904706 ignition[671]: parsed url from cmdline: "" Jan 13 20:43:36.904708 ignition[671]: no config URL provided Jan 13 20:43:36.904710 ignition[671]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 20:43:36.904715 ignition[671]: no config at "/usr/lib/ignition/user.ign" Jan 13 20:43:36.905115 ignition[671]: config successfully fetched Jan 13 20:43:36.905131 ignition[671]: parsing config with SHA512: f6fbbbb403358a4cc64f616ca18a0b3fbdba56da6ce872139f7626b5376dbbb3f8db144313a4c698bf4717e921d94884fc5f2a85c82a0784fb194e291d8da999 Jan 13 20:43:36.907550 unknown[671]: fetched base config from "system" Jan 13 20:43:36.907825 ignition[671]: fetch-offline: fetch-offline passed Jan 13 20:43:36.907556 unknown[671]: fetched user config from "vmware" Jan 13 20:43:36.907865 ignition[671]: Ignition finished successfully Jan 13 20:43:36.908366 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:43:36.910213 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:43:36.913852 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 20:43:36.925041 systemd-networkd[806]: lo: Link UP Jan 13 20:43:36.925047 systemd-networkd[806]: lo: Gained carrier Jan 13 20:43:36.925703 systemd-networkd[806]: Enumeration completed Jan 13 20:43:36.925961 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 20:43:36.925983 systemd-networkd[806]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Jan 13 20:43:36.926090 systemd[1]: Reached target network.target - Network. Jan 13 20:43:36.926179 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jan 13 20:43:36.930578 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 20:43:36.930678 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 20:43:36.929019 systemd-networkd[806]: ens192: Link UP Jan 13 20:43:36.929022 systemd-networkd[806]: ens192: Gained carrier Jan 13 20:43:36.940125 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 13 20:43:36.945693 ignition[808]: Ignition 2.20.0 Jan 13 20:43:36.945700 ignition[808]: Stage: kargs Jan 13 20:43:36.945801 ignition[808]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:43:36.945807 ignition[808]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:43:36.946277 ignition[808]: kargs: kargs passed Jan 13 20:43:36.946302 ignition[808]: Ignition finished successfully Jan 13 20:43:36.947213 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 13 20:43:36.952033 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 13 20:43:36.957811 ignition[815]: Ignition 2.20.0 Jan 13 20:43:36.957818 ignition[815]: Stage: disks Jan 13 20:43:36.957925 ignition[815]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:43:36.957931 ignition[815]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:43:36.958440 ignition[815]: disks: disks passed Jan 13 20:43:36.958470 ignition[815]: Ignition finished successfully Jan 13 20:43:36.959036 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 13 20:43:36.959362 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 13 20:43:36.959458 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 13 20:43:36.959554 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 20:43:36.959636 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 20:43:36.959717 systemd[1]: Reached target basic.target - Basic System. Jan 13 20:43:36.965934 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 13 20:43:36.976030 systemd-fsck[823]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 13 20:43:36.976830 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 13 20:43:36.979828 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 13 20:43:37.030243 kernel: EXT4-fs (sda9): mounted filesystem 59ba8ffc-e6b0-4bb4-a36e-13a47bd6ad99 r/w with ordered data mode. Quota mode: none. Jan 13 20:43:37.030172 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 13 20:43:37.030628 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 13 20:43:37.039939 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:43:37.041215 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 13 20:43:37.041503 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 13 20:43:37.041528 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 13 20:43:37.041542 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:43:37.044292 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 13 20:43:37.047801 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (831) Jan 13 20:43:37.050426 kernel: BTRFS info (device sda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:43:37.050440 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:43:37.050448 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:43:37.051852 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 13 20:43:37.053834 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:43:37.054085 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:43:37.076500 initrd-setup-root[855]: cut: /sysroot/etc/passwd: No such file or directory Jan 13 20:43:37.078783 initrd-setup-root[862]: cut: /sysroot/etc/group: No such file or directory Jan 13 20:43:37.080986 initrd-setup-root[869]: cut: /sysroot/etc/shadow: No such file or directory Jan 13 20:43:37.083306 initrd-setup-root[876]: cut: /sysroot/etc/gshadow: No such file or directory Jan 13 20:43:37.132580 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 13 20:43:37.139899 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 13 20:43:37.141241 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 13 20:43:37.145860 kernel: BTRFS info (device sda6): last unmount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:43:37.161110 ignition[944]: INFO : Ignition 2.20.0 Jan 13 20:43:37.161110 ignition[944]: INFO : Stage: mount Jan 13 20:43:37.161561 ignition[944]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:43:37.161561 ignition[944]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:43:37.162405 ignition[944]: INFO : mount: mount passed Jan 13 20:43:37.162405 ignition[944]: INFO : Ignition finished successfully Jan 13 20:43:37.161887 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 13 20:43:37.162282 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 13 20:43:37.165938 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 13 20:43:37.749242 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 13 20:43:37.753893 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:43:37.766224 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (955) Jan 13 20:43:37.766250 kernel: BTRFS info (device sda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:43:37.766267 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:43:37.766278 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:43:37.769782 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:43:37.771021 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:43:37.786609 ignition[972]: INFO : Ignition 2.20.0 Jan 13 20:43:37.786609 ignition[972]: INFO : Stage: files Jan 13 20:43:37.787116 ignition[972]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:43:37.787116 ignition[972]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:43:37.787390 ignition[972]: DEBUG : files: compiled without relabeling support, skipping Jan 13 20:43:37.788010 ignition[972]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 13 20:43:37.788010 ignition[972]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 13 20:43:37.790039 ignition[972]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 13 20:43:37.790198 ignition[972]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 13 20:43:37.790351 ignition[972]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 13 20:43:37.790260 unknown[972]: wrote ssh authorized keys file for user: core Jan 13 20:43:37.792405 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:43:37.792653 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 13 20:43:37.833584 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 13 20:43:37.976453 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:43:37.976799 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 13 20:43:37.976799 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 13 20:43:37.976799 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:43:37.976799 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:43:37.976799 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:43:37.976799 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.29.2-x86-64.raw: attempt #1 Jan 13 20:43:38.460293 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 13 20:43:38.699724 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:43:38.699999 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:43:38.699999 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:43:38.699999 ignition[972]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Jan 13 20:43:38.699999 ignition[972]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:43:38.700601 ignition[972]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:43:38.700601 ignition[972]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Jan 13 20:43:38.700601 ignition[972]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Jan 13 20:43:38.700601 ignition[972]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:43:38.700601 ignition[972]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:43:38.700601 ignition[972]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Jan 13 20:43:38.700601 ignition[972]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Jan 13 20:43:38.760851 systemd-networkd[806]: ens192: Gained IPv6LL Jan 13 20:43:38.780487 ignition[972]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:43:38.783477 ignition[972]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:43:38.783477 ignition[972]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Jan 13 20:43:38.783477 ignition[972]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Jan 13 20:43:38.783477 ignition[972]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Jan 13 20:43:38.783477 ignition[972]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:43:38.783477 ignition[972]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:43:38.783477 ignition[972]: INFO : files: files passed Jan 13 20:43:38.783477 ignition[972]: INFO : Ignition finished successfully Jan 13 20:43:38.784885 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 13 20:43:38.789915 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 13 20:43:38.791821 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 13 20:43:38.792505 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 13 20:43:38.792724 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 13 20:43:38.800089 initrd-setup-root-after-ignition[1002]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:43:38.800089 initrd-setup-root-after-ignition[1002]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:43:38.800483 initrd-setup-root-after-ignition[1006]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:43:38.801111 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:43:38.801416 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 13 20:43:38.804854 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 13 20:43:38.816465 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 13 20:43:38.816520 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 13 20:43:38.816812 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 13 20:43:38.816918 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 13 20:43:38.817104 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 13 20:43:38.817500 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 13 20:43:38.826067 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:43:38.829924 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 13 20:43:38.835292 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:43:38.835448 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:43:38.835653 systemd[1]: Stopped target timers.target - Timer Units. Jan 13 20:43:38.835860 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 13 20:43:38.835915 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:43:38.836243 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 13 20:43:38.836382 systemd[1]: Stopped target basic.target - Basic System. Jan 13 20:43:38.836547 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 13 20:43:38.836722 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:43:38.836922 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 13 20:43:38.837312 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 13 20:43:38.837498 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:43:38.837692 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 13 20:43:38.837891 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 13 20:43:38.838068 systemd[1]: Stopped target swap.target - Swaps. Jan 13 20:43:38.838220 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 13 20:43:38.838277 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:43:38.838509 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:43:38.838638 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:43:38.838851 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 13 20:43:38.838893 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:43:38.839028 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 13 20:43:38.839082 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 13 20:43:38.839326 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 13 20:43:38.839384 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:43:38.839626 systemd[1]: Stopped target paths.target - Path Units. Jan 13 20:43:38.839757 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 13 20:43:38.844860 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:43:38.845010 systemd[1]: Stopped target slices.target - Slice Units. Jan 13 20:43:38.845200 systemd[1]: Stopped target sockets.target - Socket Units. Jan 13 20:43:38.845367 systemd[1]: iscsid.socket: Deactivated successfully. Jan 13 20:43:38.845428 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:43:38.845630 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 13 20:43:38.845672 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:43:38.845906 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 13 20:43:38.845962 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:43:38.846198 systemd[1]: ignition-files.service: Deactivated successfully. Jan 13 20:43:38.846250 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 13 20:43:38.852994 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 13 20:43:38.854244 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 13 20:43:38.854336 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 13 20:43:38.854418 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:43:38.854615 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 13 20:43:38.854689 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:43:38.857399 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 13 20:43:38.857453 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 13 20:43:38.860691 ignition[1026]: INFO : Ignition 2.20.0 Jan 13 20:43:38.864082 ignition[1026]: INFO : Stage: umount Jan 13 20:43:38.864082 ignition[1026]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:43:38.864082 ignition[1026]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:43:38.864082 ignition[1026]: INFO : umount: umount passed Jan 13 20:43:38.864082 ignition[1026]: INFO : Ignition finished successfully Jan 13 20:43:38.863147 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 13 20:43:38.863208 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 13 20:43:38.863505 systemd[1]: Stopped target network.target - Network. Jan 13 20:43:38.863674 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 13 20:43:38.863702 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 13 20:43:38.863811 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 13 20:43:38.863833 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 13 20:43:38.864007 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 13 20:43:38.864027 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 13 20:43:38.864153 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 13 20:43:38.864174 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 13 20:43:38.864376 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 13 20:43:38.864515 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 13 20:43:38.871367 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 13 20:43:38.871425 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 13 20:43:38.872680 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 13 20:43:38.873095 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 13 20:43:38.873142 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 13 20:43:38.874550 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 13 20:43:38.874578 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:43:38.878912 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 13 20:43:38.878995 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 13 20:43:38.879020 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:43:38.879136 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Jan 13 20:43:38.879158 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:43:38.879264 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 13 20:43:38.879284 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:43:38.879379 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 13 20:43:38.879399 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 13 20:43:38.879494 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 13 20:43:38.879514 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:43:38.879654 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:43:38.885839 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 13 20:43:38.885895 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 13 20:43:38.890173 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 13 20:43:38.890245 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:43:38.890535 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 13 20:43:38.890564 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 13 20:43:38.890760 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 13 20:43:38.890815 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:43:38.890961 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 13 20:43:38.890983 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:43:38.891238 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 13 20:43:38.891258 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 13 20:43:38.891532 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:43:38.891554 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:43:38.896128 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 13 20:43:38.896238 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 13 20:43:38.896265 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:43:38.896378 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 13 20:43:38.896400 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:43:38.896509 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 13 20:43:38.896530 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:43:38.896636 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:43:38.896655 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:43:38.898842 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 13 20:43:38.898892 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 13 20:43:38.929521 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 13 20:43:38.929574 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 13 20:43:38.929845 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 13 20:43:38.929950 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 13 20:43:38.929975 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 13 20:43:38.937940 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 13 20:43:38.944824 systemd[1]: Switching root. Jan 13 20:43:38.974330 systemd-journald[217]: Journal stopped Jan 13 20:43:34.739803 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241116 p3) 14.2.1 20241116, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Jan 13 18:58:40 -00 2025 Jan 13 20:43:34.739821 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 20:43:34.739828 kernel: Disabled fast string operations Jan 13 20:43:34.739832 kernel: BIOS-provided physical RAM map: Jan 13 20:43:34.739836 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Jan 13 20:43:34.739840 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Jan 13 20:43:34.739846 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Jan 13 20:43:34.739851 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Jan 13 20:43:34.739855 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Jan 13 20:43:34.739859 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Jan 13 20:43:34.739864 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Jan 13 20:43:34.739868 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Jan 13 20:43:34.739872 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Jan 13 20:43:34.739877 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Jan 13 20:43:34.739883 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Jan 13 20:43:34.739888 kernel: NX (Execute Disable) protection: active Jan 13 20:43:34.739893 kernel: APIC: Static calls initialized Jan 13 20:43:34.739898 kernel: SMBIOS 2.7 present. Jan 13 20:43:34.739903 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Jan 13 20:43:34.739908 kernel: vmware: hypercall mode: 0x00 Jan 13 20:43:34.739913 kernel: Hypervisor detected: VMware Jan 13 20:43:34.739918 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Jan 13 20:43:34.739924 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Jan 13 20:43:34.739929 kernel: vmware: using clock offset of 2521513259 ns Jan 13 20:43:34.739934 kernel: tsc: Detected 3408.000 MHz processor Jan 13 20:43:34.739939 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 13 20:43:34.739944 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 13 20:43:34.739949 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Jan 13 20:43:34.739954 kernel: total RAM covered: 3072M Jan 13 20:43:34.739959 kernel: Found optimal setting for mtrr clean up Jan 13 20:43:34.739967 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Jan 13 20:43:34.739972 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Jan 13 20:43:34.739978 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 13 20:43:34.739983 kernel: Using GB pages for direct mapping Jan 13 20:43:34.739988 kernel: ACPI: Early table checksum verification disabled Jan 13 20:43:34.739993 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Jan 13 20:43:34.739998 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Jan 13 20:43:34.740012 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Jan 13 20:43:34.740018 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Jan 13 20:43:34.740023 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:43:34.740031 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:43:34.740037 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Jan 13 20:43:34.740042 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Jan 13 20:43:34.740047 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Jan 13 20:43:34.740053 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Jan 13 20:43:34.740058 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Jan 13 20:43:34.740064 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Jan 13 20:43:34.740070 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Jan 13 20:43:34.740075 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Jan 13 20:43:34.740080 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:43:34.740085 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:43:34.740090 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Jan 13 20:43:34.740096 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Jan 13 20:43:34.740101 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Jan 13 20:43:34.740106 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Jan 13 20:43:34.740113 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Jan 13 20:43:34.740118 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Jan 13 20:43:34.740124 kernel: system APIC only can use physical flat Jan 13 20:43:34.740129 kernel: APIC: Switched APIC routing to: physical flat Jan 13 20:43:34.740134 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 13 20:43:34.740139 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 13 20:43:34.740145 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 13 20:43:34.740150 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 13 20:43:34.740155 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 13 20:43:34.740160 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 13 20:43:34.740167 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 13 20:43:34.740172 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 13 20:43:34.740177 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Jan 13 20:43:34.740182 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Jan 13 20:43:34.740187 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Jan 13 20:43:34.740192 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Jan 13 20:43:34.740197 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Jan 13 20:43:34.740202 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Jan 13 20:43:34.740207 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Jan 13 20:43:34.740213 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Jan 13 20:43:34.740219 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Jan 13 20:43:34.740224 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Jan 13 20:43:34.740229 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Jan 13 20:43:34.740234 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Jan 13 20:43:34.740239 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Jan 13 20:43:34.740244 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Jan 13 20:43:34.740250 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Jan 13 20:43:34.740255 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Jan 13 20:43:34.740260 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Jan 13 20:43:34.740265 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Jan 13 20:43:34.740271 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Jan 13 20:43:34.740276 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Jan 13 20:43:34.740281 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Jan 13 20:43:34.740287 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Jan 13 20:43:34.740292 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Jan 13 20:43:34.740298 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Jan 13 20:43:34.740303 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Jan 13 20:43:34.740308 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Jan 13 20:43:34.740313 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Jan 13 20:43:34.740318 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Jan 13 20:43:34.740324 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Jan 13 20:43:34.740330 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Jan 13 20:43:34.740335 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Jan 13 20:43:34.740342 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Jan 13 20:43:34.740350 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Jan 13 20:43:34.740355 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Jan 13 20:43:34.740360 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Jan 13 20:43:34.740365 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Jan 13 20:43:34.740370 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Jan 13 20:43:34.740377 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Jan 13 20:43:34.740387 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Jan 13 20:43:34.740396 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Jan 13 20:43:34.740402 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Jan 13 20:43:34.740407 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Jan 13 20:43:34.740412 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Jan 13 20:43:34.740418 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Jan 13 20:43:34.740423 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Jan 13 20:43:34.740428 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Jan 13 20:43:34.740433 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Jan 13 20:43:34.740438 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Jan 13 20:43:34.740443 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Jan 13 20:43:34.740450 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Jan 13 20:43:34.740455 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Jan 13 20:43:34.740464 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Jan 13 20:43:34.740470 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Jan 13 20:43:34.740476 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Jan 13 20:43:34.740484 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Jan 13 20:43:34.740492 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Jan 13 20:43:34.740500 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Jan 13 20:43:34.740509 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Jan 13 20:43:34.740518 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Jan 13 20:43:34.740525 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Jan 13 20:43:34.740533 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Jan 13 20:43:34.740540 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Jan 13 20:43:34.740548 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Jan 13 20:43:34.740556 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Jan 13 20:43:34.740563 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Jan 13 20:43:34.740571 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Jan 13 20:43:34.740579 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Jan 13 20:43:34.740588 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Jan 13 20:43:34.740596 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Jan 13 20:43:34.740604 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Jan 13 20:43:34.740612 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Jan 13 20:43:34.740619 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Jan 13 20:43:34.740625 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Jan 13 20:43:34.740630 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Jan 13 20:43:34.740635 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Jan 13 20:43:34.740640 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Jan 13 20:43:34.740646 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Jan 13 20:43:34.740653 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Jan 13 20:43:34.740658 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Jan 13 20:43:34.740664 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Jan 13 20:43:34.740669 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Jan 13 20:43:34.740675 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Jan 13 20:43:34.740680 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Jan 13 20:43:34.740685 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Jan 13 20:43:34.740691 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Jan 13 20:43:34.740696 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Jan 13 20:43:34.740702 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Jan 13 20:43:34.740707 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Jan 13 20:43:34.740714 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Jan 13 20:43:34.740720 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Jan 13 20:43:34.740725 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Jan 13 20:43:34.740730 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Jan 13 20:43:34.740736 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Jan 13 20:43:34.740742 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Jan 13 20:43:34.740747 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Jan 13 20:43:34.740752 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Jan 13 20:43:34.740758 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Jan 13 20:43:34.740763 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Jan 13 20:43:34.740835 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Jan 13 20:43:34.740842 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Jan 13 20:43:34.740848 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Jan 13 20:43:34.740853 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Jan 13 20:43:34.740859 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Jan 13 20:43:34.740864 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Jan 13 20:43:34.740869 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Jan 13 20:43:34.740875 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Jan 13 20:43:34.740880 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Jan 13 20:43:34.740886 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Jan 13 20:43:34.740893 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Jan 13 20:43:34.740899 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Jan 13 20:43:34.740904 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Jan 13 20:43:34.740910 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Jan 13 20:43:34.740915 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Jan 13 20:43:34.740921 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Jan 13 20:43:34.740926 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Jan 13 20:43:34.740932 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Jan 13 20:43:34.740937 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Jan 13 20:43:34.740943 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Jan 13 20:43:34.740949 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Jan 13 20:43:34.740955 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Jan 13 20:43:34.740960 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 13 20:43:34.740966 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 13 20:43:34.740972 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Jan 13 20:43:34.740977 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Jan 13 20:43:34.740983 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Jan 13 20:43:34.740989 kernel: Zone ranges: Jan 13 20:43:34.740994 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 13 20:43:34.741000 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Jan 13 20:43:34.741007 kernel: Normal empty Jan 13 20:43:34.741012 kernel: Movable zone start for each node Jan 13 20:43:34.741018 kernel: Early memory node ranges Jan 13 20:43:34.741023 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Jan 13 20:43:34.741029 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Jan 13 20:43:34.741034 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Jan 13 20:43:34.741040 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Jan 13 20:43:34.741046 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 13 20:43:34.741051 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Jan 13 20:43:34.741058 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Jan 13 20:43:34.741064 kernel: ACPI: PM-Timer IO Port: 0x1008 Jan 13 20:43:34.741070 kernel: system APIC only can use physical flat Jan 13 20:43:34.741075 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Jan 13 20:43:34.741081 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Jan 13 20:43:34.741087 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Jan 13 20:43:34.741092 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Jan 13 20:43:34.741098 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Jan 13 20:43:34.741103 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Jan 13 20:43:34.741109 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Jan 13 20:43:34.741116 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Jan 13 20:43:34.741121 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Jan 13 20:43:34.741127 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Jan 13 20:43:34.741132 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Jan 13 20:43:34.741137 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Jan 13 20:43:34.741143 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Jan 13 20:43:34.741148 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Jan 13 20:43:34.741154 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Jan 13 20:43:34.741159 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Jan 13 20:43:34.741166 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Jan 13 20:43:34.741171 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Jan 13 20:43:34.741177 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Jan 13 20:43:34.741182 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Jan 13 20:43:34.741188 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Jan 13 20:43:34.741193 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Jan 13 20:43:34.741199 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Jan 13 20:43:34.741204 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Jan 13 20:43:34.741210 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Jan 13 20:43:34.741215 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Jan 13 20:43:34.741222 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Jan 13 20:43:34.741228 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Jan 13 20:43:34.741233 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Jan 13 20:43:34.741239 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Jan 13 20:43:34.741244 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Jan 13 20:43:34.741250 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Jan 13 20:43:34.741255 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Jan 13 20:43:34.741261 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Jan 13 20:43:34.741266 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Jan 13 20:43:34.741272 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Jan 13 20:43:34.741278 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Jan 13 20:43:34.741284 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Jan 13 20:43:34.741290 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Jan 13 20:43:34.741295 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Jan 13 20:43:34.741301 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Jan 13 20:43:34.741306 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Jan 13 20:43:34.741312 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Jan 13 20:43:34.741317 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Jan 13 20:43:34.741323 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Jan 13 20:43:34.741330 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Jan 13 20:43:34.741336 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Jan 13 20:43:34.741341 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Jan 13 20:43:34.741347 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Jan 13 20:43:34.741352 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Jan 13 20:43:34.741358 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Jan 13 20:43:34.741363 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Jan 13 20:43:34.741369 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Jan 13 20:43:34.741374 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Jan 13 20:43:34.741379 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Jan 13 20:43:34.741386 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Jan 13 20:43:34.741392 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Jan 13 20:43:34.741397 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Jan 13 20:43:34.741403 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Jan 13 20:43:34.741408 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Jan 13 20:43:34.741414 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Jan 13 20:43:34.741419 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Jan 13 20:43:34.741425 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Jan 13 20:43:34.741430 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Jan 13 20:43:34.741435 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Jan 13 20:43:34.741442 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Jan 13 20:43:34.741448 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Jan 13 20:43:34.741453 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Jan 13 20:43:34.741459 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Jan 13 20:43:34.741464 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Jan 13 20:43:34.741470 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Jan 13 20:43:34.741475 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Jan 13 20:43:34.741481 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Jan 13 20:43:34.741487 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Jan 13 20:43:34.741493 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Jan 13 20:43:34.741499 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Jan 13 20:43:34.741504 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Jan 13 20:43:34.741509 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Jan 13 20:43:34.741515 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Jan 13 20:43:34.741520 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Jan 13 20:43:34.741526 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Jan 13 20:43:34.741531 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Jan 13 20:43:34.741537 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Jan 13 20:43:34.741542 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Jan 13 20:43:34.741549 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Jan 13 20:43:34.741554 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Jan 13 20:43:34.741560 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Jan 13 20:43:34.741565 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Jan 13 20:43:34.741571 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Jan 13 20:43:34.741577 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Jan 13 20:43:34.741582 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Jan 13 20:43:34.741588 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Jan 13 20:43:34.741593 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Jan 13 20:43:34.741599 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Jan 13 20:43:34.741605 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Jan 13 20:43:34.741610 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Jan 13 20:43:34.741616 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Jan 13 20:43:34.741621 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Jan 13 20:43:34.741627 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Jan 13 20:43:34.741633 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Jan 13 20:43:34.741638 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Jan 13 20:43:34.741644 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Jan 13 20:43:34.741650 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Jan 13 20:43:34.741655 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Jan 13 20:43:34.741662 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Jan 13 20:43:34.741668 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Jan 13 20:43:34.741673 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Jan 13 20:43:34.741683 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Jan 13 20:43:34.741689 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Jan 13 20:43:34.741694 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Jan 13 20:43:34.741700 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Jan 13 20:43:34.741705 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Jan 13 20:43:34.741711 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Jan 13 20:43:34.741718 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Jan 13 20:43:34.741723 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Jan 13 20:43:34.741729 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Jan 13 20:43:34.741734 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Jan 13 20:43:34.741740 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Jan 13 20:43:34.741745 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Jan 13 20:43:34.741750 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Jan 13 20:43:34.741756 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Jan 13 20:43:34.741761 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Jan 13 20:43:34.741767 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Jan 13 20:43:34.741783 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Jan 13 20:43:34.741789 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Jan 13 20:43:34.741805 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Jan 13 20:43:34.741811 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Jan 13 20:43:34.741816 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Jan 13 20:43:34.741822 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Jan 13 20:43:34.741827 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Jan 13 20:43:34.741833 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 13 20:43:34.741839 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Jan 13 20:43:34.741846 kernel: TSC deadline timer available Jan 13 20:43:34.741852 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Jan 13 20:43:34.741857 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Jan 13 20:43:34.741863 kernel: Booting paravirtualized kernel on VMware hypervisor Jan 13 20:43:34.741869 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 13 20:43:34.741875 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Jan 13 20:43:34.741880 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 13 20:43:34.741886 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 13 20:43:34.741892 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Jan 13 20:43:34.741899 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Jan 13 20:43:34.741904 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Jan 13 20:43:34.741910 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Jan 13 20:43:34.741915 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Jan 13 20:43:34.741928 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Jan 13 20:43:34.741935 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Jan 13 20:43:34.741943 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Jan 13 20:43:34.741948 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Jan 13 20:43:34.741954 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Jan 13 20:43:34.741961 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Jan 13 20:43:34.741967 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Jan 13 20:43:34.741973 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Jan 13 20:43:34.741979 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Jan 13 20:43:34.741984 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Jan 13 20:43:34.741990 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Jan 13 20:43:34.741997 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 20:43:34.742003 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 13 20:43:34.742010 kernel: random: crng init done Jan 13 20:43:34.742016 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Jan 13 20:43:34.742022 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Jan 13 20:43:34.742028 kernel: printk: log_buf_len min size: 262144 bytes Jan 13 20:43:34.742034 kernel: printk: log_buf_len: 1048576 bytes Jan 13 20:43:34.742039 kernel: printk: early log buf free: 239648(91%) Jan 13 20:43:34.742045 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 13 20:43:34.742051 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 13 20:43:34.742057 kernel: Fallback order for Node 0: 0 Jan 13 20:43:34.742064 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Jan 13 20:43:34.742070 kernel: Policy zone: DMA32 Jan 13 20:43:34.742076 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 13 20:43:34.742082 kernel: Memory: 1934348K/2096628K available (14336K kernel code, 2299K rwdata, 22800K rodata, 43320K init, 1756K bss, 162020K reserved, 0K cma-reserved) Jan 13 20:43:34.742090 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Jan 13 20:43:34.742097 kernel: ftrace: allocating 37890 entries in 149 pages Jan 13 20:43:34.742103 kernel: ftrace: allocated 149 pages with 4 groups Jan 13 20:43:34.742109 kernel: Dynamic Preempt: voluntary Jan 13 20:43:34.742115 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 13 20:43:34.742121 kernel: rcu: RCU event tracing is enabled. Jan 13 20:43:34.742127 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Jan 13 20:43:34.742133 kernel: Trampoline variant of Tasks RCU enabled. Jan 13 20:43:34.742139 kernel: Rude variant of Tasks RCU enabled. Jan 13 20:43:34.742145 kernel: Tracing variant of Tasks RCU enabled. Jan 13 20:43:34.742151 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 13 20:43:34.742158 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Jan 13 20:43:34.742164 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Jan 13 20:43:34.742170 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Jan 13 20:43:34.742176 kernel: Console: colour VGA+ 80x25 Jan 13 20:43:34.742182 kernel: printk: console [tty0] enabled Jan 13 20:43:34.742188 kernel: printk: console [ttyS0] enabled Jan 13 20:43:34.742194 kernel: ACPI: Core revision 20230628 Jan 13 20:43:34.742200 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Jan 13 20:43:34.742206 kernel: APIC: Switch to symmetric I/O mode setup Jan 13 20:43:34.742213 kernel: x2apic enabled Jan 13 20:43:34.742219 kernel: APIC: Switched APIC routing to: physical x2apic Jan 13 20:43:34.742225 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 13 20:43:34.742231 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:43:34.742237 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Jan 13 20:43:34.742243 kernel: Disabled fast string operations Jan 13 20:43:34.742248 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Jan 13 20:43:34.742254 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Jan 13 20:43:34.742261 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 13 20:43:34.742268 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Jan 13 20:43:34.742274 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Jan 13 20:43:34.742280 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 13 20:43:34.742286 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 13 20:43:34.742292 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 13 20:43:34.742298 kernel: RETBleed: Mitigation: Enhanced IBRS Jan 13 20:43:34.742304 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 13 20:43:34.742310 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 13 20:43:34.742316 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jan 13 20:43:34.742323 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 13 20:43:34.742329 kernel: GDS: Unknown: Dependent on hypervisor status Jan 13 20:43:34.742335 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 13 20:43:34.742341 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 13 20:43:34.742347 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 13 20:43:34.742354 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 13 20:43:34.742360 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jan 13 20:43:34.742366 kernel: Freeing SMP alternatives memory: 32K Jan 13 20:43:34.742372 kernel: pid_max: default: 131072 minimum: 1024 Jan 13 20:43:34.742379 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 13 20:43:34.742385 kernel: landlock: Up and running. Jan 13 20:43:34.742390 kernel: SELinux: Initializing. Jan 13 20:43:34.742396 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:43:34.742402 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:43:34.742408 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Jan 13 20:43:34.742415 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:43:34.742421 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:43:34.742427 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:43:34.742434 kernel: Performance Events: Skylake events, core PMU driver. Jan 13 20:43:34.742440 kernel: core: CPUID marked event: 'cpu cycles' unavailable Jan 13 20:43:34.742447 kernel: core: CPUID marked event: 'instructions' unavailable Jan 13 20:43:34.742452 kernel: core: CPUID marked event: 'bus cycles' unavailable Jan 13 20:43:34.742458 kernel: core: CPUID marked event: 'cache references' unavailable Jan 13 20:43:34.742464 kernel: core: CPUID marked event: 'cache misses' unavailable Jan 13 20:43:34.742469 kernel: core: CPUID marked event: 'branch instructions' unavailable Jan 13 20:43:34.742475 kernel: core: CPUID marked event: 'branch misses' unavailable Jan 13 20:43:34.742482 kernel: ... version: 1 Jan 13 20:43:34.742488 kernel: ... bit width: 48 Jan 13 20:43:34.742494 kernel: ... generic registers: 4 Jan 13 20:43:34.742500 kernel: ... value mask: 0000ffffffffffff Jan 13 20:43:34.742506 kernel: ... max period: 000000007fffffff Jan 13 20:43:34.742512 kernel: ... fixed-purpose events: 0 Jan 13 20:43:34.742518 kernel: ... event mask: 000000000000000f Jan 13 20:43:34.742524 kernel: signal: max sigframe size: 1776 Jan 13 20:43:34.742529 kernel: rcu: Hierarchical SRCU implementation. Jan 13 20:43:34.742536 kernel: rcu: Max phase no-delay instances is 400. Jan 13 20:43:34.742543 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 13 20:43:34.742548 kernel: smp: Bringing up secondary CPUs ... Jan 13 20:43:34.742554 kernel: smpboot: x86: Booting SMP configuration: Jan 13 20:43:34.742560 kernel: .... node #0, CPUs: #1 Jan 13 20:43:34.742566 kernel: Disabled fast string operations Jan 13 20:43:34.742572 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Jan 13 20:43:34.742578 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 13 20:43:34.742584 kernel: smp: Brought up 1 node, 2 CPUs Jan 13 20:43:34.742590 kernel: smpboot: Max logical packages: 128 Jan 13 20:43:34.742597 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Jan 13 20:43:34.742603 kernel: devtmpfs: initialized Jan 13 20:43:34.742609 kernel: x86/mm: Memory block size: 128MB Jan 13 20:43:34.742615 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Jan 13 20:43:34.742621 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 13 20:43:34.742627 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Jan 13 20:43:34.742633 kernel: pinctrl core: initialized pinctrl subsystem Jan 13 20:43:34.742639 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 13 20:43:34.742645 kernel: audit: initializing netlink subsys (disabled) Jan 13 20:43:34.742652 kernel: audit: type=2000 audit(1736801013.066:1): state=initialized audit_enabled=0 res=1 Jan 13 20:43:34.742658 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 13 20:43:34.742664 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 13 20:43:34.742669 kernel: cpuidle: using governor menu Jan 13 20:43:34.742675 kernel: Simple Boot Flag at 0x36 set to 0x80 Jan 13 20:43:34.742681 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 13 20:43:34.742687 kernel: dca service started, version 1.12.1 Jan 13 20:43:34.742693 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Jan 13 20:43:34.742699 kernel: PCI: Using configuration type 1 for base access Jan 13 20:43:34.742706 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 13 20:43:34.742714 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 13 20:43:34.742720 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 13 20:43:34.742726 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 13 20:43:34.742731 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 13 20:43:34.742737 kernel: ACPI: Added _OSI(Module Device) Jan 13 20:43:34.742743 kernel: ACPI: Added _OSI(Processor Device) Jan 13 20:43:34.742749 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 13 20:43:34.742755 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 13 20:43:34.742762 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 13 20:43:34.742912 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Jan 13 20:43:34.742922 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 13 20:43:34.742929 kernel: ACPI: Interpreter enabled Jan 13 20:43:34.742934 kernel: ACPI: PM: (supports S0 S1 S5) Jan 13 20:43:34.742940 kernel: ACPI: Using IOAPIC for interrupt routing Jan 13 20:43:34.742946 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 13 20:43:34.742952 kernel: PCI: Using E820 reservations for host bridge windows Jan 13 20:43:34.742958 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Jan 13 20:43:34.742966 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Jan 13 20:43:34.743048 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 13 20:43:34.743104 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Jan 13 20:43:34.743154 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Jan 13 20:43:34.743163 kernel: PCI host bridge to bus 0000:00 Jan 13 20:43:34.743213 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 13 20:43:34.743261 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Jan 13 20:43:34.743306 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 13 20:43:34.743350 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 13 20:43:34.743623 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Jan 13 20:43:34.743671 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Jan 13 20:43:34.743732 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Jan 13 20:43:34.743812 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Jan 13 20:43:34.743874 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Jan 13 20:43:34.743930 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Jan 13 20:43:34.743981 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Jan 13 20:43:34.744032 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Jan 13 20:43:34.744082 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Jan 13 20:43:34.744132 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Jan 13 20:43:34.744183 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Jan 13 20:43:34.744241 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Jan 13 20:43:34.744291 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Jan 13 20:43:34.744341 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Jan 13 20:43:34.744396 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Jan 13 20:43:34.744446 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Jan 13 20:43:34.744496 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Jan 13 20:43:34.744552 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Jan 13 20:43:34.744602 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Jan 13 20:43:34.744651 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Jan 13 20:43:34.744715 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Jan 13 20:43:34.744766 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Jan 13 20:43:34.744826 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 13 20:43:34.744880 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Jan 13 20:43:34.746824 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.746887 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.746948 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747001 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747055 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747107 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747164 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747214 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747269 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747319 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747373 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747424 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747481 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747532 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747586 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747636 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747702 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747754 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747833 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747886 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.747940 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.747991 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748045 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748096 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748153 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748204 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748258 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748309 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748362 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748413 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748470 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748521 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748574 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748625 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748693 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748749 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748814 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748865 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.748919 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.748971 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749024 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749075 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749132 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749184 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749238 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749289 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749344 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749396 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749450 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749504 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749558 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749609 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749679 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749748 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749881 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.749936 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.749990 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.750040 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.750094 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.750144 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.750198 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.750251 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.750304 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:43:34.750355 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.750464 kernel: pci_bus 0000:01: extended config space not accessible Jan 13 20:43:34.750522 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:43:34.750598 kernel: pci_bus 0000:02: extended config space not accessible Jan 13 20:43:34.752715 kernel: acpiphp: Slot [32] registered Jan 13 20:43:34.752723 kernel: acpiphp: Slot [33] registered Jan 13 20:43:34.752729 kernel: acpiphp: Slot [34] registered Jan 13 20:43:34.752735 kernel: acpiphp: Slot [35] registered Jan 13 20:43:34.752740 kernel: acpiphp: Slot [36] registered Jan 13 20:43:34.752746 kernel: acpiphp: Slot [37] registered Jan 13 20:43:34.752752 kernel: acpiphp: Slot [38] registered Jan 13 20:43:34.752758 kernel: acpiphp: Slot [39] registered Jan 13 20:43:34.752764 kernel: acpiphp: Slot [40] registered Jan 13 20:43:34.752840 kernel: acpiphp: Slot [41] registered Jan 13 20:43:34.752851 kernel: acpiphp: Slot [42] registered Jan 13 20:43:34.752857 kernel: acpiphp: Slot [43] registered Jan 13 20:43:34.752862 kernel: acpiphp: Slot [44] registered Jan 13 20:43:34.752868 kernel: acpiphp: Slot [45] registered Jan 13 20:43:34.752874 kernel: acpiphp: Slot [46] registered Jan 13 20:43:34.752880 kernel: acpiphp: Slot [47] registered Jan 13 20:43:34.752886 kernel: acpiphp: Slot [48] registered Jan 13 20:43:34.752892 kernel: acpiphp: Slot [49] registered Jan 13 20:43:34.752898 kernel: acpiphp: Slot [50] registered Jan 13 20:43:34.752905 kernel: acpiphp: Slot [51] registered Jan 13 20:43:34.752911 kernel: acpiphp: Slot [52] registered Jan 13 20:43:34.752917 kernel: acpiphp: Slot [53] registered Jan 13 20:43:34.752922 kernel: acpiphp: Slot [54] registered Jan 13 20:43:34.752928 kernel: acpiphp: Slot [55] registered Jan 13 20:43:34.752934 kernel: acpiphp: Slot [56] registered Jan 13 20:43:34.752940 kernel: acpiphp: Slot [57] registered Jan 13 20:43:34.752946 kernel: acpiphp: Slot [58] registered Jan 13 20:43:34.752952 kernel: acpiphp: Slot [59] registered Jan 13 20:43:34.752958 kernel: acpiphp: Slot [60] registered Jan 13 20:43:34.752965 kernel: acpiphp: Slot [61] registered Jan 13 20:43:34.752971 kernel: acpiphp: Slot [62] registered Jan 13 20:43:34.752977 kernel: acpiphp: Slot [63] registered Jan 13 20:43:34.753041 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jan 13 20:43:34.753096 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:43:34.753147 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:43:34.753198 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:43:34.753247 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Jan 13 20:43:34.753301 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Jan 13 20:43:34.753351 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Jan 13 20:43:34.753400 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Jan 13 20:43:34.753450 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Jan 13 20:43:34.753532 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Jan 13 20:43:34.753591 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Jan 13 20:43:34.753643 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Jan 13 20:43:34.753698 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:43:34.753750 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Jan 13 20:43:34.754844 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:43:34.754900 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:43:34.754952 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:43:34.755002 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:43:34.755055 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:43:34.755105 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:43:34.755158 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:43:34.755209 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:43:34.755260 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:43:34.755312 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:43:34.755362 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:43:34.755414 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:43:34.755465 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:43:34.755518 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:43:34.755569 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:43:34.755621 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:43:34.755671 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:43:34.755721 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:43:34.756816 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:43:34.756905 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:43:34.756957 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:43:34.757009 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:43:34.757059 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:43:34.757110 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:43:34.757161 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:43:34.757214 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:43:34.757264 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:43:34.757320 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Jan 13 20:43:34.757373 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Jan 13 20:43:34.757424 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Jan 13 20:43:34.757475 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Jan 13 20:43:34.757526 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Jan 13 20:43:34.757577 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:43:34.757631 kernel: pci 0000:0b:00.0: supports D1 D2 Jan 13 20:43:34.757687 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Jan 13 20:43:34.757739 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:43:34.759809 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:43:34.759862 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:43:34.759913 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:43:34.759964 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:43:34.760015 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:43:34.760068 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:43:34.760119 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:43:34.760171 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:43:34.760221 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:43:34.760271 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:43:34.760321 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:43:34.760373 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:43:34.760425 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:43:34.760476 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:43:34.760528 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:43:34.760578 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:43:34.760629 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:43:34.760683 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:43:34.760734 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:43:34.760797 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:43:34.760853 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:43:34.760904 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:43:34.760955 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:43:34.761006 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:43:34.761056 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:43:34.761107 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:43:34.761158 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:43:34.761208 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:43:34.761261 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:43:34.761312 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:43:34.761364 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:43:34.761415 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:43:34.761465 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:43:34.761515 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:43:34.761567 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:43:34.761617 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:43:34.761670 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:43:34.761724 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:43:34.762796 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:43:34.762852 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:43:34.762902 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:43:34.762952 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:43:34.763002 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:43:34.763055 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:43:34.763105 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:43:34.763155 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:43:34.763205 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:43:34.763256 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:43:34.763305 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:43:34.763354 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:43:34.763405 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:43:34.763457 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:43:34.763506 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:43:34.763557 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:43:34.763607 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:43:34.763657 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:43:34.763706 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:43:34.763757 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:43:34.765572 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:43:34.765629 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:43:34.765680 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:43:34.765732 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:43:34.765808 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:43:34.765860 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:43:34.765910 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:43:34.765960 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:43:34.766010 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:43:34.766065 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:43:34.766114 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:43:34.766164 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:43:34.766216 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:43:34.766265 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:43:34.766315 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:43:34.766365 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:43:34.766416 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:43:34.766468 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:43:34.766519 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:43:34.766569 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:43:34.766618 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:43:34.766627 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Jan 13 20:43:34.766633 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Jan 13 20:43:34.766639 kernel: ACPI: PCI: Interrupt link LNKB disabled Jan 13 20:43:34.766646 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 13 20:43:34.766653 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Jan 13 20:43:34.766659 kernel: iommu: Default domain type: Translated Jan 13 20:43:34.766665 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 13 20:43:34.766672 kernel: PCI: Using ACPI for IRQ routing Jan 13 20:43:34.766678 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 13 20:43:34.766684 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Jan 13 20:43:34.766690 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Jan 13 20:43:34.766740 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Jan 13 20:43:34.766804 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Jan 13 20:43:34.766859 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 13 20:43:34.766868 kernel: vgaarb: loaded Jan 13 20:43:34.766874 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Jan 13 20:43:34.766880 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Jan 13 20:43:34.766886 kernel: clocksource: Switched to clocksource tsc-early Jan 13 20:43:34.766892 kernel: VFS: Disk quotas dquot_6.6.0 Jan 13 20:43:34.766898 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 13 20:43:34.766904 kernel: pnp: PnP ACPI init Jan 13 20:43:34.766956 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Jan 13 20:43:34.767006 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Jan 13 20:43:34.767052 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Jan 13 20:43:34.767102 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Jan 13 20:43:34.767152 kernel: pnp 00:06: [dma 2] Jan 13 20:43:34.767202 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Jan 13 20:43:34.767248 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Jan 13 20:43:34.767296 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Jan 13 20:43:34.767305 kernel: pnp: PnP ACPI: found 8 devices Jan 13 20:43:34.767311 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 13 20:43:34.767317 kernel: NET: Registered PF_INET protocol family Jan 13 20:43:34.767323 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 13 20:43:34.767329 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 13 20:43:34.767335 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 13 20:43:34.767342 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 13 20:43:34.767347 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 13 20:43:34.767355 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 13 20:43:34.767361 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:43:34.767367 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:43:34.767373 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 13 20:43:34.767379 kernel: NET: Registered PF_XDP protocol family Jan 13 20:43:34.767430 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Jan 13 20:43:34.767482 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 13 20:43:34.767532 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 13 20:43:34.767586 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 13 20:43:34.767637 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 13 20:43:34.767698 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 13 20:43:34.767750 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 13 20:43:34.767831 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 13 20:43:34.767888 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 13 20:43:34.767939 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 13 20:43:34.767990 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 13 20:43:34.768041 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 13 20:43:34.768374 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 13 20:43:34.768502 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 13 20:43:34.768787 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 13 20:43:34.768848 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 13 20:43:34.768923 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 13 20:43:34.768976 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 13 20:43:34.769027 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 13 20:43:34.769078 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 13 20:43:34.769132 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 13 20:43:34.769183 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 13 20:43:34.769234 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Jan 13 20:43:34.769284 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:43:34.769334 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:43:34.769384 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.769437 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.769489 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.769539 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.769589 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.769638 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.769688 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.769737 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.769818 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.769870 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.769923 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.769972 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770022 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770071 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770120 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770169 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770219 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770268 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770320 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770370 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770419 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770468 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770517 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770567 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770617 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770666 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770722 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770779 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770830 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770880 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.770930 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.770979 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771029 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771079 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771132 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771182 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771231 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771281 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771330 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771380 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771430 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771480 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771533 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771582 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771643 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771708 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771761 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771867 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.771917 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.771979 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772034 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772087 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772137 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772186 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772236 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772286 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772336 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772384 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772434 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772484 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772533 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772586 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772636 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772685 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772735 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772804 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772856 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.772906 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.772956 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773005 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773058 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773132 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773203 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773257 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773307 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773356 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773406 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773455 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773505 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773556 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773610 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773661 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773719 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:43:34.773826 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:43:34.773885 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:43:34.773936 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Jan 13 20:43:34.773986 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:43:34.774050 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:43:34.774098 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:43:34.774153 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Jan 13 20:43:34.774202 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:43:34.774250 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:43:34.774298 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:43:34.774346 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:43:34.774395 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:43:34.774443 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:43:34.774491 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:43:34.774542 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:43:34.774591 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:43:34.774640 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:43:34.774695 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:43:34.774744 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:43:34.774857 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:43:34.774910 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:43:34.774960 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:43:34.775010 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:43:34.775063 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:43:34.775112 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:43:34.775164 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:43:34.775213 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:43:34.775263 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:43:34.775312 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:43:34.775363 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:43:34.775412 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:43:34.775462 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:43:34.775511 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:43:34.775561 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:43:34.775613 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Jan 13 20:43:34.775664 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:43:34.775714 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:43:34.775765 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:43:34.777841 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:43:34.777902 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:43:34.777955 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:43:34.778006 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:43:34.778057 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:43:34.778110 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:43:34.778161 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:43:34.778211 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:43:34.778262 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:43:34.778312 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:43:34.778364 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:43:34.778414 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:43:34.778464 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:43:34.778514 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:43:34.778564 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:43:34.778615 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:43:34.778665 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:43:34.778721 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:43:34.778784 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:43:34.778841 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:43:34.778891 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:43:34.778941 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:43:34.778991 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:43:34.779041 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:43:34.779092 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:43:34.779143 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:43:34.779192 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:43:34.779243 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:43:34.779295 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:43:34.779348 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:43:34.779397 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:43:34.779447 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:43:34.779498 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:43:34.779548 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:43:34.779598 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:43:34.779648 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:43:34.779698 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:43:34.779749 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:43:34.779829 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:43:34.779880 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:43:34.779931 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:43:34.779980 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:43:34.780030 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:43:34.780080 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:43:34.780129 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:43:34.780179 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:43:34.780228 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:43:34.780278 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:43:34.780331 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:43:34.780382 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:43:34.780431 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:43:34.780481 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:43:34.780531 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:43:34.780581 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:43:34.780630 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:43:34.780683 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:43:34.780734 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:43:34.780856 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:43:34.780910 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:43:34.780960 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:43:34.781011 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:43:34.781061 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:43:34.781111 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:43:34.781161 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:43:34.781211 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:43:34.781263 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:43:34.781642 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:43:34.781704 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:43:34.781757 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:43:34.782002 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:43:34.782056 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:43:34.782108 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:43:34.782158 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:43:34.782207 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:43:34.782257 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:43:34.782306 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:43:34.782358 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:43:34.782406 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:43:34.782451 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:43:34.782494 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:43:34.782537 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:43:34.782580 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:43:34.782627 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Jan 13 20:43:34.782673 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Jan 13 20:43:34.782738 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:43:34.782858 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:43:34.782906 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:43:34.782951 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:43:34.782997 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:43:34.783042 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:43:34.783094 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Jan 13 20:43:34.783143 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Jan 13 20:43:34.783208 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:43:34.783808 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Jan 13 20:43:34.783864 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Jan 13 20:43:34.783913 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:43:34.783965 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Jan 13 20:43:34.784013 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Jan 13 20:43:34.784062 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:43:34.784112 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Jan 13 20:43:34.784159 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:43:34.784208 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Jan 13 20:43:34.784255 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:43:34.784305 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Jan 13 20:43:34.784353 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:43:34.784403 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Jan 13 20:43:34.784449 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:43:34.784502 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Jan 13 20:43:34.784556 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:43:34.784611 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Jan 13 20:43:34.784661 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Jan 13 20:43:34.784711 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:43:34.784762 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Jan 13 20:43:34.784844 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Jan 13 20:43:34.784891 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:43:34.784941 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Jan 13 20:43:34.784989 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Jan 13 20:43:34.785041 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:43:34.785091 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Jan 13 20:43:34.785137 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:43:34.785187 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Jan 13 20:43:34.785234 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:43:34.785283 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Jan 13 20:43:34.785332 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:43:34.785382 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Jan 13 20:43:34.785428 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:43:34.785477 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Jan 13 20:43:34.785523 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:43:34.785574 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Jan 13 20:43:34.785622 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Jan 13 20:43:34.785668 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:43:34.785722 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Jan 13 20:43:34.786805 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Jan 13 20:43:34.786875 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:43:34.786931 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Jan 13 20:43:34.786979 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Jan 13 20:43:34.787028 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:43:34.787080 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Jan 13 20:43:34.787128 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:43:34.787177 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Jan 13 20:43:34.787224 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:43:34.787274 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Jan 13 20:43:34.787323 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:43:34.787374 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Jan 13 20:43:34.787421 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:43:34.787471 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Jan 13 20:43:34.787518 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:43:34.787570 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 13 20:43:34.787620 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Jan 13 20:43:34.787667 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:43:34.787729 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Jan 13 20:43:34.787970 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Jan 13 20:43:34.788023 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:43:34.788074 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Jan 13 20:43:34.788122 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:43:34.788176 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Jan 13 20:43:34.788223 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:43:34.788275 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Jan 13 20:43:34.788323 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:43:34.788373 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Jan 13 20:43:34.788420 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:43:34.788472 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Jan 13 20:43:34.788519 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:43:34.788569 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Jan 13 20:43:34.788616 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:43:34.788671 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jan 13 20:43:34.788686 kernel: PCI: CLS 32 bytes, default 64 Jan 13 20:43:34.788695 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 13 20:43:34.788702 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:43:34.788709 kernel: clocksource: Switched to clocksource tsc Jan 13 20:43:34.788715 kernel: Initialise system trusted keyrings Jan 13 20:43:34.788722 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 13 20:43:34.788728 kernel: Key type asymmetric registered Jan 13 20:43:34.788735 kernel: Asymmetric key parser 'x509' registered Jan 13 20:43:34.788741 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 13 20:43:34.788747 kernel: io scheduler mq-deadline registered Jan 13 20:43:34.788755 kernel: io scheduler kyber registered Jan 13 20:43:34.788762 kernel: io scheduler bfq registered Jan 13 20:43:34.790456 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Jan 13 20:43:34.790519 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.790576 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Jan 13 20:43:34.790630 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.790687 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Jan 13 20:43:34.790741 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.790835 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Jan 13 20:43:34.790888 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.790941 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Jan 13 20:43:34.790992 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791044 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Jan 13 20:43:34.791095 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791151 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Jan 13 20:43:34.791202 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791255 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Jan 13 20:43:34.791305 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791358 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Jan 13 20:43:34.791412 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791464 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Jan 13 20:43:34.791515 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791567 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Jan 13 20:43:34.791618 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791671 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Jan 13 20:43:34.791729 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791806 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Jan 13 20:43:34.791862 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.791915 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Jan 13 20:43:34.791967 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.792371 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Jan 13 20:43:34.792434 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.792489 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Jan 13 20:43:34.792543 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.792597 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Jan 13 20:43:34.792650 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.792703 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Jan 13 20:43:34.792758 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.793869 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Jan 13 20:43:34.793929 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.793985 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Jan 13 20:43:34.794039 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.794096 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Jan 13 20:43:34.794150 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.794203 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Jan 13 20:43:34.794254 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.794306 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Jan 13 20:43:34.794358 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.794410 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Jan 13 20:43:34.794464 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.794517 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Jan 13 20:43:34.794569 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.794623 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Jan 13 20:43:34.794678 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.794732 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Jan 13 20:43:34.795806 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.795872 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Jan 13 20:43:34.795929 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.795984 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Jan 13 20:43:34.796038 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.796096 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Jan 13 20:43:34.796148 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.796201 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Jan 13 20:43:34.796253 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.796305 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Jan 13 20:43:34.796356 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:43:34.796368 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 13 20:43:34.796374 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 13 20:43:34.796381 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 13 20:43:34.796388 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Jan 13 20:43:34.796394 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 13 20:43:34.796401 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 13 20:43:34.796452 kernel: rtc_cmos 00:01: registered as rtc0 Jan 13 20:43:34.796500 kernel: rtc_cmos 00:01: setting system clock to 2025-01-13T20:43:34 UTC (1736801014) Jan 13 20:43:34.796548 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Jan 13 20:43:34.796557 kernel: intel_pstate: CPU model not supported Jan 13 20:43:34.796564 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 13 20:43:34.796571 kernel: NET: Registered PF_INET6 protocol family Jan 13 20:43:34.796577 kernel: Segment Routing with IPv6 Jan 13 20:43:34.796584 kernel: In-situ OAM (IOAM) with IPv6 Jan 13 20:43:34.796590 kernel: NET: Registered PF_PACKET protocol family Jan 13 20:43:34.796596 kernel: Key type dns_resolver registered Jan 13 20:43:34.796605 kernel: IPI shorthand broadcast: enabled Jan 13 20:43:34.796611 kernel: sched_clock: Marking stable (866003432, 222486605)->(1145558823, -57068786) Jan 13 20:43:34.796619 kernel: registered taskstats version 1 Jan 13 20:43:34.796625 kernel: Loading compiled-in X.509 certificates Jan 13 20:43:34.796631 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: ede78b3e719729f95eaaf7cb6a5289b567f6ee3e' Jan 13 20:43:34.796638 kernel: Key type .fscrypt registered Jan 13 20:43:34.796644 kernel: Key type fscrypt-provisioning registered Jan 13 20:43:34.796650 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 13 20:43:34.796657 kernel: ima: Allocated hash algorithm: sha1 Jan 13 20:43:34.796665 kernel: ima: No architecture policies found Jan 13 20:43:34.796671 kernel: clk: Disabling unused clocks Jan 13 20:43:34.796678 kernel: Freeing unused kernel image (initmem) memory: 43320K Jan 13 20:43:34.796684 kernel: Write protecting the kernel read-only data: 38912k Jan 13 20:43:34.796690 kernel: Freeing unused kernel image (rodata/data gap) memory: 1776K Jan 13 20:43:34.796697 kernel: Run /init as init process Jan 13 20:43:34.796703 kernel: with arguments: Jan 13 20:43:34.796709 kernel: /init Jan 13 20:43:34.796716 kernel: with environment: Jan 13 20:43:34.796723 kernel: HOME=/ Jan 13 20:43:34.796729 kernel: TERM=linux Jan 13 20:43:34.796735 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 13 20:43:34.796743 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 20:43:34.796751 systemd[1]: Detected virtualization vmware. Jan 13 20:43:34.797177 systemd[1]: Detected architecture x86-64. Jan 13 20:43:34.797186 systemd[1]: Running in initrd. Jan 13 20:43:34.797193 systemd[1]: No hostname configured, using default hostname. Jan 13 20:43:34.797202 systemd[1]: Hostname set to . Jan 13 20:43:34.797208 systemd[1]: Initializing machine ID from random generator. Jan 13 20:43:34.797215 systemd[1]: Queued start job for default target initrd.target. Jan 13 20:43:34.797221 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:43:34.797228 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:43:34.797235 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 13 20:43:34.797242 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 20:43:34.797249 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 13 20:43:34.797257 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 13 20:43:34.797265 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 13 20:43:34.797271 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 13 20:43:34.797278 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:43:34.797285 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:43:34.797292 systemd[1]: Reached target paths.target - Path Units. Jan 13 20:43:34.797298 systemd[1]: Reached target slices.target - Slice Units. Jan 13 20:43:34.797307 systemd[1]: Reached target swap.target - Swaps. Jan 13 20:43:34.797313 systemd[1]: Reached target timers.target - Timer Units. Jan 13 20:43:34.797319 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:43:34.797326 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:43:34.797333 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 13 20:43:34.797339 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 13 20:43:34.797346 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:43:34.797353 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 20:43:34.797361 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:43:34.797368 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 20:43:34.797374 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 13 20:43:34.797381 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 20:43:34.797388 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 13 20:43:34.797394 systemd[1]: Starting systemd-fsck-usr.service... Jan 13 20:43:34.797401 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 20:43:34.797724 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 20:43:34.797736 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:43:34.797758 systemd-journald[217]: Collecting audit messages is disabled. Jan 13 20:43:34.797784 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 13 20:43:34.797791 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:43:34.797798 systemd[1]: Finished systemd-fsck-usr.service. Jan 13 20:43:34.797808 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 20:43:34.797815 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 13 20:43:34.797822 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:43:34.797829 kernel: Bridge firewalling registered Jan 13 20:43:34.797837 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:43:34.797843 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 20:43:34.797850 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:43:34.797857 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 20:43:34.797864 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 20:43:34.797871 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:43:34.797877 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:43:34.797885 systemd-journald[217]: Journal started Jan 13 20:43:34.797901 systemd-journald[217]: Runtime Journal (/run/log/journal/5e3b0bbb7bbf4bfda61a998dad51be59) is 4.8M, max 38.6M, 33.8M free. Jan 13 20:43:34.752809 systemd-modules-load[218]: Inserted module 'overlay' Jan 13 20:43:34.773713 systemd-modules-load[218]: Inserted module 'br_netfilter' Jan 13 20:43:34.800048 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 20:43:34.800401 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:43:34.804939 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 13 20:43:34.808135 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 20:43:34.812395 dracut-cmdline[248]: dracut-dracut-053 Jan 13 20:43:34.812700 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:43:34.813865 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 20:43:34.816645 dracut-cmdline[248]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 20:43:34.842835 systemd-resolved[260]: Positive Trust Anchors: Jan 13 20:43:34.842842 systemd-resolved[260]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 20:43:34.842866 systemd-resolved[260]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 20:43:34.844430 systemd-resolved[260]: Defaulting to hostname 'linux'. Jan 13 20:43:34.845272 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 20:43:34.845419 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:43:34.863781 kernel: SCSI subsystem initialized Jan 13 20:43:34.869797 kernel: Loading iSCSI transport class v2.0-870. Jan 13 20:43:34.875780 kernel: iscsi: registered transport (tcp) Jan 13 20:43:34.889784 kernel: iscsi: registered transport (qla4xxx) Jan 13 20:43:34.889818 kernel: QLogic iSCSI HBA Driver Jan 13 20:43:34.909259 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 13 20:43:34.913872 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 13 20:43:34.928182 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 13 20:43:34.928212 kernel: device-mapper: uevent: version 1.0.3 Jan 13 20:43:34.929240 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 13 20:43:34.960789 kernel: raid6: avx2x4 gen() 47320 MB/s Jan 13 20:43:34.976788 kernel: raid6: avx2x2 gen() 52062 MB/s Jan 13 20:43:34.994040 kernel: raid6: avx2x1 gen() 42997 MB/s Jan 13 20:43:34.994084 kernel: raid6: using algorithm avx2x2 gen() 52062 MB/s Jan 13 20:43:35.011986 kernel: raid6: .... xor() 30694 MB/s, rmw enabled Jan 13 20:43:35.012028 kernel: raid6: using avx2x2 recovery algorithm Jan 13 20:43:35.025783 kernel: xor: automatically using best checksumming function avx Jan 13 20:43:35.116808 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 13 20:43:35.122317 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:43:35.127862 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:43:35.135109 systemd-udevd[435]: Using default interface naming scheme 'v255'. Jan 13 20:43:35.137566 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:43:35.144861 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 13 20:43:35.151395 dracut-pre-trigger[438]: rd.md=0: removing MD RAID activation Jan 13 20:43:35.167323 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:43:35.171840 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 20:43:35.241574 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:43:35.244872 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 13 20:43:35.251573 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 13 20:43:35.252346 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:43:35.252641 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:43:35.252906 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 20:43:35.256881 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 13 20:43:35.265286 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:43:35.306800 kernel: VMware PVSCSI driver - version 1.0.7.0-k Jan 13 20:43:35.317256 kernel: vmw_pvscsi: using 64bit dma Jan 13 20:43:35.317278 kernel: vmw_pvscsi: max_id: 16 Jan 13 20:43:35.317286 kernel: vmw_pvscsi: setting ring_pages to 8 Jan 13 20:43:35.319846 kernel: cryptd: max_cpu_qlen set to 1000 Jan 13 20:43:35.319864 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Jan 13 20:43:35.321802 kernel: vmw_pvscsi: enabling reqCallThreshold Jan 13 20:43:35.321818 kernel: vmw_pvscsi: driver-based request coalescing enabled Jan 13 20:43:35.321827 kernel: vmw_pvscsi: using MSI-X Jan 13 20:43:35.329639 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Jan 13 20:43:35.329680 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Jan 13 20:43:35.339359 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Jan 13 20:43:35.339445 kernel: AVX2 version of gcm_enc/dec engaged. Jan 13 20:43:35.339459 kernel: AES CTR mode by8 optimization enabled Jan 13 20:43:35.339467 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Jan 13 20:43:35.339537 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Jan 13 20:43:35.334828 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:43:35.334906 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:43:35.335073 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:43:35.335166 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:43:35.335228 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:43:35.335335 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:43:35.340612 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:43:35.343811 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Jan 13 20:43:35.349791 kernel: libata version 3.00 loaded. Jan 13 20:43:35.365434 kernel: ata_piix 0000:00:07.1: version 2.13 Jan 13 20:43:35.366353 kernel: scsi host1: ata_piix Jan 13 20:43:35.368120 kernel: scsi host2: ata_piix Jan 13 20:43:35.368188 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Jan 13 20:43:35.368197 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Jan 13 20:43:35.368948 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Jan 13 20:43:35.377371 kernel: sd 0:0:0:0: [sda] Write Protect is off Jan 13 20:43:35.377464 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Jan 13 20:43:35.377557 kernel: sd 0:0:0:0: [sda] Cache data unavailable Jan 13 20:43:35.377672 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Jan 13 20:43:35.377755 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:43:35.377767 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jan 13 20:43:35.374578 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:43:35.378891 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:43:35.389101 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:43:35.535795 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Jan 13 20:43:35.539835 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Jan 13 20:43:35.560781 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Jan 13 20:43:35.568673 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 13 20:43:35.568691 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jan 13 20:43:35.570778 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (491) Jan 13 20:43:35.577252 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 20:43:35.579740 kernel: BTRFS: device fsid 7f507843-6957-466b-8fb7-5bee228b170a devid 1 transid 44 /dev/sda3 scanned by (udev-worker) (498) Jan 13 20:43:35.582645 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Jan 13 20:43:35.585544 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Jan 13 20:43:35.587687 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Jan 13 20:43:35.587831 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Jan 13 20:43:35.596859 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 13 20:43:35.622786 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:43:36.629846 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:43:36.629892 disk-uuid[595]: The operation has completed successfully. Jan 13 20:43:36.673198 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 13 20:43:36.673259 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 13 20:43:36.677957 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 13 20:43:36.679711 sh[612]: Success Jan 13 20:43:36.687844 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Jan 13 20:43:36.729908 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 13 20:43:36.735522 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 13 20:43:36.735828 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 13 20:43:36.750848 kernel: BTRFS info (device dm-0): first mount of filesystem 7f507843-6957-466b-8fb7-5bee228b170a Jan 13 20:43:36.750871 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:43:36.750880 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 13 20:43:36.753196 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 13 20:43:36.753209 kernel: BTRFS info (device dm-0): using free space tree Jan 13 20:43:36.760785 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 13 20:43:36.762366 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 13 20:43:36.776963 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Jan 13 20:43:36.778125 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 13 20:43:36.800795 kernel: BTRFS info (device sda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:43:36.800817 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:43:36.800826 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:43:36.819119 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:43:36.823252 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 13 20:43:36.824779 kernel: BTRFS info (device sda6): last unmount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:43:36.827893 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 13 20:43:36.832000 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 13 20:43:36.843534 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:43:36.848845 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 13 20:43:36.904623 ignition[671]: Ignition 2.20.0 Jan 13 20:43:36.904632 ignition[671]: Stage: fetch-offline Jan 13 20:43:36.904651 ignition[671]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:43:36.904656 ignition[671]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:43:36.904706 ignition[671]: parsed url from cmdline: "" Jan 13 20:43:36.904708 ignition[671]: no config URL provided Jan 13 20:43:36.904710 ignition[671]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 20:43:36.904715 ignition[671]: no config at "/usr/lib/ignition/user.ign" Jan 13 20:43:36.905115 ignition[671]: config successfully fetched Jan 13 20:43:36.905131 ignition[671]: parsing config with SHA512: f6fbbbb403358a4cc64f616ca18a0b3fbdba56da6ce872139f7626b5376dbbb3f8db144313a4c698bf4717e921d94884fc5f2a85c82a0784fb194e291d8da999 Jan 13 20:43:36.907550 unknown[671]: fetched base config from "system" Jan 13 20:43:36.907825 ignition[671]: fetch-offline: fetch-offline passed Jan 13 20:43:36.907556 unknown[671]: fetched user config from "vmware" Jan 13 20:43:36.907865 ignition[671]: Ignition finished successfully Jan 13 20:43:36.908366 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:43:36.910213 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:43:36.913852 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 20:43:36.925041 systemd-networkd[806]: lo: Link UP Jan 13 20:43:36.925047 systemd-networkd[806]: lo: Gained carrier Jan 13 20:43:36.925703 systemd-networkd[806]: Enumeration completed Jan 13 20:43:36.925961 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 20:43:36.925983 systemd-networkd[806]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Jan 13 20:43:36.926090 systemd[1]: Reached target network.target - Network. Jan 13 20:43:36.926179 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jan 13 20:43:36.930578 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 20:43:36.930678 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 20:43:36.929019 systemd-networkd[806]: ens192: Link UP Jan 13 20:43:36.929022 systemd-networkd[806]: ens192: Gained carrier Jan 13 20:43:36.940125 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 13 20:43:36.945693 ignition[808]: Ignition 2.20.0 Jan 13 20:43:36.945700 ignition[808]: Stage: kargs Jan 13 20:43:36.945801 ignition[808]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:43:36.945807 ignition[808]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:43:36.946277 ignition[808]: kargs: kargs passed Jan 13 20:43:36.946302 ignition[808]: Ignition finished successfully Jan 13 20:43:36.947213 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 13 20:43:36.952033 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 13 20:43:36.957811 ignition[815]: Ignition 2.20.0 Jan 13 20:43:36.957818 ignition[815]: Stage: disks Jan 13 20:43:36.957925 ignition[815]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:43:36.957931 ignition[815]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:43:36.958440 ignition[815]: disks: disks passed Jan 13 20:43:36.958470 ignition[815]: Ignition finished successfully Jan 13 20:43:36.959036 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 13 20:43:36.959362 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 13 20:43:36.959458 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 13 20:43:36.959554 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 20:43:36.959636 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 20:43:36.959717 systemd[1]: Reached target basic.target - Basic System. Jan 13 20:43:36.965934 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 13 20:43:36.976030 systemd-fsck[823]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 13 20:43:36.976830 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 13 20:43:36.979828 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 13 20:43:37.030243 kernel: EXT4-fs (sda9): mounted filesystem 59ba8ffc-e6b0-4bb4-a36e-13a47bd6ad99 r/w with ordered data mode. Quota mode: none. Jan 13 20:43:37.030172 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 13 20:43:37.030628 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 13 20:43:37.039939 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:43:37.041215 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 13 20:43:37.041503 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 13 20:43:37.041528 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 13 20:43:37.041542 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:43:37.044292 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 13 20:43:37.047801 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (831) Jan 13 20:43:37.050426 kernel: BTRFS info (device sda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:43:37.050440 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:43:37.050448 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:43:37.051852 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 13 20:43:37.053834 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:43:37.054085 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:43:37.076500 initrd-setup-root[855]: cut: /sysroot/etc/passwd: No such file or directory Jan 13 20:43:37.078783 initrd-setup-root[862]: cut: /sysroot/etc/group: No such file or directory Jan 13 20:43:37.080986 initrd-setup-root[869]: cut: /sysroot/etc/shadow: No such file or directory Jan 13 20:43:37.083306 initrd-setup-root[876]: cut: /sysroot/etc/gshadow: No such file or directory Jan 13 20:43:37.132580 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 13 20:43:37.139899 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 13 20:43:37.141241 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 13 20:43:37.145860 kernel: BTRFS info (device sda6): last unmount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:43:37.161110 ignition[944]: INFO : Ignition 2.20.0 Jan 13 20:43:37.161110 ignition[944]: INFO : Stage: mount Jan 13 20:43:37.161561 ignition[944]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:43:37.161561 ignition[944]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:43:37.162405 ignition[944]: INFO : mount: mount passed Jan 13 20:43:37.162405 ignition[944]: INFO : Ignition finished successfully Jan 13 20:43:37.161887 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 13 20:43:37.162282 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 13 20:43:37.165938 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 13 20:43:37.749242 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 13 20:43:37.753893 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:43:37.766224 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (955) Jan 13 20:43:37.766250 kernel: BTRFS info (device sda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 20:43:37.766267 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:43:37.766278 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:43:37.769782 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:43:37.771021 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:43:37.786609 ignition[972]: INFO : Ignition 2.20.0 Jan 13 20:43:37.786609 ignition[972]: INFO : Stage: files Jan 13 20:43:37.787116 ignition[972]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:43:37.787116 ignition[972]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:43:37.787390 ignition[972]: DEBUG : files: compiled without relabeling support, skipping Jan 13 20:43:37.788010 ignition[972]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 13 20:43:37.788010 ignition[972]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 13 20:43:37.790039 ignition[972]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 13 20:43:37.790198 ignition[972]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 13 20:43:37.790351 ignition[972]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 13 20:43:37.790260 unknown[972]: wrote ssh authorized keys file for user: core Jan 13 20:43:37.792405 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:43:37.792653 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 13 20:43:37.833584 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 13 20:43:37.976453 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:43:37.976799 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 13 20:43:37.976799 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 13 20:43:37.976799 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:43:37.976799 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:43:37.976799 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:43:37.976799 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:43:37.978147 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.29.2-x86-64.raw: attempt #1 Jan 13 20:43:38.460293 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 13 20:43:38.699724 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:43:38.699999 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:43:38.699999 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:43:38.699999 ignition[972]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Jan 13 20:43:38.699999 ignition[972]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:43:38.700601 ignition[972]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:43:38.700601 ignition[972]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Jan 13 20:43:38.700601 ignition[972]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Jan 13 20:43:38.700601 ignition[972]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:43:38.700601 ignition[972]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:43:38.700601 ignition[972]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Jan 13 20:43:38.700601 ignition[972]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Jan 13 20:43:38.760851 systemd-networkd[806]: ens192: Gained IPv6LL Jan 13 20:43:38.780487 ignition[972]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:43:38.783477 ignition[972]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:43:38.783477 ignition[972]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Jan 13 20:43:38.783477 ignition[972]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Jan 13 20:43:38.783477 ignition[972]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Jan 13 20:43:38.783477 ignition[972]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:43:38.783477 ignition[972]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:43:38.783477 ignition[972]: INFO : files: files passed Jan 13 20:43:38.783477 ignition[972]: INFO : Ignition finished successfully Jan 13 20:43:38.784885 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 13 20:43:38.789915 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 13 20:43:38.791821 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 13 20:43:38.792505 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 13 20:43:38.792724 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 13 20:43:38.800089 initrd-setup-root-after-ignition[1002]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:43:38.800089 initrd-setup-root-after-ignition[1002]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:43:38.800483 initrd-setup-root-after-ignition[1006]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:43:38.801111 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:43:38.801416 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 13 20:43:38.804854 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 13 20:43:38.816465 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 13 20:43:38.816520 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 13 20:43:38.816812 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 13 20:43:38.816918 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 13 20:43:38.817104 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 13 20:43:38.817500 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 13 20:43:38.826067 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:43:38.829924 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 13 20:43:38.835292 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:43:38.835448 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:43:38.835653 systemd[1]: Stopped target timers.target - Timer Units. Jan 13 20:43:38.835860 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 13 20:43:38.835915 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:43:38.836243 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 13 20:43:38.836382 systemd[1]: Stopped target basic.target - Basic System. Jan 13 20:43:38.836547 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 13 20:43:38.836722 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:43:38.836922 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 13 20:43:38.837312 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 13 20:43:38.837498 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:43:38.837692 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 13 20:43:38.837891 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 13 20:43:38.838068 systemd[1]: Stopped target swap.target - Swaps. Jan 13 20:43:38.838220 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 13 20:43:38.838277 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:43:38.838509 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:43:38.838638 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:43:38.838851 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 13 20:43:38.838893 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:43:38.839028 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 13 20:43:38.839082 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 13 20:43:38.839326 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 13 20:43:38.839384 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:43:38.839626 systemd[1]: Stopped target paths.target - Path Units. Jan 13 20:43:38.839757 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 13 20:43:38.844860 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:43:38.845010 systemd[1]: Stopped target slices.target - Slice Units. Jan 13 20:43:38.845200 systemd[1]: Stopped target sockets.target - Socket Units. Jan 13 20:43:38.845367 systemd[1]: iscsid.socket: Deactivated successfully. Jan 13 20:43:38.845428 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:43:38.845630 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 13 20:43:38.845672 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:43:38.845906 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 13 20:43:38.845962 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:43:38.846198 systemd[1]: ignition-files.service: Deactivated successfully. Jan 13 20:43:38.846250 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 13 20:43:38.852994 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 13 20:43:38.854244 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 13 20:43:38.854336 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 13 20:43:38.854418 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:43:38.854615 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 13 20:43:38.854689 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:43:38.857399 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 13 20:43:38.857453 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 13 20:43:38.860691 ignition[1026]: INFO : Ignition 2.20.0 Jan 13 20:43:38.864082 ignition[1026]: INFO : Stage: umount Jan 13 20:43:38.864082 ignition[1026]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:43:38.864082 ignition[1026]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:43:38.864082 ignition[1026]: INFO : umount: umount passed Jan 13 20:43:38.864082 ignition[1026]: INFO : Ignition finished successfully Jan 13 20:43:38.863147 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 13 20:43:38.863208 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 13 20:43:38.863505 systemd[1]: Stopped target network.target - Network. Jan 13 20:43:38.863674 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 13 20:43:38.863702 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 13 20:43:38.863811 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 13 20:43:38.863833 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 13 20:43:38.864007 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 13 20:43:38.864027 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 13 20:43:38.864153 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 13 20:43:38.864174 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 13 20:43:38.864376 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 13 20:43:38.864515 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 13 20:43:38.871367 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 13 20:43:38.871425 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 13 20:43:38.872680 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 13 20:43:38.873095 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 13 20:43:38.873142 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 13 20:43:38.874550 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 13 20:43:38.874578 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:43:38.878912 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 13 20:43:38.878995 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 13 20:43:38.879020 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:43:38.879136 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Jan 13 20:43:38.879158 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:43:38.879264 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 13 20:43:38.879284 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:43:38.879379 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 13 20:43:38.879399 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 13 20:43:38.879494 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 13 20:43:38.879514 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:43:38.879654 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:43:38.885839 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 13 20:43:38.885895 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 13 20:43:38.890173 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 13 20:43:38.890245 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:43:38.890535 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 13 20:43:38.890564 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 13 20:43:38.890760 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 13 20:43:38.890815 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:43:38.890961 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 13 20:43:38.890983 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:43:38.891238 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 13 20:43:38.891258 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 13 20:43:38.891532 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:43:38.891554 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:43:38.896128 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 13 20:43:38.896238 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 13 20:43:38.896265 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:43:38.896378 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 13 20:43:38.896400 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:43:38.896509 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 13 20:43:38.896530 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:43:38.896636 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:43:38.896655 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:43:38.898842 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 13 20:43:38.898892 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 13 20:43:38.929521 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 13 20:43:38.929574 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 13 20:43:38.929845 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 13 20:43:38.929950 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 13 20:43:38.929975 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 13 20:43:38.937940 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 13 20:43:38.944824 systemd[1]: Switching root. Jan 13 20:43:38.974330 systemd-journald[217]: Journal stopped Jan 13 20:43:40.015346 systemd-journald[217]: Received SIGTERM from PID 1 (systemd). Jan 13 20:43:40.015367 kernel: SELinux: policy capability network_peer_controls=1 Jan 13 20:43:40.015376 kernel: SELinux: policy capability open_perms=1 Jan 13 20:43:40.015381 kernel: SELinux: policy capability extended_socket_class=1 Jan 13 20:43:40.015386 kernel: SELinux: policy capability always_check_network=0 Jan 13 20:43:40.015392 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 13 20:43:40.015399 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 13 20:43:40.015405 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 13 20:43:40.015410 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 13 20:43:40.015416 kernel: audit: type=1403 audit(1736801019.507:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 13 20:43:40.015422 systemd[1]: Successfully loaded SELinux policy in 30.105ms. Jan 13 20:43:40.015429 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 6.096ms. Jan 13 20:43:40.015436 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 20:43:40.015444 systemd[1]: Detected virtualization vmware. Jan 13 20:43:40.015450 systemd[1]: Detected architecture x86-64. Jan 13 20:43:40.015456 systemd[1]: Detected first boot. Jan 13 20:43:40.015463 systemd[1]: Initializing machine ID from random generator. Jan 13 20:43:40.015470 zram_generator::config[1071]: No configuration found. Jan 13 20:43:40.015477 systemd[1]: Populated /etc with preset unit settings. Jan 13 20:43:40.015486 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:43:40.015493 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Jan 13 20:43:40.015499 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 13 20:43:40.015505 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 13 20:43:40.015511 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 13 20:43:40.015517 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 13 20:43:40.015525 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 13 20:43:40.015532 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 13 20:43:40.015538 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 13 20:43:40.015545 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 13 20:43:40.015551 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 13 20:43:40.015558 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 13 20:43:40.015564 systemd[1]: Created slice user.slice - User and Session Slice. Jan 13 20:43:40.015572 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:43:40.015578 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:43:40.015585 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 13 20:43:40.015591 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 13 20:43:40.015598 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 13 20:43:40.015604 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 20:43:40.015610 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 13 20:43:40.015617 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:43:40.015626 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 13 20:43:40.015634 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 13 20:43:40.015641 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 13 20:43:40.015647 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 13 20:43:40.015654 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:43:40.015660 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 20:43:40.015667 systemd[1]: Reached target slices.target - Slice Units. Jan 13 20:43:40.015674 systemd[1]: Reached target swap.target - Swaps. Jan 13 20:43:40.015681 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 13 20:43:40.015687 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 13 20:43:40.015694 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:43:40.015700 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 20:43:40.015708 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:43:40.015734 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 13 20:43:40.015740 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 13 20:43:40.015747 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 13 20:43:40.015788 systemd[1]: Mounting media.mount - External Media Directory... Jan 13 20:43:40.015799 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:43:40.015806 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 13 20:43:40.015813 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 13 20:43:40.015822 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 13 20:43:40.015829 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 13 20:43:40.015836 systemd[1]: Reached target machines.target - Containers. Jan 13 20:43:40.015843 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 13 20:43:40.015849 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Jan 13 20:43:40.015856 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 20:43:40.015862 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 13 20:43:40.015869 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:43:40.015875 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 20:43:40.015883 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 20:43:40.015890 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 13 20:43:40.015897 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:43:40.015904 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 13 20:43:40.015911 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 13 20:43:40.015917 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 13 20:43:40.015924 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 13 20:43:40.015931 systemd[1]: Stopped systemd-fsck-usr.service. Jan 13 20:43:40.015939 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 20:43:40.015946 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 20:43:40.015952 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 13 20:43:40.015959 kernel: loop: module loaded Jan 13 20:43:40.015965 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 13 20:43:40.015971 kernel: fuse: init (API version 7.39) Jan 13 20:43:40.015978 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 20:43:40.015984 systemd[1]: verity-setup.service: Deactivated successfully. Jan 13 20:43:40.015991 systemd[1]: Stopped verity-setup.service. Jan 13 20:43:40.015999 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:43:40.016006 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 13 20:43:40.016012 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 13 20:43:40.016019 systemd[1]: Mounted media.mount - External Media Directory. Jan 13 20:43:40.016026 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 13 20:43:40.016032 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 13 20:43:40.016039 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 13 20:43:40.016055 systemd-journald[1154]: Collecting audit messages is disabled. Jan 13 20:43:40.016072 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:43:40.016080 systemd-journald[1154]: Journal started Jan 13 20:43:40.016096 systemd-journald[1154]: Runtime Journal (/run/log/journal/d202d20f356c4a6f9579aa22ce7ddf6b) is 4.8M, max 38.6M, 33.8M free. Jan 13 20:43:39.849671 systemd[1]: Queued start job for default target multi-user.target. Jan 13 20:43:39.872357 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 13 20:43:39.872547 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 13 20:43:40.016562 jq[1138]: true Jan 13 20:43:40.016844 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 20:43:40.018064 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 13 20:43:40.018150 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 13 20:43:40.018393 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:43:40.018465 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:43:40.019282 kernel: ACPI: bus type drm_connector registered Jan 13 20:43:40.018932 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 20:43:40.019001 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 20:43:40.019522 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 20:43:40.019595 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 20:43:40.019832 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 13 20:43:40.019902 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 13 20:43:40.020984 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:43:40.021059 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:43:40.021274 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 20:43:40.021481 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 13 20:43:40.021691 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 13 20:43:40.025596 jq[1170]: true Jan 13 20:43:40.034578 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 13 20:43:40.039849 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 13 20:43:40.042798 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 13 20:43:40.042908 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 13 20:43:40.042926 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 20:43:40.043553 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Jan 13 20:43:40.045032 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 13 20:43:40.048098 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 13 20:43:40.048242 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:43:40.050848 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 13 20:43:40.051884 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 13 20:43:40.052006 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 20:43:40.056477 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 13 20:43:40.056600 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 20:43:40.058160 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 20:43:40.059923 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 13 20:43:40.067856 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 20:43:40.069623 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 13 20:43:40.069897 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 13 20:43:40.070031 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 13 20:43:40.081082 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 13 20:43:40.096385 systemd-journald[1154]: Time spent on flushing to /var/log/journal/d202d20f356c4a6f9579aa22ce7ddf6b is 96.052ms for 1834 entries. Jan 13 20:43:40.096385 systemd-journald[1154]: System Journal (/var/log/journal/d202d20f356c4a6f9579aa22ce7ddf6b) is 8.0M, max 584.8M, 576.8M free. Jan 13 20:43:40.201196 systemd-journald[1154]: Received client request to flush runtime journal. Jan 13 20:43:40.201228 kernel: loop0: detected capacity change from 0 to 2960 Jan 13 20:43:40.100407 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 13 20:43:40.133905 ignition[1181]: Ignition 2.20.0 Jan 13 20:43:40.100565 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 13 20:43:40.134068 ignition[1181]: deleting config from guestinfo properties Jan 13 20:43:40.105497 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Jan 13 20:43:40.180887 ignition[1181]: Successfully deleted config Jan 13 20:43:40.184639 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Jan 13 20:43:40.185123 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:43:40.199159 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:43:40.201909 systemd-tmpfiles[1209]: ACLs are not supported, ignoring. Jan 13 20:43:40.201918 systemd-tmpfiles[1209]: ACLs are not supported, ignoring. Jan 13 20:43:40.205954 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Jan 13 20:43:40.206470 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 13 20:43:40.206872 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 13 20:43:40.207106 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Jan 13 20:43:40.209406 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:43:40.209816 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 13 20:43:40.217763 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 13 20:43:40.226454 udevadm[1229]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Jan 13 20:43:40.235787 kernel: loop1: detected capacity change from 0 to 211296 Jan 13 20:43:40.248461 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 13 20:43:40.254870 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 20:43:40.272548 systemd-tmpfiles[1236]: ACLs are not supported, ignoring. Jan 13 20:43:40.272562 systemd-tmpfiles[1236]: ACLs are not supported, ignoring. Jan 13 20:43:40.276897 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:43:40.278845 kernel: loop2: detected capacity change from 0 to 141000 Jan 13 20:43:40.332851 kernel: loop3: detected capacity change from 0 to 138184 Jan 13 20:43:40.379783 kernel: loop4: detected capacity change from 0 to 2960 Jan 13 20:43:40.406791 kernel: loop5: detected capacity change from 0 to 211296 Jan 13 20:43:40.435158 kernel: loop6: detected capacity change from 0 to 141000 Jan 13 20:43:40.457589 kernel: loop7: detected capacity change from 0 to 138184 Jan 13 20:43:40.483500 (sd-merge)[1243]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Jan 13 20:43:40.484044 (sd-merge)[1243]: Merged extensions into '/usr'. Jan 13 20:43:40.490717 systemd[1]: Reloading requested from client PID 1208 ('systemd-sysext') (unit systemd-sysext.service)... Jan 13 20:43:40.490807 systemd[1]: Reloading... Jan 13 20:43:40.534842 zram_generator::config[1265]: No configuration found. Jan 13 20:43:40.605844 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:43:40.622643 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:43:40.652572 systemd[1]: Reloading finished in 161 ms. Jan 13 20:43:40.675423 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 13 20:43:40.675734 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 13 20:43:40.687654 systemd[1]: Starting ensure-sysext.service... Jan 13 20:43:40.688680 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 20:43:40.691845 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:43:40.700836 systemd[1]: Reloading requested from client PID 1325 ('systemctl') (unit ensure-sysext.service)... Jan 13 20:43:40.700846 systemd[1]: Reloading... Jan 13 20:43:40.702720 ldconfig[1203]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 13 20:43:40.709537 systemd-tmpfiles[1326]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 13 20:43:40.709873 systemd-tmpfiles[1326]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 13 20:43:40.710422 systemd-tmpfiles[1326]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 13 20:43:40.710639 systemd-tmpfiles[1326]: ACLs are not supported, ignoring. Jan 13 20:43:40.710720 systemd-tmpfiles[1326]: ACLs are not supported, ignoring. Jan 13 20:43:40.713048 systemd-tmpfiles[1326]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 20:43:40.713136 systemd-tmpfiles[1326]: Skipping /boot Jan 13 20:43:40.716031 systemd-udevd[1327]: Using default interface naming scheme 'v255'. Jan 13 20:43:40.719763 systemd-tmpfiles[1326]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 20:43:40.719822 systemd-tmpfiles[1326]: Skipping /boot Jan 13 20:43:40.765800 zram_generator::config[1352]: No configuration found. Jan 13 20:43:40.826806 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 44 scanned by (udev-worker) (1359) Jan 13 20:43:40.829785 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Jan 13 20:43:40.840800 kernel: ACPI: button: Power Button [PWRF] Jan 13 20:43:40.876503 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:43:40.894854 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:43:40.908777 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Jan 13 20:43:40.913783 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Jan 13 20:43:40.914809 kernel: Guest personality initialized and is active Jan 13 20:43:40.915824 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Jan 13 20:43:40.915861 kernel: Initialized host personality Jan 13 20:43:40.938550 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 13 20:43:40.938641 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 20:43:40.939026 systemd[1]: Reloading finished in 237 ms. Jan 13 20:43:40.947787 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 Jan 13 20:43:40.952551 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:43:40.952960 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 13 20:43:40.961357 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:43:40.968333 (udev-worker)[1364]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Jan 13 20:43:40.973781 kernel: mousedev: PS/2 mouse device common for all mice Jan 13 20:43:40.976216 systemd[1]: Finished ensure-sysext.service. Jan 13 20:43:40.982626 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:43:40.996903 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 20:43:41.003727 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 13 20:43:41.006482 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:43:41.008851 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 20:43:41.009918 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 20:43:41.013654 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:43:41.013908 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:43:41.015892 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 13 20:43:41.016841 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 13 20:43:41.018962 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 20:43:41.020843 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 20:43:41.023849 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 13 20:43:41.026906 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 13 20:43:41.027843 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:43:41.028104 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:43:41.029791 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Jan 13 20:43:41.030040 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:43:41.030127 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:43:41.030342 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 20:43:41.030414 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 20:43:41.030625 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 20:43:41.030738 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 20:43:41.031004 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:43:41.031090 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:43:41.043291 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Jan 13 20:43:41.043682 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 20:43:41.043713 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 20:43:41.046348 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 13 20:43:41.047037 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 13 20:43:41.055408 lvm[1470]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 13 20:43:41.059406 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 13 20:43:41.067712 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 13 20:43:41.073886 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 13 20:43:41.083601 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 13 20:43:41.083929 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Jan 13 20:43:41.084169 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 13 20:43:41.085316 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:43:41.088933 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Jan 13 20:43:41.090027 augenrules[1495]: No rules Jan 13 20:43:41.090842 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 13 20:43:41.091028 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 20:43:41.091123 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 20:43:41.093247 lvm[1494]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 13 20:43:41.098012 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 13 20:43:41.113001 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Jan 13 20:43:41.125837 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:43:41.141416 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 13 20:43:41.141591 systemd[1]: Reached target time-set.target - System Time Set. Jan 13 20:43:41.153338 systemd-networkd[1457]: lo: Link UP Jan 13 20:43:41.153342 systemd-networkd[1457]: lo: Gained carrier Jan 13 20:43:41.154209 systemd-networkd[1457]: Enumeration completed Jan 13 20:43:41.154313 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 20:43:41.155451 systemd-networkd[1457]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Jan 13 20:43:41.157826 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 20:43:41.157940 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 20:43:41.158056 systemd-networkd[1457]: ens192: Link UP Jan 13 20:43:41.158210 systemd-networkd[1457]: ens192: Gained carrier Jan 13 20:43:41.159849 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 13 20:43:41.161397 systemd-timesyncd[1461]: Network configuration changed, trying to establish connection. Jan 13 20:43:41.164708 systemd-resolved[1458]: Positive Trust Anchors: Jan 13 20:43:41.164719 systemd-resolved[1458]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 20:43:41.164742 systemd-resolved[1458]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 20:43:41.167005 systemd-resolved[1458]: Defaulting to hostname 'linux'. Jan 13 20:43:41.167912 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 20:43:41.168099 systemd[1]: Reached target network.target - Network. Jan 13 20:43:41.168227 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:43:41.168378 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 20:43:41.168534 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 13 20:43:41.168789 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 13 20:43:41.168977 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 13 20:43:41.169129 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 13 20:43:41.169240 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 13 20:43:41.169346 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 13 20:43:41.169360 systemd[1]: Reached target paths.target - Path Units. Jan 13 20:43:41.169449 systemd[1]: Reached target timers.target - Timer Units. Jan 13 20:43:41.170333 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 13 20:43:41.171379 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 13 20:43:41.177774 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 13 20:43:41.178263 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 13 20:43:41.178413 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 20:43:41.178572 systemd[1]: Reached target basic.target - Basic System. Jan 13 20:43:41.178701 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 13 20:43:41.178716 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 13 20:43:41.179473 systemd[1]: Starting containerd.service - containerd container runtime... Jan 13 20:43:41.181860 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 13 20:43:41.183657 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 13 20:43:41.185859 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 13 20:43:41.186614 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 13 20:43:41.187877 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 13 20:43:41.188933 jq[1519]: false Jan 13 20:43:41.190120 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 13 20:43:41.192031 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 13 20:43:41.193894 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 13 20:43:41.196013 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 13 20:43:41.196308 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 13 20:43:41.196877 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 13 20:43:41.197184 systemd[1]: Starting update-engine.service - Update Engine... Jan 13 20:43:41.200382 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 13 20:43:41.201327 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Jan 13 20:43:41.202554 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 13 20:43:41.202657 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 13 20:43:41.212036 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 13 20:43:41.212137 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 13 20:43:41.222485 jq[1530]: true Jan 13 20:43:41.224996 systemd[1]: motdgen.service: Deactivated successfully. Jan 13 20:43:41.225110 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 13 20:43:41.229316 tar[1540]: linux-amd64/helm Jan 13 20:43:41.229779 extend-filesystems[1520]: Found loop4 Jan 13 20:43:41.232098 extend-filesystems[1520]: Found loop5 Jan 13 20:43:41.232098 extend-filesystems[1520]: Found loop6 Jan 13 20:43:41.232098 extend-filesystems[1520]: Found loop7 Jan 13 20:43:41.232098 extend-filesystems[1520]: Found sda Jan 13 20:43:41.232098 extend-filesystems[1520]: Found sda1 Jan 13 20:43:41.232098 extend-filesystems[1520]: Found sda2 Jan 13 20:43:41.232098 extend-filesystems[1520]: Found sda3 Jan 13 20:43:41.232098 extend-filesystems[1520]: Found usr Jan 13 20:43:41.232098 extend-filesystems[1520]: Found sda4 Jan 13 20:43:41.232098 extend-filesystems[1520]: Found sda6 Jan 13 20:43:41.232098 extend-filesystems[1520]: Found sda7 Jan 13 20:43:41.232098 extend-filesystems[1520]: Found sda9 Jan 13 20:43:41.232098 extend-filesystems[1520]: Checking size of /dev/sda9 Jan 13 20:43:41.240716 dbus-daemon[1518]: [system] SELinux support is enabled Jan 13 20:43:41.244033 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 13 20:43:41.245561 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 13 20:43:41.245578 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 13 20:43:41.245900 update_engine[1529]: I20250113 20:43:41.245687 1529 main.cc:92] Flatcar Update Engine starting Jan 13 20:43:41.246814 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 13 20:43:41.246828 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 13 20:43:41.248421 extend-filesystems[1520]: Old size kept for /dev/sda9 Jan 13 20:43:41.248421 extend-filesystems[1520]: Found sr0 Jan 13 20:43:41.251850 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Jan 13 20:43:41.252111 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 13 20:43:41.252208 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 13 20:43:41.255834 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Jan 13 20:43:41.256403 update_engine[1529]: I20250113 20:43:41.256304 1529 update_check_scheduler.cc:74] Next update check in 5m46s Jan 13 20:43:41.258540 systemd[1]: Started update-engine.service - Update Engine. Jan 13 20:43:41.261902 (ntainerd)[1547]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 13 20:43:41.264193 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 13 20:43:41.269140 jq[1544]: true Jan 13 20:43:41.278779 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 44 scanned by (udev-worker) (1369) Jan 13 20:43:41.286925 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Jan 13 20:43:41.312485 unknown[1556]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Jan 13 20:43:41.317681 unknown[1556]: Core dump limit set to -1 Jan 13 20:43:41.329780 kernel: NET: Registered PF_VSOCK protocol family Jan 13 20:43:41.337079 systemd-logind[1526]: Watching system buttons on /dev/input/event1 (Power Button) Jan 13 20:43:41.338863 systemd-logind[1526]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 13 20:43:41.339425 systemd-logind[1526]: New seat seat0. Jan 13 20:43:41.340474 systemd[1]: Started systemd-logind.service - User Login Management. Jan 13 20:43:41.362200 bash[1581]: Updated "/home/core/.ssh/authorized_keys" Jan 13 20:43:41.363015 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 13 20:43:41.363898 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Jan 13 20:43:41.431830 locksmithd[1561]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 13 20:43:41.492795 sshd_keygen[1552]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 13 20:43:41.521586 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 13 20:43:41.525952 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 13 20:43:41.536159 systemd[1]: issuegen.service: Deactivated successfully. Jan 13 20:43:41.536482 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 13 20:43:41.539654 containerd[1547]: time="2025-01-13T20:43:41.539621236Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Jan 13 20:43:41.542975 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 13 20:43:41.556978 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 13 20:43:41.563763 containerd[1547]: time="2025-01-13T20:43:41.563743508Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:43:41.564841 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 13 20:43:41.564936 containerd[1547]: time="2025-01-13T20:43:41.564920842Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.71-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:43:41.565104 containerd[1547]: time="2025-01-13T20:43:41.565096063Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Jan 13 20:43:41.565141 containerd[1547]: time="2025-01-13T20:43:41.565134666Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Jan 13 20:43:41.565251 containerd[1547]: time="2025-01-13T20:43:41.565243199Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Jan 13 20:43:41.565285 containerd[1547]: time="2025-01-13T20:43:41.565278757Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Jan 13 20:43:41.565356 containerd[1547]: time="2025-01-13T20:43:41.565346687Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:43:41.565388 containerd[1547]: time="2025-01-13T20:43:41.565382738Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:43:41.565582 containerd[1547]: time="2025-01-13T20:43:41.565571792Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:43:41.565651 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 13 20:43:41.566508 systemd[1]: Reached target getty.target - Login Prompts. Jan 13 20:43:41.567426 containerd[1547]: time="2025-01-13T20:43:41.566959248Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Jan 13 20:43:41.567426 containerd[1547]: time="2025-01-13T20:43:41.566972191Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:43:41.567426 containerd[1547]: time="2025-01-13T20:43:41.566978286Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Jan 13 20:43:41.567426 containerd[1547]: time="2025-01-13T20:43:41.567022758Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:43:41.567426 containerd[1547]: time="2025-01-13T20:43:41.567127649Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:43:41.567426 containerd[1547]: time="2025-01-13T20:43:41.567187812Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:43:41.567426 containerd[1547]: time="2025-01-13T20:43:41.567195757Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Jan 13 20:43:41.567426 containerd[1547]: time="2025-01-13T20:43:41.567243393Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Jan 13 20:43:41.567426 containerd[1547]: time="2025-01-13T20:43:41.567268793Z" level=info msg="metadata content store policy set" policy=shared Jan 13 20:43:41.578779 containerd[1547]: time="2025-01-13T20:43:41.577107614Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Jan 13 20:43:41.578779 containerd[1547]: time="2025-01-13T20:43:41.577139370Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Jan 13 20:43:41.578779 containerd[1547]: time="2025-01-13T20:43:41.577156598Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Jan 13 20:43:41.578779 containerd[1547]: time="2025-01-13T20:43:41.577196221Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Jan 13 20:43:41.578779 containerd[1547]: time="2025-01-13T20:43:41.577204225Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Jan 13 20:43:41.578779 containerd[1547]: time="2025-01-13T20:43:41.577340290Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Jan 13 20:43:41.578779 containerd[1547]: time="2025-01-13T20:43:41.577530671Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Jan 13 20:43:41.578779 containerd[1547]: time="2025-01-13T20:43:41.577599336Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Jan 13 20:43:41.578779 containerd[1547]: time="2025-01-13T20:43:41.577608727Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Jan 13 20:43:41.578779 containerd[1547]: time="2025-01-13T20:43:41.577616541Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Jan 13 20:43:41.578779 containerd[1547]: time="2025-01-13T20:43:41.577624927Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Jan 13 20:43:41.578779 containerd[1547]: time="2025-01-13T20:43:41.577632383Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Jan 13 20:43:41.578779 containerd[1547]: time="2025-01-13T20:43:41.577638928Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Jan 13 20:43:41.578779 containerd[1547]: time="2025-01-13T20:43:41.577645926Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Jan 13 20:43:41.578996 containerd[1547]: time="2025-01-13T20:43:41.577653819Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Jan 13 20:43:41.578996 containerd[1547]: time="2025-01-13T20:43:41.577660691Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Jan 13 20:43:41.578996 containerd[1547]: time="2025-01-13T20:43:41.577667481Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Jan 13 20:43:41.578996 containerd[1547]: time="2025-01-13T20:43:41.577673415Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Jan 13 20:43:41.578996 containerd[1547]: time="2025-01-13T20:43:41.577689633Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.578996 containerd[1547]: time="2025-01-13T20:43:41.577698319Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.578996 containerd[1547]: time="2025-01-13T20:43:41.577704933Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.578996 containerd[1547]: time="2025-01-13T20:43:41.577711578Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.578996 containerd[1547]: time="2025-01-13T20:43:41.577718233Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.578996 containerd[1547]: time="2025-01-13T20:43:41.577724940Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.578996 containerd[1547]: time="2025-01-13T20:43:41.577730928Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.578996 containerd[1547]: time="2025-01-13T20:43:41.577737672Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.578996 containerd[1547]: time="2025-01-13T20:43:41.577744739Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.578996 containerd[1547]: time="2025-01-13T20:43:41.577752847Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.579187 containerd[1547]: time="2025-01-13T20:43:41.577759534Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.579187 containerd[1547]: time="2025-01-13T20:43:41.577765776Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.579187 containerd[1547]: time="2025-01-13T20:43:41.577781492Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.579187 containerd[1547]: time="2025-01-13T20:43:41.577790852Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Jan 13 20:43:41.579187 containerd[1547]: time="2025-01-13T20:43:41.577803446Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.579187 containerd[1547]: time="2025-01-13T20:43:41.577833059Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.579187 containerd[1547]: time="2025-01-13T20:43:41.577839183Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Jan 13 20:43:41.579187 containerd[1547]: time="2025-01-13T20:43:41.577877879Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Jan 13 20:43:41.579187 containerd[1547]: time="2025-01-13T20:43:41.577904789Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Jan 13 20:43:41.579187 containerd[1547]: time="2025-01-13T20:43:41.577910632Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Jan 13 20:43:41.579187 containerd[1547]: time="2025-01-13T20:43:41.577916677Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Jan 13 20:43:41.579187 containerd[1547]: time="2025-01-13T20:43:41.577921502Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.579187 containerd[1547]: time="2025-01-13T20:43:41.577927822Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Jan 13 20:43:41.579187 containerd[1547]: time="2025-01-13T20:43:41.577933278Z" level=info msg="NRI interface is disabled by configuration." Jan 13 20:43:41.579426 containerd[1547]: time="2025-01-13T20:43:41.577939595Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Jan 13 20:43:41.579445 containerd[1547]: time="2025-01-13T20:43:41.578142009Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Jan 13 20:43:41.579445 containerd[1547]: time="2025-01-13T20:43:41.578220566Z" level=info msg="Connect containerd service" Jan 13 20:43:41.579445 containerd[1547]: time="2025-01-13T20:43:41.578241208Z" level=info msg="using legacy CRI server" Jan 13 20:43:41.579445 containerd[1547]: time="2025-01-13T20:43:41.578245911Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 13 20:43:41.579445 containerd[1547]: time="2025-01-13T20:43:41.578373528Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Jan 13 20:43:41.579445 containerd[1547]: time="2025-01-13T20:43:41.578658280Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 13 20:43:41.579710 containerd[1547]: time="2025-01-13T20:43:41.579686684Z" level=info msg="Start subscribing containerd event" Jan 13 20:43:41.579754 containerd[1547]: time="2025-01-13T20:43:41.579745671Z" level=info msg="Start recovering state" Jan 13 20:43:41.579822 containerd[1547]: time="2025-01-13T20:43:41.579814455Z" level=info msg="Start event monitor" Jan 13 20:43:41.579857 containerd[1547]: time="2025-01-13T20:43:41.579850970Z" level=info msg="Start snapshots syncer" Jan 13 20:43:41.579887 containerd[1547]: time="2025-01-13T20:43:41.579880995Z" level=info msg="Start cni network conf syncer for default" Jan 13 20:43:41.579913 containerd[1547]: time="2025-01-13T20:43:41.579908432Z" level=info msg="Start streaming server" Jan 13 20:43:41.580080 containerd[1547]: time="2025-01-13T20:43:41.580072555Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 13 20:43:41.580189 containerd[1547]: time="2025-01-13T20:43:41.580175515Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 13 20:43:41.580257 containerd[1547]: time="2025-01-13T20:43:41.580250687Z" level=info msg="containerd successfully booted in 0.042872s" Jan 13 20:43:41.580300 systemd[1]: Started containerd.service - containerd container runtime. Jan 13 20:43:41.698452 tar[1540]: linux-amd64/LICENSE Jan 13 20:43:41.698581 tar[1540]: linux-amd64/README.md Jan 13 20:43:41.705975 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 13 20:43:42.856933 systemd-networkd[1457]: ens192: Gained IPv6LL Jan 13 20:43:42.857218 systemd-timesyncd[1461]: Network configuration changed, trying to establish connection. Jan 13 20:43:42.858287 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 13 20:43:42.859193 systemd[1]: Reached target network-online.target - Network is Online. Jan 13 20:43:42.864987 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Jan 13 20:43:42.866480 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:43:42.869564 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 13 20:43:42.882914 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 13 20:43:42.901364 systemd[1]: coreos-metadata.service: Deactivated successfully. Jan 13 20:43:42.901476 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Jan 13 20:43:42.902122 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 13 20:43:43.561905 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:43:43.562247 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 13 20:43:43.562817 systemd[1]: Startup finished in 948ms (kernel) + 4.883s (initrd) + 4.083s (userspace) = 9.915s. Jan 13 20:43:43.567129 (kubelet)[1698]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:43:43.570048 agetty[1655]: failed to open credentials directory Jan 13 20:43:43.570871 agetty[1649]: failed to open credentials directory Jan 13 20:43:43.588425 login[1649]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Jan 13 20:43:43.590283 login[1655]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 13 20:43:43.596516 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 13 20:43:43.601922 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 13 20:43:43.604802 systemd-logind[1526]: New session 2 of user core. Jan 13 20:43:43.609685 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 13 20:43:43.612970 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 13 20:43:43.616589 (systemd)[1705]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 13 20:43:43.679394 systemd[1705]: Queued start job for default target default.target. Jan 13 20:43:43.681973 systemd[1705]: Created slice app.slice - User Application Slice. Jan 13 20:43:43.682004 systemd[1705]: Reached target paths.target - Paths. Jan 13 20:43:43.682017 systemd[1705]: Reached target timers.target - Timers. Jan 13 20:43:43.682879 systemd[1705]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 13 20:43:43.690348 systemd[1705]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 13 20:43:43.690382 systemd[1705]: Reached target sockets.target - Sockets. Jan 13 20:43:43.690391 systemd[1705]: Reached target basic.target - Basic System. Jan 13 20:43:43.690412 systemd[1705]: Reached target default.target - Main User Target. Jan 13 20:43:43.690427 systemd[1705]: Startup finished in 70ms. Jan 13 20:43:43.690590 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 13 20:43:43.695865 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 13 20:43:44.112373 kubelet[1698]: E0113 20:43:44.112322 1698 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:43:44.113656 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:43:44.113733 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:43:44.588923 login[1649]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 13 20:43:44.592500 systemd-logind[1526]: New session 1 of user core. Jan 13 20:43:44.601967 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 13 20:43:54.364393 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 13 20:43:54.372003 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:43:54.646296 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:43:54.649673 (kubelet)[1748]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:43:54.761327 kubelet[1748]: E0113 20:43:54.761287 1748 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:43:54.764023 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:43:54.764112 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:44:05.014640 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 13 20:44:05.020948 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:44:05.247732 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:44:05.250254 (kubelet)[1765]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:44:05.303507 kubelet[1765]: E0113 20:44:05.303441 1765 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:44:05.304658 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:44:05.304733 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:45:25.894543 systemd-resolved[1458]: Clock change detected. Flushing caches. Jan 13 20:45:25.894603 systemd-timesyncd[1461]: Contacted time server 23.168.24.210:123 (2.flatcar.pool.ntp.org). Jan 13 20:45:25.894632 systemd-timesyncd[1461]: Initial clock synchronization to Mon 2025-01-13 20:45:25.894506 UTC. Jan 13 20:45:28.240824 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 13 20:45:28.251512 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:45:28.491502 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:45:28.494384 (kubelet)[1781]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:45:28.522872 kubelet[1781]: E0113 20:45:28.522839 1781 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:45:28.524093 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:45:28.524167 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:45:34.094506 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 13 20:45:34.102639 systemd[1]: Started sshd@0-139.178.70.110:22-147.75.109.163:35214.service - OpenSSH per-connection server daemon (147.75.109.163:35214). Jan 13 20:45:34.141636 sshd[1791]: Accepted publickey for core from 147.75.109.163 port 35214 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:45:34.142438 sshd-session[1791]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:45:34.145334 systemd-logind[1526]: New session 3 of user core. Jan 13 20:45:34.156471 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 13 20:45:34.217468 systemd[1]: Started sshd@1-139.178.70.110:22-147.75.109.163:35220.service - OpenSSH per-connection server daemon (147.75.109.163:35220). Jan 13 20:45:34.246560 sshd[1796]: Accepted publickey for core from 147.75.109.163 port 35220 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:45:34.247168 sshd-session[1796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:45:34.250580 systemd-logind[1526]: New session 4 of user core. Jan 13 20:45:34.255415 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 13 20:45:34.303541 sshd[1798]: Connection closed by 147.75.109.163 port 35220 Jan 13 20:45:34.303780 sshd-session[1796]: pam_unix(sshd:session): session closed for user core Jan 13 20:45:34.309213 systemd[1]: sshd@1-139.178.70.110:22-147.75.109.163:35220.service: Deactivated successfully. Jan 13 20:45:34.310038 systemd[1]: session-4.scope: Deactivated successfully. Jan 13 20:45:34.311287 systemd-logind[1526]: Session 4 logged out. Waiting for processes to exit. Jan 13 20:45:34.311476 systemd[1]: Started sshd@2-139.178.70.110:22-147.75.109.163:35226.service - OpenSSH per-connection server daemon (147.75.109.163:35226). Jan 13 20:45:34.312810 systemd-logind[1526]: Removed session 4. Jan 13 20:45:34.343737 sshd[1803]: Accepted publickey for core from 147.75.109.163 port 35226 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:45:34.344550 sshd-session[1803]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:45:34.348792 systemd-logind[1526]: New session 5 of user core. Jan 13 20:45:34.352511 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 13 20:45:34.398159 sshd[1805]: Connection closed by 147.75.109.163 port 35226 Jan 13 20:45:34.398532 sshd-session[1803]: pam_unix(sshd:session): session closed for user core Jan 13 20:45:34.404016 systemd[1]: sshd@2-139.178.70.110:22-147.75.109.163:35226.service: Deactivated successfully. Jan 13 20:45:34.404946 systemd[1]: session-5.scope: Deactivated successfully. Jan 13 20:45:34.405874 systemd-logind[1526]: Session 5 logged out. Waiting for processes to exit. Jan 13 20:45:34.409661 systemd[1]: Started sshd@3-139.178.70.110:22-147.75.109.163:35228.service - OpenSSH per-connection server daemon (147.75.109.163:35228). Jan 13 20:45:34.411073 systemd-logind[1526]: Removed session 5. Jan 13 20:45:34.445009 sshd[1810]: Accepted publickey for core from 147.75.109.163 port 35228 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:45:34.445809 sshd-session[1810]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:45:34.449038 systemd-logind[1526]: New session 6 of user core. Jan 13 20:45:34.458454 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 13 20:45:34.509228 sshd[1812]: Connection closed by 147.75.109.163 port 35228 Jan 13 20:45:34.509176 sshd-session[1810]: pam_unix(sshd:session): session closed for user core Jan 13 20:45:34.522267 systemd[1]: sshd@3-139.178.70.110:22-147.75.109.163:35228.service: Deactivated successfully. Jan 13 20:45:34.523230 systemd[1]: session-6.scope: Deactivated successfully. Jan 13 20:45:34.524181 systemd-logind[1526]: Session 6 logged out. Waiting for processes to exit. Jan 13 20:45:34.529618 systemd[1]: Started sshd@4-139.178.70.110:22-147.75.109.163:35232.service - OpenSSH per-connection server daemon (147.75.109.163:35232). Jan 13 20:45:34.530751 systemd-logind[1526]: Removed session 6. Jan 13 20:45:34.563692 sshd[1817]: Accepted publickey for core from 147.75.109.163 port 35232 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:45:34.564657 sshd-session[1817]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:45:34.567317 systemd-logind[1526]: New session 7 of user core. Jan 13 20:45:34.578475 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 13 20:45:34.635585 sudo[1820]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 13 20:45:34.635800 sudo[1820]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:45:34.650201 sudo[1820]: pam_unix(sudo:session): session closed for user root Jan 13 20:45:34.651121 sshd[1819]: Connection closed by 147.75.109.163 port 35232 Jan 13 20:45:34.651540 sshd-session[1817]: pam_unix(sshd:session): session closed for user core Jan 13 20:45:34.661099 systemd[1]: sshd@4-139.178.70.110:22-147.75.109.163:35232.service: Deactivated successfully. Jan 13 20:45:34.662046 systemd[1]: session-7.scope: Deactivated successfully. Jan 13 20:45:34.662992 systemd-logind[1526]: Session 7 logged out. Waiting for processes to exit. Jan 13 20:45:34.663902 systemd[1]: Started sshd@5-139.178.70.110:22-147.75.109.163:35246.service - OpenSSH per-connection server daemon (147.75.109.163:35246). Jan 13 20:45:34.665545 systemd-logind[1526]: Removed session 7. Jan 13 20:45:34.706519 sshd[1825]: Accepted publickey for core from 147.75.109.163 port 35246 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:45:34.707281 sshd-session[1825]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:45:34.711116 systemd-logind[1526]: New session 8 of user core. Jan 13 20:45:34.716464 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 13 20:45:34.764815 sudo[1829]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 13 20:45:34.765026 sudo[1829]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:45:34.767234 sudo[1829]: pam_unix(sudo:session): session closed for user root Jan 13 20:45:34.770879 sudo[1828]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 13 20:45:34.771237 sudo[1828]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:45:34.787651 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 20:45:34.806761 augenrules[1851]: No rules Jan 13 20:45:34.807060 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 20:45:34.807264 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 20:45:34.808029 sudo[1828]: pam_unix(sudo:session): session closed for user root Jan 13 20:45:34.808750 sshd[1827]: Connection closed by 147.75.109.163 port 35246 Jan 13 20:45:34.809149 sshd-session[1825]: pam_unix(sshd:session): session closed for user core Jan 13 20:45:34.821069 systemd[1]: sshd@5-139.178.70.110:22-147.75.109.163:35246.service: Deactivated successfully. Jan 13 20:45:34.821981 systemd[1]: session-8.scope: Deactivated successfully. Jan 13 20:45:34.822918 systemd-logind[1526]: Session 8 logged out. Waiting for processes to exit. Jan 13 20:45:34.823774 systemd[1]: Started sshd@6-139.178.70.110:22-147.75.109.163:35252.service - OpenSSH per-connection server daemon (147.75.109.163:35252). Jan 13 20:45:34.824623 systemd-logind[1526]: Removed session 8. Jan 13 20:45:34.869701 sshd[1859]: Accepted publickey for core from 147.75.109.163 port 35252 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:45:34.870524 sshd-session[1859]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:45:34.873788 systemd-logind[1526]: New session 9 of user core. Jan 13 20:45:34.879434 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 13 20:45:34.927573 sudo[1862]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 13 20:45:34.927777 sudo[1862]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:45:35.205634 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 13 20:45:35.205752 (dockerd)[1879]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 13 20:45:35.456950 dockerd[1879]: time="2025-01-13T20:45:35.456877524Z" level=info msg="Starting up" Jan 13 20:45:35.515044 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2873507438-merged.mount: Deactivated successfully. Jan 13 20:45:35.535543 dockerd[1879]: time="2025-01-13T20:45:35.535522233Z" level=info msg="Loading containers: start." Jan 13 20:45:35.638357 kernel: Initializing XFRM netlink socket Jan 13 20:45:35.687238 systemd-networkd[1457]: docker0: Link UP Jan 13 20:45:35.714780 dockerd[1879]: time="2025-01-13T20:45:35.714371683Z" level=info msg="Loading containers: done." Jan 13 20:45:35.726018 dockerd[1879]: time="2025-01-13T20:45:35.725994564Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 13 20:45:35.726170 dockerd[1879]: time="2025-01-13T20:45:35.726156724Z" level=info msg="Docker daemon" commit=41ca978a0a5400cc24b274137efa9f25517fcc0b containerd-snapshotter=false storage-driver=overlay2 version=27.3.1 Jan 13 20:45:35.726293 dockerd[1879]: time="2025-01-13T20:45:35.726281194Z" level=info msg="Daemon has completed initialization" Jan 13 20:45:35.743982 dockerd[1879]: time="2025-01-13T20:45:35.743949709Z" level=info msg="API listen on /run/docker.sock" Jan 13 20:45:35.744113 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 13 20:45:36.448490 containerd[1547]: time="2025-01-13T20:45:36.448464445Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.29.12\"" Jan 13 20:45:36.976430 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount430993867.mount: Deactivated successfully. Jan 13 20:45:38.089292 containerd[1547]: time="2025-01-13T20:45:38.089262031Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:38.089833 containerd[1547]: time="2025-01-13T20:45:38.089813474Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.29.12: active requests=0, bytes read=35139254" Jan 13 20:45:38.089958 containerd[1547]: time="2025-01-13T20:45:38.089850694Z" level=info msg="ImageCreate event name:\"sha256:92fbbe8caf9c923e0406b93c082b9e7af30032ace2d836c785633f90514bfefa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:38.091567 containerd[1547]: time="2025-01-13T20:45:38.091553055Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:2804b1e7b9e08f3a3468f8fd2f6487c55968b9293ee51b9efb865b3298acfa26\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:38.092131 containerd[1547]: time="2025-01-13T20:45:38.092116305Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.29.12\" with image id \"sha256:92fbbe8caf9c923e0406b93c082b9e7af30032ace2d836c785633f90514bfefa\", repo tag \"registry.k8s.io/kube-apiserver:v1.29.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:2804b1e7b9e08f3a3468f8fd2f6487c55968b9293ee51b9efb865b3298acfa26\", size \"35136054\" in 1.643629653s" Jan 13 20:45:38.092158 containerd[1547]: time="2025-01-13T20:45:38.092134402Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.29.12\" returns image reference \"sha256:92fbbe8caf9c923e0406b93c082b9e7af30032ace2d836c785633f90514bfefa\"" Jan 13 20:45:38.103697 containerd[1547]: time="2025-01-13T20:45:38.103516798Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.29.12\"" Jan 13 20:45:38.774573 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 13 20:45:38.781461 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:45:39.021038 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:45:39.029597 (kubelet)[2139]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:45:39.068873 kubelet[2139]: E0113 20:45:39.068839 2139 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:45:39.070222 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:45:39.070325 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:45:39.337537 update_engine[1529]: I20250113 20:45:39.337363 1529 update_attempter.cc:509] Updating boot flags... Jan 13 20:45:39.362428 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 44 scanned by (udev-worker) (2160) Jan 13 20:45:39.856325 containerd[1547]: time="2025-01-13T20:45:39.856282475Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:39.864184 containerd[1547]: time="2025-01-13T20:45:39.864055987Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.29.12: active requests=0, bytes read=32217732" Jan 13 20:45:39.871165 containerd[1547]: time="2025-01-13T20:45:39.871132340Z" level=info msg="ImageCreate event name:\"sha256:f3b58a53109c96b6bf82adb5973fefa4baec46e2e9ee200be5cc03f3afbf127d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:39.881776 containerd[1547]: time="2025-01-13T20:45:39.879425645Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:e2f26a3f5ef3fd01f6330cab8b078cf303cfb6d36911a210d0915d535910e412\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:39.881776 containerd[1547]: time="2025-01-13T20:45:39.881301681Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.29.12\" with image id \"sha256:f3b58a53109c96b6bf82adb5973fefa4baec46e2e9ee200be5cc03f3afbf127d\", repo tag \"registry.k8s.io/kube-controller-manager:v1.29.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:e2f26a3f5ef3fd01f6330cab8b078cf303cfb6d36911a210d0915d535910e412\", size \"33662844\" in 1.777766184s" Jan 13 20:45:39.881776 containerd[1547]: time="2025-01-13T20:45:39.881319448Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.29.12\" returns image reference \"sha256:f3b58a53109c96b6bf82adb5973fefa4baec46e2e9ee200be5cc03f3afbf127d\"" Jan 13 20:45:39.898171 containerd[1547]: time="2025-01-13T20:45:39.898151607Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.29.12\"" Jan 13 20:45:40.871147 containerd[1547]: time="2025-01-13T20:45:40.870557472Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:40.875877 containerd[1547]: time="2025-01-13T20:45:40.875726856Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.29.12: active requests=0, bytes read=17332822" Jan 13 20:45:40.878412 containerd[1547]: time="2025-01-13T20:45:40.878377447Z" level=info msg="ImageCreate event name:\"sha256:e6d3373aa79026111619cc6cc1ffff8b27006c56422e7c95724b03a61b530eaf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:40.886010 containerd[1547]: time="2025-01-13T20:45:40.885981465Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:ed66e2102f4705d45de7513decf3ac61879704984409323779d19e98b970568c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:40.886783 containerd[1547]: time="2025-01-13T20:45:40.886658907Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.29.12\" with image id \"sha256:e6d3373aa79026111619cc6cc1ffff8b27006c56422e7c95724b03a61b530eaf\", repo tag \"registry.k8s.io/kube-scheduler:v1.29.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:ed66e2102f4705d45de7513decf3ac61879704984409323779d19e98b970568c\", size \"18777952\" in 988.389188ms" Jan 13 20:45:40.886783 containerd[1547]: time="2025-01-13T20:45:40.886681004Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.29.12\" returns image reference \"sha256:e6d3373aa79026111619cc6cc1ffff8b27006c56422e7c95724b03a61b530eaf\"" Jan 13 20:45:40.902076 containerd[1547]: time="2025-01-13T20:45:40.902051174Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.12\"" Jan 13 20:45:41.873623 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4017078355.mount: Deactivated successfully. Jan 13 20:45:42.099067 containerd[1547]: time="2025-01-13T20:45:42.098873123Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:42.099287 containerd[1547]: time="2025-01-13T20:45:42.099231075Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.29.12: active requests=0, bytes read=28619958" Jan 13 20:45:42.099590 containerd[1547]: time="2025-01-13T20:45:42.099450206Z" level=info msg="ImageCreate event name:\"sha256:d699d5830022f9e67c3271d1c2af58eaede81e3567df82728b7d2a8bf12ed153\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:42.100463 containerd[1547]: time="2025-01-13T20:45:42.100449741Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bc761494b78fa152a759457f42bc9b86ee9d18f5929bb127bd5f72f8e2112c39\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:42.101022 containerd[1547]: time="2025-01-13T20:45:42.100807744Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.29.12\" with image id \"sha256:d699d5830022f9e67c3271d1c2af58eaede81e3567df82728b7d2a8bf12ed153\", repo tag \"registry.k8s.io/kube-proxy:v1.29.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:bc761494b78fa152a759457f42bc9b86ee9d18f5929bb127bd5f72f8e2112c39\", size \"28618977\" in 1.198733648s" Jan 13 20:45:42.101022 containerd[1547]: time="2025-01-13T20:45:42.100825050Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.12\" returns image reference \"sha256:d699d5830022f9e67c3271d1c2af58eaede81e3567df82728b7d2a8bf12ed153\"" Jan 13 20:45:42.113329 containerd[1547]: time="2025-01-13T20:45:42.113315273Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Jan 13 20:45:42.666537 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4236846342.mount: Deactivated successfully. Jan 13 20:45:43.530671 containerd[1547]: time="2025-01-13T20:45:43.530644339Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:43.531717 containerd[1547]: time="2025-01-13T20:45:43.531690855Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" Jan 13 20:45:43.532881 containerd[1547]: time="2025-01-13T20:45:43.532083498Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:43.533608 containerd[1547]: time="2025-01-13T20:45:43.533596134Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:43.534085 containerd[1547]: time="2025-01-13T20:45:43.534066861Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.420621532s" Jan 13 20:45:43.534109 containerd[1547]: time="2025-01-13T20:45:43.534086870Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Jan 13 20:45:43.545567 containerd[1547]: time="2025-01-13T20:45:43.545545480Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Jan 13 20:45:44.187540 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount282423892.mount: Deactivated successfully. Jan 13 20:45:44.189026 containerd[1547]: time="2025-01-13T20:45:44.189006602Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:44.189684 containerd[1547]: time="2025-01-13T20:45:44.189591711Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322290" Jan 13 20:45:44.189950 containerd[1547]: time="2025-01-13T20:45:44.189935415Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:44.190960 containerd[1547]: time="2025-01-13T20:45:44.190937844Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:44.191600 containerd[1547]: time="2025-01-13T20:45:44.191376921Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 645.811043ms" Jan 13 20:45:44.191600 containerd[1547]: time="2025-01-13T20:45:44.191392277Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Jan 13 20:45:44.203507 containerd[1547]: time="2025-01-13T20:45:44.203488857Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\"" Jan 13 20:45:44.713813 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3755688484.mount: Deactivated successfully. Jan 13 20:45:47.649456 containerd[1547]: time="2025-01-13T20:45:47.649414658Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.10-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:47.650035 containerd[1547]: time="2025-01-13T20:45:47.650010143Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.10-0: active requests=0, bytes read=56651625" Jan 13 20:45:47.650506 containerd[1547]: time="2025-01-13T20:45:47.650493984Z" level=info msg="ImageCreate event name:\"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:47.651972 containerd[1547]: time="2025-01-13T20:45:47.651949748Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:45:47.652669 containerd[1547]: time="2025-01-13T20:45:47.652566846Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.10-0\" with image id \"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\", repo tag \"registry.k8s.io/etcd:3.5.10-0\", repo digest \"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\", size \"56649232\" in 3.448956911s" Jan 13 20:45:47.652669 containerd[1547]: time="2025-01-13T20:45:47.652589288Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\" returns image reference \"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\"" Jan 13 20:45:49.207948 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jan 13 20:45:49.214461 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:45:49.533418 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:45:49.535006 (kubelet)[2359]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:45:49.593664 kubelet[2359]: E0113 20:45:49.592037 2359 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:45:49.593538 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:45:49.593616 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:45:49.892886 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:45:49.902533 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:45:49.920299 systemd[1]: Reloading requested from client PID 2374 ('systemctl') (unit session-9.scope)... Jan 13 20:45:49.920307 systemd[1]: Reloading... Jan 13 20:45:49.986370 zram_generator::config[2413]: No configuration found. Jan 13 20:45:50.041931 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:45:50.057578 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:45:50.101548 systemd[1]: Reloading finished in 180 ms. Jan 13 20:45:50.126909 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 13 20:45:50.126959 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 13 20:45:50.127106 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:45:50.131511 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:45:50.573155 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:45:50.579195 (kubelet)[2479]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 20:45:50.616040 kubelet[2479]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:45:50.616040 kubelet[2479]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 13 20:45:50.616040 kubelet[2479]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:45:50.616273 kubelet[2479]: I0113 20:45:50.616072 2479 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 20:45:50.855819 kubelet[2479]: I0113 20:45:50.855756 2479 server.go:487] "Kubelet version" kubeletVersion="v1.29.2" Jan 13 20:45:50.855819 kubelet[2479]: I0113 20:45:50.855777 2479 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 20:45:50.856258 kubelet[2479]: I0113 20:45:50.856238 2479 server.go:919] "Client rotation is on, will bootstrap in background" Jan 13 20:45:50.966373 kubelet[2479]: I0113 20:45:50.966228 2479 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 20:45:50.967115 kubelet[2479]: E0113 20:45:50.967044 2479 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://139.178.70.110:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:50.983249 kubelet[2479]: I0113 20:45:50.983058 2479 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 20:45:50.983249 kubelet[2479]: I0113 20:45:50.983235 2479 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 20:45:50.984325 kubelet[2479]: I0113 20:45:50.984306 2479 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 13 20:45:50.984493 kubelet[2479]: I0113 20:45:50.984331 2479 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 20:45:50.984493 kubelet[2479]: I0113 20:45:50.984352 2479 container_manager_linux.go:301] "Creating device plugin manager" Jan 13 20:45:50.984493 kubelet[2479]: I0113 20:45:50.984432 2479 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:45:50.984570 kubelet[2479]: I0113 20:45:50.984507 2479 kubelet.go:396] "Attempting to sync node with API server" Jan 13 20:45:50.984570 kubelet[2479]: I0113 20:45:50.984523 2479 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 20:45:50.985014 kubelet[2479]: W0113 20:45:50.984965 2479 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://139.178.70.110:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:50.985014 kubelet[2479]: E0113 20:45:50.985001 2479 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.110:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:50.985762 kubelet[2479]: I0113 20:45:50.985744 2479 kubelet.go:312] "Adding apiserver pod source" Jan 13 20:45:50.985762 kubelet[2479]: I0113 20:45:50.985763 2479 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 20:45:50.986799 kubelet[2479]: W0113 20:45:50.986770 2479 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://139.178.70.110:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:50.986799 kubelet[2479]: E0113 20:45:50.986801 2479 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.110:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:50.988475 kubelet[2479]: I0113 20:45:50.987011 2479 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 13 20:45:50.990921 kubelet[2479]: I0113 20:45:50.990856 2479 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 20:45:50.990921 kubelet[2479]: W0113 20:45:50.990903 2479 probe.go:268] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 13 20:45:50.991294 kubelet[2479]: I0113 20:45:50.991279 2479 server.go:1256] "Started kubelet" Jan 13 20:45:50.993358 kubelet[2479]: I0113 20:45:50.992945 2479 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 20:45:50.993556 kubelet[2479]: I0113 20:45:50.993541 2479 server.go:461] "Adding debug handlers to kubelet server" Jan 13 20:45:50.995603 kubelet[2479]: I0113 20:45:50.995575 2479 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 20:45:50.997835 kubelet[2479]: I0113 20:45:50.997529 2479 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 20:45:50.997835 kubelet[2479]: I0113 20:45:50.997630 2479 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 20:45:50.998794 kubelet[2479]: E0113 20:45:50.998785 2479 event.go:355] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.110:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.110:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.181a5b71e252e15c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-01-13 20:45:50.991262044 +0000 UTC m=+0.409307147,LastTimestamp:2025-01-13 20:45:50.991262044 +0000 UTC m=+0.409307147,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jan 13 20:45:51.001763 kubelet[2479]: E0113 20:45:51.001756 2479 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:45:51.001841 kubelet[2479]: I0113 20:45:51.001835 2479 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 13 20:45:51.001919 kubelet[2479]: I0113 20:45:51.001913 2479 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Jan 13 20:45:51.001986 kubelet[2479]: I0113 20:45:51.001980 2479 reconciler_new.go:29] "Reconciler: start to sync state" Jan 13 20:45:51.002163 kubelet[2479]: W0113 20:45:51.002147 2479 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://139.178.70.110:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:51.002206 kubelet[2479]: E0113 20:45:51.002201 2479 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.110:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:51.002487 kubelet[2479]: E0113 20:45:51.002480 2479 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.110:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.110:6443: connect: connection refused" interval="200ms" Jan 13 20:45:51.004092 kubelet[2479]: I0113 20:45:51.004080 2479 factory.go:221] Registration of the containerd container factory successfully Jan 13 20:45:51.004092 kubelet[2479]: I0113 20:45:51.004089 2479 factory.go:221] Registration of the systemd container factory successfully Jan 13 20:45:51.004144 kubelet[2479]: I0113 20:45:51.004125 2479 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 20:45:51.007370 kubelet[2479]: E0113 20:45:51.007244 2479 kubelet.go:1462] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 13 20:45:51.010732 kubelet[2479]: I0113 20:45:51.010682 2479 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 20:45:51.011251 kubelet[2479]: I0113 20:45:51.011244 2479 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 20:45:51.011292 kubelet[2479]: I0113 20:45:51.011288 2479 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 13 20:45:51.011335 kubelet[2479]: I0113 20:45:51.011329 2479 kubelet.go:2329] "Starting kubelet main sync loop" Jan 13 20:45:51.011483 kubelet[2479]: E0113 20:45:51.011476 2479 kubelet.go:2353] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 13 20:45:51.015219 kubelet[2479]: W0113 20:45:51.015201 2479 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://139.178.70.110:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:51.015269 kubelet[2479]: E0113 20:45:51.015264 2479 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.110:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:51.027028 kubelet[2479]: I0113 20:45:51.027019 2479 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 13 20:45:51.027121 kubelet[2479]: I0113 20:45:51.027115 2479 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 13 20:45:51.027174 kubelet[2479]: I0113 20:45:51.027163 2479 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:45:51.028180 kubelet[2479]: I0113 20:45:51.028143 2479 policy_none.go:49] "None policy: Start" Jan 13 20:45:51.028401 kubelet[2479]: I0113 20:45:51.028389 2479 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 13 20:45:51.028452 kubelet[2479]: I0113 20:45:51.028402 2479 state_mem.go:35] "Initializing new in-memory state store" Jan 13 20:45:51.032283 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 13 20:45:51.048554 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 13 20:45:51.053589 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 13 20:45:51.061260 kubelet[2479]: I0113 20:45:51.061237 2479 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 20:45:51.061440 kubelet[2479]: I0113 20:45:51.061427 2479 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 20:45:51.062920 kubelet[2479]: E0113 20:45:51.062900 2479 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Jan 13 20:45:51.102700 kubelet[2479]: I0113 20:45:51.102681 2479 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:45:51.102922 kubelet[2479]: E0113 20:45:51.102907 2479 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.110:6443/api/v1/nodes\": dial tcp 139.178.70.110:6443: connect: connection refused" node="localhost" Jan 13 20:45:51.112172 kubelet[2479]: I0113 20:45:51.112131 2479 topology_manager.go:215] "Topology Admit Handler" podUID="4f8e0d694c07e04969646aa3c152c34a" podNamespace="kube-system" podName="kube-controller-manager-localhost" Jan 13 20:45:51.112875 kubelet[2479]: I0113 20:45:51.112842 2479 topology_manager.go:215] "Topology Admit Handler" podUID="c4144e8f85b2123a6afada0c1705bbba" podNamespace="kube-system" podName="kube-scheduler-localhost" Jan 13 20:45:51.114488 kubelet[2479]: I0113 20:45:51.114474 2479 topology_manager.go:215] "Topology Admit Handler" podUID="8e44ea3e7811a49fbd640050ff29f6c9" podNamespace="kube-system" podName="kube-apiserver-localhost" Jan 13 20:45:51.119529 systemd[1]: Created slice kubepods-burstable-pod4f8e0d694c07e04969646aa3c152c34a.slice - libcontainer container kubepods-burstable-pod4f8e0d694c07e04969646aa3c152c34a.slice. Jan 13 20:45:51.144790 systemd[1]: Created slice kubepods-burstable-podc4144e8f85b2123a6afada0c1705bbba.slice - libcontainer container kubepods-burstable-podc4144e8f85b2123a6afada0c1705bbba.slice. Jan 13 20:45:51.153293 systemd[1]: Created slice kubepods-burstable-pod8e44ea3e7811a49fbd640050ff29f6c9.slice - libcontainer container kubepods-burstable-pod8e44ea3e7811a49fbd640050ff29f6c9.slice. Jan 13 20:45:51.203500 kubelet[2479]: I0113 20:45:51.203441 2479 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8e44ea3e7811a49fbd640050ff29f6c9-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"8e44ea3e7811a49fbd640050ff29f6c9\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:45:51.203698 kubelet[2479]: E0113 20:45:51.203687 2479 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.110:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.110:6443: connect: connection refused" interval="400ms" Jan 13 20:45:51.303565 kubelet[2479]: I0113 20:45:51.303507 2479 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8e44ea3e7811a49fbd640050ff29f6c9-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"8e44ea3e7811a49fbd640050ff29f6c9\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:45:51.303565 kubelet[2479]: I0113 20:45:51.303534 2479 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8e44ea3e7811a49fbd640050ff29f6c9-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"8e44ea3e7811a49fbd640050ff29f6c9\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:45:51.303565 kubelet[2479]: I0113 20:45:51.303555 2479 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:45:51.303709 kubelet[2479]: I0113 20:45:51.303589 2479 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:45:51.303709 kubelet[2479]: I0113 20:45:51.303611 2479 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:45:51.303709 kubelet[2479]: I0113 20:45:51.303640 2479 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c4144e8f85b2123a6afada0c1705bbba-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"c4144e8f85b2123a6afada0c1705bbba\") " pod="kube-system/kube-scheduler-localhost" Jan 13 20:45:51.303709 kubelet[2479]: I0113 20:45:51.303668 2479 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:45:51.303709 kubelet[2479]: I0113 20:45:51.303685 2479 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:45:51.304092 kubelet[2479]: I0113 20:45:51.304074 2479 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:45:51.304380 kubelet[2479]: E0113 20:45:51.304366 2479 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.110:6443/api/v1/nodes\": dial tcp 139.178.70.110:6443: connect: connection refused" node="localhost" Jan 13 20:45:51.443099 containerd[1547]: time="2025-01-13T20:45:51.442690460Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:4f8e0d694c07e04969646aa3c152c34a,Namespace:kube-system,Attempt:0,}" Jan 13 20:45:51.448249 containerd[1547]: time="2025-01-13T20:45:51.448224896Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:c4144e8f85b2123a6afada0c1705bbba,Namespace:kube-system,Attempt:0,}" Jan 13 20:45:51.455742 containerd[1547]: time="2025-01-13T20:45:51.455722299Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:8e44ea3e7811a49fbd640050ff29f6c9,Namespace:kube-system,Attempt:0,}" Jan 13 20:45:51.605020 kubelet[2479]: E0113 20:45:51.604997 2479 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.110:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.110:6443: connect: connection refused" interval="800ms" Jan 13 20:45:51.705415 kubelet[2479]: I0113 20:45:51.705312 2479 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:45:51.705763 kubelet[2479]: E0113 20:45:51.705530 2479 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.110:6443/api/v1/nodes\": dial tcp 139.178.70.110:6443: connect: connection refused" node="localhost" Jan 13 20:45:51.835651 kubelet[2479]: W0113 20:45:51.835605 2479 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://139.178.70.110:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:51.835651 kubelet[2479]: E0113 20:45:51.835654 2479 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.110:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:51.974388 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3800648773.mount: Deactivated successfully. Jan 13 20:45:51.976009 containerd[1547]: time="2025-01-13T20:45:51.975863471Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:45:51.976921 containerd[1547]: time="2025-01-13T20:45:51.976905072Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:45:51.977905 containerd[1547]: time="2025-01-13T20:45:51.977745022Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 13 20:45:51.977905 containerd[1547]: time="2025-01-13T20:45:51.977773244Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Jan 13 20:45:51.978148 containerd[1547]: time="2025-01-13T20:45:51.978131771Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:45:51.978148 containerd[1547]: time="2025-01-13T20:45:51.978699815Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:45:51.978796 containerd[1547]: time="2025-01-13T20:45:51.978782516Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 13 20:45:51.979866 containerd[1547]: time="2025-01-13T20:45:51.979854382Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:45:51.981417 containerd[1547]: time="2025-01-13T20:45:51.981405837Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 533.121378ms" Jan 13 20:45:51.982354 containerd[1547]: time="2025-01-13T20:45:51.982332889Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 539.570802ms" Jan 13 20:45:51.984402 containerd[1547]: time="2025-01-13T20:45:51.982853774Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 527.087162ms" Jan 13 20:45:52.101149 kubelet[2479]: W0113 20:45:52.101103 2479 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://139.178.70.110:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:52.101149 kubelet[2479]: E0113 20:45:52.101150 2479 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.110:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:52.164591 containerd[1547]: time="2025-01-13T20:45:52.160480169Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:45:52.164591 containerd[1547]: time="2025-01-13T20:45:52.162790162Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:45:52.164591 containerd[1547]: time="2025-01-13T20:45:52.162804783Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:45:52.164591 containerd[1547]: time="2025-01-13T20:45:52.162864328Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:45:52.166538 containerd[1547]: time="2025-01-13T20:45:52.166260457Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:45:52.166538 containerd[1547]: time="2025-01-13T20:45:52.166293489Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:45:52.166538 containerd[1547]: time="2025-01-13T20:45:52.166310076Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:45:52.166538 containerd[1547]: time="2025-01-13T20:45:52.166391906Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:45:52.169995 containerd[1547]: time="2025-01-13T20:45:52.167485873Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:45:52.169995 containerd[1547]: time="2025-01-13T20:45:52.167632716Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:45:52.169995 containerd[1547]: time="2025-01-13T20:45:52.167646440Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:45:52.169995 containerd[1547]: time="2025-01-13T20:45:52.167701840Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:45:52.174504 kubelet[2479]: W0113 20:45:52.174179 2479 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://139.178.70.110:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:52.174504 kubelet[2479]: E0113 20:45:52.174215 2479 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.110:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:52.185464 systemd[1]: Started cri-containerd-0d2b2717c19feadca4102ce04c318efe12aecaeb68a395540a3213c0f9caeaa4.scope - libcontainer container 0d2b2717c19feadca4102ce04c318efe12aecaeb68a395540a3213c0f9caeaa4. Jan 13 20:45:52.188498 systemd[1]: Started cri-containerd-157bb3f214cfd62ab9d281f1043239a4b3f60486d8a070fc8e502362e2bdce3f.scope - libcontainer container 157bb3f214cfd62ab9d281f1043239a4b3f60486d8a070fc8e502362e2bdce3f. Jan 13 20:45:52.189818 systemd[1]: Started cri-containerd-a7643b275eaa21dcc9606ed272ec7ae3ca47abf61177c288656c3c5ede499e82.scope - libcontainer container a7643b275eaa21dcc9606ed272ec7ae3ca47abf61177c288656c3c5ede499e82. Jan 13 20:45:52.227549 containerd[1547]: time="2025-01-13T20:45:52.227406747Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:4f8e0d694c07e04969646aa3c152c34a,Namespace:kube-system,Attempt:0,} returns sandbox id \"0d2b2717c19feadca4102ce04c318efe12aecaeb68a395540a3213c0f9caeaa4\"" Jan 13 20:45:52.230424 containerd[1547]: time="2025-01-13T20:45:52.230364551Z" level=info msg="CreateContainer within sandbox \"0d2b2717c19feadca4102ce04c318efe12aecaeb68a395540a3213c0f9caeaa4\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 13 20:45:52.233523 containerd[1547]: time="2025-01-13T20:45:52.233483092Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:8e44ea3e7811a49fbd640050ff29f6c9,Namespace:kube-system,Attempt:0,} returns sandbox id \"157bb3f214cfd62ab9d281f1043239a4b3f60486d8a070fc8e502362e2bdce3f\"" Jan 13 20:45:52.236416 containerd[1547]: time="2025-01-13T20:45:52.236265277Z" level=info msg="CreateContainer within sandbox \"157bb3f214cfd62ab9d281f1043239a4b3f60486d8a070fc8e502362e2bdce3f\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 13 20:45:52.242786 containerd[1547]: time="2025-01-13T20:45:52.242717591Z" level=info msg="CreateContainer within sandbox \"0d2b2717c19feadca4102ce04c318efe12aecaeb68a395540a3213c0f9caeaa4\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"bb4e412c2332db129a64c59f1df1047ab610158d5d2339115eccb4370c9c4b79\"" Jan 13 20:45:52.243216 containerd[1547]: time="2025-01-13T20:45:52.243152995Z" level=info msg="StartContainer for \"bb4e412c2332db129a64c59f1df1047ab610158d5d2339115eccb4370c9c4b79\"" Jan 13 20:45:52.244112 containerd[1547]: time="2025-01-13T20:45:52.244055315Z" level=info msg="CreateContainer within sandbox \"157bb3f214cfd62ab9d281f1043239a4b3f60486d8a070fc8e502362e2bdce3f\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"dbed451158bd1e88878f85c331b97eb8e78a88da3d33f3528de64aa7e321f1af\"" Jan 13 20:45:52.244353 containerd[1547]: time="2025-01-13T20:45:52.244284603Z" level=info msg="StartContainer for \"dbed451158bd1e88878f85c331b97eb8e78a88da3d33f3528de64aa7e321f1af\"" Jan 13 20:45:52.244955 containerd[1547]: time="2025-01-13T20:45:52.244911465Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:c4144e8f85b2123a6afada0c1705bbba,Namespace:kube-system,Attempt:0,} returns sandbox id \"a7643b275eaa21dcc9606ed272ec7ae3ca47abf61177c288656c3c5ede499e82\"" Jan 13 20:45:52.246127 containerd[1547]: time="2025-01-13T20:45:52.246087631Z" level=info msg="CreateContainer within sandbox \"a7643b275eaa21dcc9606ed272ec7ae3ca47abf61177c288656c3c5ede499e82\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 13 20:45:52.258013 containerd[1547]: time="2025-01-13T20:45:52.257944994Z" level=info msg="CreateContainer within sandbox \"a7643b275eaa21dcc9606ed272ec7ae3ca47abf61177c288656c3c5ede499e82\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c0fd55bb64cc7bee7a12f047cac649e7716119c67af7bdc6c31b1a48adc3c705\"" Jan 13 20:45:52.258354 containerd[1547]: time="2025-01-13T20:45:52.258223381Z" level=info msg="StartContainer for \"c0fd55bb64cc7bee7a12f047cac649e7716119c67af7bdc6c31b1a48adc3c705\"" Jan 13 20:45:52.266625 systemd[1]: Started cri-containerd-dbed451158bd1e88878f85c331b97eb8e78a88da3d33f3528de64aa7e321f1af.scope - libcontainer container dbed451158bd1e88878f85c331b97eb8e78a88da3d33f3528de64aa7e321f1af. Jan 13 20:45:52.275392 systemd[1]: Started cri-containerd-bb4e412c2332db129a64c59f1df1047ab610158d5d2339115eccb4370c9c4b79.scope - libcontainer container bb4e412c2332db129a64c59f1df1047ab610158d5d2339115eccb4370c9c4b79. Jan 13 20:45:52.278907 systemd[1]: Started cri-containerd-c0fd55bb64cc7bee7a12f047cac649e7716119c67af7bdc6c31b1a48adc3c705.scope - libcontainer container c0fd55bb64cc7bee7a12f047cac649e7716119c67af7bdc6c31b1a48adc3c705. Jan 13 20:45:52.309802 containerd[1547]: time="2025-01-13T20:45:52.309705006Z" level=info msg="StartContainer for \"dbed451158bd1e88878f85c331b97eb8e78a88da3d33f3528de64aa7e321f1af\" returns successfully" Jan 13 20:45:52.317901 containerd[1547]: time="2025-01-13T20:45:52.317753352Z" level=info msg="StartContainer for \"bb4e412c2332db129a64c59f1df1047ab610158d5d2339115eccb4370c9c4b79\" returns successfully" Jan 13 20:45:52.324077 containerd[1547]: time="2025-01-13T20:45:52.323875872Z" level=info msg="StartContainer for \"c0fd55bb64cc7bee7a12f047cac649e7716119c67af7bdc6c31b1a48adc3c705\" returns successfully" Jan 13 20:45:52.406355 kubelet[2479]: E0113 20:45:52.405668 2479 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.110:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.110:6443: connect: connection refused" interval="1.6s" Jan 13 20:45:52.501497 kubelet[2479]: W0113 20:45:52.501408 2479 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://139.178.70.110:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:52.501497 kubelet[2479]: E0113 20:45:52.501445 2479 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.110:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.110:6443: connect: connection refused Jan 13 20:45:52.507195 kubelet[2479]: I0113 20:45:52.507186 2479 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:45:52.507434 kubelet[2479]: E0113 20:45:52.507417 2479 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.110:6443/api/v1/nodes\": dial tcp 139.178.70.110:6443: connect: connection refused" node="localhost" Jan 13 20:45:53.819673 kubelet[2479]: E0113 20:45:53.819637 2479 csi_plugin.go:300] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Jan 13 20:45:54.009125 kubelet[2479]: E0113 20:45:54.009075 2479 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Jan 13 20:45:54.108312 kubelet[2479]: I0113 20:45:54.108234 2479 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:45:54.117955 kubelet[2479]: I0113 20:45:54.117927 2479 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Jan 13 20:45:54.123302 kubelet[2479]: E0113 20:45:54.123280 2479 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:45:54.224277 kubelet[2479]: E0113 20:45:54.224253 2479 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:45:54.324861 kubelet[2479]: E0113 20:45:54.324826 2479 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:45:54.425454 kubelet[2479]: E0113 20:45:54.425380 2479 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:45:54.526192 kubelet[2479]: E0113 20:45:54.526157 2479 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:45:54.626705 kubelet[2479]: E0113 20:45:54.626643 2479 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:45:54.727241 kubelet[2479]: E0113 20:45:54.727216 2479 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:45:54.828067 kubelet[2479]: E0113 20:45:54.828044 2479 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:45:54.929044 kubelet[2479]: E0113 20:45:54.929023 2479 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:45:54.988053 kubelet[2479]: I0113 20:45:54.987852 2479 apiserver.go:52] "Watching apiserver" Jan 13 20:45:55.002518 kubelet[2479]: I0113 20:45:55.002470 2479 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Jan 13 20:45:55.750065 systemd[1]: Reloading requested from client PID 2745 ('systemctl') (unit session-9.scope)... Jan 13 20:45:55.750075 systemd[1]: Reloading... Jan 13 20:45:55.814363 zram_generator::config[2786]: No configuration found. Jan 13 20:45:55.869929 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:45:55.885433 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:45:55.935855 systemd[1]: Reloading finished in 185 ms. Jan 13 20:45:55.959047 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:45:55.959902 kubelet[2479]: I0113 20:45:55.959822 2479 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 20:45:55.968593 systemd[1]: kubelet.service: Deactivated successfully. Jan 13 20:45:55.968776 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:45:55.974534 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:45:56.220216 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:45:56.227588 (kubelet)[2851]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 20:45:56.271748 kubelet[2851]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:45:56.271748 kubelet[2851]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 13 20:45:56.271748 kubelet[2851]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:45:56.271748 kubelet[2851]: I0113 20:45:56.271722 2851 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 20:45:56.278331 kubelet[2851]: I0113 20:45:56.278262 2851 server.go:487] "Kubelet version" kubeletVersion="v1.29.2" Jan 13 20:45:56.279153 kubelet[2851]: I0113 20:45:56.278394 2851 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 20:45:56.279153 kubelet[2851]: I0113 20:45:56.278551 2851 server.go:919] "Client rotation is on, will bootstrap in background" Jan 13 20:45:56.281071 kubelet[2851]: I0113 20:45:56.281059 2851 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 13 20:45:56.286535 kubelet[2851]: I0113 20:45:56.286467 2851 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 20:45:56.291557 kubelet[2851]: I0113 20:45:56.291498 2851 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 20:45:56.292220 kubelet[2851]: I0113 20:45:56.292210 2851 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 20:45:56.292314 kubelet[2851]: I0113 20:45:56.292302 2851 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 13 20:45:56.292384 kubelet[2851]: I0113 20:45:56.292318 2851 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 20:45:56.292384 kubelet[2851]: I0113 20:45:56.292325 2851 container_manager_linux.go:301] "Creating device plugin manager" Jan 13 20:45:56.292384 kubelet[2851]: I0113 20:45:56.292352 2851 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:45:56.295425 kubelet[2851]: I0113 20:45:56.295420 2851 kubelet.go:396] "Attempting to sync node with API server" Jan 13 20:45:56.295453 kubelet[2851]: I0113 20:45:56.295430 2851 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 20:45:56.295453 kubelet[2851]: I0113 20:45:56.295444 2851 kubelet.go:312] "Adding apiserver pod source" Jan 13 20:45:56.295453 kubelet[2851]: I0113 20:45:56.295450 2851 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 20:45:56.301491 kubelet[2851]: I0113 20:45:56.301000 2851 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 13 20:45:56.301491 kubelet[2851]: I0113 20:45:56.301086 2851 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 20:45:56.301491 kubelet[2851]: I0113 20:45:56.301277 2851 server.go:1256] "Started kubelet" Jan 13 20:45:56.301491 kubelet[2851]: I0113 20:45:56.301427 2851 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 20:45:56.301596 kubelet[2851]: I0113 20:45:56.301589 2851 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 20:45:56.301750 kubelet[2851]: I0113 20:45:56.301739 2851 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 20:45:56.301950 kubelet[2851]: I0113 20:45:56.301940 2851 server.go:461] "Adding debug handlers to kubelet server" Jan 13 20:45:56.302684 kubelet[2851]: I0113 20:45:56.302524 2851 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 20:45:56.311312 kubelet[2851]: E0113 20:45:56.310828 2851 kubelet.go:1462] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 13 20:45:56.311600 kubelet[2851]: I0113 20:45:56.311537 2851 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 13 20:45:56.311797 kubelet[2851]: I0113 20:45:56.311788 2851 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Jan 13 20:45:56.312623 kubelet[2851]: I0113 20:45:56.312549 2851 reconciler_new.go:29] "Reconciler: start to sync state" Jan 13 20:45:56.314531 kubelet[2851]: I0113 20:45:56.314522 2851 factory.go:221] Registration of the systemd container factory successfully Jan 13 20:45:56.315648 kubelet[2851]: I0113 20:45:56.314649 2851 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 20:45:56.316773 kubelet[2851]: I0113 20:45:56.316719 2851 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 20:45:56.318587 kubelet[2851]: I0113 20:45:56.318579 2851 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 20:45:56.318789 kubelet[2851]: I0113 20:45:56.318635 2851 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 13 20:45:56.318789 kubelet[2851]: I0113 20:45:56.318647 2851 kubelet.go:2329] "Starting kubelet main sync loop" Jan 13 20:45:56.318789 kubelet[2851]: E0113 20:45:56.318676 2851 kubelet.go:2353] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 13 20:45:56.319694 kubelet[2851]: I0113 20:45:56.319685 2851 factory.go:221] Registration of the containerd container factory successfully Jan 13 20:45:56.361942 kubelet[2851]: I0113 20:45:56.361924 2851 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 13 20:45:56.362033 kubelet[2851]: I0113 20:45:56.361993 2851 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 13 20:45:56.362033 kubelet[2851]: I0113 20:45:56.362004 2851 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:45:56.362229 kubelet[2851]: I0113 20:45:56.362207 2851 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 13 20:45:56.362229 kubelet[2851]: I0113 20:45:56.362227 2851 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 13 20:45:56.362285 kubelet[2851]: I0113 20:45:56.362231 2851 policy_none.go:49] "None policy: Start" Jan 13 20:45:56.362967 kubelet[2851]: I0113 20:45:56.362954 2851 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 13 20:45:56.362967 kubelet[2851]: I0113 20:45:56.362968 2851 state_mem.go:35] "Initializing new in-memory state store" Jan 13 20:45:56.363054 kubelet[2851]: I0113 20:45:56.363043 2851 state_mem.go:75] "Updated machine memory state" Jan 13 20:45:56.366071 kubelet[2851]: I0113 20:45:56.366034 2851 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 20:45:56.366326 kubelet[2851]: I0113 20:45:56.366165 2851 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 20:45:56.412697 kubelet[2851]: I0113 20:45:56.412676 2851 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:45:56.416989 kubelet[2851]: I0113 20:45:56.416868 2851 kubelet_node_status.go:112] "Node was previously registered" node="localhost" Jan 13 20:45:56.416989 kubelet[2851]: I0113 20:45:56.416925 2851 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Jan 13 20:45:56.419763 kubelet[2851]: I0113 20:45:56.419179 2851 topology_manager.go:215] "Topology Admit Handler" podUID="8e44ea3e7811a49fbd640050ff29f6c9" podNamespace="kube-system" podName="kube-apiserver-localhost" Jan 13 20:45:56.419763 kubelet[2851]: I0113 20:45:56.419220 2851 topology_manager.go:215] "Topology Admit Handler" podUID="4f8e0d694c07e04969646aa3c152c34a" podNamespace="kube-system" podName="kube-controller-manager-localhost" Jan 13 20:45:56.419763 kubelet[2851]: I0113 20:45:56.419250 2851 topology_manager.go:215] "Topology Admit Handler" podUID="c4144e8f85b2123a6afada0c1705bbba" podNamespace="kube-system" podName="kube-scheduler-localhost" Jan 13 20:45:56.425040 kubelet[2851]: E0113 20:45:56.425027 2851 kubelet.go:1921] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Jan 13 20:45:56.614457 kubelet[2851]: I0113 20:45:56.614405 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8e44ea3e7811a49fbd640050ff29f6c9-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"8e44ea3e7811a49fbd640050ff29f6c9\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:45:56.614626 kubelet[2851]: I0113 20:45:56.614618 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8e44ea3e7811a49fbd640050ff29f6c9-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"8e44ea3e7811a49fbd640050ff29f6c9\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:45:56.614699 kubelet[2851]: I0113 20:45:56.614693 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:45:56.614744 kubelet[2851]: I0113 20:45:56.614739 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:45:56.614789 kubelet[2851]: I0113 20:45:56.614785 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:45:56.614860 kubelet[2851]: I0113 20:45:56.614850 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c4144e8f85b2123a6afada0c1705bbba-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"c4144e8f85b2123a6afada0c1705bbba\") " pod="kube-system/kube-scheduler-localhost" Jan 13 20:45:56.614967 kubelet[2851]: I0113 20:45:56.614928 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8e44ea3e7811a49fbd640050ff29f6c9-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"8e44ea3e7811a49fbd640050ff29f6c9\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:45:56.614967 kubelet[2851]: I0113 20:45:56.614942 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:45:56.614967 kubelet[2851]: I0113 20:45:56.614953 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:45:57.296189 kubelet[2851]: I0113 20:45:57.296032 2851 apiserver.go:52] "Watching apiserver" Jan 13 20:45:57.313333 kubelet[2851]: I0113 20:45:57.313303 2851 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Jan 13 20:45:57.363184 kubelet[2851]: E0113 20:45:57.363140 2851 kubelet.go:1921] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Jan 13 20:45:57.364193 kubelet[2851]: E0113 20:45:57.363772 2851 kubelet.go:1921] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Jan 13 20:45:57.394634 kubelet[2851]: I0113 20:45:57.394605 2851 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.394572957 podStartE2EDuration="1.394572957s" podCreationTimestamp="2025-01-13 20:45:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:45:57.382416466 +0000 UTC m=+1.151353651" watchObservedRunningTime="2025-01-13 20:45:57.394572957 +0000 UTC m=+1.163510136" Jan 13 20:45:57.403260 kubelet[2851]: I0113 20:45:57.403237 2851 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=2.403213562 podStartE2EDuration="2.403213562s" podCreationTimestamp="2025-01-13 20:45:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:45:57.394965676 +0000 UTC m=+1.163902867" watchObservedRunningTime="2025-01-13 20:45:57.403213562 +0000 UTC m=+1.172150744" Jan 13 20:45:57.411863 kubelet[2851]: I0113 20:45:57.411828 2851 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.4118069659999999 podStartE2EDuration="1.411806966s" podCreationTimestamp="2025-01-13 20:45:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:45:57.403454796 +0000 UTC m=+1.172391986" watchObservedRunningTime="2025-01-13 20:45:57.411806966 +0000 UTC m=+1.180744149" Jan 13 20:45:59.880525 sudo[1862]: pam_unix(sudo:session): session closed for user root Jan 13 20:45:59.881289 sshd[1861]: Connection closed by 147.75.109.163 port 35252 Jan 13 20:45:59.881744 sshd-session[1859]: pam_unix(sshd:session): session closed for user core Jan 13 20:45:59.883675 systemd[1]: sshd@6-139.178.70.110:22-147.75.109.163:35252.service: Deactivated successfully. Jan 13 20:45:59.884910 systemd[1]: session-9.scope: Deactivated successfully. Jan 13 20:45:59.885020 systemd[1]: session-9.scope: Consumed 3.213s CPU time, 184.2M memory peak, 0B memory swap peak. Jan 13 20:45:59.885535 systemd-logind[1526]: Session 9 logged out. Waiting for processes to exit. Jan 13 20:45:59.886102 systemd-logind[1526]: Removed session 9. Jan 13 20:46:10.570637 kubelet[2851]: I0113 20:46:10.570620 2851 kuberuntime_manager.go:1529] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 13 20:46:10.571254 containerd[1547]: time="2025-01-13T20:46:10.571220039Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 13 20:46:10.571950 kubelet[2851]: I0113 20:46:10.571331 2851 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 13 20:46:11.629353 kubelet[2851]: I0113 20:46:11.629313 2851 topology_manager.go:215] "Topology Admit Handler" podUID="75c86ebd-8939-4ef2-b562-702533428ff8" podNamespace="kube-system" podName="kube-proxy-whb4p" Jan 13 20:46:11.640671 systemd[1]: Created slice kubepods-besteffort-pod75c86ebd_8939_4ef2_b562_702533428ff8.slice - libcontainer container kubepods-besteffort-pod75c86ebd_8939_4ef2_b562_702533428ff8.slice. Jan 13 20:46:11.705519 kubelet[2851]: I0113 20:46:11.705494 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg9f5\" (UniqueName: \"kubernetes.io/projected/75c86ebd-8939-4ef2-b562-702533428ff8-kube-api-access-vg9f5\") pod \"kube-proxy-whb4p\" (UID: \"75c86ebd-8939-4ef2-b562-702533428ff8\") " pod="kube-system/kube-proxy-whb4p" Jan 13 20:46:11.705659 kubelet[2851]: I0113 20:46:11.705650 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/75c86ebd-8939-4ef2-b562-702533428ff8-kube-proxy\") pod \"kube-proxy-whb4p\" (UID: \"75c86ebd-8939-4ef2-b562-702533428ff8\") " pod="kube-system/kube-proxy-whb4p" Jan 13 20:46:11.705742 kubelet[2851]: I0113 20:46:11.705716 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/75c86ebd-8939-4ef2-b562-702533428ff8-xtables-lock\") pod \"kube-proxy-whb4p\" (UID: \"75c86ebd-8939-4ef2-b562-702533428ff8\") " pod="kube-system/kube-proxy-whb4p" Jan 13 20:46:11.705774 kubelet[2851]: I0113 20:46:11.705748 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75c86ebd-8939-4ef2-b562-702533428ff8-lib-modules\") pod \"kube-proxy-whb4p\" (UID: \"75c86ebd-8939-4ef2-b562-702533428ff8\") " pod="kube-system/kube-proxy-whb4p" Jan 13 20:46:11.741628 kubelet[2851]: I0113 20:46:11.740602 2851 topology_manager.go:215] "Topology Admit Handler" podUID="54dd7a56-3c49-42da-80d3-4670ed48b514" podNamespace="tigera-operator" podName="tigera-operator-c7ccbd65-8mzfp" Jan 13 20:46:11.747958 systemd[1]: Created slice kubepods-besteffort-pod54dd7a56_3c49_42da_80d3_4670ed48b514.slice - libcontainer container kubepods-besteffort-pod54dd7a56_3c49_42da_80d3_4670ed48b514.slice. Jan 13 20:46:11.805883 kubelet[2851]: I0113 20:46:11.805863 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/54dd7a56-3c49-42da-80d3-4670ed48b514-var-lib-calico\") pod \"tigera-operator-c7ccbd65-8mzfp\" (UID: \"54dd7a56-3c49-42da-80d3-4670ed48b514\") " pod="tigera-operator/tigera-operator-c7ccbd65-8mzfp" Jan 13 20:46:11.806188 kubelet[2851]: I0113 20:46:11.806171 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sxg4\" (UniqueName: \"kubernetes.io/projected/54dd7a56-3c49-42da-80d3-4670ed48b514-kube-api-access-4sxg4\") pod \"tigera-operator-c7ccbd65-8mzfp\" (UID: \"54dd7a56-3c49-42da-80d3-4670ed48b514\") " pod="tigera-operator/tigera-operator-c7ccbd65-8mzfp" Jan 13 20:46:11.948054 containerd[1547]: time="2025-01-13T20:46:11.947975408Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-whb4p,Uid:75c86ebd-8939-4ef2-b562-702533428ff8,Namespace:kube-system,Attempt:0,}" Jan 13 20:46:11.960322 containerd[1547]: time="2025-01-13T20:46:11.960248854Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:46:11.960322 containerd[1547]: time="2025-01-13T20:46:11.960290703Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:46:11.960322 containerd[1547]: time="2025-01-13T20:46:11.960298682Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:11.960556 containerd[1547]: time="2025-01-13T20:46:11.960372003Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:11.976444 systemd[1]: Started cri-containerd-d89c4f9bf191412dff3bbf1f3fc302eaa1bce4be46fcccea67abc16d8b4a9576.scope - libcontainer container d89c4f9bf191412dff3bbf1f3fc302eaa1bce4be46fcccea67abc16d8b4a9576. Jan 13 20:46:11.990559 containerd[1547]: time="2025-01-13T20:46:11.990535520Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-whb4p,Uid:75c86ebd-8939-4ef2-b562-702533428ff8,Namespace:kube-system,Attempt:0,} returns sandbox id \"d89c4f9bf191412dff3bbf1f3fc302eaa1bce4be46fcccea67abc16d8b4a9576\"" Jan 13 20:46:11.993963 containerd[1547]: time="2025-01-13T20:46:11.993912248Z" level=info msg="CreateContainer within sandbox \"d89c4f9bf191412dff3bbf1f3fc302eaa1bce4be46fcccea67abc16d8b4a9576\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 13 20:46:12.005259 containerd[1547]: time="2025-01-13T20:46:12.005238197Z" level=info msg="CreateContainer within sandbox \"d89c4f9bf191412dff3bbf1f3fc302eaa1bce4be46fcccea67abc16d8b4a9576\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"3789da3f698723a04ea5b0828f0810a1fdcb591fdfed5d38d0ae905ac6177a07\"" Jan 13 20:46:12.005881 containerd[1547]: time="2025-01-13T20:46:12.005512776Z" level=info msg="StartContainer for \"3789da3f698723a04ea5b0828f0810a1fdcb591fdfed5d38d0ae905ac6177a07\"" Jan 13 20:46:12.022561 systemd[1]: Started cri-containerd-3789da3f698723a04ea5b0828f0810a1fdcb591fdfed5d38d0ae905ac6177a07.scope - libcontainer container 3789da3f698723a04ea5b0828f0810a1fdcb591fdfed5d38d0ae905ac6177a07. Jan 13 20:46:12.038785 containerd[1547]: time="2025-01-13T20:46:12.038761279Z" level=info msg="StartContainer for \"3789da3f698723a04ea5b0828f0810a1fdcb591fdfed5d38d0ae905ac6177a07\" returns successfully" Jan 13 20:46:12.050243 containerd[1547]: time="2025-01-13T20:46:12.050223525Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-c7ccbd65-8mzfp,Uid:54dd7a56-3c49-42da-80d3-4670ed48b514,Namespace:tigera-operator,Attempt:0,}" Jan 13 20:46:12.065056 containerd[1547]: time="2025-01-13T20:46:12.064966775Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:46:12.065056 containerd[1547]: time="2025-01-13T20:46:12.065027470Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:46:12.065056 containerd[1547]: time="2025-01-13T20:46:12.065035098Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:12.065231 containerd[1547]: time="2025-01-13T20:46:12.065083267Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:12.076464 systemd[1]: Started cri-containerd-101d1a7c5c897090f848910ce2ac77549e5a92a304385c32e6b1d7613bf3b62c.scope - libcontainer container 101d1a7c5c897090f848910ce2ac77549e5a92a304385c32e6b1d7613bf3b62c. Jan 13 20:46:12.103178 containerd[1547]: time="2025-01-13T20:46:12.103154586Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-c7ccbd65-8mzfp,Uid:54dd7a56-3c49-42da-80d3-4670ed48b514,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"101d1a7c5c897090f848910ce2ac77549e5a92a304385c32e6b1d7613bf3b62c\"" Jan 13 20:46:12.104151 containerd[1547]: time="2025-01-13T20:46:12.104130904Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Jan 13 20:46:12.816010 systemd[1]: run-containerd-runc-k8s.io-d89c4f9bf191412dff3bbf1f3fc302eaa1bce4be46fcccea67abc16d8b4a9576-runc.ZmkePw.mount: Deactivated successfully. Jan 13 20:46:13.353956 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3847808571.mount: Deactivated successfully. Jan 13 20:46:13.628902 containerd[1547]: time="2025-01-13T20:46:13.628843785Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:13.629313 containerd[1547]: time="2025-01-13T20:46:13.629289737Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21764313" Jan 13 20:46:13.629667 containerd[1547]: time="2025-01-13T20:46:13.629655875Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:13.631085 containerd[1547]: time="2025-01-13T20:46:13.631063812Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:13.631436 containerd[1547]: time="2025-01-13T20:46:13.631423701Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 1.527269157s" Jan 13 20:46:13.631524 containerd[1547]: time="2025-01-13T20:46:13.631471987Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Jan 13 20:46:13.651350 containerd[1547]: time="2025-01-13T20:46:13.651318198Z" level=info msg="CreateContainer within sandbox \"101d1a7c5c897090f848910ce2ac77549e5a92a304385c32e6b1d7613bf3b62c\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 13 20:46:13.657035 containerd[1547]: time="2025-01-13T20:46:13.657012863Z" level=info msg="CreateContainer within sandbox \"101d1a7c5c897090f848910ce2ac77549e5a92a304385c32e6b1d7613bf3b62c\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"604731157be9c09cd16febf2bb047a1dbc880bdc9539d2b31199485428b80382\"" Jan 13 20:46:13.657513 containerd[1547]: time="2025-01-13T20:46:13.657383378Z" level=info msg="StartContainer for \"604731157be9c09cd16febf2bb047a1dbc880bdc9539d2b31199485428b80382\"" Jan 13 20:46:13.657438 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3376275998.mount: Deactivated successfully. Jan 13 20:46:13.673566 systemd[1]: Started cri-containerd-604731157be9c09cd16febf2bb047a1dbc880bdc9539d2b31199485428b80382.scope - libcontainer container 604731157be9c09cd16febf2bb047a1dbc880bdc9539d2b31199485428b80382. Jan 13 20:46:13.690677 containerd[1547]: time="2025-01-13T20:46:13.689641171Z" level=info msg="StartContainer for \"604731157be9c09cd16febf2bb047a1dbc880bdc9539d2b31199485428b80382\" returns successfully" Jan 13 20:46:14.415226 kubelet[2851]: I0113 20:46:14.415201 2851 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-proxy-whb4p" podStartSLOduration=3.415162168 podStartE2EDuration="3.415162168s" podCreationTimestamp="2025-01-13 20:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:46:12.395404665 +0000 UTC m=+16.164341856" watchObservedRunningTime="2025-01-13 20:46:14.415162168 +0000 UTC m=+18.184099360" Jan 13 20:46:14.415666 kubelet[2851]: I0113 20:46:14.415263 2851 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="tigera-operator/tigera-operator-c7ccbd65-8mzfp" podStartSLOduration=1.8856312320000002 podStartE2EDuration="3.415247948s" podCreationTimestamp="2025-01-13 20:46:11 +0000 UTC" firstStartedPulling="2025-01-13 20:46:12.10383094 +0000 UTC m=+15.872768122" lastFinishedPulling="2025-01-13 20:46:13.633447655 +0000 UTC m=+17.402384838" observedRunningTime="2025-01-13 20:46:14.41392037 +0000 UTC m=+18.182857571" watchObservedRunningTime="2025-01-13 20:46:14.415247948 +0000 UTC m=+18.184185142" Jan 13 20:46:16.394592 kubelet[2851]: I0113 20:46:16.393130 2851 topology_manager.go:215] "Topology Admit Handler" podUID="a51177ce-9c37-49f6-abc5-c724df388345" podNamespace="calico-system" podName="calico-typha-7566698459-vf98w" Jan 13 20:46:16.403848 systemd[1]: Created slice kubepods-besteffort-poda51177ce_9c37_49f6_abc5_c724df388345.slice - libcontainer container kubepods-besteffort-poda51177ce_9c37_49f6_abc5_c724df388345.slice. Jan 13 20:46:16.534639 kubelet[2851]: I0113 20:46:16.534621 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-897xz\" (UniqueName: \"kubernetes.io/projected/a51177ce-9c37-49f6-abc5-c724df388345-kube-api-access-897xz\") pod \"calico-typha-7566698459-vf98w\" (UID: \"a51177ce-9c37-49f6-abc5-c724df388345\") " pod="calico-system/calico-typha-7566698459-vf98w" Jan 13 20:46:16.538730 kubelet[2851]: I0113 20:46:16.534793 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a51177ce-9c37-49f6-abc5-c724df388345-tigera-ca-bundle\") pod \"calico-typha-7566698459-vf98w\" (UID: \"a51177ce-9c37-49f6-abc5-c724df388345\") " pod="calico-system/calico-typha-7566698459-vf98w" Jan 13 20:46:16.538730 kubelet[2851]: I0113 20:46:16.534829 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a51177ce-9c37-49f6-abc5-c724df388345-typha-certs\") pod \"calico-typha-7566698459-vf98w\" (UID: \"a51177ce-9c37-49f6-abc5-c724df388345\") " pod="calico-system/calico-typha-7566698459-vf98w" Jan 13 20:46:16.562094 kubelet[2851]: I0113 20:46:16.561626 2851 topology_manager.go:215] "Topology Admit Handler" podUID="4b98f3dc-ef49-4366-86fa-63e7279e3894" podNamespace="calico-system" podName="calico-node-pvrpb" Jan 13 20:46:16.575983 systemd[1]: Created slice kubepods-besteffort-pod4b98f3dc_ef49_4366_86fa_63e7279e3894.slice - libcontainer container kubepods-besteffort-pod4b98f3dc_ef49_4366_86fa_63e7279e3894.slice. Jan 13 20:46:16.681117 kubelet[2851]: I0113 20:46:16.681061 2851 topology_manager.go:215] "Topology Admit Handler" podUID="7af6a31b-5e31-40c1-b6a8-196414f83e54" podNamespace="calico-system" podName="csi-node-driver-cj9xm" Jan 13 20:46:16.701589 kubelet[2851]: E0113 20:46:16.681978 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cj9xm" podUID="7af6a31b-5e31-40c1-b6a8-196414f83e54" Jan 13 20:46:16.707072 containerd[1547]: time="2025-01-13T20:46:16.707048350Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7566698459-vf98w,Uid:a51177ce-9c37-49f6-abc5-c724df388345,Namespace:calico-system,Attempt:0,}" Jan 13 20:46:16.736726 kubelet[2851]: I0113 20:46:16.736639 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/4b98f3dc-ef49-4366-86fa-63e7279e3894-node-certs\") pod \"calico-node-pvrpb\" (UID: \"4b98f3dc-ef49-4366-86fa-63e7279e3894\") " pod="calico-system/calico-node-pvrpb" Jan 13 20:46:16.736726 kubelet[2851]: I0113 20:46:16.736667 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4b98f3dc-ef49-4366-86fa-63e7279e3894-var-lib-calico\") pod \"calico-node-pvrpb\" (UID: \"4b98f3dc-ef49-4366-86fa-63e7279e3894\") " pod="calico-system/calico-node-pvrpb" Jan 13 20:46:16.736726 kubelet[2851]: I0113 20:46:16.736690 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr969\" (UniqueName: \"kubernetes.io/projected/4b98f3dc-ef49-4366-86fa-63e7279e3894-kube-api-access-hr969\") pod \"calico-node-pvrpb\" (UID: \"4b98f3dc-ef49-4366-86fa-63e7279e3894\") " pod="calico-system/calico-node-pvrpb" Jan 13 20:46:16.736726 kubelet[2851]: I0113 20:46:16.736704 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/4b98f3dc-ef49-4366-86fa-63e7279e3894-cni-log-dir\") pod \"calico-node-pvrpb\" (UID: \"4b98f3dc-ef49-4366-86fa-63e7279e3894\") " pod="calico-system/calico-node-pvrpb" Jan 13 20:46:16.737443 kubelet[2851]: I0113 20:46:16.736747 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/4b98f3dc-ef49-4366-86fa-63e7279e3894-policysync\") pod \"calico-node-pvrpb\" (UID: \"4b98f3dc-ef49-4366-86fa-63e7279e3894\") " pod="calico-system/calico-node-pvrpb" Jan 13 20:46:16.737443 kubelet[2851]: I0113 20:46:16.736775 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b98f3dc-ef49-4366-86fa-63e7279e3894-tigera-ca-bundle\") pod \"calico-node-pvrpb\" (UID: \"4b98f3dc-ef49-4366-86fa-63e7279e3894\") " pod="calico-system/calico-node-pvrpb" Jan 13 20:46:16.737443 kubelet[2851]: I0113 20:46:16.736788 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/4b98f3dc-ef49-4366-86fa-63e7279e3894-var-run-calico\") pod \"calico-node-pvrpb\" (UID: \"4b98f3dc-ef49-4366-86fa-63e7279e3894\") " pod="calico-system/calico-node-pvrpb" Jan 13 20:46:16.737443 kubelet[2851]: I0113 20:46:16.736804 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/4b98f3dc-ef49-4366-86fa-63e7279e3894-cni-bin-dir\") pod \"calico-node-pvrpb\" (UID: \"4b98f3dc-ef49-4366-86fa-63e7279e3894\") " pod="calico-system/calico-node-pvrpb" Jan 13 20:46:16.737443 kubelet[2851]: I0113 20:46:16.736816 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/4b98f3dc-ef49-4366-86fa-63e7279e3894-cni-net-dir\") pod \"calico-node-pvrpb\" (UID: \"4b98f3dc-ef49-4366-86fa-63e7279e3894\") " pod="calico-system/calico-node-pvrpb" Jan 13 20:46:16.737547 kubelet[2851]: I0113 20:46:16.736848 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/4b98f3dc-ef49-4366-86fa-63e7279e3894-flexvol-driver-host\") pod \"calico-node-pvrpb\" (UID: \"4b98f3dc-ef49-4366-86fa-63e7279e3894\") " pod="calico-system/calico-node-pvrpb" Jan 13 20:46:16.737547 kubelet[2851]: I0113 20:46:16.736862 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4b98f3dc-ef49-4366-86fa-63e7279e3894-xtables-lock\") pod \"calico-node-pvrpb\" (UID: \"4b98f3dc-ef49-4366-86fa-63e7279e3894\") " pod="calico-system/calico-node-pvrpb" Jan 13 20:46:16.737547 kubelet[2851]: I0113 20:46:16.736887 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4b98f3dc-ef49-4366-86fa-63e7279e3894-lib-modules\") pod \"calico-node-pvrpb\" (UID: \"4b98f3dc-ef49-4366-86fa-63e7279e3894\") " pod="calico-system/calico-node-pvrpb" Jan 13 20:46:16.776642 containerd[1547]: time="2025-01-13T20:46:16.776577875Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:46:16.777971 containerd[1547]: time="2025-01-13T20:46:16.777898722Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:46:16.777971 containerd[1547]: time="2025-01-13T20:46:16.777933493Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:16.778142 containerd[1547]: time="2025-01-13T20:46:16.778020711Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:16.788043 systemd[1]: run-containerd-runc-k8s.io-8afce0ed7de35d2ace28c89deba01040efcd6dfb98812a610ab1a773a03d593d-runc.qUd5Bx.mount: Deactivated successfully. Jan 13 20:46:16.795465 systemd[1]: Started cri-containerd-8afce0ed7de35d2ace28c89deba01040efcd6dfb98812a610ab1a773a03d593d.scope - libcontainer container 8afce0ed7de35d2ace28c89deba01040efcd6dfb98812a610ab1a773a03d593d. Jan 13 20:46:16.839738 kubelet[2851]: I0113 20:46:16.837810 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7af6a31b-5e31-40c1-b6a8-196414f83e54-socket-dir\") pod \"csi-node-driver-cj9xm\" (UID: \"7af6a31b-5e31-40c1-b6a8-196414f83e54\") " pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:16.839738 kubelet[2851]: I0113 20:46:16.837842 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7af6a31b-5e31-40c1-b6a8-196414f83e54-registration-dir\") pod \"csi-node-driver-cj9xm\" (UID: \"7af6a31b-5e31-40c1-b6a8-196414f83e54\") " pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:16.839738 kubelet[2851]: I0113 20:46:16.837902 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/7af6a31b-5e31-40c1-b6a8-196414f83e54-varrun\") pod \"csi-node-driver-cj9xm\" (UID: \"7af6a31b-5e31-40c1-b6a8-196414f83e54\") " pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:16.839738 kubelet[2851]: I0113 20:46:16.837913 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7af6a31b-5e31-40c1-b6a8-196414f83e54-kubelet-dir\") pod \"csi-node-driver-cj9xm\" (UID: \"7af6a31b-5e31-40c1-b6a8-196414f83e54\") " pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:16.839738 kubelet[2851]: I0113 20:46:16.837925 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq6g2\" (UniqueName: \"kubernetes.io/projected/7af6a31b-5e31-40c1-b6a8-196414f83e54-kube-api-access-tq6g2\") pod \"csi-node-driver-cj9xm\" (UID: \"7af6a31b-5e31-40c1-b6a8-196414f83e54\") " pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:16.840989 kubelet[2851]: E0113 20:46:16.840671 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.840989 kubelet[2851]: W0113 20:46:16.840680 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.840989 kubelet[2851]: E0113 20:46:16.840696 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.841041 containerd[1547]: time="2025-01-13T20:46:16.840584048Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7566698459-vf98w,Uid:a51177ce-9c37-49f6-abc5-c724df388345,Namespace:calico-system,Attempt:0,} returns sandbox id \"8afce0ed7de35d2ace28c89deba01040efcd6dfb98812a610ab1a773a03d593d\"" Jan 13 20:46:16.841960 containerd[1547]: time="2025-01-13T20:46:16.841780851Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Jan 13 20:46:16.848084 kubelet[2851]: E0113 20:46:16.848043 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.848084 kubelet[2851]: W0113 20:46:16.848054 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.848084 kubelet[2851]: E0113 20:46:16.848066 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.879570 containerd[1547]: time="2025-01-13T20:46:16.879543754Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-pvrpb,Uid:4b98f3dc-ef49-4366-86fa-63e7279e3894,Namespace:calico-system,Attempt:0,}" Jan 13 20:46:16.895031 containerd[1547]: time="2025-01-13T20:46:16.894978225Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:46:16.895183 containerd[1547]: time="2025-01-13T20:46:16.895035456Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:46:16.895183 containerd[1547]: time="2025-01-13T20:46:16.895055393Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:16.895227 containerd[1547]: time="2025-01-13T20:46:16.895133305Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:16.907462 systemd[1]: Started cri-containerd-524a4ebd94f9ddd3038e45498d154ac9823c25164cb31932d4f2a5587dcdfc9d.scope - libcontainer container 524a4ebd94f9ddd3038e45498d154ac9823c25164cb31932d4f2a5587dcdfc9d. Jan 13 20:46:16.922363 containerd[1547]: time="2025-01-13T20:46:16.922321884Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-pvrpb,Uid:4b98f3dc-ef49-4366-86fa-63e7279e3894,Namespace:calico-system,Attempt:0,} returns sandbox id \"524a4ebd94f9ddd3038e45498d154ac9823c25164cb31932d4f2a5587dcdfc9d\"" Jan 13 20:46:16.938634 kubelet[2851]: E0113 20:46:16.938554 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.938634 kubelet[2851]: W0113 20:46:16.938566 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.938634 kubelet[2851]: E0113 20:46:16.938577 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.939499 kubelet[2851]: E0113 20:46:16.939428 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.939499 kubelet[2851]: W0113 20:46:16.939435 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.939499 kubelet[2851]: E0113 20:46:16.939445 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.940098 kubelet[2851]: E0113 20:46:16.939615 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.940098 kubelet[2851]: W0113 20:46:16.939619 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.940098 kubelet[2851]: E0113 20:46:16.939626 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.940098 kubelet[2851]: E0113 20:46:16.939845 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.940098 kubelet[2851]: W0113 20:46:16.939852 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.940098 kubelet[2851]: E0113 20:46:16.939867 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.940098 kubelet[2851]: E0113 20:46:16.939987 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.940098 kubelet[2851]: W0113 20:46:16.939992 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.940098 kubelet[2851]: E0113 20:46:16.940004 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.940238 kubelet[2851]: E0113 20:46:16.940123 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.940238 kubelet[2851]: W0113 20:46:16.940128 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.940238 kubelet[2851]: E0113 20:46:16.940167 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.940238 kubelet[2851]: E0113 20:46:16.940223 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.940238 kubelet[2851]: W0113 20:46:16.940228 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.940238 kubelet[2851]: E0113 20:46:16.940236 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.940329 kubelet[2851]: E0113 20:46:16.940312 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.940329 kubelet[2851]: W0113 20:46:16.940317 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.940388 kubelet[2851]: E0113 20:46:16.940377 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.940458 kubelet[2851]: E0113 20:46:16.940433 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.940458 kubelet[2851]: W0113 20:46:16.940456 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.940519 kubelet[2851]: E0113 20:46:16.940468 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.940567 kubelet[2851]: E0113 20:46:16.940553 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.940567 kubelet[2851]: W0113 20:46:16.940562 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.940624 kubelet[2851]: E0113 20:46:16.940576 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.940711 kubelet[2851]: E0113 20:46:16.940661 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.940711 kubelet[2851]: W0113 20:46:16.940665 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.940711 kubelet[2851]: E0113 20:46:16.940673 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.941901 kubelet[2851]: E0113 20:46:16.941889 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.941901 kubelet[2851]: W0113 20:46:16.941897 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.941960 kubelet[2851]: E0113 20:46:16.941907 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.942007 kubelet[2851]: E0113 20:46:16.941999 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.942007 kubelet[2851]: W0113 20:46:16.942005 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.942084 kubelet[2851]: E0113 20:46:16.942018 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.942121 kubelet[2851]: E0113 20:46:16.942112 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.942121 kubelet[2851]: W0113 20:46:16.942118 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.942185 kubelet[2851]: E0113 20:46:16.942137 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.942212 kubelet[2851]: E0113 20:46:16.942205 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.942212 kubelet[2851]: W0113 20:46:16.942209 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.942291 kubelet[2851]: E0113 20:46:16.942279 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.942355 kubelet[2851]: E0113 20:46:16.942317 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.942355 kubelet[2851]: W0113 20:46:16.942321 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.942355 kubelet[2851]: E0113 20:46:16.942336 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.942618 kubelet[2851]: E0113 20:46:16.942411 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.942618 kubelet[2851]: W0113 20:46:16.942415 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.942618 kubelet[2851]: E0113 20:46:16.942496 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.942618 kubelet[2851]: E0113 20:46:16.942504 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.942618 kubelet[2851]: W0113 20:46:16.942508 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.942618 kubelet[2851]: E0113 20:46:16.942529 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.942859 kubelet[2851]: E0113 20:46:16.942661 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.942859 kubelet[2851]: W0113 20:46:16.942666 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.942859 kubelet[2851]: E0113 20:46:16.942676 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.942980 kubelet[2851]: E0113 20:46:16.942920 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.942980 kubelet[2851]: W0113 20:46:16.942926 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.942980 kubelet[2851]: E0113 20:46:16.942936 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.943132 kubelet[2851]: E0113 20:46:16.943034 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.943132 kubelet[2851]: W0113 20:46:16.943039 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.943132 kubelet[2851]: E0113 20:46:16.943048 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.943132 kubelet[2851]: E0113 20:46:16.943127 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.943132 kubelet[2851]: W0113 20:46:16.943132 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.943219 kubelet[2851]: E0113 20:46:16.943138 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.943219 kubelet[2851]: E0113 20:46:16.943215 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.943250 kubelet[2851]: W0113 20:46:16.943220 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.943250 kubelet[2851]: E0113 20:46:16.943225 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.943401 kubelet[2851]: E0113 20:46:16.943311 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.943401 kubelet[2851]: W0113 20:46:16.943317 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.943401 kubelet[2851]: E0113 20:46:16.943323 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.943755 kubelet[2851]: E0113 20:46:16.943562 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.943755 kubelet[2851]: W0113 20:46:16.943568 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.943755 kubelet[2851]: E0113 20:46:16.943575 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:16.947830 kubelet[2851]: E0113 20:46:16.947816 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:16.947870 kubelet[2851]: W0113 20:46:16.947839 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:16.947870 kubelet[2851]: E0113 20:46:16.947848 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:18.124590 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3875162192.mount: Deactivated successfully. Jan 13 20:46:18.319889 kubelet[2851]: E0113 20:46:18.319616 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cj9xm" podUID="7af6a31b-5e31-40c1-b6a8-196414f83e54" Jan 13 20:46:18.847743 containerd[1547]: time="2025-01-13T20:46:18.847665502Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:18.848378 containerd[1547]: time="2025-01-13T20:46:18.848201815Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=31343363" Jan 13 20:46:18.848861 containerd[1547]: time="2025-01-13T20:46:18.848668442Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:18.850751 containerd[1547]: time="2025-01-13T20:46:18.850719398Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:18.851661 containerd[1547]: time="2025-01-13T20:46:18.851633007Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 2.00983222s" Jan 13 20:46:18.851709 containerd[1547]: time="2025-01-13T20:46:18.851662348Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Jan 13 20:46:18.853044 containerd[1547]: time="2025-01-13T20:46:18.852261232Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Jan 13 20:46:18.859492 containerd[1547]: time="2025-01-13T20:46:18.859466034Z" level=info msg="CreateContainer within sandbox \"8afce0ed7de35d2ace28c89deba01040efcd6dfb98812a610ab1a773a03d593d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 13 20:46:18.865138 containerd[1547]: time="2025-01-13T20:46:18.865119511Z" level=info msg="CreateContainer within sandbox \"8afce0ed7de35d2ace28c89deba01040efcd6dfb98812a610ab1a773a03d593d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"cafc6ef28dd8315edd1046d569fc0c8721dd65988a6e12d9eb95f275e1c1fb1c\"" Jan 13 20:46:18.865319 containerd[1547]: time="2025-01-13T20:46:18.865309300Z" level=info msg="StartContainer for \"cafc6ef28dd8315edd1046d569fc0c8721dd65988a6e12d9eb95f275e1c1fb1c\"" Jan 13 20:46:18.904521 systemd[1]: Started cri-containerd-cafc6ef28dd8315edd1046d569fc0c8721dd65988a6e12d9eb95f275e1c1fb1c.scope - libcontainer container cafc6ef28dd8315edd1046d569fc0c8721dd65988a6e12d9eb95f275e1c1fb1c. Jan 13 20:46:18.933157 containerd[1547]: time="2025-01-13T20:46:18.933071869Z" level=info msg="StartContainer for \"cafc6ef28dd8315edd1046d569fc0c8721dd65988a6e12d9eb95f275e1c1fb1c\" returns successfully" Jan 13 20:46:19.411909 kubelet[2851]: I0113 20:46:19.411446 2851 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-7566698459-vf98w" podStartSLOduration=1.401053232 podStartE2EDuration="3.41141898s" podCreationTimestamp="2025-01-13 20:46:16 +0000 UTC" firstStartedPulling="2025-01-13 20:46:16.841566812 +0000 UTC m=+20.610503994" lastFinishedPulling="2025-01-13 20:46:18.851932555 +0000 UTC m=+22.620869742" observedRunningTime="2025-01-13 20:46:19.411251154 +0000 UTC m=+23.180188353" watchObservedRunningTime="2025-01-13 20:46:19.41141898 +0000 UTC m=+23.180356173" Jan 13 20:46:19.453970 kubelet[2851]: E0113 20:46:19.453907 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.453970 kubelet[2851]: W0113 20:46:19.453923 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.453970 kubelet[2851]: E0113 20:46:19.453938 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.454148 kubelet[2851]: E0113 20:46:19.454083 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.454148 kubelet[2851]: W0113 20:46:19.454090 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.454148 kubelet[2851]: E0113 20:46:19.454100 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.454243 kubelet[2851]: E0113 20:46:19.454214 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.454243 kubelet[2851]: W0113 20:46:19.454222 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.454243 kubelet[2851]: E0113 20:46:19.454230 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.454371 kubelet[2851]: E0113 20:46:19.454353 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.454371 kubelet[2851]: W0113 20:46:19.454361 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.454371 kubelet[2851]: E0113 20:46:19.454369 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.454516 kubelet[2851]: E0113 20:46:19.454501 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.454516 kubelet[2851]: W0113 20:46:19.454511 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.454583 kubelet[2851]: E0113 20:46:19.454520 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.454641 kubelet[2851]: E0113 20:46:19.454630 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.454641 kubelet[2851]: W0113 20:46:19.454639 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.454707 kubelet[2851]: E0113 20:46:19.454649 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.454776 kubelet[2851]: E0113 20:46:19.454756 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.454776 kubelet[2851]: W0113 20:46:19.454766 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.454776 kubelet[2851]: E0113 20:46:19.454774 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.455146 kubelet[2851]: E0113 20:46:19.454888 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.455146 kubelet[2851]: W0113 20:46:19.454895 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.455146 kubelet[2851]: E0113 20:46:19.454906 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.455146 kubelet[2851]: E0113 20:46:19.455053 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.455146 kubelet[2851]: W0113 20:46:19.455059 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.455146 kubelet[2851]: E0113 20:46:19.455067 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.455392 kubelet[2851]: E0113 20:46:19.455178 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.455392 kubelet[2851]: W0113 20:46:19.455184 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.455392 kubelet[2851]: E0113 20:46:19.455192 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.455392 kubelet[2851]: E0113 20:46:19.455298 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.455392 kubelet[2851]: W0113 20:46:19.455303 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.455392 kubelet[2851]: E0113 20:46:19.455311 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.455580 kubelet[2851]: E0113 20:46:19.455433 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.455580 kubelet[2851]: W0113 20:46:19.455439 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.455580 kubelet[2851]: E0113 20:46:19.455446 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.456038 kubelet[2851]: E0113 20:46:19.455589 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.456038 kubelet[2851]: W0113 20:46:19.455595 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.456038 kubelet[2851]: E0113 20:46:19.455603 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.456038 kubelet[2851]: E0113 20:46:19.455715 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.456038 kubelet[2851]: W0113 20:46:19.455721 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.456038 kubelet[2851]: E0113 20:46:19.455730 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.456038 kubelet[2851]: E0113 20:46:19.455853 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.456038 kubelet[2851]: W0113 20:46:19.455860 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.456038 kubelet[2851]: E0113 20:46:19.455871 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.456038 kubelet[2851]: E0113 20:46:19.456016 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.456523 kubelet[2851]: W0113 20:46:19.456022 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.456523 kubelet[2851]: E0113 20:46:19.456030 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.456523 kubelet[2851]: E0113 20:46:19.456181 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.456523 kubelet[2851]: W0113 20:46:19.456187 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.456523 kubelet[2851]: E0113 20:46:19.456199 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.456958 kubelet[2851]: E0113 20:46:19.456699 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.456958 kubelet[2851]: W0113 20:46:19.456710 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.456958 kubelet[2851]: E0113 20:46:19.456728 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.456958 kubelet[2851]: E0113 20:46:19.456872 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.456958 kubelet[2851]: W0113 20:46:19.456880 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.456958 kubelet[2851]: E0113 20:46:19.456900 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.457268 kubelet[2851]: E0113 20:46:19.457035 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.457268 kubelet[2851]: W0113 20:46:19.457042 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.457268 kubelet[2851]: E0113 20:46:19.457056 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.457552 kubelet[2851]: E0113 20:46:19.457434 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.457552 kubelet[2851]: W0113 20:46:19.457442 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.457552 kubelet[2851]: E0113 20:46:19.457456 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.457758 kubelet[2851]: E0113 20:46:19.457567 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.457758 kubelet[2851]: W0113 20:46:19.457574 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.457758 kubelet[2851]: E0113 20:46:19.457583 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.457758 kubelet[2851]: E0113 20:46:19.457699 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.457758 kubelet[2851]: W0113 20:46:19.457705 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.457758 kubelet[2851]: E0113 20:46:19.457713 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.457918 kubelet[2851]: E0113 20:46:19.457817 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.457918 kubelet[2851]: W0113 20:46:19.457822 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.457918 kubelet[2851]: E0113 20:46:19.457839 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.457982 kubelet[2851]: E0113 20:46:19.457947 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.457982 kubelet[2851]: W0113 20:46:19.457953 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.457982 kubelet[2851]: E0113 20:46:19.457963 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.458119 kubelet[2851]: E0113 20:46:19.458102 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.458119 kubelet[2851]: W0113 20:46:19.458113 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.458119 kubelet[2851]: E0113 20:46:19.458125 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.458369 kubelet[2851]: E0113 20:46:19.458356 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.458369 kubelet[2851]: W0113 20:46:19.458364 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.458438 kubelet[2851]: E0113 20:46:19.458375 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.458583 kubelet[2851]: E0113 20:46:19.458570 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.458583 kubelet[2851]: W0113 20:46:19.458580 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.458651 kubelet[2851]: E0113 20:46:19.458597 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.458738 kubelet[2851]: E0113 20:46:19.458723 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.458738 kubelet[2851]: W0113 20:46:19.458732 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.458824 kubelet[2851]: E0113 20:46:19.458741 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.459140 kubelet[2851]: E0113 20:46:19.458844 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.459140 kubelet[2851]: W0113 20:46:19.458852 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.459140 kubelet[2851]: E0113 20:46:19.458859 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.459140 kubelet[2851]: E0113 20:46:19.458964 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.459140 kubelet[2851]: W0113 20:46:19.458970 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.459140 kubelet[2851]: E0113 20:46:19.458978 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.459140 kubelet[2851]: E0113 20:46:19.459128 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.459140 kubelet[2851]: W0113 20:46:19.459134 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.459140 kubelet[2851]: E0113 20:46:19.459142 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:19.459496 kubelet[2851]: E0113 20:46:19.459483 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:19.459496 kubelet[2851]: W0113 20:46:19.459493 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:19.459559 kubelet[2851]: E0113 20:46:19.459502 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.319415 kubelet[2851]: E0113 20:46:20.319391 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cj9xm" podUID="7af6a31b-5e31-40c1-b6a8-196414f83e54" Jan 13 20:46:20.336319 containerd[1547]: time="2025-01-13T20:46:20.335886948Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:20.345175 containerd[1547]: time="2025-01-13T20:46:20.345148506Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5362121" Jan 13 20:46:20.350284 containerd[1547]: time="2025-01-13T20:46:20.350266584Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:20.355385 containerd[1547]: time="2025-01-13T20:46:20.355360213Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:20.355937 containerd[1547]: time="2025-01-13T20:46:20.355653822Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.503360201s" Jan 13 20:46:20.355937 containerd[1547]: time="2025-01-13T20:46:20.355671770Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Jan 13 20:46:20.356669 containerd[1547]: time="2025-01-13T20:46:20.356651395Z" level=info msg="CreateContainer within sandbox \"524a4ebd94f9ddd3038e45498d154ac9823c25164cb31932d4f2a5587dcdfc9d\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 13 20:46:20.405903 kubelet[2851]: I0113 20:46:20.405521 2851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:46:20.412005 containerd[1547]: time="2025-01-13T20:46:20.411982331Z" level=info msg="CreateContainer within sandbox \"524a4ebd94f9ddd3038e45498d154ac9823c25164cb31932d4f2a5587dcdfc9d\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"b5a6424294a08a84539a923837a69f87ee2b454e0ea1be8cd929321bf802e2fa\"" Jan 13 20:46:20.412501 containerd[1547]: time="2025-01-13T20:46:20.412484076Z" level=info msg="StartContainer for \"b5a6424294a08a84539a923837a69f87ee2b454e0ea1be8cd929321bf802e2fa\"" Jan 13 20:46:20.434463 systemd[1]: Started cri-containerd-b5a6424294a08a84539a923837a69f87ee2b454e0ea1be8cd929321bf802e2fa.scope - libcontainer container b5a6424294a08a84539a923837a69f87ee2b454e0ea1be8cd929321bf802e2fa. Jan 13 20:46:20.462364 kubelet[2851]: E0113 20:46:20.462257 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.462364 kubelet[2851]: W0113 20:46:20.462273 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.462364 kubelet[2851]: E0113 20:46:20.462295 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.466885 kubelet[2851]: E0113 20:46:20.462430 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.466885 kubelet[2851]: W0113 20:46:20.462435 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.466885 kubelet[2851]: E0113 20:46:20.462446 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.466885 kubelet[2851]: E0113 20:46:20.462817 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.466885 kubelet[2851]: W0113 20:46:20.462825 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.466885 kubelet[2851]: E0113 20:46:20.462834 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.466885 kubelet[2851]: E0113 20:46:20.462951 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.466885 kubelet[2851]: W0113 20:46:20.462956 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.466885 kubelet[2851]: E0113 20:46:20.462986 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.466885 kubelet[2851]: E0113 20:46:20.463090 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.467082 containerd[1547]: time="2025-01-13T20:46:20.465580901Z" level=info msg="StartContainer for \"b5a6424294a08a84539a923837a69f87ee2b454e0ea1be8cd929321bf802e2fa\" returns successfully" Jan 13 20:46:20.467117 kubelet[2851]: W0113 20:46:20.463095 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.467117 kubelet[2851]: E0113 20:46:20.463112 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.467117 kubelet[2851]: E0113 20:46:20.463219 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.467117 kubelet[2851]: W0113 20:46:20.463225 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.467117 kubelet[2851]: E0113 20:46:20.463235 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.467117 kubelet[2851]: E0113 20:46:20.463319 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.467117 kubelet[2851]: W0113 20:46:20.463337 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.467117 kubelet[2851]: E0113 20:46:20.463370 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.467117 kubelet[2851]: E0113 20:46:20.463459 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.467117 kubelet[2851]: W0113 20:46:20.463467 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.467329 kubelet[2851]: E0113 20:46:20.463479 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.467329 kubelet[2851]: E0113 20:46:20.463594 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.467329 kubelet[2851]: W0113 20:46:20.463599 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.467329 kubelet[2851]: E0113 20:46:20.463606 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.467329 kubelet[2851]: E0113 20:46:20.463689 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.467329 kubelet[2851]: W0113 20:46:20.463693 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.467329 kubelet[2851]: E0113 20:46:20.463699 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.467329 kubelet[2851]: E0113 20:46:20.463783 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.467329 kubelet[2851]: W0113 20:46:20.463787 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.467329 kubelet[2851]: E0113 20:46:20.463796 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.474200 kubelet[2851]: E0113 20:46:20.463912 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.474200 kubelet[2851]: W0113 20:46:20.463918 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.474200 kubelet[2851]: E0113 20:46:20.463928 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.474200 kubelet[2851]: E0113 20:46:20.464024 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.474200 kubelet[2851]: W0113 20:46:20.464032 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.474200 kubelet[2851]: E0113 20:46:20.464049 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.474200 kubelet[2851]: E0113 20:46:20.464182 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.474200 kubelet[2851]: W0113 20:46:20.464189 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.474200 kubelet[2851]: E0113 20:46:20.464214 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.474200 kubelet[2851]: E0113 20:46:20.464308 2851 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:46:20.472076 systemd[1]: cri-containerd-b5a6424294a08a84539a923837a69f87ee2b454e0ea1be8cd929321bf802e2fa.scope: Deactivated successfully. Jan 13 20:46:20.474699 kubelet[2851]: W0113 20:46:20.464312 2851 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:46:20.474699 kubelet[2851]: E0113 20:46:20.464319 2851 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:46:20.488892 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b5a6424294a08a84539a923837a69f87ee2b454e0ea1be8cd929321bf802e2fa-rootfs.mount: Deactivated successfully. Jan 13 20:46:20.633872 containerd[1547]: time="2025-01-13T20:46:20.626808170Z" level=info msg="shim disconnected" id=b5a6424294a08a84539a923837a69f87ee2b454e0ea1be8cd929321bf802e2fa namespace=k8s.io Jan 13 20:46:20.633872 containerd[1547]: time="2025-01-13T20:46:20.633831186Z" level=warning msg="cleaning up after shim disconnected" id=b5a6424294a08a84539a923837a69f87ee2b454e0ea1be8cd929321bf802e2fa namespace=k8s.io Jan 13 20:46:20.633872 containerd[1547]: time="2025-01-13T20:46:20.633849160Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:46:21.408162 containerd[1547]: time="2025-01-13T20:46:21.408094650Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Jan 13 20:46:22.320747 kubelet[2851]: E0113 20:46:22.320726 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cj9xm" podUID="7af6a31b-5e31-40c1-b6a8-196414f83e54" Jan 13 20:46:24.293896 containerd[1547]: time="2025-01-13T20:46:24.293866234Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:24.296272 containerd[1547]: time="2025-01-13T20:46:24.296253290Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:24.297176 containerd[1547]: time="2025-01-13T20:46:24.296622328Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Jan 13 20:46:24.298051 containerd[1547]: time="2025-01-13T20:46:24.298023131Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:24.298608 containerd[1547]: time="2025-01-13T20:46:24.298281402Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 2.890125418s" Jan 13 20:46:24.298608 containerd[1547]: time="2025-01-13T20:46:24.298299744Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Jan 13 20:46:24.299588 containerd[1547]: time="2025-01-13T20:46:24.299568455Z" level=info msg="CreateContainer within sandbox \"524a4ebd94f9ddd3038e45498d154ac9823c25164cb31932d4f2a5587dcdfc9d\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 13 20:46:24.319203 kubelet[2851]: E0113 20:46:24.319185 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cj9xm" podUID="7af6a31b-5e31-40c1-b6a8-196414f83e54" Jan 13 20:46:24.334487 containerd[1547]: time="2025-01-13T20:46:24.334456074Z" level=info msg="CreateContainer within sandbox \"524a4ebd94f9ddd3038e45498d154ac9823c25164cb31932d4f2a5587dcdfc9d\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"0cb95b28e717ba535998b6f974b44f04c7a67795dc6a8ef26c518cb9b1d740f2\"" Jan 13 20:46:24.335487 containerd[1547]: time="2025-01-13T20:46:24.335469870Z" level=info msg="StartContainer for \"0cb95b28e717ba535998b6f974b44f04c7a67795dc6a8ef26c518cb9b1d740f2\"" Jan 13 20:46:24.388443 systemd[1]: Started cri-containerd-0cb95b28e717ba535998b6f974b44f04c7a67795dc6a8ef26c518cb9b1d740f2.scope - libcontainer container 0cb95b28e717ba535998b6f974b44f04c7a67795dc6a8ef26c518cb9b1d740f2. Jan 13 20:46:24.405978 containerd[1547]: time="2025-01-13T20:46:24.405954390Z" level=info msg="StartContainer for \"0cb95b28e717ba535998b6f974b44f04c7a67795dc6a8ef26c518cb9b1d740f2\" returns successfully" Jan 13 20:46:25.567229 systemd[1]: cri-containerd-0cb95b28e717ba535998b6f974b44f04c7a67795dc6a8ef26c518cb9b1d740f2.scope: Deactivated successfully. Jan 13 20:46:25.587127 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0cb95b28e717ba535998b6f974b44f04c7a67795dc6a8ef26c518cb9b1d740f2-rootfs.mount: Deactivated successfully. Jan 13 20:46:25.587466 containerd[1547]: time="2025-01-13T20:46:25.587424372Z" level=info msg="shim disconnected" id=0cb95b28e717ba535998b6f974b44f04c7a67795dc6a8ef26c518cb9b1d740f2 namespace=k8s.io Jan 13 20:46:25.587466 containerd[1547]: time="2025-01-13T20:46:25.587463685Z" level=warning msg="cleaning up after shim disconnected" id=0cb95b28e717ba535998b6f974b44f04c7a67795dc6a8ef26c518cb9b1d740f2 namespace=k8s.io Jan 13 20:46:25.587651 containerd[1547]: time="2025-01-13T20:46:25.587469324Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:46:25.599110 kubelet[2851]: I0113 20:46:25.599085 2851 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Jan 13 20:46:25.628924 kubelet[2851]: I0113 20:46:25.628683 2851 topology_manager.go:215] "Topology Admit Handler" podUID="6ab74c68-2020-4618-bcbe-672227cc6fc9" podNamespace="kube-system" podName="coredns-76f75df574-nx8p5" Jan 13 20:46:25.628924 kubelet[2851]: I0113 20:46:25.629281 2851 topology_manager.go:215] "Topology Admit Handler" podUID="662f5a4d-917d-45ec-97a1-d70b9c8e2f05" podNamespace="calico-apiserver" podName="calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:25.640676 kubelet[2851]: I0113 20:46:25.640658 2851 topology_manager.go:215] "Topology Admit Handler" podUID="0016e978-2a6d-4be0-ad22-af8c555426bb" podNamespace="calico-system" podName="calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:25.641361 kubelet[2851]: I0113 20:46:25.640908 2851 topology_manager.go:215] "Topology Admit Handler" podUID="6953b35c-8169-44aa-91cb-7dd3f8f9aade" podNamespace="calico-apiserver" podName="calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:25.643963 kubelet[2851]: I0113 20:46:25.643538 2851 topology_manager.go:215] "Topology Admit Handler" podUID="45d20c01-0698-463e-b647-27ec83c8d824" podNamespace="kube-system" podName="coredns-76f75df574-75b2t" Jan 13 20:46:25.649903 systemd[1]: Created slice kubepods-burstable-pod6ab74c68_2020_4618_bcbe_672227cc6fc9.slice - libcontainer container kubepods-burstable-pod6ab74c68_2020_4618_bcbe_672227cc6fc9.slice. Jan 13 20:46:25.656118 systemd[1]: Created slice kubepods-besteffort-pod6953b35c_8169_44aa_91cb_7dd3f8f9aade.slice - libcontainer container kubepods-besteffort-pod6953b35c_8169_44aa_91cb_7dd3f8f9aade.slice. Jan 13 20:46:25.660779 systemd[1]: Created slice kubepods-besteffort-pod0016e978_2a6d_4be0_ad22_af8c555426bb.slice - libcontainer container kubepods-besteffort-pod0016e978_2a6d_4be0_ad22_af8c555426bb.slice. Jan 13 20:46:25.665127 systemd[1]: Created slice kubepods-besteffort-pod662f5a4d_917d_45ec_97a1_d70b9c8e2f05.slice - libcontainer container kubepods-besteffort-pod662f5a4d_917d_45ec_97a1_d70b9c8e2f05.slice. Jan 13 20:46:25.669382 systemd[1]: Created slice kubepods-burstable-pod45d20c01_0698_463e_b647_27ec83c8d824.slice - libcontainer container kubepods-burstable-pod45d20c01_0698_463e_b647_27ec83c8d824.slice. Jan 13 20:46:25.761203 kubelet[2851]: I0113 20:46:25.761167 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwv9c\" (UniqueName: \"kubernetes.io/projected/662f5a4d-917d-45ec-97a1-d70b9c8e2f05-kube-api-access-bwv9c\") pod \"calico-apiserver-5d7876745f-sk2dr\" (UID: \"662f5a4d-917d-45ec-97a1-d70b9c8e2f05\") " pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:25.761322 kubelet[2851]: I0113 20:46:25.761215 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6953b35c-8169-44aa-91cb-7dd3f8f9aade-calico-apiserver-certs\") pod \"calico-apiserver-5d7876745f-vvftf\" (UID: \"6953b35c-8169-44aa-91cb-7dd3f8f9aade\") " pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:25.761322 kubelet[2851]: I0113 20:46:25.761238 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hpjx\" (UniqueName: \"kubernetes.io/projected/0016e978-2a6d-4be0-ad22-af8c555426bb-kube-api-access-7hpjx\") pod \"calico-kube-controllers-66f4bb6d79-zhjcz\" (UID: \"0016e978-2a6d-4be0-ad22-af8c555426bb\") " pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:25.761322 kubelet[2851]: I0113 20:46:25.761255 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxfz9\" (UniqueName: \"kubernetes.io/projected/45d20c01-0698-463e-b647-27ec83c8d824-kube-api-access-hxfz9\") pod \"coredns-76f75df574-75b2t\" (UID: \"45d20c01-0698-463e-b647-27ec83c8d824\") " pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:25.761322 kubelet[2851]: I0113 20:46:25.761271 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtt89\" (UniqueName: \"kubernetes.io/projected/6953b35c-8169-44aa-91cb-7dd3f8f9aade-kube-api-access-rtt89\") pod \"calico-apiserver-5d7876745f-vvftf\" (UID: \"6953b35c-8169-44aa-91cb-7dd3f8f9aade\") " pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:25.761322 kubelet[2851]: I0113 20:46:25.761288 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ab74c68-2020-4618-bcbe-672227cc6fc9-config-volume\") pod \"coredns-76f75df574-nx8p5\" (UID: \"6ab74c68-2020-4618-bcbe-672227cc6fc9\") " pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:25.761490 kubelet[2851]: I0113 20:46:25.761306 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drq5j\" (UniqueName: \"kubernetes.io/projected/6ab74c68-2020-4618-bcbe-672227cc6fc9-kube-api-access-drq5j\") pod \"coredns-76f75df574-nx8p5\" (UID: \"6ab74c68-2020-4618-bcbe-672227cc6fc9\") " pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:25.761490 kubelet[2851]: I0113 20:46:25.761325 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/662f5a4d-917d-45ec-97a1-d70b9c8e2f05-calico-apiserver-certs\") pod \"calico-apiserver-5d7876745f-sk2dr\" (UID: \"662f5a4d-917d-45ec-97a1-d70b9c8e2f05\") " pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:25.761490 kubelet[2851]: I0113 20:46:25.761365 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45d20c01-0698-463e-b647-27ec83c8d824-config-volume\") pod \"coredns-76f75df574-75b2t\" (UID: \"45d20c01-0698-463e-b647-27ec83c8d824\") " pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:25.761490 kubelet[2851]: I0113 20:46:25.761388 2851 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0016e978-2a6d-4be0-ad22-af8c555426bb-tigera-ca-bundle\") pod \"calico-kube-controllers-66f4bb6d79-zhjcz\" (UID: \"0016e978-2a6d-4be0-ad22-af8c555426bb\") " pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:25.954833 containerd[1547]: time="2025-01-13T20:46:25.954762957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:0,}" Jan 13 20:46:25.959120 containerd[1547]: time="2025-01-13T20:46:25.959105985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:0,}" Jan 13 20:46:25.980238 containerd[1547]: time="2025-01-13T20:46:25.980007863Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:0,}" Jan 13 20:46:25.984117 containerd[1547]: time="2025-01-13T20:46:25.984083710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:0,}" Jan 13 20:46:25.985725 containerd[1547]: time="2025-01-13T20:46:25.985618413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:0,}" Jan 13 20:46:25.992712 kubelet[2851]: I0113 20:46:25.992367 2851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:46:26.216461 containerd[1547]: time="2025-01-13T20:46:26.216417425Z" level=error msg="Failed to destroy network for sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.221221 containerd[1547]: time="2025-01-13T20:46:26.221200892Z" level=error msg="encountered an error cleaning up failed sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.221289 containerd[1547]: time="2025-01-13T20:46:26.221245023Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.222470 containerd[1547]: time="2025-01-13T20:46:26.222449264Z" level=error msg="Failed to destroy network for sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.224337 containerd[1547]: time="2025-01-13T20:46:26.223843628Z" level=error msg="Failed to destroy network for sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.224337 containerd[1547]: time="2025-01-13T20:46:26.224153044Z" level=error msg="encountered an error cleaning up failed sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.224337 containerd[1547]: time="2025-01-13T20:46:26.224186225Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.227032 containerd[1547]: time="2025-01-13T20:46:26.227014190Z" level=error msg="encountered an error cleaning up failed sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.227120 containerd[1547]: time="2025-01-13T20:46:26.227107686Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.229278 containerd[1547]: time="2025-01-13T20:46:26.228990595Z" level=error msg="Failed to destroy network for sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.229278 containerd[1547]: time="2025-01-13T20:46:26.229200541Z" level=error msg="encountered an error cleaning up failed sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.229278 containerd[1547]: time="2025-01-13T20:46:26.229226065Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.229814 containerd[1547]: time="2025-01-13T20:46:26.229777616Z" level=error msg="Failed to destroy network for sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.230020 containerd[1547]: time="2025-01-13T20:46:26.229963645Z" level=error msg="encountered an error cleaning up failed sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.230020 containerd[1547]: time="2025-01-13T20:46:26.229987726Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.230731 kubelet[2851]: E0113 20:46:26.230609 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.230731 kubelet[2851]: E0113 20:46:26.230609 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.230731 kubelet[2851]: E0113 20:46:26.230643 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.230731 kubelet[2851]: E0113 20:46:26.230651 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:26.231372 kubelet[2851]: E0113 20:46:26.230659 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:26.231372 kubelet[2851]: E0113 20:46:26.230667 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:26.231372 kubelet[2851]: E0113 20:46:26.230673 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:26.231693 kubelet[2851]: E0113 20:46:26.230701 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d7876745f-vvftf_calico-apiserver(6953b35c-8169-44aa-91cb-7dd3f8f9aade)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d7876745f-vvftf_calico-apiserver(6953b35c-8169-44aa-91cb-7dd3f8f9aade)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" podUID="6953b35c-8169-44aa-91cb-7dd3f8f9aade" Jan 13 20:46:26.231693 kubelet[2851]: E0113 20:46:26.230701 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-nx8p5_kube-system(6ab74c68-2020-4618-bcbe-672227cc6fc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-nx8p5_kube-system(6ab74c68-2020-4618-bcbe-672227cc6fc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-nx8p5" podUID="6ab74c68-2020-4618-bcbe-672227cc6fc9" Jan 13 20:46:26.231693 kubelet[2851]: E0113 20:46:26.230754 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.231795 kubelet[2851]: E0113 20:46:26.230768 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:26.231795 kubelet[2851]: E0113 20:46:26.230778 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:26.231795 kubelet[2851]: E0113 20:46:26.230799 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-66f4bb6d79-zhjcz_calico-system(0016e978-2a6d-4be0-ad22-af8c555426bb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-66f4bb6d79-zhjcz_calico-system(0016e978-2a6d-4be0-ad22-af8c555426bb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" podUID="0016e978-2a6d-4be0-ad22-af8c555426bb" Jan 13 20:46:26.231865 kubelet[2851]: E0113 20:46:26.230814 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.231865 kubelet[2851]: E0113 20:46:26.230825 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:26.231865 kubelet[2851]: E0113 20:46:26.230834 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:26.231920 kubelet[2851]: E0113 20:46:26.230851 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d7876745f-sk2dr_calico-apiserver(662f5a4d-917d-45ec-97a1-d70b9c8e2f05)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d7876745f-sk2dr_calico-apiserver(662f5a4d-917d-45ec-97a1-d70b9c8e2f05)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" podUID="662f5a4d-917d-45ec-97a1-d70b9c8e2f05" Jan 13 20:46:26.231920 kubelet[2851]: E0113 20:46:26.230870 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:26.231920 kubelet[2851]: E0113 20:46:26.230881 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:26.231988 kubelet[2851]: E0113 20:46:26.230903 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-75b2t_kube-system(45d20c01-0698-463e-b647-27ec83c8d824)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-75b2t_kube-system(45d20c01-0698-463e-b647-27ec83c8d824)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-75b2t" podUID="45d20c01-0698-463e-b647-27ec83c8d824" Jan 13 20:46:26.326330 systemd[1]: Created slice kubepods-besteffort-pod7af6a31b_5e31_40c1_b6a8_196414f83e54.slice - libcontainer container kubepods-besteffort-pod7af6a31b_5e31_40c1_b6a8_196414f83e54.slice. Jan 13 20:46:26.327749 containerd[1547]: time="2025-01-13T20:46:26.327726789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:0,}" Jan 13 20:46:26.365532 containerd[1547]: time="2025-01-13T20:46:26.365500261Z" level=error msg="Failed to destroy network for sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.365710 containerd[1547]: time="2025-01-13T20:46:26.365688454Z" level=error msg="encountered an error cleaning up failed sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.365741 containerd[1547]: time="2025-01-13T20:46:26.365725442Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.365902 kubelet[2851]: E0113 20:46:26.365885 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.365947 kubelet[2851]: E0113 20:46:26.365921 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:26.365947 kubelet[2851]: E0113 20:46:26.365934 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:26.365993 kubelet[2851]: E0113 20:46:26.365979 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cj9xm_calico-system(7af6a31b-5e31-40c1-b6a8-196414f83e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cj9xm_calico-system(7af6a31b-5e31-40c1-b6a8-196414f83e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cj9xm" podUID="7af6a31b-5e31-40c1-b6a8-196414f83e54" Jan 13 20:46:26.416229 kubelet[2851]: I0113 20:46:26.415998 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2" Jan 13 20:46:26.423518 kubelet[2851]: I0113 20:46:26.423496 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd" Jan 13 20:46:26.425599 kubelet[2851]: I0113 20:46:26.425490 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87" Jan 13 20:46:26.426866 kubelet[2851]: I0113 20:46:26.426846 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492" Jan 13 20:46:26.445354 containerd[1547]: time="2025-01-13T20:46:26.445210686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Jan 13 20:46:26.446117 kubelet[2851]: I0113 20:46:26.445874 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2" Jan 13 20:46:26.447609 kubelet[2851]: I0113 20:46:26.447601 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a" Jan 13 20:46:26.451906 containerd[1547]: time="2025-01-13T20:46:26.451649678Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\"" Jan 13 20:46:26.451906 containerd[1547]: time="2025-01-13T20:46:26.451820120Z" level=info msg="Ensure that sandbox 67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a in task-service has been cleanup successfully" Jan 13 20:46:26.452146 containerd[1547]: time="2025-01-13T20:46:26.452133713Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\"" Jan 13 20:46:26.452368 containerd[1547]: time="2025-01-13T20:46:26.452276707Z" level=info msg="Ensure that sandbox 2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2 in task-service has been cleanup successfully" Jan 13 20:46:26.452832 containerd[1547]: time="2025-01-13T20:46:26.452442246Z" level=info msg="TearDown network for sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" successfully" Jan 13 20:46:26.452832 containerd[1547]: time="2025-01-13T20:46:26.452452615Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" returns successfully" Jan 13 20:46:26.452832 containerd[1547]: time="2025-01-13T20:46:26.452500323Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\"" Jan 13 20:46:26.452832 containerd[1547]: time="2025-01-13T20:46:26.452576274Z" level=info msg="Ensure that sandbox 83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd in task-service has been cleanup successfully" Jan 13 20:46:26.453614 containerd[1547]: time="2025-01-13T20:46:26.453137394Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\"" Jan 13 20:46:26.453614 containerd[1547]: time="2025-01-13T20:46:26.453226208Z" level=info msg="Ensure that sandbox 4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2 in task-service has been cleanup successfully" Jan 13 20:46:26.453614 containerd[1547]: time="2025-01-13T20:46:26.453535926Z" level=info msg="TearDown network for sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" successfully" Jan 13 20:46:26.453614 containerd[1547]: time="2025-01-13T20:46:26.453553256Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" returns successfully" Jan 13 20:46:26.454026 containerd[1547]: time="2025-01-13T20:46:26.453884012Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\"" Jan 13 20:46:26.454026 containerd[1547]: time="2025-01-13T20:46:26.453976082Z" level=info msg="Ensure that sandbox 48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492 in task-service has been cleanup successfully" Jan 13 20:46:26.454111 containerd[1547]: time="2025-01-13T20:46:26.454102258Z" level=info msg="TearDown network for sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" successfully" Jan 13 20:46:26.454146 containerd[1547]: time="2025-01-13T20:46:26.454138969Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" returns successfully" Jan 13 20:46:26.454431 containerd[1547]: time="2025-01-13T20:46:26.454195699Z" level=info msg="TearDown network for sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" successfully" Jan 13 20:46:26.454431 containerd[1547]: time="2025-01-13T20:46:26.454292890Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" returns successfully" Jan 13 20:46:26.454431 containerd[1547]: time="2025-01-13T20:46:26.454204048Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\"" Jan 13 20:46:26.454431 containerd[1547]: time="2025-01-13T20:46:26.454385102Z" level=info msg="Ensure that sandbox 183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87 in task-service has been cleanup successfully" Jan 13 20:46:26.454523 containerd[1547]: time="2025-01-13T20:46:26.454251567Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:1,}" Jan 13 20:46:26.454697 containerd[1547]: time="2025-01-13T20:46:26.454518118Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:1,}" Jan 13 20:46:26.456581 containerd[1547]: time="2025-01-13T20:46:26.456504842Z" level=info msg="TearDown network for sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" successfully" Jan 13 20:46:26.456581 containerd[1547]: time="2025-01-13T20:46:26.456518110Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" returns successfully" Jan 13 20:46:26.456581 containerd[1547]: time="2025-01-13T20:46:26.456554653Z" level=info msg="TearDown network for sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" successfully" Jan 13 20:46:26.456581 containerd[1547]: time="2025-01-13T20:46:26.456560726Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" returns successfully" Jan 13 20:46:26.457757 containerd[1547]: time="2025-01-13T20:46:26.457723051Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:1,}" Jan 13 20:46:26.458313 containerd[1547]: time="2025-01-13T20:46:26.458185599Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:1,}" Jan 13 20:46:26.458738 containerd[1547]: time="2025-01-13T20:46:26.458635657Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:1,}" Jan 13 20:46:26.460609 containerd[1547]: time="2025-01-13T20:46:26.460285739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:1,}" Jan 13 20:46:26.521453 containerd[1547]: time="2025-01-13T20:46:26.521336084Z" level=error msg="Failed to destroy network for sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.523555 containerd[1547]: time="2025-01-13T20:46:26.523473096Z" level=error msg="encountered an error cleaning up failed sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.523555 containerd[1547]: time="2025-01-13T20:46:26.523528718Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.525187 kubelet[2851]: E0113 20:46:26.524835 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.525187 kubelet[2851]: E0113 20:46:26.524979 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:26.525187 kubelet[2851]: E0113 20:46:26.524997 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:26.525268 kubelet[2851]: E0113 20:46:26.525142 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-75b2t_kube-system(45d20c01-0698-463e-b647-27ec83c8d824)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-75b2t_kube-system(45d20c01-0698-463e-b647-27ec83c8d824)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-75b2t" podUID="45d20c01-0698-463e-b647-27ec83c8d824" Jan 13 20:46:26.535461 containerd[1547]: time="2025-01-13T20:46:26.535412983Z" level=error msg="Failed to destroy network for sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.535640 containerd[1547]: time="2025-01-13T20:46:26.535616301Z" level=error msg="encountered an error cleaning up failed sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.535665 containerd[1547]: time="2025-01-13T20:46:26.535648668Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.536487 kubelet[2851]: E0113 20:46:26.535805 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.536487 kubelet[2851]: E0113 20:46:26.535836 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:26.536487 kubelet[2851]: E0113 20:46:26.535853 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:26.536582 kubelet[2851]: E0113 20:46:26.535885 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d7876745f-sk2dr_calico-apiserver(662f5a4d-917d-45ec-97a1-d70b9c8e2f05)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d7876745f-sk2dr_calico-apiserver(662f5a4d-917d-45ec-97a1-d70b9c8e2f05)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" podUID="662f5a4d-917d-45ec-97a1-d70b9c8e2f05" Jan 13 20:46:26.581976 containerd[1547]: time="2025-01-13T20:46:26.581941474Z" level=error msg="Failed to destroy network for sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.582373 containerd[1547]: time="2025-01-13T20:46:26.582147329Z" level=error msg="encountered an error cleaning up failed sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.582373 containerd[1547]: time="2025-01-13T20:46:26.582185157Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.583071 kubelet[2851]: E0113 20:46:26.582523 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.583071 kubelet[2851]: E0113 20:46:26.582560 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:26.583071 kubelet[2851]: E0113 20:46:26.582579 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:26.583187 kubelet[2851]: E0113 20:46:26.582609 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cj9xm_calico-system(7af6a31b-5e31-40c1-b6a8-196414f83e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cj9xm_calico-system(7af6a31b-5e31-40c1-b6a8-196414f83e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cj9xm" podUID="7af6a31b-5e31-40c1-b6a8-196414f83e54" Jan 13 20:46:26.592401 systemd[1]: run-netns-cni\x2d47aa5f98\x2d93d5\x2dc6a7\x2d9631\x2d6a432a754d35.mount: Deactivated successfully. Jan 13 20:46:26.592457 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492-shm.mount: Deactivated successfully. Jan 13 20:46:26.602674 containerd[1547]: time="2025-01-13T20:46:26.602444207Z" level=error msg="Failed to destroy network for sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.603002 containerd[1547]: time="2025-01-13T20:46:26.602845019Z" level=error msg="Failed to destroy network for sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.605371 containerd[1547]: time="2025-01-13T20:46:26.603519131Z" level=error msg="encountered an error cleaning up failed sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.605371 containerd[1547]: time="2025-01-13T20:46:26.603557542Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.605454 kubelet[2851]: E0113 20:46:26.604520 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.605454 kubelet[2851]: E0113 20:46:26.604569 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:26.605454 kubelet[2851]: E0113 20:46:26.604582 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:26.605644 kubelet[2851]: E0113 20:46:26.604614 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-66f4bb6d79-zhjcz_calico-system(0016e978-2a6d-4be0-ad22-af8c555426bb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-66f4bb6d79-zhjcz_calico-system(0016e978-2a6d-4be0-ad22-af8c555426bb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" podUID="0016e978-2a6d-4be0-ad22-af8c555426bb" Jan 13 20:46:26.606125 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3-shm.mount: Deactivated successfully. Jan 13 20:46:26.607326 containerd[1547]: time="2025-01-13T20:46:26.606452850Z" level=error msg="encountered an error cleaning up failed sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.607326 containerd[1547]: time="2025-01-13T20:46:26.606502371Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.606184 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9-shm.mount: Deactivated successfully. Jan 13 20:46:26.607999 kubelet[2851]: E0113 20:46:26.607788 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.607999 kubelet[2851]: E0113 20:46:26.607839 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:26.607999 kubelet[2851]: E0113 20:46:26.607851 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:26.608521 kubelet[2851]: E0113 20:46:26.607894 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-nx8p5_kube-system(6ab74c68-2020-4618-bcbe-672227cc6fc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-nx8p5_kube-system(6ab74c68-2020-4618-bcbe-672227cc6fc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-nx8p5" podUID="6ab74c68-2020-4618-bcbe-672227cc6fc9" Jan 13 20:46:26.610676 containerd[1547]: time="2025-01-13T20:46:26.610653024Z" level=error msg="Failed to destroy network for sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.610867 containerd[1547]: time="2025-01-13T20:46:26.610844094Z" level=error msg="encountered an error cleaning up failed sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.610899 containerd[1547]: time="2025-01-13T20:46:26.610882391Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.611058 kubelet[2851]: E0113 20:46:26.611024 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:26.611172 kubelet[2851]: E0113 20:46:26.611105 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:26.611172 kubelet[2851]: E0113 20:46:26.611122 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:26.611172 kubelet[2851]: E0113 20:46:26.611150 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d7876745f-vvftf_calico-apiserver(6953b35c-8169-44aa-91cb-7dd3f8f9aade)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d7876745f-vvftf_calico-apiserver(6953b35c-8169-44aa-91cb-7dd3f8f9aade)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" podUID="6953b35c-8169-44aa-91cb-7dd3f8f9aade" Jan 13 20:46:26.612984 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7-shm.mount: Deactivated successfully. Jan 13 20:46:27.451221 kubelet[2851]: I0113 20:46:27.451195 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9" Jan 13 20:46:27.452011 containerd[1547]: time="2025-01-13T20:46:27.451907857Z" level=info msg="StopPodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\"" Jan 13 20:46:27.452160 containerd[1547]: time="2025-01-13T20:46:27.452079193Z" level=info msg="Ensure that sandbox 18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9 in task-service has been cleanup successfully" Jan 13 20:46:27.453822 containerd[1547]: time="2025-01-13T20:46:27.452301025Z" level=info msg="TearDown network for sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" successfully" Jan 13 20:46:27.453822 containerd[1547]: time="2025-01-13T20:46:27.452313017Z" level=info msg="StopPodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" returns successfully" Jan 13 20:46:27.453939 systemd[1]: run-netns-cni\x2d2e774cb6\x2dff56\x2dc057\x2d453b\x2d03cbf5b4cc4c.mount: Deactivated successfully. Jan 13 20:46:27.455007 kubelet[2851]: I0113 20:46:27.454277 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3" Jan 13 20:46:27.455039 containerd[1547]: time="2025-01-13T20:46:27.454020663Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\"" Jan 13 20:46:27.455039 containerd[1547]: time="2025-01-13T20:46:27.455012331Z" level=info msg="TearDown network for sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" successfully" Jan 13 20:46:27.455039 containerd[1547]: time="2025-01-13T20:46:27.455019056Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" returns successfully" Jan 13 20:46:27.455198 containerd[1547]: time="2025-01-13T20:46:27.455086701Z" level=info msg="StopPodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\"" Jan 13 20:46:27.455198 containerd[1547]: time="2025-01-13T20:46:27.455180843Z" level=info msg="Ensure that sandbox a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3 in task-service has been cleanup successfully" Jan 13 20:46:27.456211 containerd[1547]: time="2025-01-13T20:46:27.455388515Z" level=info msg="TearDown network for sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" successfully" Jan 13 20:46:27.456211 containerd[1547]: time="2025-01-13T20:46:27.455403561Z" level=info msg="StopPodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" returns successfully" Jan 13 20:46:27.456211 containerd[1547]: time="2025-01-13T20:46:27.455473275Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:2,}" Jan 13 20:46:27.456876 containerd[1547]: time="2025-01-13T20:46:27.456718465Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\"" Jan 13 20:46:27.456876 containerd[1547]: time="2025-01-13T20:46:27.456819762Z" level=info msg="TearDown network for sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" successfully" Jan 13 20:46:27.456876 containerd[1547]: time="2025-01-13T20:46:27.456827509Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" returns successfully" Jan 13 20:46:27.457238 systemd[1]: run-netns-cni\x2da224fdd5\x2d4afe\x2d11c9\x2d0502\x2d51c3e9f1be0b.mount: Deactivated successfully. Jan 13 20:46:27.457491 containerd[1547]: time="2025-01-13T20:46:27.457290495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:2,}" Jan 13 20:46:27.458709 kubelet[2851]: I0113 20:46:27.458698 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3" Jan 13 20:46:27.460337 kubelet[2851]: I0113 20:46:27.460301 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3" Jan 13 20:46:27.462313 containerd[1547]: time="2025-01-13T20:46:27.462203522Z" level=info msg="StopPodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\"" Jan 13 20:46:27.463222 containerd[1547]: time="2025-01-13T20:46:27.463147700Z" level=info msg="Ensure that sandbox 648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3 in task-service has been cleanup successfully" Jan 13 20:46:27.463397 containerd[1547]: time="2025-01-13T20:46:27.463385658Z" level=info msg="TearDown network for sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" successfully" Jan 13 20:46:27.463633 containerd[1547]: time="2025-01-13T20:46:27.463440240Z" level=info msg="StopPodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" returns successfully" Jan 13 20:46:27.464076 containerd[1547]: time="2025-01-13T20:46:27.463740335Z" level=info msg="StopPodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\"" Jan 13 20:46:27.464076 containerd[1547]: time="2025-01-13T20:46:27.463829851Z" level=info msg="Ensure that sandbox 44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3 in task-service has been cleanup successfully" Jan 13 20:46:27.464378 containerd[1547]: time="2025-01-13T20:46:27.464208788Z" level=info msg="TearDown network for sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" successfully" Jan 13 20:46:27.464450 containerd[1547]: time="2025-01-13T20:46:27.464417482Z" level=info msg="StopPodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" returns successfully" Jan 13 20:46:27.465056 kubelet[2851]: I0113 20:46:27.464605 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7" Jan 13 20:46:27.466038 containerd[1547]: time="2025-01-13T20:46:27.465874584Z" level=info msg="StopPodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\"" Jan 13 20:46:27.466038 containerd[1547]: time="2025-01-13T20:46:27.465968347Z" level=info msg="Ensure that sandbox 3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7 in task-service has been cleanup successfully" Jan 13 20:46:27.466144 containerd[1547]: time="2025-01-13T20:46:27.466133973Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\"" Jan 13 20:46:27.466306 containerd[1547]: time="2025-01-13T20:46:27.466199513Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\"" Jan 13 20:46:27.466591 containerd[1547]: time="2025-01-13T20:46:27.466576713Z" level=info msg="TearDown network for sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" successfully" Jan 13 20:46:27.466591 containerd[1547]: time="2025-01-13T20:46:27.466588034Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" returns successfully" Jan 13 20:46:27.466730 containerd[1547]: time="2025-01-13T20:46:27.466666190Z" level=info msg="TearDown network for sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" successfully" Jan 13 20:46:27.466730 containerd[1547]: time="2025-01-13T20:46:27.466674704Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" returns successfully" Jan 13 20:46:27.466730 containerd[1547]: time="2025-01-13T20:46:27.466437599Z" level=info msg="TearDown network for sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" successfully" Jan 13 20:46:27.466730 containerd[1547]: time="2025-01-13T20:46:27.466700788Z" level=info msg="StopPodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" returns successfully" Jan 13 20:46:27.467284 containerd[1547]: time="2025-01-13T20:46:27.467019744Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:2,}" Jan 13 20:46:27.467284 containerd[1547]: time="2025-01-13T20:46:27.467149214Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\"" Jan 13 20:46:27.467284 containerd[1547]: time="2025-01-13T20:46:27.467187163Z" level=info msg="TearDown network for sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" successfully" Jan 13 20:46:27.467284 containerd[1547]: time="2025-01-13T20:46:27.467192796Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" returns successfully" Jan 13 20:46:27.467942 containerd[1547]: time="2025-01-13T20:46:27.467931081Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:2,}" Jan 13 20:46:27.468632 kubelet[2851]: I0113 20:46:27.468618 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920" Jan 13 20:46:27.470806 containerd[1547]: time="2025-01-13T20:46:27.470791886Z" level=info msg="StopPodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\"" Jan 13 20:46:27.471356 containerd[1547]: time="2025-01-13T20:46:27.471330273Z" level=info msg="Ensure that sandbox a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920 in task-service has been cleanup successfully" Jan 13 20:46:27.471876 containerd[1547]: time="2025-01-13T20:46:27.471140169Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:2,}" Jan 13 20:46:27.471946 containerd[1547]: time="2025-01-13T20:46:27.471937087Z" level=info msg="TearDown network for sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" successfully" Jan 13 20:46:27.472009 containerd[1547]: time="2025-01-13T20:46:27.471972819Z" level=info msg="StopPodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" returns successfully" Jan 13 20:46:27.478360 containerd[1547]: time="2025-01-13T20:46:27.478316060Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\"" Jan 13 20:46:27.478464 containerd[1547]: time="2025-01-13T20:46:27.478424313Z" level=info msg="TearDown network for sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" successfully" Jan 13 20:46:27.478464 containerd[1547]: time="2025-01-13T20:46:27.478434396Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" returns successfully" Jan 13 20:46:27.479041 containerd[1547]: time="2025-01-13T20:46:27.479027118Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:2,}" Jan 13 20:46:27.555572 containerd[1547]: time="2025-01-13T20:46:27.555546089Z" level=error msg="Failed to destroy network for sandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.555995 containerd[1547]: time="2025-01-13T20:46:27.555818360Z" level=error msg="encountered an error cleaning up failed sandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.555995 containerd[1547]: time="2025-01-13T20:46:27.555854710Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.562904 kubelet[2851]: E0113 20:46:27.562888 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.562970 kubelet[2851]: E0113 20:46:27.562921 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:27.562970 kubelet[2851]: E0113 20:46:27.562935 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:27.563009 kubelet[2851]: E0113 20:46:27.562984 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-nx8p5_kube-system(6ab74c68-2020-4618-bcbe-672227cc6fc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-nx8p5_kube-system(6ab74c68-2020-4618-bcbe-672227cc6fc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-nx8p5" podUID="6ab74c68-2020-4618-bcbe-672227cc6fc9" Jan 13 20:46:27.564301 containerd[1547]: time="2025-01-13T20:46:27.564156947Z" level=error msg="Failed to destroy network for sandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.566518 containerd[1547]: time="2025-01-13T20:46:27.566466934Z" level=error msg="encountered an error cleaning up failed sandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.566613 containerd[1547]: time="2025-01-13T20:46:27.566549928Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.566846 kubelet[2851]: E0113 20:46:27.566779 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.567089 kubelet[2851]: E0113 20:46:27.566917 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:27.567089 kubelet[2851]: E0113 20:46:27.566940 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:27.567089 kubelet[2851]: E0113 20:46:27.566976 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-66f4bb6d79-zhjcz_calico-system(0016e978-2a6d-4be0-ad22-af8c555426bb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-66f4bb6d79-zhjcz_calico-system(0016e978-2a6d-4be0-ad22-af8c555426bb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" podUID="0016e978-2a6d-4be0-ad22-af8c555426bb" Jan 13 20:46:27.583900 containerd[1547]: time="2025-01-13T20:46:27.583860866Z" level=error msg="Failed to destroy network for sandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.584309 containerd[1547]: time="2025-01-13T20:46:27.584281339Z" level=error msg="encountered an error cleaning up failed sandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.585330 containerd[1547]: time="2025-01-13T20:46:27.584380149Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.588103 kubelet[2851]: E0113 20:46:27.587207 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.588103 kubelet[2851]: E0113 20:46:27.587240 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:27.588103 kubelet[2851]: E0113 20:46:27.587255 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:27.588187 kubelet[2851]: E0113 20:46:27.587293 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-75b2t_kube-system(45d20c01-0698-463e-b647-27ec83c8d824)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-75b2t_kube-system(45d20c01-0698-463e-b647-27ec83c8d824)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-75b2t" podUID="45d20c01-0698-463e-b647-27ec83c8d824" Jan 13 20:46:27.591463 systemd[1]: run-netns-cni\x2dd0ea8ed4\x2d25d5\x2d4015\x2db780\x2d4c2e02ce76f3.mount: Deactivated successfully. Jan 13 20:46:27.591689 systemd[1]: run-netns-cni\x2d21ff42b8\x2de36d\x2dee2b\x2d271b\x2dd50da1d94db7.mount: Deactivated successfully. Jan 13 20:46:27.591825 systemd[1]: run-netns-cni\x2d2097d804\x2da8a9\x2d74fe\x2df19d\x2ded3a22308138.mount: Deactivated successfully. Jan 13 20:46:27.591950 systemd[1]: run-netns-cni\x2daf27c172\x2dd36b\x2d99da\x2d2648\x2d622a60e11e2f.mount: Deactivated successfully. Jan 13 20:46:27.607351 containerd[1547]: time="2025-01-13T20:46:27.604987226Z" level=error msg="Failed to destroy network for sandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.610164 containerd[1547]: time="2025-01-13T20:46:27.608587006Z" level=error msg="encountered an error cleaning up failed sandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.610164 containerd[1547]: time="2025-01-13T20:46:27.608668472Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.610238 kubelet[2851]: E0113 20:46:27.608844 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.610238 kubelet[2851]: E0113 20:46:27.609022 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:27.610238 kubelet[2851]: E0113 20:46:27.609035 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:27.608648 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5-shm.mount: Deactivated successfully. Jan 13 20:46:27.610560 kubelet[2851]: E0113 20:46:27.609072 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d7876745f-sk2dr_calico-apiserver(662f5a4d-917d-45ec-97a1-d70b9c8e2f05)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d7876745f-sk2dr_calico-apiserver(662f5a4d-917d-45ec-97a1-d70b9c8e2f05)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" podUID="662f5a4d-917d-45ec-97a1-d70b9c8e2f05" Jan 13 20:46:27.614710 containerd[1547]: time="2025-01-13T20:46:27.612780510Z" level=error msg="Failed to destroy network for sandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.614710 containerd[1547]: time="2025-01-13T20:46:27.613282557Z" level=error msg="encountered an error cleaning up failed sandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.614710 containerd[1547]: time="2025-01-13T20:46:27.613316453Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.614796 kubelet[2851]: E0113 20:46:27.614300 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.614796 kubelet[2851]: E0113 20:46:27.614329 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:27.614796 kubelet[2851]: E0113 20:46:27.614358 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:27.614249 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d-shm.mount: Deactivated successfully. Jan 13 20:46:27.614890 kubelet[2851]: E0113 20:46:27.614395 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cj9xm_calico-system(7af6a31b-5e31-40c1-b6a8-196414f83e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cj9xm_calico-system(7af6a31b-5e31-40c1-b6a8-196414f83e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cj9xm" podUID="7af6a31b-5e31-40c1-b6a8-196414f83e54" Jan 13 20:46:27.623441 containerd[1547]: time="2025-01-13T20:46:27.623410709Z" level=error msg="Failed to destroy network for sandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.623629 containerd[1547]: time="2025-01-13T20:46:27.623614247Z" level=error msg="encountered an error cleaning up failed sandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.623665 containerd[1547]: time="2025-01-13T20:46:27.623651683Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.624740 kubelet[2851]: E0113 20:46:27.623780 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:27.624740 kubelet[2851]: E0113 20:46:27.623815 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:27.624740 kubelet[2851]: E0113 20:46:27.623832 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:27.624849 kubelet[2851]: E0113 20:46:27.623869 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d7876745f-vvftf_calico-apiserver(6953b35c-8169-44aa-91cb-7dd3f8f9aade)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d7876745f-vvftf_calico-apiserver(6953b35c-8169-44aa-91cb-7dd3f8f9aade)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" podUID="6953b35c-8169-44aa-91cb-7dd3f8f9aade" Jan 13 20:46:27.626044 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444-shm.mount: Deactivated successfully. Jan 13 20:46:28.491753 kubelet[2851]: I0113 20:46:28.491674 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d" Jan 13 20:46:28.492730 containerd[1547]: time="2025-01-13T20:46:28.492691926Z" level=info msg="StopPodSandbox for \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\"" Jan 13 20:46:28.492860 containerd[1547]: time="2025-01-13T20:46:28.492844270Z" level=info msg="Ensure that sandbox 5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d in task-service has been cleanup successfully" Jan 13 20:46:28.495018 containerd[1547]: time="2025-01-13T20:46:28.494690576Z" level=info msg="TearDown network for sandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\" successfully" Jan 13 20:46:28.495018 containerd[1547]: time="2025-01-13T20:46:28.494703066Z" level=info msg="StopPodSandbox for \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\" returns successfully" Jan 13 20:46:28.494840 systemd[1]: run-netns-cni\x2d0b22e21b\x2d8b17\x2d40ff\x2dc371\x2dc80c7626ce10.mount: Deactivated successfully. Jan 13 20:46:28.495694 containerd[1547]: time="2025-01-13T20:46:28.495357901Z" level=info msg="StopPodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\"" Jan 13 20:46:28.495694 containerd[1547]: time="2025-01-13T20:46:28.495416455Z" level=info msg="TearDown network for sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" successfully" Jan 13 20:46:28.495694 containerd[1547]: time="2025-01-13T20:46:28.495426603Z" level=info msg="StopPodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" returns successfully" Jan 13 20:46:28.502308 kubelet[2851]: I0113 20:46:28.495454 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444" Jan 13 20:46:28.502308 kubelet[2851]: I0113 20:46:28.497187 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5" Jan 13 20:46:28.502308 kubelet[2851]: I0113 20:46:28.498518 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2" Jan 13 20:46:28.502308 kubelet[2851]: I0113 20:46:28.499883 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3" Jan 13 20:46:28.502308 kubelet[2851]: I0113 20:46:28.501511 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.495869626Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\"" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.495924889Z" level=info msg="TearDown network for sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.495933363Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" returns successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.495966340Z" level=info msg="StopPodSandbox for \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\"" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.496076779Z" level=info msg="Ensure that sandbox 3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444 in task-service has been cleanup successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.496321485Z" level=info msg="TearDown network for sandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\" successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.496333949Z" level=info msg="StopPodSandbox for \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\" returns successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.496439614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:3,}" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.496574882Z" level=info msg="StopPodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\"" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.496714935Z" level=info msg="TearDown network for sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.496725454Z" level=info msg="StopPodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" returns successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.497083694Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\"" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.497149390Z" level=info msg="TearDown network for sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.497157697Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" returns successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.497403238Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:3,}" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.497447129Z" level=info msg="StopPodSandbox for \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\"" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.497553349Z" level=info msg="Ensure that sandbox 43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5 in task-service has been cleanup successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.497725261Z" level=info msg="TearDown network for sandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\" successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.497735984Z" level=info msg="StopPodSandbox for \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\" returns successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.497905744Z" level=info msg="StopPodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\"" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.497948462Z" level=info msg="TearDown network for sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.497953963Z" level=info msg="StopPodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" returns successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.498081555Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\"" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.498113516Z" level=info msg="TearDown network for sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.498118991Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" returns successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.498336567Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:3,}" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.498711295Z" level=info msg="StopPodSandbox for \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\"" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.498873383Z" level=info msg="Ensure that sandbox 07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2 in task-service has been cleanup successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.498980338Z" level=info msg="TearDown network for sandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\" successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.498988122Z" level=info msg="StopPodSandbox for \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\" returns successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.499231463Z" level=info msg="StopPodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\"" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.499280815Z" level=info msg="TearDown network for sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.499287840Z" level=info msg="StopPodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" returns successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.499552137Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\"" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.499585805Z" level=info msg="TearDown network for sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" successfully" Jan 13 20:46:28.502442 containerd[1547]: time="2025-01-13T20:46:28.499591228Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" returns successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.499918466Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:3,}" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.500162343Z" level=info msg="StopPodSandbox for \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\"" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.500254654Z" level=info msg="Ensure that sandbox f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3 in task-service has been cleanup successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.500874580Z" level=info msg="TearDown network for sandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\" successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.500882315Z" level=info msg="StopPodSandbox for \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\" returns successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.500981992Z" level=info msg="StopPodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\"" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.501200904Z" level=info msg="TearDown network for sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.501208053Z" level=info msg="StopPodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" returns successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.501380811Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\"" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.501504533Z" level=info msg="TearDown network for sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.501511989Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" returns successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.501715408Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:3,}" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.501841570Z" level=info msg="StopPodSandbox for \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\"" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.501931260Z" level=info msg="Ensure that sandbox 981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33 in task-service has been cleanup successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.502014625Z" level=info msg="TearDown network for sandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\" successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.502022164Z" level=info msg="StopPodSandbox for \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\" returns successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.502121947Z" level=info msg="StopPodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\"" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.502159399Z" level=info msg="TearDown network for sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.502165133Z" level=info msg="StopPodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" returns successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.502761955Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\"" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.502813811Z" level=info msg="TearDown network for sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.502820173Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" returns successfully" Jan 13 20:46:28.503549 containerd[1547]: time="2025-01-13T20:46:28.503074105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:3,}" Jan 13 20:46:28.592191 systemd[1]: run-netns-cni\x2db3b4d96d\x2d4eae\x2dcbd4\x2d6d5b\x2defb5a72ff25d.mount: Deactivated successfully. Jan 13 20:46:28.592246 systemd[1]: run-netns-cni\x2dae76de84\x2db853\x2d6065\x2dd417\x2d165b9744f14e.mount: Deactivated successfully. Jan 13 20:46:28.592278 systemd[1]: run-netns-cni\x2dc9c10306\x2dda7f\x2d7735\x2d7c38\x2db9fe47a9b03b.mount: Deactivated successfully. Jan 13 20:46:28.592309 systemd[1]: run-netns-cni\x2dbdeadd6d\x2deac9\x2d6e3f\x2d183b\x2d174b58da7379.mount: Deactivated successfully. Jan 13 20:46:28.592345 systemd[1]: run-netns-cni\x2d5b98113b\x2de915\x2de5c3\x2db6ab\x2dacc8cc073069.mount: Deactivated successfully. Jan 13 20:46:28.665277 containerd[1547]: time="2025-01-13T20:46:28.662820214Z" level=error msg="Failed to destroy network for sandbox \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.664736 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96-shm.mount: Deactivated successfully. Jan 13 20:46:28.665923 containerd[1547]: time="2025-01-13T20:46:28.665793814Z" level=error msg="encountered an error cleaning up failed sandbox \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.665923 containerd[1547]: time="2025-01-13T20:46:28.665829732Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.666125 kubelet[2851]: E0113 20:46:28.666108 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.666251 kubelet[2851]: E0113 20:46:28.666146 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:28.666251 kubelet[2851]: E0113 20:46:28.666162 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:28.666251 kubelet[2851]: E0113 20:46:28.666195 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cj9xm_calico-system(7af6a31b-5e31-40c1-b6a8-196414f83e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cj9xm_calico-system(7af6a31b-5e31-40c1-b6a8-196414f83e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cj9xm" podUID="7af6a31b-5e31-40c1-b6a8-196414f83e54" Jan 13 20:46:28.680169 containerd[1547]: time="2025-01-13T20:46:28.680142948Z" level=error msg="Failed to destroy network for sandbox \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.680674 containerd[1547]: time="2025-01-13T20:46:28.680563229Z" level=error msg="encountered an error cleaning up failed sandbox \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.680778 containerd[1547]: time="2025-01-13T20:46:28.680766220Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.681527 kubelet[2851]: E0113 20:46:28.681512 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.681576 kubelet[2851]: E0113 20:46:28.681542 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:28.681576 kubelet[2851]: E0113 20:46:28.681555 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:28.681619 kubelet[2851]: E0113 20:46:28.681587 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-nx8p5_kube-system(6ab74c68-2020-4618-bcbe-672227cc6fc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-nx8p5_kube-system(6ab74c68-2020-4618-bcbe-672227cc6fc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-nx8p5" podUID="6ab74c68-2020-4618-bcbe-672227cc6fc9" Jan 13 20:46:28.682593 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a-shm.mount: Deactivated successfully. Jan 13 20:46:28.683361 containerd[1547]: time="2025-01-13T20:46:28.683020815Z" level=error msg="Failed to destroy network for sandbox \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.685365 containerd[1547]: time="2025-01-13T20:46:28.685083608Z" level=error msg="encountered an error cleaning up failed sandbox \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.685365 containerd[1547]: time="2025-01-13T20:46:28.685123445Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.685884 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8-shm.mount: Deactivated successfully. Jan 13 20:46:28.686053 kubelet[2851]: E0113 20:46:28.686040 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.686089 kubelet[2851]: E0113 20:46:28.686068 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:28.686089 kubelet[2851]: E0113 20:46:28.686082 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:28.686140 kubelet[2851]: E0113 20:46:28.686108 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d7876745f-vvftf_calico-apiserver(6953b35c-8169-44aa-91cb-7dd3f8f9aade)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d7876745f-vvftf_calico-apiserver(6953b35c-8169-44aa-91cb-7dd3f8f9aade)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" podUID="6953b35c-8169-44aa-91cb-7dd3f8f9aade" Jan 13 20:46:28.690297 containerd[1547]: time="2025-01-13T20:46:28.690246244Z" level=error msg="Failed to destroy network for sandbox \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.690808 containerd[1547]: time="2025-01-13T20:46:28.690465558Z" level=error msg="encountered an error cleaning up failed sandbox \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.690808 containerd[1547]: time="2025-01-13T20:46:28.690499288Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.692433 kubelet[2851]: E0113 20:46:28.691495 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.692433 kubelet[2851]: E0113 20:46:28.691521 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:28.692433 kubelet[2851]: E0113 20:46:28.691533 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:28.692122 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117-shm.mount: Deactivated successfully. Jan 13 20:46:28.692588 kubelet[2851]: E0113 20:46:28.691559 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d7876745f-sk2dr_calico-apiserver(662f5a4d-917d-45ec-97a1-d70b9c8e2f05)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d7876745f-sk2dr_calico-apiserver(662f5a4d-917d-45ec-97a1-d70b9c8e2f05)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" podUID="662f5a4d-917d-45ec-97a1-d70b9c8e2f05" Jan 13 20:46:28.704091 containerd[1547]: time="2025-01-13T20:46:28.704031822Z" level=error msg="Failed to destroy network for sandbox \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.704950 containerd[1547]: time="2025-01-13T20:46:28.704859797Z" level=error msg="encountered an error cleaning up failed sandbox \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.704950 containerd[1547]: time="2025-01-13T20:46:28.704901960Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.705306 kubelet[2851]: E0113 20:46:28.705129 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.705306 kubelet[2851]: E0113 20:46:28.705158 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:28.705306 kubelet[2851]: E0113 20:46:28.705171 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:28.705389 kubelet[2851]: E0113 20:46:28.705202 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-66f4bb6d79-zhjcz_calico-system(0016e978-2a6d-4be0-ad22-af8c555426bb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-66f4bb6d79-zhjcz_calico-system(0016e978-2a6d-4be0-ad22-af8c555426bb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" podUID="0016e978-2a6d-4be0-ad22-af8c555426bb" Jan 13 20:46:28.707358 containerd[1547]: time="2025-01-13T20:46:28.707083969Z" level=error msg="Failed to destroy network for sandbox \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.707358 containerd[1547]: time="2025-01-13T20:46:28.707229174Z" level=error msg="encountered an error cleaning up failed sandbox \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.707358 containerd[1547]: time="2025-01-13T20:46:28.707254679Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.707806 kubelet[2851]: E0113 20:46:28.707525 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:28.707806 kubelet[2851]: E0113 20:46:28.707545 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:28.707806 kubelet[2851]: E0113 20:46:28.707556 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:28.707873 kubelet[2851]: E0113 20:46:28.707599 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-75b2t_kube-system(45d20c01-0698-463e-b647-27ec83c8d824)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-75b2t_kube-system(45d20c01-0698-463e-b647-27ec83c8d824)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-75b2t" podUID="45d20c01-0698-463e-b647-27ec83c8d824" Jan 13 20:46:29.513002 kubelet[2851]: I0113 20:46:29.512987 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8" Jan 13 20:46:29.515014 containerd[1547]: time="2025-01-13T20:46:29.514849436Z" level=info msg="StopPodSandbox for \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\"" Jan 13 20:46:29.515014 containerd[1547]: time="2025-01-13T20:46:29.514962828Z" level=info msg="Ensure that sandbox 989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8 in task-service has been cleanup successfully" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.515083978Z" level=info msg="TearDown network for sandbox \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\" successfully" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.515092615Z" level=info msg="StopPodSandbox for \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\" returns successfully" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.515375635Z" level=info msg="StopPodSandbox for \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\"" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.515408023Z" level=info msg="TearDown network for sandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\" successfully" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.515413287Z" level=info msg="StopPodSandbox for \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\" returns successfully" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.515897199Z" level=info msg="StopPodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\"" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.515933554Z" level=info msg="TearDown network for sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" successfully" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.515939341Z" level=info msg="StopPodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" returns successfully" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.516658840Z" level=info msg="StopPodSandbox for \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\"" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.516742448Z" level=info msg="Ensure that sandbox 46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a in task-service has been cleanup successfully" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.517227548Z" level=info msg="TearDown network for sandbox \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\" successfully" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.517236306Z" level=info msg="StopPodSandbox for \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\" returns successfully" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.517265562Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\"" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.517299421Z" level=info msg="TearDown network for sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" successfully" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.517304305Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" returns successfully" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.518221663Z" level=info msg="StopPodSandbox for \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\"" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.518253840Z" level=info msg="TearDown network for sandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\" successfully" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.518259344Z" level=info msg="StopPodSandbox for \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\" returns successfully" Jan 13 20:46:29.519503 containerd[1547]: time="2025-01-13T20:46:29.518297918Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:4,}" Jan 13 20:46:29.526793 kubelet[2851]: I0113 20:46:29.516027 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a" Jan 13 20:46:29.528699 containerd[1547]: time="2025-01-13T20:46:29.528460201Z" level=info msg="StopPodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\"" Jan 13 20:46:29.528699 containerd[1547]: time="2025-01-13T20:46:29.528547894Z" level=info msg="TearDown network for sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" successfully" Jan 13 20:46:29.528699 containerd[1547]: time="2025-01-13T20:46:29.528555868Z" level=info msg="StopPodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" returns successfully" Jan 13 20:46:29.535577 containerd[1547]: time="2025-01-13T20:46:29.529946694Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\"" Jan 13 20:46:29.535577 containerd[1547]: time="2025-01-13T20:46:29.529981712Z" level=info msg="TearDown network for sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" successfully" Jan 13 20:46:29.535577 containerd[1547]: time="2025-01-13T20:46:29.529987288Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" returns successfully" Jan 13 20:46:29.535577 containerd[1547]: time="2025-01-13T20:46:29.530253222Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:4,}" Jan 13 20:46:29.535647 kubelet[2851]: I0113 20:46:29.531827 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117" Jan 13 20:46:29.535647 kubelet[2851]: I0113 20:46:29.533904 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610" Jan 13 20:46:29.536699 kubelet[2851]: I0113 20:46:29.536283 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51" Jan 13 20:46:29.544786 kubelet[2851]: I0113 20:46:29.544661 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96" Jan 13 20:46:29.552104 containerd[1547]: time="2025-01-13T20:46:29.552003792Z" level=info msg="StopPodSandbox for \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\"" Jan 13 20:46:29.573316 containerd[1547]: time="2025-01-13T20:46:29.573289384Z" level=info msg="StopPodSandbox for \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\"" Jan 13 20:46:29.590033 containerd[1547]: time="2025-01-13T20:46:29.588135229Z" level=info msg="Ensure that sandbox ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117 in task-service has been cleanup successfully" Jan 13 20:46:29.588161 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610-shm.mount: Deactivated successfully. Jan 13 20:46:29.588242 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51-shm.mount: Deactivated successfully. Jan 13 20:46:29.588296 systemd[1]: run-netns-cni\x2d9b35e516\x2d2c09\x2d4c14\x2d111f\x2dd2b6b103988f.mount: Deactivated successfully. Jan 13 20:46:29.588364 systemd[1]: run-netns-cni\x2d58d34230\x2dfb1e\x2d8cce\x2d2415\x2deb0439b77b97.mount: Deactivated successfully. Jan 13 20:46:29.591302 containerd[1547]: time="2025-01-13T20:46:29.591183290Z" level=info msg="TearDown network for sandbox \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\" successfully" Jan 13 20:46:29.591302 containerd[1547]: time="2025-01-13T20:46:29.591200912Z" level=info msg="StopPodSandbox for \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\" returns successfully" Jan 13 20:46:29.591673 systemd[1]: run-netns-cni\x2d3909c957\x2dc8cc\x2dbf6a\x2de434\x2d15dfb45f57f8.mount: Deactivated successfully. Jan 13 20:46:29.592603 containerd[1547]: time="2025-01-13T20:46:29.592580358Z" level=info msg="StopPodSandbox for \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\"" Jan 13 20:46:29.599761 containerd[1547]: time="2025-01-13T20:46:29.599740370Z" level=info msg="Ensure that sandbox baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51 in task-service has been cleanup successfully" Jan 13 20:46:29.601573 containerd[1547]: time="2025-01-13T20:46:29.601425683Z" level=info msg="TearDown network for sandbox \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\" successfully" Jan 13 20:46:29.601573 containerd[1547]: time="2025-01-13T20:46:29.601440740Z" level=info msg="StopPodSandbox for \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\" returns successfully" Jan 13 20:46:29.601573 containerd[1547]: time="2025-01-13T20:46:29.601503053Z" level=info msg="StopPodSandbox for \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\"" Jan 13 20:46:29.601593 systemd[1]: run-netns-cni\x2d1b74fee4\x2d369d\x2d09d9\x2dc209\x2d5f07e27841e2.mount: Deactivated successfully. Jan 13 20:46:29.612676 containerd[1547]: time="2025-01-13T20:46:29.611047018Z" level=info msg="Ensure that sandbox 94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96 in task-service has been cleanup successfully" Jan 13 20:46:29.612736 systemd[1]: run-netns-cni\x2d6291d0e2\x2d2643\x2d57df\x2d75d8\x2d051f43e24a5f.mount: Deactivated successfully. Jan 13 20:46:29.613240 containerd[1547]: time="2025-01-13T20:46:29.612831504Z" level=info msg="TearDown network for sandbox \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\" successfully" Jan 13 20:46:29.613240 containerd[1547]: time="2025-01-13T20:46:29.612845888Z" level=info msg="StopPodSandbox for \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\" returns successfully" Jan 13 20:46:29.613240 containerd[1547]: time="2025-01-13T20:46:29.613162618Z" level=info msg="Ensure that sandbox 16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610 in task-service has been cleanup successfully" Jan 13 20:46:29.614259 containerd[1547]: time="2025-01-13T20:46:29.614245137Z" level=info msg="TearDown network for sandbox \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\" successfully" Jan 13 20:46:29.614329 containerd[1547]: time="2025-01-13T20:46:29.614315690Z" level=info msg="StopPodSandbox for \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\" returns successfully" Jan 13 20:46:29.614842 containerd[1547]: time="2025-01-13T20:46:29.614544006Z" level=info msg="StopPodSandbox for \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\"" Jan 13 20:46:29.614842 containerd[1547]: time="2025-01-13T20:46:29.614598080Z" level=info msg="TearDown network for sandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\" successfully" Jan 13 20:46:29.614842 containerd[1547]: time="2025-01-13T20:46:29.614605555Z" level=info msg="StopPodSandbox for \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\" returns successfully" Jan 13 20:46:29.614842 containerd[1547]: time="2025-01-13T20:46:29.614645856Z" level=info msg="StopPodSandbox for \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\"" Jan 13 20:46:29.614842 containerd[1547]: time="2025-01-13T20:46:29.614688238Z" level=info msg="TearDown network for sandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\" successfully" Jan 13 20:46:29.614842 containerd[1547]: time="2025-01-13T20:46:29.614694925Z" level=info msg="StopPodSandbox for \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\" returns successfully" Jan 13 20:46:29.614842 containerd[1547]: time="2025-01-13T20:46:29.614750050Z" level=info msg="StopPodSandbox for \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\"" Jan 13 20:46:29.614842 containerd[1547]: time="2025-01-13T20:46:29.614790677Z" level=info msg="TearDown network for sandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\" successfully" Jan 13 20:46:29.614842 containerd[1547]: time="2025-01-13T20:46:29.614797854Z" level=info msg="StopPodSandbox for \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\" returns successfully" Jan 13 20:46:29.615938 systemd[1]: run-netns-cni\x2d3cd30ca3\x2d5fdc\x2da5d0\x2d4e0c\x2d8a8a1f65d28e.mount: Deactivated successfully. Jan 13 20:46:29.617706 containerd[1547]: time="2025-01-13T20:46:29.617654842Z" level=info msg="StopPodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\"" Jan 13 20:46:29.617771 containerd[1547]: time="2025-01-13T20:46:29.617712271Z" level=info msg="TearDown network for sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" successfully" Jan 13 20:46:29.617771 containerd[1547]: time="2025-01-13T20:46:29.617720766Z" level=info msg="StopPodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" returns successfully" Jan 13 20:46:29.617771 containerd[1547]: time="2025-01-13T20:46:29.617762898Z" level=info msg="StopPodSandbox for \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\"" Jan 13 20:46:29.617867 containerd[1547]: time="2025-01-13T20:46:29.617805262Z" level=info msg="TearDown network for sandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\" successfully" Jan 13 20:46:29.617867 containerd[1547]: time="2025-01-13T20:46:29.617812223Z" level=info msg="StopPodSandbox for \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\" returns successfully" Jan 13 20:46:29.617867 containerd[1547]: time="2025-01-13T20:46:29.617839141Z" level=info msg="StopPodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\"" Jan 13 20:46:29.635023 containerd[1547]: time="2025-01-13T20:46:29.617877105Z" level=info msg="TearDown network for sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" successfully" Jan 13 20:46:29.635023 containerd[1547]: time="2025-01-13T20:46:29.634843646Z" level=info msg="StopPodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" returns successfully" Jan 13 20:46:29.635023 containerd[1547]: time="2025-01-13T20:46:29.633790652Z" level=info msg="StopPodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\"" Jan 13 20:46:29.635023 containerd[1547]: time="2025-01-13T20:46:29.634906232Z" level=info msg="TearDown network for sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" successfully" Jan 13 20:46:29.635023 containerd[1547]: time="2025-01-13T20:46:29.634912155Z" level=info msg="StopPodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" returns successfully" Jan 13 20:46:29.635023 containerd[1547]: time="2025-01-13T20:46:29.633987595Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\"" Jan 13 20:46:29.635023 containerd[1547]: time="2025-01-13T20:46:29.634951299Z" level=info msg="TearDown network for sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" successfully" Jan 13 20:46:29.635023 containerd[1547]: time="2025-01-13T20:46:29.634956186Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" returns successfully" Jan 13 20:46:29.635023 containerd[1547]: time="2025-01-13T20:46:29.634459183Z" level=info msg="StopPodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\"" Jan 13 20:46:29.635023 containerd[1547]: time="2025-01-13T20:46:29.634993269Z" level=info msg="TearDown network for sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" successfully" Jan 13 20:46:29.635023 containerd[1547]: time="2025-01-13T20:46:29.634997748Z" level=info msg="StopPodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" returns successfully" Jan 13 20:46:29.635501 containerd[1547]: time="2025-01-13T20:46:29.635489010Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\"" Jan 13 20:46:29.635538 containerd[1547]: time="2025-01-13T20:46:29.635528663Z" level=info msg="TearDown network for sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" successfully" Jan 13 20:46:29.635538 containerd[1547]: time="2025-01-13T20:46:29.635536320Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" returns successfully" Jan 13 20:46:29.635569 containerd[1547]: time="2025-01-13T20:46:29.635559751Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\"" Jan 13 20:46:29.635597 containerd[1547]: time="2025-01-13T20:46:29.635588249Z" level=info msg="TearDown network for sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" successfully" Jan 13 20:46:29.635597 containerd[1547]: time="2025-01-13T20:46:29.635594691Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" returns successfully" Jan 13 20:46:29.635634 containerd[1547]: time="2025-01-13T20:46:29.635613963Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\"" Jan 13 20:46:29.635653 containerd[1547]: time="2025-01-13T20:46:29.635641885Z" level=info msg="TearDown network for sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" successfully" Jan 13 20:46:29.635653 containerd[1547]: time="2025-01-13T20:46:29.635646741Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" returns successfully" Jan 13 20:46:29.635778 containerd[1547]: time="2025-01-13T20:46:29.635706323Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:4,}" Jan 13 20:46:29.636035 containerd[1547]: time="2025-01-13T20:46:29.635985505Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:4,}" Jan 13 20:46:29.636659 containerd[1547]: time="2025-01-13T20:46:29.636117342Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:4,}" Jan 13 20:46:29.636816 containerd[1547]: time="2025-01-13T20:46:29.636129630Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:4,}" Jan 13 20:46:30.160137 containerd[1547]: time="2025-01-13T20:46:30.160020738Z" level=error msg="Failed to destroy network for sandbox \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.160407 containerd[1547]: time="2025-01-13T20:46:30.160357626Z" level=error msg="encountered an error cleaning up failed sandbox \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.160407 containerd[1547]: time="2025-01-13T20:46:30.160394961Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.160775 kubelet[2851]: E0113 20:46:30.160584 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.160775 kubelet[2851]: E0113 20:46:30.160624 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:30.160775 kubelet[2851]: E0113 20:46:30.160637 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:30.160971 kubelet[2851]: E0113 20:46:30.160676 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-66f4bb6d79-zhjcz_calico-system(0016e978-2a6d-4be0-ad22-af8c555426bb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-66f4bb6d79-zhjcz_calico-system(0016e978-2a6d-4be0-ad22-af8c555426bb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" podUID="0016e978-2a6d-4be0-ad22-af8c555426bb" Jan 13 20:46:30.163205 containerd[1547]: time="2025-01-13T20:46:30.163089599Z" level=error msg="Failed to destroy network for sandbox \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.163329 containerd[1547]: time="2025-01-13T20:46:30.163278788Z" level=error msg="Failed to destroy network for sandbox \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.163528 containerd[1547]: time="2025-01-13T20:46:30.163515329Z" level=error msg="encountered an error cleaning up failed sandbox \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.163826 containerd[1547]: time="2025-01-13T20:46:30.163584308Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.163826 containerd[1547]: time="2025-01-13T20:46:30.163687462Z" level=error msg="encountered an error cleaning up failed sandbox \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.163826 containerd[1547]: time="2025-01-13T20:46:30.163714531Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.163894 kubelet[2851]: E0113 20:46:30.163676 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.163894 kubelet[2851]: E0113 20:46:30.163699 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:30.163894 kubelet[2851]: E0113 20:46:30.163713 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:30.163954 kubelet[2851]: E0113 20:46:30.163743 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-nx8p5_kube-system(6ab74c68-2020-4618-bcbe-672227cc6fc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-nx8p5_kube-system(6ab74c68-2020-4618-bcbe-672227cc6fc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-nx8p5" podUID="6ab74c68-2020-4618-bcbe-672227cc6fc9" Jan 13 20:46:30.164380 kubelet[2851]: E0113 20:46:30.164128 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.164380 kubelet[2851]: E0113 20:46:30.164150 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:30.164380 kubelet[2851]: E0113 20:46:30.164161 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:30.164867 kubelet[2851]: E0113 20:46:30.164189 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-75b2t_kube-system(45d20c01-0698-463e-b647-27ec83c8d824)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-75b2t_kube-system(45d20c01-0698-463e-b647-27ec83c8d824)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-75b2t" podUID="45d20c01-0698-463e-b647-27ec83c8d824" Jan 13 20:46:30.166065 containerd[1547]: time="2025-01-13T20:46:30.165486715Z" level=error msg="Failed to destroy network for sandbox \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.166563 containerd[1547]: time="2025-01-13T20:46:30.166475553Z" level=error msg="encountered an error cleaning up failed sandbox \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.166563 containerd[1547]: time="2025-01-13T20:46:30.166523747Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.166837 kubelet[2851]: E0113 20:46:30.166754 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.166837 kubelet[2851]: E0113 20:46:30.166776 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:30.166837 kubelet[2851]: E0113 20:46:30.166788 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:30.166903 kubelet[2851]: E0113 20:46:30.166813 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d7876745f-vvftf_calico-apiserver(6953b35c-8169-44aa-91cb-7dd3f8f9aade)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d7876745f-vvftf_calico-apiserver(6953b35c-8169-44aa-91cb-7dd3f8f9aade)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" podUID="6953b35c-8169-44aa-91cb-7dd3f8f9aade" Jan 13 20:46:30.170253 containerd[1547]: time="2025-01-13T20:46:30.169545246Z" level=error msg="Failed to destroy network for sandbox \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.170253 containerd[1547]: time="2025-01-13T20:46:30.169726050Z" level=error msg="encountered an error cleaning up failed sandbox \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.170253 containerd[1547]: time="2025-01-13T20:46:30.169754579Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.170651 kubelet[2851]: E0113 20:46:30.170419 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.170651 kubelet[2851]: E0113 20:46:30.170441 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:30.170651 kubelet[2851]: E0113 20:46:30.170456 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:30.170737 kubelet[2851]: E0113 20:46:30.170483 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d7876745f-sk2dr_calico-apiserver(662f5a4d-917d-45ec-97a1-d70b9c8e2f05)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d7876745f-sk2dr_calico-apiserver(662f5a4d-917d-45ec-97a1-d70b9c8e2f05)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" podUID="662f5a4d-917d-45ec-97a1-d70b9c8e2f05" Jan 13 20:46:30.173864 containerd[1547]: time="2025-01-13T20:46:30.173838618Z" level=error msg="Failed to destroy network for sandbox \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.174404 containerd[1547]: time="2025-01-13T20:46:30.174387456Z" level=error msg="encountered an error cleaning up failed sandbox \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.174529 containerd[1547]: time="2025-01-13T20:46:30.174512324Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.174813 kubelet[2851]: E0113 20:46:30.174617 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.174813 kubelet[2851]: E0113 20:46:30.174651 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:30.174813 kubelet[2851]: E0113 20:46:30.174666 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:30.174885 kubelet[2851]: E0113 20:46:30.174696 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cj9xm_calico-system(7af6a31b-5e31-40c1-b6a8-196414f83e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cj9xm_calico-system(7af6a31b-5e31-40c1-b6a8-196414f83e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cj9xm" podUID="7af6a31b-5e31-40c1-b6a8-196414f83e54" Jan 13 20:46:30.548304 kubelet[2851]: I0113 20:46:30.548287 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45" Jan 13 20:46:30.549904 containerd[1547]: time="2025-01-13T20:46:30.549216590Z" level=info msg="StopPodSandbox for \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\"" Jan 13 20:46:30.549904 containerd[1547]: time="2025-01-13T20:46:30.549762824Z" level=info msg="Ensure that sandbox 80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45 in task-service has been cleanup successfully" Jan 13 20:46:30.550641 containerd[1547]: time="2025-01-13T20:46:30.550619826Z" level=info msg="TearDown network for sandbox \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\" successfully" Jan 13 20:46:30.550641 containerd[1547]: time="2025-01-13T20:46:30.550629959Z" level=info msg="StopPodSandbox for \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\" returns successfully" Jan 13 20:46:30.551306 containerd[1547]: time="2025-01-13T20:46:30.551269857Z" level=info msg="StopPodSandbox for \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\"" Jan 13 20:46:30.551693 containerd[1547]: time="2025-01-13T20:46:30.551314845Z" level=info msg="TearDown network for sandbox \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\" successfully" Jan 13 20:46:30.551693 containerd[1547]: time="2025-01-13T20:46:30.551322728Z" level=info msg="StopPodSandbox for \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\" returns successfully" Jan 13 20:46:30.552596 containerd[1547]: time="2025-01-13T20:46:30.552286668Z" level=info msg="StopPodSandbox for \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\"" Jan 13 20:46:30.552789 containerd[1547]: time="2025-01-13T20:46:30.552762554Z" level=info msg="TearDown network for sandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\" successfully" Jan 13 20:46:30.552789 containerd[1547]: time="2025-01-13T20:46:30.552772165Z" level=info msg="StopPodSandbox for \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\" returns successfully" Jan 13 20:46:30.553345 containerd[1547]: time="2025-01-13T20:46:30.553250112Z" level=info msg="StopPodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\"" Jan 13 20:46:30.553401 containerd[1547]: time="2025-01-13T20:46:30.553384216Z" level=info msg="TearDown network for sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" successfully" Jan 13 20:46:30.553401 containerd[1547]: time="2025-01-13T20:46:30.553395665Z" level=info msg="StopPodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" returns successfully" Jan 13 20:46:30.554616 containerd[1547]: time="2025-01-13T20:46:30.554451389Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\"" Jan 13 20:46:30.554616 containerd[1547]: time="2025-01-13T20:46:30.554493179Z" level=info msg="TearDown network for sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" successfully" Jan 13 20:46:30.554616 containerd[1547]: time="2025-01-13T20:46:30.554500101Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" returns successfully" Jan 13 20:46:30.554774 kubelet[2851]: I0113 20:46:30.554751 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef" Jan 13 20:46:30.555797 containerd[1547]: time="2025-01-13T20:46:30.555750967Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:5,}" Jan 13 20:46:30.556234 containerd[1547]: time="2025-01-13T20:46:30.556214157Z" level=info msg="StopPodSandbox for \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\"" Jan 13 20:46:30.556333 containerd[1547]: time="2025-01-13T20:46:30.556319402Z" level=info msg="Ensure that sandbox 71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef in task-service has been cleanup successfully" Jan 13 20:46:30.556854 containerd[1547]: time="2025-01-13T20:46:30.556575650Z" level=info msg="TearDown network for sandbox \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\" successfully" Jan 13 20:46:30.556854 containerd[1547]: time="2025-01-13T20:46:30.556585481Z" level=info msg="StopPodSandbox for \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\" returns successfully" Jan 13 20:46:30.557578 containerd[1547]: time="2025-01-13T20:46:30.557536210Z" level=info msg="StopPodSandbox for \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\"" Jan 13 20:46:30.558389 containerd[1547]: time="2025-01-13T20:46:30.557606168Z" level=info msg="TearDown network for sandbox \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\" successfully" Jan 13 20:46:30.558389 containerd[1547]: time="2025-01-13T20:46:30.557612566Z" level=info msg="StopPodSandbox for \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\" returns successfully" Jan 13 20:46:30.558389 containerd[1547]: time="2025-01-13T20:46:30.557886614Z" level=info msg="StopPodSandbox for \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\"" Jan 13 20:46:30.558389 containerd[1547]: time="2025-01-13T20:46:30.557948491Z" level=info msg="TearDown network for sandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\" successfully" Jan 13 20:46:30.558389 containerd[1547]: time="2025-01-13T20:46:30.557954410Z" level=info msg="StopPodSandbox for \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\" returns successfully" Jan 13 20:46:30.558514 containerd[1547]: time="2025-01-13T20:46:30.558503117Z" level=info msg="StopPodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\"" Jan 13 20:46:30.558565 containerd[1547]: time="2025-01-13T20:46:30.558538349Z" level=info msg="TearDown network for sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" successfully" Jan 13 20:46:30.558565 containerd[1547]: time="2025-01-13T20:46:30.558562645Z" level=info msg="StopPodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" returns successfully" Jan 13 20:46:30.560015 containerd[1547]: time="2025-01-13T20:46:30.560000981Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\"" Jan 13 20:46:30.560047 containerd[1547]: time="2025-01-13T20:46:30.560041952Z" level=info msg="TearDown network for sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" successfully" Jan 13 20:46:30.560067 containerd[1547]: time="2025-01-13T20:46:30.560047980Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" returns successfully" Jan 13 20:46:30.560716 containerd[1547]: time="2025-01-13T20:46:30.560695739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:5,}" Jan 13 20:46:30.562309 kubelet[2851]: I0113 20:46:30.562227 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d" Jan 13 20:46:30.563515 containerd[1547]: time="2025-01-13T20:46:30.563324569Z" level=info msg="StopPodSandbox for \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\"" Jan 13 20:46:30.565968 containerd[1547]: time="2025-01-13T20:46:30.565955915Z" level=info msg="Ensure that sandbox 62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d in task-service has been cleanup successfully" Jan 13 20:46:30.566104 containerd[1547]: time="2025-01-13T20:46:30.566094533Z" level=info msg="TearDown network for sandbox \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\" successfully" Jan 13 20:46:30.566143 containerd[1547]: time="2025-01-13T20:46:30.566136108Z" level=info msg="StopPodSandbox for \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\" returns successfully" Jan 13 20:46:30.567586 containerd[1547]: time="2025-01-13T20:46:30.567546801Z" level=info msg="StopPodSandbox for \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\"" Jan 13 20:46:30.567624 containerd[1547]: time="2025-01-13T20:46:30.567591432Z" level=info msg="TearDown network for sandbox \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\" successfully" Jan 13 20:46:30.567624 containerd[1547]: time="2025-01-13T20:46:30.567598796Z" level=info msg="StopPodSandbox for \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\" returns successfully" Jan 13 20:46:30.578238 containerd[1547]: time="2025-01-13T20:46:30.578215828Z" level=info msg="StopPodSandbox for \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\"" Jan 13 20:46:30.578656 containerd[1547]: time="2025-01-13T20:46:30.578349521Z" level=info msg="TearDown network for sandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\" successfully" Jan 13 20:46:30.578691 containerd[1547]: time="2025-01-13T20:46:30.578653943Z" level=info msg="StopPodSandbox for \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\" returns successfully" Jan 13 20:46:30.582257 containerd[1547]: time="2025-01-13T20:46:30.582241768Z" level=info msg="StopPodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\"" Jan 13 20:46:30.582297 containerd[1547]: time="2025-01-13T20:46:30.582287166Z" level=info msg="TearDown network for sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" successfully" Jan 13 20:46:30.582297 containerd[1547]: time="2025-01-13T20:46:30.582294839Z" level=info msg="StopPodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" returns successfully" Jan 13 20:46:30.582942 containerd[1547]: time="2025-01-13T20:46:30.582927661Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\"" Jan 13 20:46:30.582972 containerd[1547]: time="2025-01-13T20:46:30.582967308Z" level=info msg="TearDown network for sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" successfully" Jan 13 20:46:30.582992 containerd[1547]: time="2025-01-13T20:46:30.582973318Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" returns successfully" Jan 13 20:46:30.585289 containerd[1547]: time="2025-01-13T20:46:30.585018056Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:5,}" Jan 13 20:46:30.588181 kubelet[2851]: I0113 20:46:30.587844 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd" Jan 13 20:46:30.588460 containerd[1547]: time="2025-01-13T20:46:30.588287512Z" level=info msg="StopPodSandbox for \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\"" Jan 13 20:46:30.588460 containerd[1547]: time="2025-01-13T20:46:30.588402035Z" level=info msg="Ensure that sandbox 0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd in task-service has been cleanup successfully" Jan 13 20:46:30.591084 containerd[1547]: time="2025-01-13T20:46:30.591069912Z" level=info msg="TearDown network for sandbox \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\" successfully" Jan 13 20:46:30.591145 containerd[1547]: time="2025-01-13T20:46:30.591137659Z" level=info msg="StopPodSandbox for \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\" returns successfully" Jan 13 20:46:30.591537 containerd[1547]: time="2025-01-13T20:46:30.591526982Z" level=info msg="StopPodSandbox for \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\"" Jan 13 20:46:30.591755 containerd[1547]: time="2025-01-13T20:46:30.591669796Z" level=info msg="TearDown network for sandbox \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\" successfully" Jan 13 20:46:30.591755 containerd[1547]: time="2025-01-13T20:46:30.591678720Z" level=info msg="StopPodSandbox for \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\" returns successfully" Jan 13 20:46:30.594215 containerd[1547]: time="2025-01-13T20:46:30.592102197Z" level=info msg="StopPodSandbox for \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\"" Jan 13 20:46:30.594215 containerd[1547]: time="2025-01-13T20:46:30.592141918Z" level=info msg="TearDown network for sandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\" successfully" Jan 13 20:46:30.594215 containerd[1547]: time="2025-01-13T20:46:30.592148278Z" level=info msg="StopPodSandbox for \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\" returns successfully" Jan 13 20:46:30.593695 systemd[1]: run-netns-cni\x2da6900c2f\x2d2e74\x2db6b3\x2dff97\x2d3b87c9e59f47.mount: Deactivated successfully. Jan 13 20:46:30.595975 containerd[1547]: time="2025-01-13T20:46:30.595951425Z" level=info msg="StopPodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\"" Jan 13 20:46:30.596108 containerd[1547]: time="2025-01-13T20:46:30.596095913Z" level=info msg="TearDown network for sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" successfully" Jan 13 20:46:30.596108 containerd[1547]: time="2025-01-13T20:46:30.596105430Z" level=info msg="StopPodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" returns successfully" Jan 13 20:46:30.597531 containerd[1547]: time="2025-01-13T20:46:30.597420025Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\"" Jan 13 20:46:30.597531 containerd[1547]: time="2025-01-13T20:46:30.597461755Z" level=info msg="TearDown network for sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" successfully" Jan 13 20:46:30.597531 containerd[1547]: time="2025-01-13T20:46:30.597468391Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" returns successfully" Jan 13 20:46:30.599491 kubelet[2851]: I0113 20:46:30.599480 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f" Jan 13 20:46:30.600307 containerd[1547]: time="2025-01-13T20:46:30.599533798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:5,}" Jan 13 20:46:30.601738 containerd[1547]: time="2025-01-13T20:46:30.601719958Z" level=info msg="StopPodSandbox for \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\"" Jan 13 20:46:30.601855 containerd[1547]: time="2025-01-13T20:46:30.601841934Z" level=info msg="Ensure that sandbox 5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f in task-service has been cleanup successfully" Jan 13 20:46:30.603210 systemd[1]: run-netns-cni\x2d03a377b0\x2d79f6\x2d4f21\x2d0e49\x2dc5f115e2c60d.mount: Deactivated successfully. Jan 13 20:46:30.605038 containerd[1547]: time="2025-01-13T20:46:30.605020007Z" level=info msg="TearDown network for sandbox \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\" successfully" Jan 13 20:46:30.605038 containerd[1547]: time="2025-01-13T20:46:30.605032249Z" level=info msg="StopPodSandbox for \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\" returns successfully" Jan 13 20:46:30.605256 containerd[1547]: time="2025-01-13T20:46:30.605241656Z" level=info msg="StopPodSandbox for \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\"" Jan 13 20:46:30.610246 containerd[1547]: time="2025-01-13T20:46:30.610233259Z" level=info msg="TearDown network for sandbox \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\" successfully" Jan 13 20:46:30.618946 containerd[1547]: time="2025-01-13T20:46:30.618917435Z" level=info msg="StopPodSandbox for \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\" returns successfully" Jan 13 20:46:30.620980 containerd[1547]: time="2025-01-13T20:46:30.620622295Z" level=info msg="StopPodSandbox for \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\"" Jan 13 20:46:30.620980 containerd[1547]: time="2025-01-13T20:46:30.620689820Z" level=info msg="TearDown network for sandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\" successfully" Jan 13 20:46:30.620980 containerd[1547]: time="2025-01-13T20:46:30.620698476Z" level=info msg="StopPodSandbox for \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\" returns successfully" Jan 13 20:46:30.626836 containerd[1547]: time="2025-01-13T20:46:30.626821322Z" level=info msg="StopPodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\"" Jan 13 20:46:30.627061 containerd[1547]: time="2025-01-13T20:46:30.626938923Z" level=info msg="TearDown network for sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" successfully" Jan 13 20:46:30.627061 containerd[1547]: time="2025-01-13T20:46:30.626951260Z" level=info msg="StopPodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" returns successfully" Jan 13 20:46:30.627173 containerd[1547]: time="2025-01-13T20:46:30.627163164Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\"" Jan 13 20:46:30.627240 containerd[1547]: time="2025-01-13T20:46:30.627232912Z" level=info msg="TearDown network for sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" successfully" Jan 13 20:46:30.627285 containerd[1547]: time="2025-01-13T20:46:30.627273241Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" returns successfully" Jan 13 20:46:30.627573 containerd[1547]: time="2025-01-13T20:46:30.627563016Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:5,}" Jan 13 20:46:30.628734 kubelet[2851]: I0113 20:46:30.628388 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb" Jan 13 20:46:30.639308 containerd[1547]: time="2025-01-13T20:46:30.639272313Z" level=info msg="StopPodSandbox for \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\"" Jan 13 20:46:30.642030 containerd[1547]: time="2025-01-13T20:46:30.642012732Z" level=info msg="Ensure that sandbox 64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb in task-service has been cleanup successfully" Jan 13 20:46:30.642243 containerd[1547]: time="2025-01-13T20:46:30.642227359Z" level=info msg="TearDown network for sandbox \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\" successfully" Jan 13 20:46:30.642243 containerd[1547]: time="2025-01-13T20:46:30.642239208Z" level=info msg="StopPodSandbox for \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\" returns successfully" Jan 13 20:46:30.658073 containerd[1547]: time="2025-01-13T20:46:30.658045653Z" level=info msg="StopPodSandbox for \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\"" Jan 13 20:46:30.658730 containerd[1547]: time="2025-01-13T20:46:30.658695382Z" level=info msg="TearDown network for sandbox \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\" successfully" Jan 13 20:46:30.658730 containerd[1547]: time="2025-01-13T20:46:30.658727648Z" level=info msg="StopPodSandbox for \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\" returns successfully" Jan 13 20:46:30.661694 containerd[1547]: time="2025-01-13T20:46:30.661578975Z" level=error msg="Failed to destroy network for sandbox \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.662471 containerd[1547]: time="2025-01-13T20:46:30.662458714Z" level=info msg="StopPodSandbox for \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\"" Jan 13 20:46:30.662556 containerd[1547]: time="2025-01-13T20:46:30.662533367Z" level=info msg="TearDown network for sandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\" successfully" Jan 13 20:46:30.662556 containerd[1547]: time="2025-01-13T20:46:30.662554851Z" level=info msg="StopPodSandbox for \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\" returns successfully" Jan 13 20:46:30.662777 containerd[1547]: time="2025-01-13T20:46:30.662760617Z" level=error msg="encountered an error cleaning up failed sandbox \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.662808 containerd[1547]: time="2025-01-13T20:46:30.662793468Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.664496 kubelet[2851]: E0113 20:46:30.663864 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.664496 kubelet[2851]: E0113 20:46:30.663890 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:30.664496 kubelet[2851]: E0113 20:46:30.663904 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:30.664610 kubelet[2851]: E0113 20:46:30.663938 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d7876745f-sk2dr_calico-apiserver(662f5a4d-917d-45ec-97a1-d70b9c8e2f05)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d7876745f-sk2dr_calico-apiserver(662f5a4d-917d-45ec-97a1-d70b9c8e2f05)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" podUID="662f5a4d-917d-45ec-97a1-d70b9c8e2f05" Jan 13 20:46:30.664798 containerd[1547]: time="2025-01-13T20:46:30.664787419Z" level=info msg="StopPodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\"" Jan 13 20:46:30.664899 containerd[1547]: time="2025-01-13T20:46:30.664890103Z" level=info msg="TearDown network for sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" successfully" Jan 13 20:46:30.664937 containerd[1547]: time="2025-01-13T20:46:30.664929982Z" level=info msg="StopPodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" returns successfully" Jan 13 20:46:30.665337 containerd[1547]: time="2025-01-13T20:46:30.665326220Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\"" Jan 13 20:46:30.666977 containerd[1547]: time="2025-01-13T20:46:30.666965554Z" level=info msg="TearDown network for sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" successfully" Jan 13 20:46:30.667078 containerd[1547]: time="2025-01-13T20:46:30.667068737Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" returns successfully" Jan 13 20:46:30.667553 containerd[1547]: time="2025-01-13T20:46:30.667539588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:5,}" Jan 13 20:46:30.703235 containerd[1547]: time="2025-01-13T20:46:30.703156956Z" level=error msg="Failed to destroy network for sandbox \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.703903 containerd[1547]: time="2025-01-13T20:46:30.703888642Z" level=error msg="encountered an error cleaning up failed sandbox \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.703998 containerd[1547]: time="2025-01-13T20:46:30.703986280Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.704219 kubelet[2851]: E0113 20:46:30.704208 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.704285 kubelet[2851]: E0113 20:46:30.704280 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:30.704334 kubelet[2851]: E0113 20:46:30.704329 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:30.704419 kubelet[2851]: E0113 20:46:30.704412 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-nx8p5_kube-system(6ab74c68-2020-4618-bcbe-672227cc6fc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-nx8p5_kube-system(6ab74c68-2020-4618-bcbe-672227cc6fc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-nx8p5" podUID="6ab74c68-2020-4618-bcbe-672227cc6fc9" Jan 13 20:46:30.724674 containerd[1547]: time="2025-01-13T20:46:30.724644197Z" level=error msg="Failed to destroy network for sandbox \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.725098 containerd[1547]: time="2025-01-13T20:46:30.725080154Z" level=error msg="encountered an error cleaning up failed sandbox \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.725137 containerd[1547]: time="2025-01-13T20:46:30.725117270Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.725271 kubelet[2851]: E0113 20:46:30.725245 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.725324 kubelet[2851]: E0113 20:46:30.725283 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:30.725442 kubelet[2851]: E0113 20:46:30.725325 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:30.725570 kubelet[2851]: E0113 20:46:30.725473 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-66f4bb6d79-zhjcz_calico-system(0016e978-2a6d-4be0-ad22-af8c555426bb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-66f4bb6d79-zhjcz_calico-system(0016e978-2a6d-4be0-ad22-af8c555426bb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" podUID="0016e978-2a6d-4be0-ad22-af8c555426bb" Jan 13 20:46:30.746669 containerd[1547]: time="2025-01-13T20:46:30.746598154Z" level=error msg="Failed to destroy network for sandbox \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.746965 containerd[1547]: time="2025-01-13T20:46:30.746800012Z" level=error msg="encountered an error cleaning up failed sandbox \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.746965 containerd[1547]: time="2025-01-13T20:46:30.746844122Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.747022 kubelet[2851]: E0113 20:46:30.746989 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.747219 kubelet[2851]: E0113 20:46:30.747024 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:30.747219 kubelet[2851]: E0113 20:46:30.747038 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:30.747219 kubelet[2851]: E0113 20:46:30.747174 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-75b2t_kube-system(45d20c01-0698-463e-b647-27ec83c8d824)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-75b2t_kube-system(45d20c01-0698-463e-b647-27ec83c8d824)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-75b2t" podUID="45d20c01-0698-463e-b647-27ec83c8d824" Jan 13 20:46:30.750755 containerd[1547]: time="2025-01-13T20:46:30.750737167Z" level=error msg="Failed to destroy network for sandbox \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.751069 containerd[1547]: time="2025-01-13T20:46:30.750925847Z" level=error msg="encountered an error cleaning up failed sandbox \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.751069 containerd[1547]: time="2025-01-13T20:46:30.750954096Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.751243 kubelet[2851]: E0113 20:46:30.751058 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.751243 kubelet[2851]: E0113 20:46:30.751089 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:30.751243 kubelet[2851]: E0113 20:46:30.751104 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:30.751310 kubelet[2851]: E0113 20:46:30.751138 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cj9xm_calico-system(7af6a31b-5e31-40c1-b6a8-196414f83e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cj9xm_calico-system(7af6a31b-5e31-40c1-b6a8-196414f83e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cj9xm" podUID="7af6a31b-5e31-40c1-b6a8-196414f83e54" Jan 13 20:46:30.807904 containerd[1547]: time="2025-01-13T20:46:30.807835585Z" level=error msg="Failed to destroy network for sandbox \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.808077 containerd[1547]: time="2025-01-13T20:46:30.808060521Z" level=error msg="encountered an error cleaning up failed sandbox \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.808203 containerd[1547]: time="2025-01-13T20:46:30.808109728Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.808599 kubelet[2851]: E0113 20:46:30.808405 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:30.808599 kubelet[2851]: E0113 20:46:30.808437 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:30.808599 kubelet[2851]: E0113 20:46:30.808452 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:30.808680 kubelet[2851]: E0113 20:46:30.808490 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d7876745f-vvftf_calico-apiserver(6953b35c-8169-44aa-91cb-7dd3f8f9aade)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d7876745f-vvftf_calico-apiserver(6953b35c-8169-44aa-91cb-7dd3f8f9aade)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" podUID="6953b35c-8169-44aa-91cb-7dd3f8f9aade" Jan 13 20:46:31.331088 containerd[1547]: time="2025-01-13T20:46:31.324566466Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Jan 13 20:46:31.333679 containerd[1547]: time="2025-01-13T20:46:31.333623137Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 4.886744799s" Jan 13 20:46:31.333679 containerd[1547]: time="2025-01-13T20:46:31.333641340Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Jan 13 20:46:31.343263 containerd[1547]: time="2025-01-13T20:46:31.343232030Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:31.349626 containerd[1547]: time="2025-01-13T20:46:31.349606099Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:31.349951 containerd[1547]: time="2025-01-13T20:46:31.349937123Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:31.389144 containerd[1547]: time="2025-01-13T20:46:31.389121568Z" level=info msg="CreateContainer within sandbox \"524a4ebd94f9ddd3038e45498d154ac9823c25164cb31932d4f2a5587dcdfc9d\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 13 20:46:31.477099 containerd[1547]: time="2025-01-13T20:46:31.477050319Z" level=info msg="CreateContainer within sandbox \"524a4ebd94f9ddd3038e45498d154ac9823c25164cb31932d4f2a5587dcdfc9d\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"5889317b66f78c10cf1f9280d374bee8b8e84dc0fa0d5fc336a6c748bbcf0bd1\"" Jan 13 20:46:31.482091 containerd[1547]: time="2025-01-13T20:46:31.482024982Z" level=info msg="StartContainer for \"5889317b66f78c10cf1f9280d374bee8b8e84dc0fa0d5fc336a6c748bbcf0bd1\"" Jan 13 20:46:31.586381 systemd[1]: Started cri-containerd-5889317b66f78c10cf1f9280d374bee8b8e84dc0fa0d5fc336a6c748bbcf0bd1.scope - libcontainer container 5889317b66f78c10cf1f9280d374bee8b8e84dc0fa0d5fc336a6c748bbcf0bd1. Jan 13 20:46:31.592317 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b-shm.mount: Deactivated successfully. Jan 13 20:46:31.592502 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb-shm.mount: Deactivated successfully. Jan 13 20:46:31.592549 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6-shm.mount: Deactivated successfully. Jan 13 20:46:31.592603 systemd[1]: run-netns-cni\x2d5b4de2cb\x2dc046\x2dd24e\x2d9725\x2dcdbe5f88303d.mount: Deactivated successfully. Jan 13 20:46:31.592692 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount725627086.mount: Deactivated successfully. Jan 13 20:46:31.608473 containerd[1547]: time="2025-01-13T20:46:31.608448726Z" level=info msg="StartContainer for \"5889317b66f78c10cf1f9280d374bee8b8e84dc0fa0d5fc336a6c748bbcf0bd1\" returns successfully" Jan 13 20:46:31.633606 kubelet[2851]: I0113 20:46:31.633593 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb" Jan 13 20:46:31.637878 containerd[1547]: time="2025-01-13T20:46:31.637782872Z" level=info msg="StopPodSandbox for \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\"" Jan 13 20:46:31.639064 kubelet[2851]: I0113 20:46:31.639049 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b" Jan 13 20:46:31.639227 containerd[1547]: time="2025-01-13T20:46:31.639127657Z" level=info msg="Ensure that sandbox 5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb in task-service has been cleanup successfully" Jan 13 20:46:31.639353 containerd[1547]: time="2025-01-13T20:46:31.639270795Z" level=info msg="StopPodSandbox for \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\"" Jan 13 20:46:31.639353 containerd[1547]: time="2025-01-13T20:46:31.639308787Z" level=info msg="TearDown network for sandbox \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\" successfully" Jan 13 20:46:31.639353 containerd[1547]: time="2025-01-13T20:46:31.639317454Z" level=info msg="StopPodSandbox for \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\" returns successfully" Jan 13 20:46:31.641781 containerd[1547]: time="2025-01-13T20:46:31.641133108Z" level=info msg="Ensure that sandbox 10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b in task-service has been cleanup successfully" Jan 13 20:46:31.641781 containerd[1547]: time="2025-01-13T20:46:31.641241497Z" level=info msg="TearDown network for sandbox \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\" successfully" Jan 13 20:46:31.641781 containerd[1547]: time="2025-01-13T20:46:31.641249451Z" level=info msg="StopPodSandbox for \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\" returns successfully" Jan 13 20:46:31.641781 containerd[1547]: time="2025-01-13T20:46:31.641635323Z" level=info msg="StopPodSandbox for \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\"" Jan 13 20:46:31.641781 containerd[1547]: time="2025-01-13T20:46:31.641678160Z" level=info msg="TearDown network for sandbox \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\" successfully" Jan 13 20:46:31.641781 containerd[1547]: time="2025-01-13T20:46:31.641684598Z" level=info msg="StopPodSandbox for \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\" returns successfully" Jan 13 20:46:31.641781 containerd[1547]: time="2025-01-13T20:46:31.641718170Z" level=info msg="StopPodSandbox for \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\"" Jan 13 20:46:31.641781 containerd[1547]: time="2025-01-13T20:46:31.641747790Z" level=info msg="TearDown network for sandbox \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\" successfully" Jan 13 20:46:31.641781 containerd[1547]: time="2025-01-13T20:46:31.641752944Z" level=info msg="StopPodSandbox for \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\" returns successfully" Jan 13 20:46:31.642101 containerd[1547]: time="2025-01-13T20:46:31.642082849Z" level=info msg="StopPodSandbox for \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\"" Jan 13 20:46:31.642181 containerd[1547]: time="2025-01-13T20:46:31.642170203Z" level=info msg="StopPodSandbox for \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\"" Jan 13 20:46:31.642220 containerd[1547]: time="2025-01-13T20:46:31.642213665Z" level=info msg="TearDown network for sandbox \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\" successfully" Jan 13 20:46:31.642251 containerd[1547]: time="2025-01-13T20:46:31.642220086Z" level=info msg="StopPodSandbox for \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\" returns successfully" Jan 13 20:46:31.642292 systemd[1]: run-netns-cni\x2d84f4e271\x2dda90\x2df62f\x2d18a3\x2d458dd486c5ae.mount: Deactivated successfully. Jan 13 20:46:31.643333 containerd[1547]: time="2025-01-13T20:46:31.642386427Z" level=info msg="StopPodSandbox for \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\"" Jan 13 20:46:31.643333 containerd[1547]: time="2025-01-13T20:46:31.642426924Z" level=info msg="TearDown network for sandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\" successfully" Jan 13 20:46:31.643333 containerd[1547]: time="2025-01-13T20:46:31.642434469Z" level=info msg="StopPodSandbox for \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\" returns successfully" Jan 13 20:46:31.643333 containerd[1547]: time="2025-01-13T20:46:31.642582160Z" level=info msg="StopPodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\"" Jan 13 20:46:31.643333 containerd[1547]: time="2025-01-13T20:46:31.642615563Z" level=info msg="TearDown network for sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" successfully" Jan 13 20:46:31.643333 containerd[1547]: time="2025-01-13T20:46:31.642621104Z" level=info msg="StopPodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" returns successfully" Jan 13 20:46:31.644051 containerd[1547]: time="2025-01-13T20:46:31.643381655Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\"" Jan 13 20:46:31.644051 containerd[1547]: time="2025-01-13T20:46:31.643418717Z" level=info msg="TearDown network for sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" successfully" Jan 13 20:46:31.644051 containerd[1547]: time="2025-01-13T20:46:31.643424462Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" returns successfully" Jan 13 20:46:31.644051 containerd[1547]: time="2025-01-13T20:46:31.643458509Z" level=info msg="TearDown network for sandbox \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\" successfully" Jan 13 20:46:31.644051 containerd[1547]: time="2025-01-13T20:46:31.643477203Z" level=info msg="StopPodSandbox for \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\" returns successfully" Jan 13 20:46:31.644884 containerd[1547]: time="2025-01-13T20:46:31.644662735Z" level=info msg="StopPodSandbox for \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\"" Jan 13 20:46:31.644884 containerd[1547]: time="2025-01-13T20:46:31.644703899Z" level=info msg="TearDown network for sandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\" successfully" Jan 13 20:46:31.644884 containerd[1547]: time="2025-01-13T20:46:31.644710158Z" level=info msg="StopPodSandbox for \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\" returns successfully" Jan 13 20:46:31.645368 containerd[1547]: time="2025-01-13T20:46:31.644898086Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:6,}" Jan 13 20:46:31.645368 containerd[1547]: time="2025-01-13T20:46:31.645254225Z" level=info msg="StopPodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\"" Jan 13 20:46:31.645368 containerd[1547]: time="2025-01-13T20:46:31.645293357Z" level=info msg="TearDown network for sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" successfully" Jan 13 20:46:31.645368 containerd[1547]: time="2025-01-13T20:46:31.645299296Z" level=info msg="StopPodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" returns successfully" Jan 13 20:46:31.645500 kubelet[2851]: I0113 20:46:31.645426 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d" Jan 13 20:46:31.646374 systemd[1]: run-netns-cni\x2d90834870\x2d85da\x2dcb98\x2d3cb8\x2d5d1733be2351.mount: Deactivated successfully. Jan 13 20:46:31.648595 containerd[1547]: time="2025-01-13T20:46:31.648470825Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\"" Jan 13 20:46:31.648595 containerd[1547]: time="2025-01-13T20:46:31.648570802Z" level=info msg="TearDown network for sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" successfully" Jan 13 20:46:31.648595 containerd[1547]: time="2025-01-13T20:46:31.648578005Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" returns successfully" Jan 13 20:46:31.648848 containerd[1547]: time="2025-01-13T20:46:31.648780507Z" level=info msg="StopPodSandbox for \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\"" Jan 13 20:46:31.649192 containerd[1547]: time="2025-01-13T20:46:31.649141520Z" level=info msg="Ensure that sandbox ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d in task-service has been cleanup successfully" Jan 13 20:46:31.651299 systemd[1]: run-netns-cni\x2d45afa602\x2dffbd\x2d966d\x2d3e04\x2dfc5e5ea800d6.mount: Deactivated successfully. Jan 13 20:46:31.653432 containerd[1547]: time="2025-01-13T20:46:31.652682070Z" level=info msg="TearDown network for sandbox \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\" successfully" Jan 13 20:46:31.653432 containerd[1547]: time="2025-01-13T20:46:31.652695072Z" level=info msg="StopPodSandbox for \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\" returns successfully" Jan 13 20:46:31.653432 containerd[1547]: time="2025-01-13T20:46:31.652930688Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:6,}" Jan 13 20:46:31.653432 containerd[1547]: time="2025-01-13T20:46:31.653237520Z" level=info msg="StopPodSandbox for \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\"" Jan 13 20:46:31.654203 containerd[1547]: time="2025-01-13T20:46:31.654181995Z" level=info msg="TearDown network for sandbox \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\" successfully" Jan 13 20:46:31.654203 containerd[1547]: time="2025-01-13T20:46:31.654199512Z" level=info msg="StopPodSandbox for \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\" returns successfully" Jan 13 20:46:31.654565 containerd[1547]: time="2025-01-13T20:46:31.654407037Z" level=info msg="StopPodSandbox for \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\"" Jan 13 20:46:31.654565 containerd[1547]: time="2025-01-13T20:46:31.654451009Z" level=info msg="TearDown network for sandbox \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\" successfully" Jan 13 20:46:31.654565 containerd[1547]: time="2025-01-13T20:46:31.654457670Z" level=info msg="StopPodSandbox for \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\" returns successfully" Jan 13 20:46:31.655361 containerd[1547]: time="2025-01-13T20:46:31.654954903Z" level=info msg="StopPodSandbox for \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\"" Jan 13 20:46:31.655361 containerd[1547]: time="2025-01-13T20:46:31.655004685Z" level=info msg="TearDown network for sandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\" successfully" Jan 13 20:46:31.655361 containerd[1547]: time="2025-01-13T20:46:31.655010854Z" level=info msg="StopPodSandbox for \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\" returns successfully" Jan 13 20:46:31.655439 containerd[1547]: time="2025-01-13T20:46:31.655419244Z" level=info msg="StopPodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\"" Jan 13 20:46:31.655485 containerd[1547]: time="2025-01-13T20:46:31.655472035Z" level=info msg="TearDown network for sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" successfully" Jan 13 20:46:31.655485 containerd[1547]: time="2025-01-13T20:46:31.655479778Z" level=info msg="StopPodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" returns successfully" Jan 13 20:46:31.671546 containerd[1547]: time="2025-01-13T20:46:31.671518071Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\"" Jan 13 20:46:31.671619 containerd[1547]: time="2025-01-13T20:46:31.671609534Z" level=info msg="TearDown network for sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" successfully" Jan 13 20:46:31.671619 containerd[1547]: time="2025-01-13T20:46:31.671617064Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" returns successfully" Jan 13 20:46:31.673251 containerd[1547]: time="2025-01-13T20:46:31.672937475Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:6,}" Jan 13 20:46:31.673735 kubelet[2851]: I0113 20:46:31.673664 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42" Jan 13 20:46:31.675006 containerd[1547]: time="2025-01-13T20:46:31.674164949Z" level=info msg="StopPodSandbox for \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\"" Jan 13 20:46:31.675006 containerd[1547]: time="2025-01-13T20:46:31.674327690Z" level=info msg="Ensure that sandbox 0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42 in task-service has been cleanup successfully" Jan 13 20:46:31.675006 containerd[1547]: time="2025-01-13T20:46:31.674464567Z" level=info msg="TearDown network for sandbox \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\" successfully" Jan 13 20:46:31.675006 containerd[1547]: time="2025-01-13T20:46:31.674477161Z" level=info msg="StopPodSandbox for \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\" returns successfully" Jan 13 20:46:31.675758 containerd[1547]: time="2025-01-13T20:46:31.675563641Z" level=info msg="StopPodSandbox for \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\"" Jan 13 20:46:31.675758 containerd[1547]: time="2025-01-13T20:46:31.675650685Z" level=info msg="TearDown network for sandbox \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\" successfully" Jan 13 20:46:31.675758 containerd[1547]: time="2025-01-13T20:46:31.675659479Z" level=info msg="StopPodSandbox for \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\" returns successfully" Jan 13 20:46:31.676524 containerd[1547]: time="2025-01-13T20:46:31.676465507Z" level=info msg="StopPodSandbox for \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\"" Jan 13 20:46:31.676715 containerd[1547]: time="2025-01-13T20:46:31.676652658Z" level=info msg="TearDown network for sandbox \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\" successfully" Jan 13 20:46:31.676715 containerd[1547]: time="2025-01-13T20:46:31.676662847Z" level=info msg="StopPodSandbox for \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\" returns successfully" Jan 13 20:46:31.678198 containerd[1547]: time="2025-01-13T20:46:31.677940845Z" level=info msg="StopPodSandbox for \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\"" Jan 13 20:46:31.678198 containerd[1547]: time="2025-01-13T20:46:31.677988273Z" level=info msg="TearDown network for sandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\" successfully" Jan 13 20:46:31.678198 containerd[1547]: time="2025-01-13T20:46:31.677994773Z" level=info msg="StopPodSandbox for \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\" returns successfully" Jan 13 20:46:31.678440 containerd[1547]: time="2025-01-13T20:46:31.678426084Z" level=info msg="StopPodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\"" Jan 13 20:46:31.678486 containerd[1547]: time="2025-01-13T20:46:31.678475269Z" level=info msg="TearDown network for sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" successfully" Jan 13 20:46:31.678511 containerd[1547]: time="2025-01-13T20:46:31.678484479Z" level=info msg="StopPodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" returns successfully" Jan 13 20:46:31.678711 containerd[1547]: time="2025-01-13T20:46:31.678697833Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\"" Jan 13 20:46:31.678753 containerd[1547]: time="2025-01-13T20:46:31.678742932Z" level=info msg="TearDown network for sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" successfully" Jan 13 20:46:31.678782 containerd[1547]: time="2025-01-13T20:46:31.678753681Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" returns successfully" Jan 13 20:46:31.679114 containerd[1547]: time="2025-01-13T20:46:31.679094450Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:6,}" Jan 13 20:46:31.680177 kubelet[2851]: I0113 20:46:31.680162 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f" Jan 13 20:46:31.682933 containerd[1547]: time="2025-01-13T20:46:31.682336676Z" level=info msg="StopPodSandbox for \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\"" Jan 13 20:46:31.685430 kubelet[2851]: I0113 20:46:31.685411 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6" Jan 13 20:46:31.685934 containerd[1547]: time="2025-01-13T20:46:31.685834006Z" level=info msg="StopPodSandbox for \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\"" Jan 13 20:46:31.686049 containerd[1547]: time="2025-01-13T20:46:31.686033187Z" level=info msg="Ensure that sandbox 4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f in task-service has been cleanup successfully" Jan 13 20:46:31.687065 containerd[1547]: time="2025-01-13T20:46:31.686141728Z" level=info msg="Ensure that sandbox 933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6 in task-service has been cleanup successfully" Jan 13 20:46:31.687990 containerd[1547]: time="2025-01-13T20:46:31.687972434Z" level=info msg="TearDown network for sandbox \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\" successfully" Jan 13 20:46:31.688034 containerd[1547]: time="2025-01-13T20:46:31.688026727Z" level=info msg="StopPodSandbox for \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\" returns successfully" Jan 13 20:46:31.689067 containerd[1547]: time="2025-01-13T20:46:31.688217333Z" level=info msg="TearDown network for sandbox \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\" successfully" Jan 13 20:46:31.689067 containerd[1547]: time="2025-01-13T20:46:31.688226014Z" level=info msg="StopPodSandbox for \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\" returns successfully" Jan 13 20:46:31.689599 containerd[1547]: time="2025-01-13T20:46:31.689587212Z" level=info msg="StopPodSandbox for \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\"" Jan 13 20:46:31.690943 containerd[1547]: time="2025-01-13T20:46:31.690495434Z" level=info msg="TearDown network for sandbox \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\" successfully" Jan 13 20:46:31.690943 containerd[1547]: time="2025-01-13T20:46:31.690535935Z" level=info msg="StopPodSandbox for \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\" returns successfully" Jan 13 20:46:31.690943 containerd[1547]: time="2025-01-13T20:46:31.689778739Z" level=info msg="StopPodSandbox for \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\"" Jan 13 20:46:31.690943 containerd[1547]: time="2025-01-13T20:46:31.690601595Z" level=info msg="TearDown network for sandbox \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\" successfully" Jan 13 20:46:31.690943 containerd[1547]: time="2025-01-13T20:46:31.690608919Z" level=info msg="StopPodSandbox for \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\" returns successfully" Jan 13 20:46:31.692936 containerd[1547]: time="2025-01-13T20:46:31.692177399Z" level=info msg="StopPodSandbox for \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\"" Jan 13 20:46:31.692936 containerd[1547]: time="2025-01-13T20:46:31.692458261Z" level=info msg="TearDown network for sandbox \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\" successfully" Jan 13 20:46:31.692936 containerd[1547]: time="2025-01-13T20:46:31.692466912Z" level=info msg="StopPodSandbox for \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\" returns successfully" Jan 13 20:46:31.694682 containerd[1547]: time="2025-01-13T20:46:31.694458186Z" level=info msg="StopPodSandbox for \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\"" Jan 13 20:46:31.694682 containerd[1547]: time="2025-01-13T20:46:31.694507804Z" level=info msg="TearDown network for sandbox \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\" successfully" Jan 13 20:46:31.694682 containerd[1547]: time="2025-01-13T20:46:31.694515750Z" level=info msg="StopPodSandbox for \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\" returns successfully" Jan 13 20:46:31.695105 containerd[1547]: time="2025-01-13T20:46:31.694902592Z" level=info msg="StopPodSandbox for \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\"" Jan 13 20:46:31.695105 containerd[1547]: time="2025-01-13T20:46:31.694945303Z" level=info msg="TearDown network for sandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\" successfully" Jan 13 20:46:31.695105 containerd[1547]: time="2025-01-13T20:46:31.694951202Z" level=info msg="StopPodSandbox for \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\" returns successfully" Jan 13 20:46:31.695203 containerd[1547]: time="2025-01-13T20:46:31.695112983Z" level=info msg="StopPodSandbox for \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\"" Jan 13 20:46:31.695203 containerd[1547]: time="2025-01-13T20:46:31.695149263Z" level=info msg="TearDown network for sandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\" successfully" Jan 13 20:46:31.695203 containerd[1547]: time="2025-01-13T20:46:31.695154901Z" level=info msg="StopPodSandbox for \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\" returns successfully" Jan 13 20:46:31.695984 containerd[1547]: time="2025-01-13T20:46:31.695398990Z" level=info msg="StopPodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\"" Jan 13 20:46:31.695984 containerd[1547]: time="2025-01-13T20:46:31.695434139Z" level=info msg="TearDown network for sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" successfully" Jan 13 20:46:31.695984 containerd[1547]: time="2025-01-13T20:46:31.695439856Z" level=info msg="StopPodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" returns successfully" Jan 13 20:46:31.695984 containerd[1547]: time="2025-01-13T20:46:31.695466611Z" level=info msg="StopPodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\"" Jan 13 20:46:31.695984 containerd[1547]: time="2025-01-13T20:46:31.695496652Z" level=info msg="TearDown network for sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" successfully" Jan 13 20:46:31.695984 containerd[1547]: time="2025-01-13T20:46:31.695501755Z" level=info msg="StopPodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" returns successfully" Jan 13 20:46:31.695984 containerd[1547]: time="2025-01-13T20:46:31.695762109Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\"" Jan 13 20:46:31.695984 containerd[1547]: time="2025-01-13T20:46:31.695799916Z" level=info msg="TearDown network for sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" successfully" Jan 13 20:46:31.695984 containerd[1547]: time="2025-01-13T20:46:31.695805655Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" returns successfully" Jan 13 20:46:31.695984 containerd[1547]: time="2025-01-13T20:46:31.695831369Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\"" Jan 13 20:46:31.695984 containerd[1547]: time="2025-01-13T20:46:31.695863348Z" level=info msg="TearDown network for sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" successfully" Jan 13 20:46:31.695984 containerd[1547]: time="2025-01-13T20:46:31.695868303Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" returns successfully" Jan 13 20:46:31.696449 containerd[1547]: time="2025-01-13T20:46:31.696437200Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:6,}" Jan 13 20:46:31.696626 containerd[1547]: time="2025-01-13T20:46:31.696615662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:6,}" Jan 13 20:46:31.701088 kubelet[2851]: I0113 20:46:31.701066 2851 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-node-pvrpb" podStartSLOduration=1.247481148 podStartE2EDuration="15.666022757s" podCreationTimestamp="2025-01-13 20:46:16 +0000 UTC" firstStartedPulling="2025-01-13 20:46:16.92376628 +0000 UTC m=+20.692703462" lastFinishedPulling="2025-01-13 20:46:31.342307889 +0000 UTC m=+35.111245071" observedRunningTime="2025-01-13 20:46:31.661795642 +0000 UTC m=+35.430732832" watchObservedRunningTime="2025-01-13 20:46:31.666022757 +0000 UTC m=+35.434959944" Jan 13 20:46:31.760758 containerd[1547]: time="2025-01-13T20:46:31.760727216Z" level=error msg="Failed to destroy network for sandbox \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.761057 containerd[1547]: time="2025-01-13T20:46:31.761039004Z" level=error msg="encountered an error cleaning up failed sandbox \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.761096 containerd[1547]: time="2025-01-13T20:46:31.761077173Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.762524 kubelet[2851]: E0113 20:46:31.761765 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.762524 kubelet[2851]: E0113 20:46:31.761817 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:31.762524 kubelet[2851]: E0113 20:46:31.761834 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" Jan 13 20:46:31.771929 kubelet[2851]: E0113 20:46:31.761905 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-66f4bb6d79-zhjcz_calico-system(0016e978-2a6d-4be0-ad22-af8c555426bb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-66f4bb6d79-zhjcz_calico-system(0016e978-2a6d-4be0-ad22-af8c555426bb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" podUID="0016e978-2a6d-4be0-ad22-af8c555426bb" Jan 13 20:46:31.790357 containerd[1547]: time="2025-01-13T20:46:31.790189187Z" level=error msg="Failed to destroy network for sandbox \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.791099 containerd[1547]: time="2025-01-13T20:46:31.791084012Z" level=error msg="encountered an error cleaning up failed sandbox \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.791284 containerd[1547]: time="2025-01-13T20:46:31.791271025Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.791596 kubelet[2851]: E0113 20:46:31.791578 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.791657 kubelet[2851]: E0113 20:46:31.791611 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:31.791657 kubelet[2851]: E0113 20:46:31.791625 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-nx8p5" Jan 13 20:46:31.791698 kubelet[2851]: E0113 20:46:31.791657 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-nx8p5_kube-system(6ab74c68-2020-4618-bcbe-672227cc6fc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-nx8p5_kube-system(6ab74c68-2020-4618-bcbe-672227cc6fc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-nx8p5" podUID="6ab74c68-2020-4618-bcbe-672227cc6fc9" Jan 13 20:46:31.814039 containerd[1547]: time="2025-01-13T20:46:31.814011508Z" level=error msg="Failed to destroy network for sandbox \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.829060 containerd[1547]: time="2025-01-13T20:46:31.814504249Z" level=error msg="encountered an error cleaning up failed sandbox \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.829060 containerd[1547]: time="2025-01-13T20:46:31.814630035Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.829060 containerd[1547]: time="2025-01-13T20:46:31.815368644Z" level=error msg="Failed to destroy network for sandbox \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.829060 containerd[1547]: time="2025-01-13T20:46:31.815530011Z" level=error msg="encountered an error cleaning up failed sandbox \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.829060 containerd[1547]: time="2025-01-13T20:46:31.815554333Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:6,} failed, error" error="failed to setup network for sandbox \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.829060 containerd[1547]: time="2025-01-13T20:46:31.819155422Z" level=error msg="Failed to destroy network for sandbox \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.829060 containerd[1547]: time="2025-01-13T20:46:31.819331216Z" level=error msg="encountered an error cleaning up failed sandbox \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.829060 containerd[1547]: time="2025-01-13T20:46:31.819394627Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:6,} failed, error" error="failed to setup network for sandbox \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.829060 containerd[1547]: time="2025-01-13T20:46:31.822149901Z" level=error msg="Failed to destroy network for sandbox \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.829060 containerd[1547]: time="2025-01-13T20:46:31.822296495Z" level=error msg="encountered an error cleaning up failed sandbox \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.829060 containerd[1547]: time="2025-01-13T20:46:31.822320416Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.829971 kubelet[2851]: E0113 20:46:31.818712 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.829971 kubelet[2851]: E0113 20:46:31.818738 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:31.829971 kubelet[2851]: E0113 20:46:31.818754 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" Jan 13 20:46:31.830079 kubelet[2851]: E0113 20:46:31.818785 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d7876745f-vvftf_calico-apiserver(6953b35c-8169-44aa-91cb-7dd3f8f9aade)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d7876745f-vvftf_calico-apiserver(6953b35c-8169-44aa-91cb-7dd3f8f9aade)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" podUID="6953b35c-8169-44aa-91cb-7dd3f8f9aade" Jan 13 20:46:31.830079 kubelet[2851]: E0113 20:46:31.818805 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.830079 kubelet[2851]: E0113 20:46:31.818816 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:31.830157 kubelet[2851]: E0113 20:46:31.818826 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cj9xm" Jan 13 20:46:31.830157 kubelet[2851]: E0113 20:46:31.818844 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cj9xm_calico-system(7af6a31b-5e31-40c1-b6a8-196414f83e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cj9xm_calico-system(7af6a31b-5e31-40c1-b6a8-196414f83e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cj9xm" podUID="7af6a31b-5e31-40c1-b6a8-196414f83e54" Jan 13 20:46:31.830157 kubelet[2851]: E0113 20:46:31.819533 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.830229 kubelet[2851]: E0113 20:46:31.819549 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:31.830229 kubelet[2851]: E0113 20:46:31.819559 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" Jan 13 20:46:31.830229 kubelet[2851]: E0113 20:46:31.819580 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d7876745f-sk2dr_calico-apiserver(662f5a4d-917d-45ec-97a1-d70b9c8e2f05)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d7876745f-sk2dr_calico-apiserver(662f5a4d-917d-45ec-97a1-d70b9c8e2f05)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" podUID="662f5a4d-917d-45ec-97a1-d70b9c8e2f05" Jan 13 20:46:31.830298 kubelet[2851]: E0113 20:46:31.822415 2851 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:46:31.830298 kubelet[2851]: E0113 20:46:31.822435 2851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:31.830298 kubelet[2851]: E0113 20:46:31.822446 2851 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-75b2t" Jan 13 20:46:31.830388 kubelet[2851]: E0113 20:46:31.822471 2851 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-75b2t_kube-system(45d20c01-0698-463e-b647-27ec83c8d824)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-75b2t_kube-system(45d20c01-0698-463e-b647-27ec83c8d824)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-75b2t" podUID="45d20c01-0698-463e-b647-27ec83c8d824" Jan 13 20:46:32.051897 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 13 20:46:32.052666 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 13 20:46:32.590098 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231-shm.mount: Deactivated successfully. Jan 13 20:46:32.590171 systemd[1]: run-netns-cni\x2d11034682\x2df030\x2de31d\x2d11d6\x2d391345f44c3f.mount: Deactivated successfully. Jan 13 20:46:32.590221 systemd[1]: run-netns-cni\x2d2b9aa2d0\x2d58f7\x2da6b6\x2d6120\x2d5d8c04bd0f8b.mount: Deactivated successfully. Jan 13 20:46:32.590265 systemd[1]: run-netns-cni\x2d06a09cfb\x2d99ed\x2d4b27\x2d3d67\x2d4b04addd33ce.mount: Deactivated successfully. Jan 13 20:46:32.689184 kubelet[2851]: I0113 20:46:32.689159 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102" Jan 13 20:46:32.690622 containerd[1547]: time="2025-01-13T20:46:32.689805285Z" level=info msg="StopPodSandbox for \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\"" Jan 13 20:46:32.690622 containerd[1547]: time="2025-01-13T20:46:32.689965994Z" level=info msg="Ensure that sandbox dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102 in task-service has been cleanup successfully" Jan 13 20:46:32.691672 containerd[1547]: time="2025-01-13T20:46:32.691407103Z" level=info msg="StopPodSandbox for \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\"" Jan 13 20:46:32.691672 containerd[1547]: time="2025-01-13T20:46:32.691560096Z" level=info msg="Ensure that sandbox 250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc in task-service has been cleanup successfully" Jan 13 20:46:32.691722 kubelet[2851]: I0113 20:46:32.691167 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc" Jan 13 20:46:32.693192 containerd[1547]: time="2025-01-13T20:46:32.693135662Z" level=info msg="TearDown network for sandbox \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\" successfully" Jan 13 20:46:32.693192 containerd[1547]: time="2025-01-13T20:46:32.693149132Z" level=info msg="StopPodSandbox for \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\" returns successfully" Jan 13 20:46:32.693258 systemd[1]: run-netns-cni\x2df4fb6271\x2d413b\x2dcbca\x2de578\x2d4cd37e68b578.mount: Deactivated successfully. Jan 13 20:46:32.693880 containerd[1547]: time="2025-01-13T20:46:32.693601749Z" level=info msg="StopPodSandbox for \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\"" Jan 13 20:46:32.693880 containerd[1547]: time="2025-01-13T20:46:32.693650297Z" level=info msg="TearDown network for sandbox \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\" successfully" Jan 13 20:46:32.693880 containerd[1547]: time="2025-01-13T20:46:32.693658093Z" level=info msg="StopPodSandbox for \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\" returns successfully" Jan 13 20:46:32.694414 containerd[1547]: time="2025-01-13T20:46:32.694399114Z" level=info msg="StopPodSandbox for \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\"" Jan 13 20:46:32.694644 containerd[1547]: time="2025-01-13T20:46:32.694566323Z" level=info msg="TearDown network for sandbox \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\" successfully" Jan 13 20:46:32.694644 containerd[1547]: time="2025-01-13T20:46:32.694577829Z" level=info msg="StopPodSandbox for \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\" returns successfully" Jan 13 20:46:32.694644 containerd[1547]: time="2025-01-13T20:46:32.694481895Z" level=info msg="TearDown network for sandbox \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\" successfully" Jan 13 20:46:32.694644 containerd[1547]: time="2025-01-13T20:46:32.694615678Z" level=info msg="StopPodSandbox for \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\" returns successfully" Jan 13 20:46:32.696606 containerd[1547]: time="2025-01-13T20:46:32.694822436Z" level=info msg="StopPodSandbox for \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\"" Jan 13 20:46:32.696606 containerd[1547]: time="2025-01-13T20:46:32.694908337Z" level=info msg="TearDown network for sandbox \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\" successfully" Jan 13 20:46:32.696606 containerd[1547]: time="2025-01-13T20:46:32.694916507Z" level=info msg="StopPodSandbox for \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\" returns successfully" Jan 13 20:46:32.696606 containerd[1547]: time="2025-01-13T20:46:32.695008959Z" level=info msg="StopPodSandbox for \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\"" Jan 13 20:46:32.696606 containerd[1547]: time="2025-01-13T20:46:32.695079940Z" level=info msg="TearDown network for sandbox \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\" successfully" Jan 13 20:46:32.696606 containerd[1547]: time="2025-01-13T20:46:32.695087104Z" level=info msg="StopPodSandbox for \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\" returns successfully" Jan 13 20:46:32.696606 containerd[1547]: time="2025-01-13T20:46:32.696540696Z" level=info msg="StopPodSandbox for \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\"" Jan 13 20:46:32.696606 containerd[1547]: time="2025-01-13T20:46:32.696582289Z" level=info msg="TearDown network for sandbox \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\" successfully" Jan 13 20:46:32.696606 containerd[1547]: time="2025-01-13T20:46:32.696588587Z" level=info msg="StopPodSandbox for \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\" returns successfully" Jan 13 20:46:32.698018 containerd[1547]: time="2025-01-13T20:46:32.696670754Z" level=info msg="StopPodSandbox for \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\"" Jan 13 20:46:32.698018 containerd[1547]: time="2025-01-13T20:46:32.697801155Z" level=info msg="TearDown network for sandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\" successfully" Jan 13 20:46:32.698018 containerd[1547]: time="2025-01-13T20:46:32.697810825Z" level=info msg="StopPodSandbox for \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\" returns successfully" Jan 13 20:46:32.697141 systemd[1]: run-netns-cni\x2d9bec22ff\x2d0a4a\x2d9c72\x2d8d01\x2df00412ebfe3a.mount: Deactivated successfully. Jan 13 20:46:32.698745 containerd[1547]: time="2025-01-13T20:46:32.698547435Z" level=info msg="StopPodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\"" Jan 13 20:46:32.698745 containerd[1547]: time="2025-01-13T20:46:32.698592161Z" level=info msg="TearDown network for sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" successfully" Jan 13 20:46:32.698745 containerd[1547]: time="2025-01-13T20:46:32.698598827Z" level=info msg="StopPodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" returns successfully" Jan 13 20:46:32.698745 containerd[1547]: time="2025-01-13T20:46:32.698627275Z" level=info msg="StopPodSandbox for \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\"" Jan 13 20:46:32.698745 containerd[1547]: time="2025-01-13T20:46:32.698661544Z" level=info msg="TearDown network for sandbox \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\" successfully" Jan 13 20:46:32.698745 containerd[1547]: time="2025-01-13T20:46:32.698666801Z" level=info msg="StopPodSandbox for \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\" returns successfully" Jan 13 20:46:32.699051 containerd[1547]: time="2025-01-13T20:46:32.699040892Z" level=info msg="StopPodSandbox for \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\"" Jan 13 20:46:32.699289 containerd[1547]: time="2025-01-13T20:46:32.699115752Z" level=info msg="TearDown network for sandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\" successfully" Jan 13 20:46:32.699289 containerd[1547]: time="2025-01-13T20:46:32.699127759Z" level=info msg="StopPodSandbox for \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\" returns successfully" Jan 13 20:46:32.699289 containerd[1547]: time="2025-01-13T20:46:32.699115837Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\"" Jan 13 20:46:32.699289 containerd[1547]: time="2025-01-13T20:46:32.699176992Z" level=info msg="TearDown network for sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" successfully" Jan 13 20:46:32.699289 containerd[1547]: time="2025-01-13T20:46:32.699182060Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" returns successfully" Jan 13 20:46:32.699750 kubelet[2851]: I0113 20:46:32.699324 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7" Jan 13 20:46:32.699864 containerd[1547]: time="2025-01-13T20:46:32.699849810Z" level=info msg="StopPodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\"" Jan 13 20:46:32.700370 containerd[1547]: time="2025-01-13T20:46:32.699901563Z" level=info msg="TearDown network for sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" successfully" Jan 13 20:46:32.700370 containerd[1547]: time="2025-01-13T20:46:32.699911985Z" level=info msg="StopPodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" returns successfully" Jan 13 20:46:32.700370 containerd[1547]: time="2025-01-13T20:46:32.699854007Z" level=info msg="StopPodSandbox for \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\"" Jan 13 20:46:32.700370 containerd[1547]: time="2025-01-13T20:46:32.699988312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:7,}" Jan 13 20:46:32.700370 containerd[1547]: time="2025-01-13T20:46:32.700032907Z" level=info msg="Ensure that sandbox 67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7 in task-service has been cleanup successfully" Jan 13 20:46:32.700680 containerd[1547]: time="2025-01-13T20:46:32.700591805Z" level=info msg="TearDown network for sandbox \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\" successfully" Jan 13 20:46:32.700710 containerd[1547]: time="2025-01-13T20:46:32.700680214Z" level=info msg="StopPodSandbox for \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\" returns successfully" Jan 13 20:46:32.701548 containerd[1547]: time="2025-01-13T20:46:32.700830928Z" level=info msg="StopPodSandbox for \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\"" Jan 13 20:46:32.701736 containerd[1547]: time="2025-01-13T20:46:32.700865704Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\"" Jan 13 20:46:32.701789 containerd[1547]: time="2025-01-13T20:46:32.701777716Z" level=info msg="TearDown network for sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" successfully" Jan 13 20:46:32.701855 containerd[1547]: time="2025-01-13T20:46:32.701844357Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" returns successfully" Jan 13 20:46:32.702308 systemd[1]: run-netns-cni\x2d38e6725f\x2d1863\x2da07c\x2dd7db\x2d342d71dcf68b.mount: Deactivated successfully. Jan 13 20:46:32.703252 containerd[1547]: time="2025-01-13T20:46:32.701840855Z" level=info msg="TearDown network for sandbox \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\" successfully" Jan 13 20:46:32.703252 containerd[1547]: time="2025-01-13T20:46:32.702922640Z" level=info msg="StopPodSandbox for \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\" returns successfully" Jan 13 20:46:32.703717 containerd[1547]: time="2025-01-13T20:46:32.703701682Z" level=info msg="StopPodSandbox for \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\"" Jan 13 20:46:32.703780 containerd[1547]: time="2025-01-13T20:46:32.703767337Z" level=info msg="TearDown network for sandbox \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\" successfully" Jan 13 20:46:32.703780 containerd[1547]: time="2025-01-13T20:46:32.703777192Z" level=info msg="StopPodSandbox for \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\" returns successfully" Jan 13 20:46:32.704801 containerd[1547]: time="2025-01-13T20:46:32.704515684Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:7,}" Jan 13 20:46:32.704801 containerd[1547]: time="2025-01-13T20:46:32.704771313Z" level=info msg="StopPodSandbox for \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\"" Jan 13 20:46:32.706524 containerd[1547]: time="2025-01-13T20:46:32.704819791Z" level=info msg="TearDown network for sandbox \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\" successfully" Jan 13 20:46:32.706524 containerd[1547]: time="2025-01-13T20:46:32.704895532Z" level=info msg="StopPodSandbox for \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\" returns successfully" Jan 13 20:46:32.706524 containerd[1547]: time="2025-01-13T20:46:32.705424027Z" level=info msg="StopPodSandbox for \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\"" Jan 13 20:46:32.706524 containerd[1547]: time="2025-01-13T20:46:32.705466143Z" level=info msg="TearDown network for sandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\" successfully" Jan 13 20:46:32.706524 containerd[1547]: time="2025-01-13T20:46:32.705472660Z" level=info msg="StopPodSandbox for \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\" returns successfully" Jan 13 20:46:32.706524 containerd[1547]: time="2025-01-13T20:46:32.705576534Z" level=info msg="StopPodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\"" Jan 13 20:46:32.706524 containerd[1547]: time="2025-01-13T20:46:32.705617674Z" level=info msg="TearDown network for sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" successfully" Jan 13 20:46:32.706524 containerd[1547]: time="2025-01-13T20:46:32.705623895Z" level=info msg="StopPodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" returns successfully" Jan 13 20:46:32.706524 containerd[1547]: time="2025-01-13T20:46:32.705724233Z" level=info msg="StopPodSandbox for \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\"" Jan 13 20:46:32.706524 containerd[1547]: time="2025-01-13T20:46:32.705815537Z" level=info msg="Ensure that sandbox 1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231 in task-service has been cleanup successfully" Jan 13 20:46:32.706704 kubelet[2851]: I0113 20:46:32.705069 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231" Jan 13 20:46:32.708206 systemd[1]: run-netns-cni\x2d61fa7d03\x2d477e\x2d360e\x2d6689\x2d0b54d1a23a41.mount: Deactivated successfully. Jan 13 20:46:32.708473 containerd[1547]: time="2025-01-13T20:46:32.708429111Z" level=info msg="TearDown network for sandbox \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\" successfully" Jan 13 20:46:32.708473 containerd[1547]: time="2025-01-13T20:46:32.708447322Z" level=info msg="StopPodSandbox for \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\" returns successfully" Jan 13 20:46:32.708635 containerd[1547]: time="2025-01-13T20:46:32.708622104Z" level=info msg="StopPodSandbox for \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\"" Jan 13 20:46:32.708763 containerd[1547]: time="2025-01-13T20:46:32.708748557Z" level=info msg="TearDown network for sandbox \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\" successfully" Jan 13 20:46:32.708763 containerd[1547]: time="2025-01-13T20:46:32.708758812Z" level=info msg="StopPodSandbox for \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\" returns successfully" Jan 13 20:46:32.708836 containerd[1547]: time="2025-01-13T20:46:32.708822976Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\"" Jan 13 20:46:32.709320 containerd[1547]: time="2025-01-13T20:46:32.708858008Z" level=info msg="TearDown network for sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" successfully" Jan 13 20:46:32.709320 containerd[1547]: time="2025-01-13T20:46:32.708865880Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" returns successfully" Jan 13 20:46:32.709639 containerd[1547]: time="2025-01-13T20:46:32.709496868Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:7,}" Jan 13 20:46:32.709639 containerd[1547]: time="2025-01-13T20:46:32.709517936Z" level=info msg="StopPodSandbox for \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\"" Jan 13 20:46:32.709639 containerd[1547]: time="2025-01-13T20:46:32.709556867Z" level=info msg="TearDown network for sandbox \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\" successfully" Jan 13 20:46:32.709639 containerd[1547]: time="2025-01-13T20:46:32.709562931Z" level=info msg="StopPodSandbox for \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\" returns successfully" Jan 13 20:46:32.710429 containerd[1547]: time="2025-01-13T20:46:32.709993933Z" level=info msg="StopPodSandbox for \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\"" Jan 13 20:46:32.710429 containerd[1547]: time="2025-01-13T20:46:32.710046477Z" level=info msg="TearDown network for sandbox \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\" successfully" Jan 13 20:46:32.710429 containerd[1547]: time="2025-01-13T20:46:32.710054734Z" level=info msg="StopPodSandbox for \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\" returns successfully" Jan 13 20:46:32.710740 containerd[1547]: time="2025-01-13T20:46:32.710725979Z" level=info msg="StopPodSandbox for \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\"" Jan 13 20:46:32.710799 containerd[1547]: time="2025-01-13T20:46:32.710767247Z" level=info msg="TearDown network for sandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\" successfully" Jan 13 20:46:32.711036 containerd[1547]: time="2025-01-13T20:46:32.710796490Z" level=info msg="StopPodSandbox for \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\" returns successfully" Jan 13 20:46:32.711125 containerd[1547]: time="2025-01-13T20:46:32.711106747Z" level=info msg="StopPodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\"" Jan 13 20:46:32.711156 containerd[1547]: time="2025-01-13T20:46:32.711146127Z" level=info msg="TearDown network for sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" successfully" Jan 13 20:46:32.711156 containerd[1547]: time="2025-01-13T20:46:32.711151688Z" level=info msg="StopPodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" returns successfully" Jan 13 20:46:32.711870 containerd[1547]: time="2025-01-13T20:46:32.711631345Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\"" Jan 13 20:46:32.711870 containerd[1547]: time="2025-01-13T20:46:32.711670276Z" level=info msg="TearDown network for sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" successfully" Jan 13 20:46:32.711870 containerd[1547]: time="2025-01-13T20:46:32.711676212Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" returns successfully" Jan 13 20:46:32.712081 containerd[1547]: time="2025-01-13T20:46:32.711963491Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:7,}" Jan 13 20:46:32.712210 kubelet[2851]: I0113 20:46:32.712192 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd" Jan 13 20:46:32.712750 containerd[1547]: time="2025-01-13T20:46:32.712662232Z" level=info msg="StopPodSandbox for \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\"" Jan 13 20:46:32.712791 containerd[1547]: time="2025-01-13T20:46:32.712759601Z" level=info msg="Ensure that sandbox 66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd in task-service has been cleanup successfully" Jan 13 20:46:32.713088 containerd[1547]: time="2025-01-13T20:46:32.713055186Z" level=info msg="TearDown network for sandbox \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\" successfully" Jan 13 20:46:32.713088 containerd[1547]: time="2025-01-13T20:46:32.713081174Z" level=info msg="StopPodSandbox for \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\" returns successfully" Jan 13 20:46:32.713921 containerd[1547]: time="2025-01-13T20:46:32.713904055Z" level=info msg="StopPodSandbox for \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\"" Jan 13 20:46:32.713988 containerd[1547]: time="2025-01-13T20:46:32.713966765Z" level=info msg="TearDown network for sandbox \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\" successfully" Jan 13 20:46:32.713988 containerd[1547]: time="2025-01-13T20:46:32.713978116Z" level=info msg="StopPodSandbox for \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\" returns successfully" Jan 13 20:46:32.715217 containerd[1547]: time="2025-01-13T20:46:32.715133836Z" level=info msg="StopPodSandbox for \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\"" Jan 13 20:46:32.716305 containerd[1547]: time="2025-01-13T20:46:32.715187681Z" level=info msg="TearDown network for sandbox \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\" successfully" Jan 13 20:46:32.716305 containerd[1547]: time="2025-01-13T20:46:32.715820214Z" level=info msg="StopPodSandbox for \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\" returns successfully" Jan 13 20:46:32.716879 containerd[1547]: time="2025-01-13T20:46:32.716753477Z" level=info msg="StopPodSandbox for \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\"" Jan 13 20:46:32.717026 containerd[1547]: time="2025-01-13T20:46:32.717016579Z" level=info msg="TearDown network for sandbox \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\" successfully" Jan 13 20:46:32.717244 containerd[1547]: time="2025-01-13T20:46:32.717163090Z" level=info msg="StopPodSandbox for \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\" returns successfully" Jan 13 20:46:32.717599 containerd[1547]: time="2025-01-13T20:46:32.717588127Z" level=info msg="StopPodSandbox for \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\"" Jan 13 20:46:32.717927 containerd[1547]: time="2025-01-13T20:46:32.717916349Z" level=info msg="TearDown network for sandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\" successfully" Jan 13 20:46:32.717982 containerd[1547]: time="2025-01-13T20:46:32.717974521Z" level=info msg="StopPodSandbox for \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\" returns successfully" Jan 13 20:46:32.718682 containerd[1547]: time="2025-01-13T20:46:32.718660030Z" level=info msg="StopPodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\"" Jan 13 20:46:32.718895 containerd[1547]: time="2025-01-13T20:46:32.718885289Z" level=info msg="TearDown network for sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" successfully" Jan 13 20:46:32.718974 containerd[1547]: time="2025-01-13T20:46:32.718964451Z" level=info msg="StopPodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" returns successfully" Jan 13 20:46:32.719209 containerd[1547]: time="2025-01-13T20:46:32.719198095Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\"" Jan 13 20:46:32.719732 containerd[1547]: time="2025-01-13T20:46:32.719720936Z" level=info msg="TearDown network for sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" successfully" Jan 13 20:46:32.719779 containerd[1547]: time="2025-01-13T20:46:32.719766620Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" returns successfully" Jan 13 20:46:32.719924 kubelet[2851]: I0113 20:46:32.719910 2851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c" Jan 13 20:46:32.720993 containerd[1547]: time="2025-01-13T20:46:32.720981240Z" level=info msg="StopPodSandbox for \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\"" Jan 13 20:46:32.721721 containerd[1547]: time="2025-01-13T20:46:32.721498878Z" level=info msg="Ensure that sandbox d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c in task-service has been cleanup successfully" Jan 13 20:46:32.721894 containerd[1547]: time="2025-01-13T20:46:32.721878001Z" level=info msg="TearDown network for sandbox \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\" successfully" Jan 13 20:46:32.722360 containerd[1547]: time="2025-01-13T20:46:32.722322241Z" level=info msg="StopPodSandbox for \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\" returns successfully" Jan 13 20:46:32.722594 containerd[1547]: time="2025-01-13T20:46:32.722572748Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:7,}" Jan 13 20:46:32.724390 containerd[1547]: time="2025-01-13T20:46:32.723852066Z" level=info msg="StopPodSandbox for \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\"" Jan 13 20:46:32.724390 containerd[1547]: time="2025-01-13T20:46:32.723896144Z" level=info msg="TearDown network for sandbox \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\" successfully" Jan 13 20:46:32.724390 containerd[1547]: time="2025-01-13T20:46:32.723902392Z" level=info msg="StopPodSandbox for \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\" returns successfully" Jan 13 20:46:32.724888 containerd[1547]: time="2025-01-13T20:46:32.724876737Z" level=info msg="StopPodSandbox for \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\"" Jan 13 20:46:32.725547 containerd[1547]: time="2025-01-13T20:46:32.725536223Z" level=info msg="TearDown network for sandbox \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\" successfully" Jan 13 20:46:32.725660 containerd[1547]: time="2025-01-13T20:46:32.725649088Z" level=info msg="StopPodSandbox for \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\" returns successfully" Jan 13 20:46:32.727474 containerd[1547]: time="2025-01-13T20:46:32.727458365Z" level=info msg="StopPodSandbox for \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\"" Jan 13 20:46:32.727518 containerd[1547]: time="2025-01-13T20:46:32.727504514Z" level=info msg="TearDown network for sandbox \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\" successfully" Jan 13 20:46:32.727518 containerd[1547]: time="2025-01-13T20:46:32.727511254Z" level=info msg="StopPodSandbox for \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\" returns successfully" Jan 13 20:46:32.727647 containerd[1547]: time="2025-01-13T20:46:32.727632804Z" level=info msg="StopPodSandbox for \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\"" Jan 13 20:46:32.727695 containerd[1547]: time="2025-01-13T20:46:32.727684986Z" level=info msg="TearDown network for sandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\" successfully" Jan 13 20:46:32.727695 containerd[1547]: time="2025-01-13T20:46:32.727693239Z" level=info msg="StopPodSandbox for \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\" returns successfully" Jan 13 20:46:32.727982 containerd[1547]: time="2025-01-13T20:46:32.727967101Z" level=info msg="StopPodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\"" Jan 13 20:46:32.728085 containerd[1547]: time="2025-01-13T20:46:32.728075953Z" level=info msg="TearDown network for sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" successfully" Jan 13 20:46:32.728124 containerd[1547]: time="2025-01-13T20:46:32.728117421Z" level=info msg="StopPodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" returns successfully" Jan 13 20:46:32.728435 containerd[1547]: time="2025-01-13T20:46:32.728419014Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\"" Jan 13 20:46:32.728551 containerd[1547]: time="2025-01-13T20:46:32.728542594Z" level=info msg="TearDown network for sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" successfully" Jan 13 20:46:32.728645 containerd[1547]: time="2025-01-13T20:46:32.728585106Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" returns successfully" Jan 13 20:46:32.728958 containerd[1547]: time="2025-01-13T20:46:32.728939894Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:7,}" Jan 13 20:46:33.055746 systemd-networkd[1457]: cali6e99632cdb6: Link UP Jan 13 20:46:33.055857 systemd-networkd[1457]: cali6e99632cdb6: Gained carrier Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:32.794 [INFO][4926] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:32.805 [INFO][4926] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5d7876745f--sk2dr-eth0 calico-apiserver-5d7876745f- calico-apiserver 662f5a4d-917d-45ec-97a1-d70b9c8e2f05 671 0 2025-01-13 20:46:16 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5d7876745f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5d7876745f-sk2dr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali6e99632cdb6 [] []}} ContainerID="7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-sk2dr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--sk2dr-" Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:32.805 [INFO][4926] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-sk2dr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--sk2dr-eth0" Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.005 [INFO][4992] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" HandleID="k8s-pod-network.7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" Workload="localhost-k8s-calico--apiserver--5d7876745f--sk2dr-eth0" Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.025 [INFO][4992] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" HandleID="k8s-pod-network.7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" Workload="localhost-k8s-calico--apiserver--5d7876745f--sk2dr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002f3060), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5d7876745f-sk2dr", "timestamp":"2025-01-13 20:46:33.005542223 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.025 [INFO][4992] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.026 [INFO][4992] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.026 [INFO][4992] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.028 [INFO][4992] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" host="localhost" Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.033 [INFO][4992] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.035 [INFO][4992] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.036 [INFO][4992] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.037 [INFO][4992] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.037 [INFO][4992] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" host="localhost" Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.038 [INFO][4992] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6 Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.039 [INFO][4992] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" host="localhost" Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.042 [INFO][4992] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" host="localhost" Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.042 [INFO][4992] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" host="localhost" Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.042 [INFO][4992] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:46:33.066736 containerd[1547]: 2025-01-13 20:46:33.042 [INFO][4992] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" HandleID="k8s-pod-network.7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" Workload="localhost-k8s-calico--apiserver--5d7876745f--sk2dr-eth0" Jan 13 20:46:33.069610 containerd[1547]: 2025-01-13 20:46:33.043 [INFO][4926] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-sk2dr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--sk2dr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5d7876745f--sk2dr-eth0", GenerateName:"calico-apiserver-5d7876745f-", Namespace:"calico-apiserver", SelfLink:"", UID:"662f5a4d-917d-45ec-97a1-d70b9c8e2f05", ResourceVersion:"671", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 46, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d7876745f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5d7876745f-sk2dr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6e99632cdb6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:46:33.069610 containerd[1547]: 2025-01-13 20:46:33.043 [INFO][4926] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-sk2dr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--sk2dr-eth0" Jan 13 20:46:33.069610 containerd[1547]: 2025-01-13 20:46:33.043 [INFO][4926] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6e99632cdb6 ContainerID="7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-sk2dr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--sk2dr-eth0" Jan 13 20:46:33.069610 containerd[1547]: 2025-01-13 20:46:33.052 [INFO][4926] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-sk2dr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--sk2dr-eth0" Jan 13 20:46:33.069610 containerd[1547]: 2025-01-13 20:46:33.052 [INFO][4926] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-sk2dr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--sk2dr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5d7876745f--sk2dr-eth0", GenerateName:"calico-apiserver-5d7876745f-", Namespace:"calico-apiserver", SelfLink:"", UID:"662f5a4d-917d-45ec-97a1-d70b9c8e2f05", ResourceVersion:"671", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 46, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d7876745f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6", Pod:"calico-apiserver-5d7876745f-sk2dr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6e99632cdb6", MAC:"9e:3d:89:3b:c9:c8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:46:33.069610 containerd[1547]: 2025-01-13 20:46:33.063 [INFO][4926] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-sk2dr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--sk2dr-eth0" Jan 13 20:46:33.073727 systemd-networkd[1457]: calibc608138932: Link UP Jan 13 20:46:33.073820 systemd-networkd[1457]: calibc608138932: Gained carrier Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:32.789 [INFO][4932] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:32.806 [INFO][4932] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--76f75df574--nx8p5-eth0 coredns-76f75df574- kube-system 6ab74c68-2020-4618-bcbe-672227cc6fc9 670 0 2025-01-13 20:46:11 +0000 UTC map[k8s-app:kube-dns pod-template-hash:76f75df574 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-76f75df574-nx8p5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calibc608138932 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" Namespace="kube-system" Pod="coredns-76f75df574-nx8p5" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--nx8p5-" Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:32.806 [INFO][4932] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" Namespace="kube-system" Pod="coredns-76f75df574-nx8p5" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--nx8p5-eth0" Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.005 [INFO][4991] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" HandleID="k8s-pod-network.d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" Workload="localhost-k8s-coredns--76f75df574--nx8p5-eth0" Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.026 [INFO][4991] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" HandleID="k8s-pod-network.d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" Workload="localhost-k8s-coredns--76f75df574--nx8p5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000319430), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-76f75df574-nx8p5", "timestamp":"2025-01-13 20:46:33.005872842 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.026 [INFO][4991] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.042 [INFO][4991] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.042 [INFO][4991] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.043 [INFO][4991] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" host="localhost" Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.045 [INFO][4991] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.047 [INFO][4991] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.048 [INFO][4991] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.050 [INFO][4991] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.050 [INFO][4991] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" host="localhost" Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.053 [INFO][4991] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11 Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.059 [INFO][4991] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" host="localhost" Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.062 [INFO][4991] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" host="localhost" Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.062 [INFO][4991] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" host="localhost" Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.063 [INFO][4991] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:46:33.082900 containerd[1547]: 2025-01-13 20:46:33.063 [INFO][4991] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" HandleID="k8s-pod-network.d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" Workload="localhost-k8s-coredns--76f75df574--nx8p5-eth0" Jan 13 20:46:33.083323 containerd[1547]: 2025-01-13 20:46:33.067 [INFO][4932] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" Namespace="kube-system" Pod="coredns-76f75df574-nx8p5" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--nx8p5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--76f75df574--nx8p5-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"6ab74c68-2020-4618-bcbe-672227cc6fc9", ResourceVersion:"670", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 46, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-76f75df574-nx8p5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibc608138932", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:46:33.083323 containerd[1547]: 2025-01-13 20:46:33.068 [INFO][4932] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" Namespace="kube-system" Pod="coredns-76f75df574-nx8p5" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--nx8p5-eth0" Jan 13 20:46:33.083323 containerd[1547]: 2025-01-13 20:46:33.068 [INFO][4932] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibc608138932 ContainerID="d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" Namespace="kube-system" Pod="coredns-76f75df574-nx8p5" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--nx8p5-eth0" Jan 13 20:46:33.083323 containerd[1547]: 2025-01-13 20:46:33.069 [INFO][4932] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" Namespace="kube-system" Pod="coredns-76f75df574-nx8p5" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--nx8p5-eth0" Jan 13 20:46:33.083323 containerd[1547]: 2025-01-13 20:46:33.069 [INFO][4932] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" Namespace="kube-system" Pod="coredns-76f75df574-nx8p5" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--nx8p5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--76f75df574--nx8p5-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"6ab74c68-2020-4618-bcbe-672227cc6fc9", ResourceVersion:"670", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 46, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11", Pod:"coredns-76f75df574-nx8p5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibc608138932", MAC:"b2:a8:2a:44:74:cf", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:46:33.083323 containerd[1547]: 2025-01-13 20:46:33.080 [INFO][4932] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11" Namespace="kube-system" Pod="coredns-76f75df574-nx8p5" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--nx8p5-eth0" Jan 13 20:46:33.111212 containerd[1547]: time="2025-01-13T20:46:33.110931305Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:46:33.111212 containerd[1547]: time="2025-01-13T20:46:33.110995790Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:46:33.111212 containerd[1547]: time="2025-01-13T20:46:33.111007110Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:33.111871 containerd[1547]: time="2025-01-13T20:46:33.111082261Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:33.116306 systemd-networkd[1457]: cali22854092921: Link UP Jan 13 20:46:33.116774 systemd-networkd[1457]: cali22854092921: Gained carrier Jan 13 20:46:33.131566 containerd[1547]: time="2025-01-13T20:46:33.129155184Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:46:33.131566 containerd[1547]: time="2025-01-13T20:46:33.130256014Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:46:33.131566 containerd[1547]: time="2025-01-13T20:46:33.130274257Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:32.780 [INFO][4956] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:32.801 [INFO][4956] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--76f75df574--75b2t-eth0 coredns-76f75df574- kube-system 45d20c01-0698-463e-b647-27ec83c8d824 673 0 2025-01-13 20:46:11 +0000 UTC map[k8s-app:kube-dns pod-template-hash:76f75df574 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-76f75df574-75b2t eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali22854092921 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" Namespace="kube-system" Pod="coredns-76f75df574-75b2t" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--75b2t-" Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:32.801 [INFO][4956] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" Namespace="kube-system" Pod="coredns-76f75df574-75b2t" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--75b2t-eth0" Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.005 [INFO][4990] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" HandleID="k8s-pod-network.5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" Workload="localhost-k8s-coredns--76f75df574--75b2t-eth0" Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.025 [INFO][4990] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" HandleID="k8s-pod-network.5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" Workload="localhost-k8s-coredns--76f75df574--75b2t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003610d0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-76f75df574-75b2t", "timestamp":"2025-01-13 20:46:33.005909298 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.025 [INFO][4990] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.063 [INFO][4990] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.063 [INFO][4990] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.067 [INFO][4990] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" host="localhost" Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.077 [INFO][4990] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.088 [INFO][4990] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.093 [INFO][4990] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.095 [INFO][4990] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.095 [INFO][4990] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" host="localhost" Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.096 [INFO][4990] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.099 [INFO][4990] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" host="localhost" Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.105 [INFO][4990] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" host="localhost" Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.105 [INFO][4990] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" host="localhost" Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.105 [INFO][4990] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:46:33.132645 containerd[1547]: 2025-01-13 20:46:33.105 [INFO][4990] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" HandleID="k8s-pod-network.5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" Workload="localhost-k8s-coredns--76f75df574--75b2t-eth0" Jan 13 20:46:33.133124 containerd[1547]: 2025-01-13 20:46:33.113 [INFO][4956] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" Namespace="kube-system" Pod="coredns-76f75df574-75b2t" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--75b2t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--76f75df574--75b2t-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"45d20c01-0698-463e-b647-27ec83c8d824", ResourceVersion:"673", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 46, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-76f75df574-75b2t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali22854092921", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:46:33.133124 containerd[1547]: 2025-01-13 20:46:33.113 [INFO][4956] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" Namespace="kube-system" Pod="coredns-76f75df574-75b2t" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--75b2t-eth0" Jan 13 20:46:33.133124 containerd[1547]: 2025-01-13 20:46:33.113 [INFO][4956] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali22854092921 ContainerID="5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" Namespace="kube-system" Pod="coredns-76f75df574-75b2t" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--75b2t-eth0" Jan 13 20:46:33.133124 containerd[1547]: 2025-01-13 20:46:33.117 [INFO][4956] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" Namespace="kube-system" Pod="coredns-76f75df574-75b2t" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--75b2t-eth0" Jan 13 20:46:33.133124 containerd[1547]: 2025-01-13 20:46:33.118 [INFO][4956] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" Namespace="kube-system" Pod="coredns-76f75df574-75b2t" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--75b2t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--76f75df574--75b2t-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"45d20c01-0698-463e-b647-27ec83c8d824", ResourceVersion:"673", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 46, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a", Pod:"coredns-76f75df574-75b2t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali22854092921", MAC:"46:9c:4c:e6:e0:97", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:46:33.133124 containerd[1547]: 2025-01-13 20:46:33.127 [INFO][4956] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a" Namespace="kube-system" Pod="coredns-76f75df574-75b2t" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--75b2t-eth0" Jan 13 20:46:33.134161 containerd[1547]: time="2025-01-13T20:46:33.134100224Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:33.141538 systemd[1]: Started cri-containerd-d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11.scope - libcontainer container d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11. Jan 13 20:46:33.160218 systemd-networkd[1457]: cali42f9f6f3247: Link UP Jan 13 20:46:33.160874 systemd-networkd[1457]: cali42f9f6f3247: Gained carrier Jan 13 20:46:33.161448 systemd[1]: Started cri-containerd-7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6.scope - libcontainer container 7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6. Jan 13 20:46:33.173459 systemd-resolved[1458]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:32.756 [INFO][4946] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:32.780 [INFO][4946] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--66f4bb6d79--zhjcz-eth0 calico-kube-controllers-66f4bb6d79- calico-system 0016e978-2a6d-4be0-ad22-af8c555426bb 674 0 2025-01-13 20:46:16 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:66f4bb6d79 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-66f4bb6d79-zhjcz eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali42f9f6f3247 [] []}} ContainerID="762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" Namespace="calico-system" Pod="calico-kube-controllers-66f4bb6d79-zhjcz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--66f4bb6d79--zhjcz-" Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:32.780 [INFO][4946] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" Namespace="calico-system" Pod="calico-kube-controllers-66f4bb6d79-zhjcz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--66f4bb6d79--zhjcz-eth0" Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.005 [INFO][4989] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" HandleID="k8s-pod-network.762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" Workload="localhost-k8s-calico--kube--controllers--66f4bb6d79--zhjcz-eth0" Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.026 [INFO][4989] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" HandleID="k8s-pod-network.762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" Workload="localhost-k8s-calico--kube--controllers--66f4bb6d79--zhjcz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000304a60), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-66f4bb6d79-zhjcz", "timestamp":"2025-01-13 20:46:33.005946354 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.026 [INFO][4989] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.106 [INFO][4989] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.106 [INFO][4989] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.112 [INFO][4989] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" host="localhost" Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.117 [INFO][4989] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.124 [INFO][4989] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.127 [INFO][4989] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.132 [INFO][4989] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.132 [INFO][4989] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" host="localhost" Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.134 [INFO][4989] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.140 [INFO][4989] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" host="localhost" Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.147 [INFO][4989] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" host="localhost" Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.147 [INFO][4989] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" host="localhost" Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.147 [INFO][4989] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:46:33.177155 containerd[1547]: 2025-01-13 20:46:33.148 [INFO][4989] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" HandleID="k8s-pod-network.762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" Workload="localhost-k8s-calico--kube--controllers--66f4bb6d79--zhjcz-eth0" Jan 13 20:46:33.179372 containerd[1547]: 2025-01-13 20:46:33.151 [INFO][4946] cni-plugin/k8s.go 386: Populated endpoint ContainerID="762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" Namespace="calico-system" Pod="calico-kube-controllers-66f4bb6d79-zhjcz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--66f4bb6d79--zhjcz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--66f4bb6d79--zhjcz-eth0", GenerateName:"calico-kube-controllers-66f4bb6d79-", Namespace:"calico-system", SelfLink:"", UID:"0016e978-2a6d-4be0-ad22-af8c555426bb", ResourceVersion:"674", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 46, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"66f4bb6d79", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-66f4bb6d79-zhjcz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali42f9f6f3247", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:46:33.179372 containerd[1547]: 2025-01-13 20:46:33.151 [INFO][4946] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" Namespace="calico-system" Pod="calico-kube-controllers-66f4bb6d79-zhjcz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--66f4bb6d79--zhjcz-eth0" Jan 13 20:46:33.179372 containerd[1547]: 2025-01-13 20:46:33.152 [INFO][4946] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali42f9f6f3247 ContainerID="762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" Namespace="calico-system" Pod="calico-kube-controllers-66f4bb6d79-zhjcz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--66f4bb6d79--zhjcz-eth0" Jan 13 20:46:33.179372 containerd[1547]: 2025-01-13 20:46:33.161 [INFO][4946] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" Namespace="calico-system" Pod="calico-kube-controllers-66f4bb6d79-zhjcz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--66f4bb6d79--zhjcz-eth0" Jan 13 20:46:33.179372 containerd[1547]: 2025-01-13 20:46:33.162 [INFO][4946] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" Namespace="calico-system" Pod="calico-kube-controllers-66f4bb6d79-zhjcz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--66f4bb6d79--zhjcz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--66f4bb6d79--zhjcz-eth0", GenerateName:"calico-kube-controllers-66f4bb6d79-", Namespace:"calico-system", SelfLink:"", UID:"0016e978-2a6d-4be0-ad22-af8c555426bb", ResourceVersion:"674", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 46, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"66f4bb6d79", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b", Pod:"calico-kube-controllers-66f4bb6d79-zhjcz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali42f9f6f3247", MAC:"be:9c:43:35:ed:c2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:46:33.179372 containerd[1547]: 2025-01-13 20:46:33.172 [INFO][4946] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b" Namespace="calico-system" Pod="calico-kube-controllers-66f4bb6d79-zhjcz" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--66f4bb6d79--zhjcz-eth0" Jan 13 20:46:33.181510 containerd[1547]: time="2025-01-13T20:46:33.181437690Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:46:33.181510 containerd[1547]: time="2025-01-13T20:46:33.181476975Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:46:33.181942 containerd[1547]: time="2025-01-13T20:46:33.181682666Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:33.182234 containerd[1547]: time="2025-01-13T20:46:33.182081504Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:33.205472 systemd[1]: Started cri-containerd-5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a.scope - libcontainer container 5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a. Jan 13 20:46:33.208771 containerd[1547]: time="2025-01-13T20:46:33.205868019Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:46:33.208771 containerd[1547]: time="2025-01-13T20:46:33.206388762Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:46:33.208771 containerd[1547]: time="2025-01-13T20:46:33.206400165Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:33.208771 containerd[1547]: time="2025-01-13T20:46:33.206575115Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:33.216724 systemd-networkd[1457]: cali501e1353892: Link UP Jan 13 20:46:33.217519 systemd-networkd[1457]: cali501e1353892: Gained carrier Jan 13 20:46:33.240413 systemd-resolved[1458]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:46:33.240968 systemd-resolved[1458]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:32.740 [INFO][4916] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:32.777 [INFO][4916] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5d7876745f--vvftf-eth0 calico-apiserver-5d7876745f- calico-apiserver 6953b35c-8169-44aa-91cb-7dd3f8f9aade 672 0 2025-01-13 20:46:16 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5d7876745f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5d7876745f-vvftf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali501e1353892 [] []}} ContainerID="0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-vvftf" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--vvftf-" Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:32.777 [INFO][4916] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-vvftf" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--vvftf-eth0" Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.005 [INFO][4988] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" HandleID="k8s-pod-network.0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" Workload="localhost-k8s-calico--apiserver--5d7876745f--vvftf-eth0" Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.025 [INFO][4988] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" HandleID="k8s-pod-network.0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" Workload="localhost-k8s-calico--apiserver--5d7876745f--vvftf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000284d90), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5d7876745f-vvftf", "timestamp":"2025-01-13 20:46:33.005687102 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.026 [INFO][4988] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.148 [INFO][4988] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.148 [INFO][4988] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.151 [INFO][4988] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" host="localhost" Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.157 [INFO][4988] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.166 [INFO][4988] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.169 [INFO][4988] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.172 [INFO][4988] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.172 [INFO][4988] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" host="localhost" Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.177 [INFO][4988] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49 Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.183 [INFO][4988] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" host="localhost" Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.191 [INFO][4988] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" host="localhost" Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.191 [INFO][4988] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" host="localhost" Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.191 [INFO][4988] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:46:33.242169 containerd[1547]: 2025-01-13 20:46:33.191 [INFO][4988] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" HandleID="k8s-pod-network.0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" Workload="localhost-k8s-calico--apiserver--5d7876745f--vvftf-eth0" Jan 13 20:46:33.242728 containerd[1547]: 2025-01-13 20:46:33.207 [INFO][4916] cni-plugin/k8s.go 386: Populated endpoint ContainerID="0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-vvftf" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--vvftf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5d7876745f--vvftf-eth0", GenerateName:"calico-apiserver-5d7876745f-", Namespace:"calico-apiserver", SelfLink:"", UID:"6953b35c-8169-44aa-91cb-7dd3f8f9aade", ResourceVersion:"672", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 46, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d7876745f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5d7876745f-vvftf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali501e1353892", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:46:33.242728 containerd[1547]: 2025-01-13 20:46:33.208 [INFO][4916] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-vvftf" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--vvftf-eth0" Jan 13 20:46:33.242728 containerd[1547]: 2025-01-13 20:46:33.208 [INFO][4916] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali501e1353892 ContainerID="0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-vvftf" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--vvftf-eth0" Jan 13 20:46:33.242728 containerd[1547]: 2025-01-13 20:46:33.217 [INFO][4916] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-vvftf" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--vvftf-eth0" Jan 13 20:46:33.242728 containerd[1547]: 2025-01-13 20:46:33.217 [INFO][4916] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-vvftf" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--vvftf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5d7876745f--vvftf-eth0", GenerateName:"calico-apiserver-5d7876745f-", Namespace:"calico-apiserver", SelfLink:"", UID:"6953b35c-8169-44aa-91cb-7dd3f8f9aade", ResourceVersion:"672", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 46, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d7876745f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49", Pod:"calico-apiserver-5d7876745f-vvftf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali501e1353892", MAC:"86:bc:51:61:9d:02", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:46:33.242728 containerd[1547]: 2025-01-13 20:46:33.232 [INFO][4916] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49" Namespace="calico-apiserver" Pod="calico-apiserver-5d7876745f-vvftf" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d7876745f--vvftf-eth0" Jan 13 20:46:33.243615 systemd[1]: Started cri-containerd-762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b.scope - libcontainer container 762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b. Jan 13 20:46:33.247830 containerd[1547]: time="2025-01-13T20:46:33.247559259Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-nx8p5,Uid:6ab74c68-2020-4618-bcbe-672227cc6fc9,Namespace:kube-system,Attempt:7,} returns sandbox id \"d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11\"" Jan 13 20:46:33.268424 containerd[1547]: time="2025-01-13T20:46:33.268399614Z" level=info msg="CreateContainer within sandbox \"d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 13 20:46:33.274956 containerd[1547]: time="2025-01-13T20:46:33.274931900Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-75b2t,Uid:45d20c01-0698-463e-b647-27ec83c8d824,Namespace:kube-system,Attempt:7,} returns sandbox id \"5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a\"" Jan 13 20:46:33.279851 containerd[1547]: time="2025-01-13T20:46:33.279764040Z" level=info msg="CreateContainer within sandbox \"5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 13 20:46:33.286501 systemd-resolved[1458]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:46:33.291114 systemd-networkd[1457]: caliccc91d2ea55: Link UP Jan 13 20:46:33.291438 systemd-networkd[1457]: caliccc91d2ea55: Gained carrier Jan 13 20:46:33.364985 containerd[1547]: time="2025-01-13T20:46:33.319909977Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:46:33.364985 containerd[1547]: time="2025-01-13T20:46:33.319941023Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:46:33.364985 containerd[1547]: time="2025-01-13T20:46:33.319951311Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:33.364985 containerd[1547]: time="2025-01-13T20:46:33.320001546Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:33.364985 containerd[1547]: time="2025-01-13T20:46:33.331080172Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-sk2dr,Uid:662f5a4d-917d-45ec-97a1-d70b9c8e2f05,Namespace:calico-apiserver,Attempt:7,} returns sandbox id \"7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6\"" Jan 13 20:46:33.364985 containerd[1547]: time="2025-01-13T20:46:33.335530210Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:32.794 [INFO][4969] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:32.806 [INFO][4969] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--cj9xm-eth0 csi-node-driver- calico-system 7af6a31b-5e31-40c1-b6a8-196414f83e54 596 0 2025-01-13 20:46:16 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:55b695c467 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-cj9xm eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] caliccc91d2ea55 [] []}} ContainerID="43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" Namespace="calico-system" Pod="csi-node-driver-cj9xm" WorkloadEndpoint="localhost-k8s-csi--node--driver--cj9xm-" Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:32.806 [INFO][4969] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" Namespace="calico-system" Pod="csi-node-driver-cj9xm" WorkloadEndpoint="localhost-k8s-csi--node--driver--cj9xm-eth0" Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.005 [INFO][4993] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" HandleID="k8s-pod-network.43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" Workload="localhost-k8s-csi--node--driver--cj9xm-eth0" Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.029 [INFO][4993] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" HandleID="k8s-pod-network.43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" Workload="localhost-k8s-csi--node--driver--cj9xm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002f5ec0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-cj9xm", "timestamp":"2025-01-13 20:46:33.005591526 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.029 [INFO][4993] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.191 [INFO][4993] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.191 [INFO][4993] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.194 [INFO][4993] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" host="localhost" Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.205 [INFO][4993] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.211 [INFO][4993] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.219 [INFO][4993] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.245 [INFO][4993] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.246 [INFO][4993] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" host="localhost" Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.253 [INFO][4993] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514 Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.258 [INFO][4993] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" host="localhost" Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.277 [INFO][4993] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" host="localhost" Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.277 [INFO][4993] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" host="localhost" Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.278 [INFO][4993] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:46:33.367120 containerd[1547]: 2025-01-13 20:46:33.280 [INFO][4993] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" HandleID="k8s-pod-network.43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" Workload="localhost-k8s-csi--node--driver--cj9xm-eth0" Jan 13 20:46:33.369284 containerd[1547]: 2025-01-13 20:46:33.286 [INFO][4969] cni-plugin/k8s.go 386: Populated endpoint ContainerID="43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" Namespace="calico-system" Pod="csi-node-driver-cj9xm" WorkloadEndpoint="localhost-k8s-csi--node--driver--cj9xm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--cj9xm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7af6a31b-5e31-40c1-b6a8-196414f83e54", ResourceVersion:"596", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 46, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b695c467", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-cj9xm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliccc91d2ea55", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:46:33.369284 containerd[1547]: 2025-01-13 20:46:33.288 [INFO][4969] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" Namespace="calico-system" Pod="csi-node-driver-cj9xm" WorkloadEndpoint="localhost-k8s-csi--node--driver--cj9xm-eth0" Jan 13 20:46:33.369284 containerd[1547]: 2025-01-13 20:46:33.288 [INFO][4969] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliccc91d2ea55 ContainerID="43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" Namespace="calico-system" Pod="csi-node-driver-cj9xm" WorkloadEndpoint="localhost-k8s-csi--node--driver--cj9xm-eth0" Jan 13 20:46:33.369284 containerd[1547]: 2025-01-13 20:46:33.294 [INFO][4969] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" Namespace="calico-system" Pod="csi-node-driver-cj9xm" WorkloadEndpoint="localhost-k8s-csi--node--driver--cj9xm-eth0" Jan 13 20:46:33.369284 containerd[1547]: 2025-01-13 20:46:33.295 [INFO][4969] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" Namespace="calico-system" Pod="csi-node-driver-cj9xm" WorkloadEndpoint="localhost-k8s-csi--node--driver--cj9xm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--cj9xm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7af6a31b-5e31-40c1-b6a8-196414f83e54", ResourceVersion:"596", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 46, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b695c467", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514", Pod:"csi-node-driver-cj9xm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliccc91d2ea55", MAC:"ba:31:c0:40:38:2c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:46:33.369284 containerd[1547]: 2025-01-13 20:46:33.322 [INFO][4969] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514" Namespace="calico-system" Pod="csi-node-driver-cj9xm" WorkloadEndpoint="localhost-k8s-csi--node--driver--cj9xm-eth0" Jan 13 20:46:33.387526 containerd[1547]: time="2025-01-13T20:46:33.387498223Z" level=info msg="CreateContainer within sandbox \"5cfad9fcd9b9269a4693540503259ab455315be9641b0dfc92349291ffbe8e5a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a85197ae6fc46fb46a3b82a760b454f4bba947d952a85a51ad2dee975fa9691a\"" Jan 13 20:46:33.392049 containerd[1547]: time="2025-01-13T20:46:33.392027237Z" level=info msg="StartContainer for \"a85197ae6fc46fb46a3b82a760b454f4bba947d952a85a51ad2dee975fa9691a\"" Jan 13 20:46:33.396407 containerd[1547]: time="2025-01-13T20:46:33.396377405Z" level=info msg="CreateContainer within sandbox \"d5a3f4ec0c43810a58a61b064b634723278ff5e2cc54e86ec1b10a97aef90b11\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"713a02edcf4f155679f8d3b2c931c06b8bf06fb9d6c19b42d7b481095d2f08ba\"" Jan 13 20:46:33.398449 systemd[1]: Started cri-containerd-0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49.scope - libcontainer container 0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49. Jan 13 20:46:33.401897 containerd[1547]: time="2025-01-13T20:46:33.401872415Z" level=info msg="StartContainer for \"713a02edcf4f155679f8d3b2c931c06b8bf06fb9d6c19b42d7b481095d2f08ba\"" Jan 13 20:46:33.411490 containerd[1547]: time="2025-01-13T20:46:33.411459456Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66f4bb6d79-zhjcz,Uid:0016e978-2a6d-4be0-ad22-af8c555426bb,Namespace:calico-system,Attempt:7,} returns sandbox id \"762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b\"" Jan 13 20:46:33.452440 systemd[1]: Started cri-containerd-a85197ae6fc46fb46a3b82a760b454f4bba947d952a85a51ad2dee975fa9691a.scope - libcontainer container a85197ae6fc46fb46a3b82a760b454f4bba947d952a85a51ad2dee975fa9691a. Jan 13 20:46:33.465877 systemd[1]: Started cri-containerd-713a02edcf4f155679f8d3b2c931c06b8bf06fb9d6c19b42d7b481095d2f08ba.scope - libcontainer container 713a02edcf4f155679f8d3b2c931c06b8bf06fb9d6c19b42d7b481095d2f08ba. Jan 13 20:46:33.470281 containerd[1547]: time="2025-01-13T20:46:33.470094200Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:46:33.470281 containerd[1547]: time="2025-01-13T20:46:33.470137386Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:46:33.470281 containerd[1547]: time="2025-01-13T20:46:33.470147210Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:33.470281 containerd[1547]: time="2025-01-13T20:46:33.470202684Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:46:33.478964 systemd-resolved[1458]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:46:33.497452 systemd[1]: Started cri-containerd-43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514.scope - libcontainer container 43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514. Jan 13 20:46:33.506686 systemd-resolved[1458]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:46:33.519033 containerd[1547]: time="2025-01-13T20:46:33.518962600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cj9xm,Uid:7af6a31b-5e31-40c1-b6a8-196414f83e54,Namespace:calico-system,Attempt:7,} returns sandbox id \"43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514\"" Jan 13 20:46:33.544560 containerd[1547]: time="2025-01-13T20:46:33.544540203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d7876745f-vvftf,Uid:6953b35c-8169-44aa-91cb-7dd3f8f9aade,Namespace:calico-apiserver,Attempt:7,} returns sandbox id \"0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49\"" Jan 13 20:46:33.560049 containerd[1547]: time="2025-01-13T20:46:33.560019888Z" level=info msg="StartContainer for \"a85197ae6fc46fb46a3b82a760b454f4bba947d952a85a51ad2dee975fa9691a\" returns successfully" Jan 13 20:46:33.560149 containerd[1547]: time="2025-01-13T20:46:33.560019843Z" level=info msg="StartContainer for \"713a02edcf4f155679f8d3b2c931c06b8bf06fb9d6c19b42d7b481095d2f08ba\" returns successfully" Jan 13 20:46:33.602020 systemd[1]: run-netns-cni\x2df856e5f9\x2dec9a\x2df418\x2dbc1b\x2dd549eed455b8.mount: Deactivated successfully. Jan 13 20:46:33.602563 systemd[1]: run-netns-cni\x2d4a6bfa79\x2d9ae7\x2d0304\x2d6d03\x2d7cc995807693.mount: Deactivated successfully. Jan 13 20:46:33.688355 kernel: bpftool[5532]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Jan 13 20:46:33.756012 kubelet[2851]: I0113 20:46:33.755990 2851 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-76f75df574-nx8p5" podStartSLOduration=22.755961566 podStartE2EDuration="22.755961566s" podCreationTimestamp="2025-01-13 20:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:46:33.750041917 +0000 UTC m=+37.518979108" watchObservedRunningTime="2025-01-13 20:46:33.755961566 +0000 UTC m=+37.524898752" Jan 13 20:46:33.870540 systemd-networkd[1457]: vxlan.calico: Link UP Jan 13 20:46:33.870545 systemd-networkd[1457]: vxlan.calico: Gained carrier Jan 13 20:46:34.550588 systemd-networkd[1457]: cali22854092921: Gained IPv6LL Jan 13 20:46:34.742506 systemd-networkd[1457]: cali6e99632cdb6: Gained IPv6LL Jan 13 20:46:34.757437 kubelet[2851]: I0113 20:46:34.756997 2851 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-76f75df574-75b2t" podStartSLOduration=23.756969414 podStartE2EDuration="23.756969414s" podCreationTimestamp="2025-01-13 20:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:46:33.75635729 +0000 UTC m=+37.525294473" watchObservedRunningTime="2025-01-13 20:46:34.756969414 +0000 UTC m=+38.525906606" Jan 13 20:46:34.934507 systemd-networkd[1457]: cali42f9f6f3247: Gained IPv6LL Jan 13 20:46:34.998479 systemd-networkd[1457]: caliccc91d2ea55: Gained IPv6LL Jan 13 20:46:35.062682 systemd-networkd[1457]: calibc608138932: Gained IPv6LL Jan 13 20:46:35.126456 systemd-networkd[1457]: cali501e1353892: Gained IPv6LL Jan 13 20:46:35.766474 systemd-networkd[1457]: vxlan.calico: Gained IPv6LL Jan 13 20:46:35.769239 containerd[1547]: time="2025-01-13T20:46:35.768314531Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:35.769239 containerd[1547]: time="2025-01-13T20:46:35.768702597Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Jan 13 20:46:35.769239 containerd[1547]: time="2025-01-13T20:46:35.768788710Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:35.770874 containerd[1547]: time="2025-01-13T20:46:35.770847485Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:35.771421 containerd[1547]: time="2025-01-13T20:46:35.771329325Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 2.435782456s" Jan 13 20:46:35.771421 containerd[1547]: time="2025-01-13T20:46:35.771363073Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 13 20:46:35.771874 containerd[1547]: time="2025-01-13T20:46:35.771857971Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Jan 13 20:46:35.772784 containerd[1547]: time="2025-01-13T20:46:35.772568313Z" level=info msg="CreateContainer within sandbox \"7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 13 20:46:35.789080 containerd[1547]: time="2025-01-13T20:46:35.789034424Z" level=info msg="CreateContainer within sandbox \"7329bed0f7fd078a922eb3f6b2bf0469fd413630c266d1128fca31d1b14bb8a6\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"35ab90f6e249f26700fb3212173bcdb7f67826ab685d704931ee03d4fa80d360\"" Jan 13 20:46:35.790286 containerd[1547]: time="2025-01-13T20:46:35.789391229Z" level=info msg="StartContainer for \"35ab90f6e249f26700fb3212173bcdb7f67826ab685d704931ee03d4fa80d360\"" Jan 13 20:46:35.820875 systemd[1]: run-containerd-runc-k8s.io-35ab90f6e249f26700fb3212173bcdb7f67826ab685d704931ee03d4fa80d360-runc.xDDP1o.mount: Deactivated successfully. Jan 13 20:46:35.826591 systemd[1]: Started cri-containerd-35ab90f6e249f26700fb3212173bcdb7f67826ab685d704931ee03d4fa80d360.scope - libcontainer container 35ab90f6e249f26700fb3212173bcdb7f67826ab685d704931ee03d4fa80d360. Jan 13 20:46:35.869401 containerd[1547]: time="2025-01-13T20:46:35.869374257Z" level=info msg="StartContainer for \"35ab90f6e249f26700fb3212173bcdb7f67826ab685d704931ee03d4fa80d360\" returns successfully" Jan 13 20:46:36.765289 kubelet[2851]: I0113 20:46:36.765263 2851 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5d7876745f-sk2dr" podStartSLOduration=18.327244351 podStartE2EDuration="20.763458803s" podCreationTimestamp="2025-01-13 20:46:16 +0000 UTC" firstStartedPulling="2025-01-13 20:46:33.335388318 +0000 UTC m=+37.104325500" lastFinishedPulling="2025-01-13 20:46:35.771602769 +0000 UTC m=+39.540539952" observedRunningTime="2025-01-13 20:46:36.763001501 +0000 UTC m=+40.531938700" watchObservedRunningTime="2025-01-13 20:46:36.763458803 +0000 UTC m=+40.532395988" Jan 13 20:46:37.747807 containerd[1547]: time="2025-01-13T20:46:37.747779709Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:37.748586 containerd[1547]: time="2025-01-13T20:46:37.748565852Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Jan 13 20:46:37.748834 containerd[1547]: time="2025-01-13T20:46:37.748811786Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:37.755721 containerd[1547]: time="2025-01-13T20:46:37.755705217Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:37.756284 containerd[1547]: time="2025-01-13T20:46:37.756194076Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 1.984318715s" Jan 13 20:46:37.756284 containerd[1547]: time="2025-01-13T20:46:37.756208725Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Jan 13 20:46:37.756672 containerd[1547]: time="2025-01-13T20:46:37.756661615Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Jan 13 20:46:37.762374 kubelet[2851]: I0113 20:46:37.762354 2851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:46:37.776126 containerd[1547]: time="2025-01-13T20:46:37.776097308Z" level=info msg="CreateContainer within sandbox \"762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 13 20:46:37.782558 containerd[1547]: time="2025-01-13T20:46:37.782539807Z" level=info msg="CreateContainer within sandbox \"762a125e39f7263ce46f2c0aeca11c0775a5f29386e71e5d1d0fb35b9a65253b\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"9a2be02f98c8f4fb7e58fa074c9f5080f07f563b244bbc801e2ad67aa81d0b18\"" Jan 13 20:46:37.783436 containerd[1547]: time="2025-01-13T20:46:37.783391676Z" level=info msg="StartContainer for \"9a2be02f98c8f4fb7e58fa074c9f5080f07f563b244bbc801e2ad67aa81d0b18\"" Jan 13 20:46:37.809422 systemd[1]: Started cri-containerd-9a2be02f98c8f4fb7e58fa074c9f5080f07f563b244bbc801e2ad67aa81d0b18.scope - libcontainer container 9a2be02f98c8f4fb7e58fa074c9f5080f07f563b244bbc801e2ad67aa81d0b18. Jan 13 20:46:37.836115 containerd[1547]: time="2025-01-13T20:46:37.836086015Z" level=info msg="StartContainer for \"9a2be02f98c8f4fb7e58fa074c9f5080f07f563b244bbc801e2ad67aa81d0b18\" returns successfully" Jan 13 20:46:39.220010 containerd[1547]: time="2025-01-13T20:46:39.219977560Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:39.220788 containerd[1547]: time="2025-01-13T20:46:39.220770673Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Jan 13 20:46:39.221043 containerd[1547]: time="2025-01-13T20:46:39.221031860Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:39.230123 containerd[1547]: time="2025-01-13T20:46:39.230101046Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:39.230648 containerd[1547]: time="2025-01-13T20:46:39.230510867Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.473794358s" Jan 13 20:46:39.230648 containerd[1547]: time="2025-01-13T20:46:39.230528308Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Jan 13 20:46:39.231071 containerd[1547]: time="2025-01-13T20:46:39.231059483Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 13 20:46:39.236168 containerd[1547]: time="2025-01-13T20:46:39.236147556Z" level=info msg="CreateContainer within sandbox \"43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jan 13 20:46:39.301296 containerd[1547]: time="2025-01-13T20:46:39.301263625Z" level=info msg="CreateContainer within sandbox \"43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"e95ed0ee6b87307924705a38766f6ff3ba1db86d9dfa0af79c20e219d1972864\"" Jan 13 20:46:39.302411 containerd[1547]: time="2025-01-13T20:46:39.301760404Z" level=info msg="StartContainer for \"e95ed0ee6b87307924705a38766f6ff3ba1db86d9dfa0af79c20e219d1972864\"" Jan 13 20:46:39.324574 systemd[1]: Started cri-containerd-e95ed0ee6b87307924705a38766f6ff3ba1db86d9dfa0af79c20e219d1972864.scope - libcontainer container e95ed0ee6b87307924705a38766f6ff3ba1db86d9dfa0af79c20e219d1972864. Jan 13 20:46:39.342874 containerd[1547]: time="2025-01-13T20:46:39.342833374Z" level=info msg="StartContainer for \"e95ed0ee6b87307924705a38766f6ff3ba1db86d9dfa0af79c20e219d1972864\" returns successfully" Jan 13 20:46:39.667206 containerd[1547]: time="2025-01-13T20:46:39.667097727Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:39.667548 containerd[1547]: time="2025-01-13T20:46:39.667516362Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Jan 13 20:46:39.669477 containerd[1547]: time="2025-01-13T20:46:39.669454252Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 438.379395ms" Jan 13 20:46:39.669538 containerd[1547]: time="2025-01-13T20:46:39.669479671Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 13 20:46:39.670121 containerd[1547]: time="2025-01-13T20:46:39.669885090Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Jan 13 20:46:39.677593 containerd[1547]: time="2025-01-13T20:46:39.677458936Z" level=info msg="CreateContainer within sandbox \"0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 13 20:46:39.691832 containerd[1547]: time="2025-01-13T20:46:39.691801410Z" level=info msg="CreateContainer within sandbox \"0baefae873c7457961dc22aba98aa00b7e036d7f3f571d13eadaa29a29334e49\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"69a71334e712854a7c7c21fe20b46622557f6d9367876b5ab761b57e6e3395a7\"" Jan 13 20:46:39.694295 containerd[1547]: time="2025-01-13T20:46:39.692021438Z" level=info msg="StartContainer for \"69a71334e712854a7c7c21fe20b46622557f6d9367876b5ab761b57e6e3395a7\"" Jan 13 20:46:39.710424 systemd[1]: Started cri-containerd-69a71334e712854a7c7c21fe20b46622557f6d9367876b5ab761b57e6e3395a7.scope - libcontainer container 69a71334e712854a7c7c21fe20b46622557f6d9367876b5ab761b57e6e3395a7. Jan 13 20:46:39.737683 containerd[1547]: time="2025-01-13T20:46:39.737626327Z" level=info msg="StartContainer for \"69a71334e712854a7c7c21fe20b46622557f6d9367876b5ab761b57e6e3395a7\" returns successfully" Jan 13 20:46:39.778875 kubelet[2851]: I0113 20:46:39.778846 2851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:46:39.786209 kubelet[2851]: I0113 20:46:39.786191 2851 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5d7876745f-vvftf" podStartSLOduration=17.663251248999998 podStartE2EDuration="23.786167799s" podCreationTimestamp="2025-01-13 20:46:16 +0000 UTC" firstStartedPulling="2025-01-13 20:46:33.546804084 +0000 UTC m=+37.315741266" lastFinishedPulling="2025-01-13 20:46:39.669720628 +0000 UTC m=+43.438657816" observedRunningTime="2025-01-13 20:46:39.785643533 +0000 UTC m=+43.554580718" watchObservedRunningTime="2025-01-13 20:46:39.786167799 +0000 UTC m=+43.555104985" Jan 13 20:46:39.788390 kubelet[2851]: I0113 20:46:39.787112 2851 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-66f4bb6d79-zhjcz" podStartSLOduration=19.443123719 podStartE2EDuration="23.787085081s" podCreationTimestamp="2025-01-13 20:46:16 +0000 UTC" firstStartedPulling="2025-01-13 20:46:33.412506928 +0000 UTC m=+37.181444110" lastFinishedPulling="2025-01-13 20:46:37.75646829 +0000 UTC m=+41.525405472" observedRunningTime="2025-01-13 20:46:38.78198894 +0000 UTC m=+42.550926132" watchObservedRunningTime="2025-01-13 20:46:39.787085081 +0000 UTC m=+43.556022267" Jan 13 20:46:40.781152 kubelet[2851]: I0113 20:46:40.780957 2851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:46:41.408128 kubelet[2851]: I0113 20:46:41.408106 2851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:46:41.733318 containerd[1547]: time="2025-01-13T20:46:41.732849713Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:41.733318 containerd[1547]: time="2025-01-13T20:46:41.733217679Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Jan 13 20:46:41.733318 containerd[1547]: time="2025-01-13T20:46:41.733288945Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:41.734640 containerd[1547]: time="2025-01-13T20:46:41.734569703Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:46:41.735005 containerd[1547]: time="2025-01-13T20:46:41.734987887Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 2.065079095s" Jan 13 20:46:41.735040 containerd[1547]: time="2025-01-13T20:46:41.735006489Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Jan 13 20:46:41.743577 containerd[1547]: time="2025-01-13T20:46:41.743555938Z" level=info msg="CreateContainer within sandbox \"43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jan 13 20:46:41.749651 containerd[1547]: time="2025-01-13T20:46:41.749633748Z" level=info msg="CreateContainer within sandbox \"43b2b0ce5f00a5fbe9afa4acdaee0ed8cf6f22a17f5cf7940dd4798efd207514\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"14fbd86f97307d1173a9f1aba8b0401708660f845c594ef476b4b1c425f770e8\"" Jan 13 20:46:41.751149 containerd[1547]: time="2025-01-13T20:46:41.750415821Z" level=info msg="StartContainer for \"14fbd86f97307d1173a9f1aba8b0401708660f845c594ef476b4b1c425f770e8\"" Jan 13 20:46:41.775518 systemd[1]: Started cri-containerd-14fbd86f97307d1173a9f1aba8b0401708660f845c594ef476b4b1c425f770e8.scope - libcontainer container 14fbd86f97307d1173a9f1aba8b0401708660f845c594ef476b4b1c425f770e8. Jan 13 20:46:41.796419 containerd[1547]: time="2025-01-13T20:46:41.796394515Z" level=info msg="StartContainer for \"14fbd86f97307d1173a9f1aba8b0401708660f845c594ef476b4b1c425f770e8\" returns successfully" Jan 13 20:46:42.507569 kubelet[2851]: I0113 20:46:42.507548 2851 csi_plugin.go:99] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jan 13 20:46:42.512361 kubelet[2851]: I0113 20:46:42.512349 2851 csi_plugin.go:112] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jan 13 20:46:42.802293 kubelet[2851]: I0113 20:46:42.802218 2851 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/csi-node-driver-cj9xm" podStartSLOduration=18.587343575 podStartE2EDuration="26.802190088s" podCreationTimestamp="2025-01-13 20:46:16 +0000 UTC" firstStartedPulling="2025-01-13 20:46:33.520313877 +0000 UTC m=+37.289251059" lastFinishedPulling="2025-01-13 20:46:41.73516039 +0000 UTC m=+45.504097572" observedRunningTime="2025-01-13 20:46:42.80192417 +0000 UTC m=+46.570861371" watchObservedRunningTime="2025-01-13 20:46:42.802190088 +0000 UTC m=+46.571127279" Jan 13 20:46:43.926165 kubelet[2851]: I0113 20:46:43.926006 2851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:46:45.141578 kubelet[2851]: I0113 20:46:45.141548 2851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:46:45.211777 systemd[1]: run-containerd-runc-k8s.io-5889317b66f78c10cf1f9280d374bee8b8e84dc0fa0d5fc336a6c748bbcf0bd1-runc.9GsAFY.mount: Deactivated successfully. Jan 13 20:46:56.387370 containerd[1547]: time="2025-01-13T20:46:56.386832140Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\"" Jan 13 20:46:56.387370 containerd[1547]: time="2025-01-13T20:46:56.386910641Z" level=info msg="TearDown network for sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" successfully" Jan 13 20:46:56.387370 containerd[1547]: time="2025-01-13T20:46:56.386918404Z" level=info msg="StopPodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" returns successfully" Jan 13 20:46:56.399223 containerd[1547]: time="2025-01-13T20:46:56.399200746Z" level=info msg="RemovePodSandbox for \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\"" Jan 13 20:46:56.404681 containerd[1547]: time="2025-01-13T20:46:56.404666111Z" level=info msg="Forcibly stopping sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\"" Jan 13 20:46:56.414552 containerd[1547]: time="2025-01-13T20:46:56.404720725Z" level=info msg="TearDown network for sandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" successfully" Jan 13 20:46:56.419237 containerd[1547]: time="2025-01-13T20:46:56.419218408Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.426369 containerd[1547]: time="2025-01-13T20:46:56.426274400Z" level=info msg="RemovePodSandbox \"183abd5efba79e657c3a30001dd0f01ad06f57d1c1a211b14c3a1fd2dad38f87\" returns successfully" Jan 13 20:46:56.426967 containerd[1547]: time="2025-01-13T20:46:56.426701517Z" level=info msg="StopPodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\"" Jan 13 20:46:56.426967 containerd[1547]: time="2025-01-13T20:46:56.426742832Z" level=info msg="TearDown network for sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" successfully" Jan 13 20:46:56.426967 containerd[1547]: time="2025-01-13T20:46:56.426749010Z" level=info msg="StopPodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" returns successfully" Jan 13 20:46:56.426967 containerd[1547]: time="2025-01-13T20:46:56.426885187Z" level=info msg="RemovePodSandbox for \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\"" Jan 13 20:46:56.426967 containerd[1547]: time="2025-01-13T20:46:56.426896067Z" level=info msg="Forcibly stopping sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\"" Jan 13 20:46:56.427492 containerd[1547]: time="2025-01-13T20:46:56.427099102Z" level=info msg="TearDown network for sandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" successfully" Jan 13 20:46:56.428617 containerd[1547]: time="2025-01-13T20:46:56.428606712Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.428917 containerd[1547]: time="2025-01-13T20:46:56.428763902Z" level=info msg="RemovePodSandbox \"a1973b644884dccaac6bc6d0499ee23046fc43ff700dd7578832ffd1a7c97920\" returns successfully" Jan 13 20:46:56.429004 containerd[1547]: time="2025-01-13T20:46:56.428994860Z" level=info msg="StopPodSandbox for \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\"" Jan 13 20:46:56.429150 containerd[1547]: time="2025-01-13T20:46:56.429141007Z" level=info msg="TearDown network for sandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\" successfully" Jan 13 20:46:56.429304 containerd[1547]: time="2025-01-13T20:46:56.429262760Z" level=info msg="StopPodSandbox for \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\" returns successfully" Jan 13 20:46:56.429573 containerd[1547]: time="2025-01-13T20:46:56.429515845Z" level=info msg="RemovePodSandbox for \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\"" Jan 13 20:46:56.429573 containerd[1547]: time="2025-01-13T20:46:56.429527362Z" level=info msg="Forcibly stopping sandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\"" Jan 13 20:46:56.429807 containerd[1547]: time="2025-01-13T20:46:56.429737226Z" level=info msg="TearDown network for sandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\" successfully" Jan 13 20:46:56.431379 containerd[1547]: time="2025-01-13T20:46:56.431280679Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.431379 containerd[1547]: time="2025-01-13T20:46:56.431311100Z" level=info msg="RemovePodSandbox \"43344537d363c938e292a9607eee1a5f1072ec085ec7bbc09cd3822ac88579d5\" returns successfully" Jan 13 20:46:56.431879 containerd[1547]: time="2025-01-13T20:46:56.431703854Z" level=info msg="StopPodSandbox for \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\"" Jan 13 20:46:56.431879 containerd[1547]: time="2025-01-13T20:46:56.431747661Z" level=info msg="TearDown network for sandbox \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\" successfully" Jan 13 20:46:56.431879 containerd[1547]: time="2025-01-13T20:46:56.431753950Z" level=info msg="StopPodSandbox for \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\" returns successfully" Jan 13 20:46:56.431879 containerd[1547]: time="2025-01-13T20:46:56.431857030Z" level=info msg="RemovePodSandbox for \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\"" Jan 13 20:46:56.432571 containerd[1547]: time="2025-01-13T20:46:56.431976636Z" level=info msg="Forcibly stopping sandbox \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\"" Jan 13 20:46:56.432571 containerd[1547]: time="2025-01-13T20:46:56.432009850Z" level=info msg="TearDown network for sandbox \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\" successfully" Jan 13 20:46:56.434458 containerd[1547]: time="2025-01-13T20:46:56.433132317Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.434458 containerd[1547]: time="2025-01-13T20:46:56.433150534Z" level=info msg="RemovePodSandbox \"ffedff343ae1a751bc4839f975a72abff80afbb520493cfcdc7c1e9129e37117\" returns successfully" Jan 13 20:46:56.434458 containerd[1547]: time="2025-01-13T20:46:56.433276635Z" level=info msg="StopPodSandbox for \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\"" Jan 13 20:46:56.434458 containerd[1547]: time="2025-01-13T20:46:56.433332635Z" level=info msg="TearDown network for sandbox \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\" successfully" Jan 13 20:46:56.434458 containerd[1547]: time="2025-01-13T20:46:56.433381919Z" level=info msg="StopPodSandbox for \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\" returns successfully" Jan 13 20:46:56.434458 containerd[1547]: time="2025-01-13T20:46:56.433538101Z" level=info msg="RemovePodSandbox for \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\"" Jan 13 20:46:56.434458 containerd[1547]: time="2025-01-13T20:46:56.433551807Z" level=info msg="Forcibly stopping sandbox \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\"" Jan 13 20:46:56.434458 containerd[1547]: time="2025-01-13T20:46:56.433582758Z" level=info msg="TearDown network for sandbox \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\" successfully" Jan 13 20:46:56.438895 containerd[1547]: time="2025-01-13T20:46:56.434949206Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.438895 containerd[1547]: time="2025-01-13T20:46:56.434978809Z" level=info msg="RemovePodSandbox \"80ceaac677b5428f89d2d5bbe9fa8df884739d2dee540691783cdd6a496f9f45\" returns successfully" Jan 13 20:46:56.438895 containerd[1547]: time="2025-01-13T20:46:56.435140734Z" level=info msg="StopPodSandbox for \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\"" Jan 13 20:46:56.438895 containerd[1547]: time="2025-01-13T20:46:56.435222968Z" level=info msg="TearDown network for sandbox \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\" successfully" Jan 13 20:46:56.438895 containerd[1547]: time="2025-01-13T20:46:56.435230838Z" level=info msg="StopPodSandbox for \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\" returns successfully" Jan 13 20:46:56.438895 containerd[1547]: time="2025-01-13T20:46:56.435393606Z" level=info msg="RemovePodSandbox for \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\"" Jan 13 20:46:56.438895 containerd[1547]: time="2025-01-13T20:46:56.435579050Z" level=info msg="Forcibly stopping sandbox \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\"" Jan 13 20:46:56.438895 containerd[1547]: time="2025-01-13T20:46:56.435633215Z" level=info msg="TearDown network for sandbox \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\" successfully" Jan 13 20:46:56.442474 containerd[1547]: time="2025-01-13T20:46:56.441119932Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.442474 containerd[1547]: time="2025-01-13T20:46:56.441157770Z" level=info msg="RemovePodSandbox \"933c25212959aceb6d25fb2dd1ae59533855e8f3f7e5d6eb80dd9ac9ba997ec6\" returns successfully" Jan 13 20:46:56.443132 containerd[1547]: time="2025-01-13T20:46:56.443119225Z" level=info msg="StopPodSandbox for \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\"" Jan 13 20:46:56.446867 containerd[1547]: time="2025-01-13T20:46:56.443252845Z" level=info msg="TearDown network for sandbox \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\" successfully" Jan 13 20:46:56.446867 containerd[1547]: time="2025-01-13T20:46:56.446791491Z" level=info msg="StopPodSandbox for \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\" returns successfully" Jan 13 20:46:56.447607 containerd[1547]: time="2025-01-13T20:46:56.446933115Z" level=info msg="RemovePodSandbox for \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\"" Jan 13 20:46:56.447607 containerd[1547]: time="2025-01-13T20:46:56.446944555Z" level=info msg="Forcibly stopping sandbox \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\"" Jan 13 20:46:56.447607 containerd[1547]: time="2025-01-13T20:46:56.446975159Z" level=info msg="TearDown network for sandbox \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\" successfully" Jan 13 20:46:56.448582 containerd[1547]: time="2025-01-13T20:46:56.448570057Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.448738 containerd[1547]: time="2025-01-13T20:46:56.448683803Z" level=info msg="RemovePodSandbox \"250efb38e834590ec7388828e0266150a1c75b577290c79ac3b32b8efc78cdbc\" returns successfully" Jan 13 20:46:56.448873 containerd[1547]: time="2025-01-13T20:46:56.448811637Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\"" Jan 13 20:46:56.448873 containerd[1547]: time="2025-01-13T20:46:56.448849948Z" level=info msg="TearDown network for sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" successfully" Jan 13 20:46:56.448873 containerd[1547]: time="2025-01-13T20:46:56.448856391Z" level=info msg="StopPodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" returns successfully" Jan 13 20:46:56.449829 containerd[1547]: time="2025-01-13T20:46:56.449048859Z" level=info msg="RemovePodSandbox for \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\"" Jan 13 20:46:56.449829 containerd[1547]: time="2025-01-13T20:46:56.449066677Z" level=info msg="Forcibly stopping sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\"" Jan 13 20:46:56.449829 containerd[1547]: time="2025-01-13T20:46:56.449096937Z" level=info msg="TearDown network for sandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" successfully" Jan 13 20:46:56.450288 containerd[1547]: time="2025-01-13T20:46:56.450275469Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.450371 containerd[1547]: time="2025-01-13T20:46:56.450360547Z" level=info msg="RemovePodSandbox \"67835d2442823bb7bb94f269a4a48862c4d134380aff2f0cf7697d6d80d8164a\" returns successfully" Jan 13 20:46:56.450612 containerd[1547]: time="2025-01-13T20:46:56.450604050Z" level=info msg="StopPodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\"" Jan 13 20:46:56.450687 containerd[1547]: time="2025-01-13T20:46:56.450678619Z" level=info msg="TearDown network for sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" successfully" Jan 13 20:46:56.450719 containerd[1547]: time="2025-01-13T20:46:56.450713147Z" level=info msg="StopPodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" returns successfully" Jan 13 20:46:56.450864 containerd[1547]: time="2025-01-13T20:46:56.450854997Z" level=info msg="RemovePodSandbox for \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\"" Jan 13 20:46:56.450938 containerd[1547]: time="2025-01-13T20:46:56.450910056Z" level=info msg="Forcibly stopping sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\"" Jan 13 20:46:56.451012 containerd[1547]: time="2025-01-13T20:46:56.450995848Z" level=info msg="TearDown network for sandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" successfully" Jan 13 20:46:56.452171 containerd[1547]: time="2025-01-13T20:46:56.452159972Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.452257 containerd[1547]: time="2025-01-13T20:46:56.452246646Z" level=info msg="RemovePodSandbox \"648b33789587b810aec4992ed85559918186e1ddf2ac23864fae70161b68cac3\" returns successfully" Jan 13 20:46:56.452425 containerd[1547]: time="2025-01-13T20:46:56.452415378Z" level=info msg="StopPodSandbox for \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\"" Jan 13 20:46:56.452539 containerd[1547]: time="2025-01-13T20:46:56.452529915Z" level=info msg="TearDown network for sandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\" successfully" Jan 13 20:46:56.452580 containerd[1547]: time="2025-01-13T20:46:56.452573196Z" level=info msg="StopPodSandbox for \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\" returns successfully" Jan 13 20:46:56.452722 containerd[1547]: time="2025-01-13T20:46:56.452712825Z" level=info msg="RemovePodSandbox for \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\"" Jan 13 20:46:56.452788 containerd[1547]: time="2025-01-13T20:46:56.452772843Z" level=info msg="Forcibly stopping sandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\"" Jan 13 20:46:56.452858 containerd[1547]: time="2025-01-13T20:46:56.452841775Z" level=info msg="TearDown network for sandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\" successfully" Jan 13 20:46:56.453973 containerd[1547]: time="2025-01-13T20:46:56.453962183Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.454042 containerd[1547]: time="2025-01-13T20:46:56.454032434Z" level=info msg="RemovePodSandbox \"981f0f43f2295fc31ae94da854f49c034a6df112964b8553607a927f7738fb33\" returns successfully" Jan 13 20:46:56.454192 containerd[1547]: time="2025-01-13T20:46:56.454182869Z" level=info msg="StopPodSandbox for \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\"" Jan 13 20:46:56.454315 containerd[1547]: time="2025-01-13T20:46:56.454289271Z" level=info msg="TearDown network for sandbox \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\" successfully" Jan 13 20:46:56.454389 containerd[1547]: time="2025-01-13T20:46:56.454359389Z" level=info msg="StopPodSandbox for \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\" returns successfully" Jan 13 20:46:56.455336 containerd[1547]: time="2025-01-13T20:46:56.454459552Z" level=info msg="RemovePodSandbox for \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\"" Jan 13 20:46:56.455336 containerd[1547]: time="2025-01-13T20:46:56.454471372Z" level=info msg="Forcibly stopping sandbox \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\"" Jan 13 20:46:56.455336 containerd[1547]: time="2025-01-13T20:46:56.454525990Z" level=info msg="TearDown network for sandbox \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\" successfully" Jan 13 20:46:56.455734 containerd[1547]: time="2025-01-13T20:46:56.455722092Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.455785 containerd[1547]: time="2025-01-13T20:46:56.455777432Z" level=info msg="RemovePodSandbox \"baa074ebbc4352eb13803df2aee1accfb3368e6818ce9c41b166171bddba6d51\" returns successfully" Jan 13 20:46:56.455940 containerd[1547]: time="2025-01-13T20:46:56.455924273Z" level=info msg="StopPodSandbox for \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\"" Jan 13 20:46:56.456000 containerd[1547]: time="2025-01-13T20:46:56.455987615Z" level=info msg="TearDown network for sandbox \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\" successfully" Jan 13 20:46:56.456000 containerd[1547]: time="2025-01-13T20:46:56.455998034Z" level=info msg="StopPodSandbox for \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\" returns successfully" Jan 13 20:46:56.456207 containerd[1547]: time="2025-01-13T20:46:56.456185306Z" level=info msg="RemovePodSandbox for \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\"" Jan 13 20:46:56.456238 containerd[1547]: time="2025-01-13T20:46:56.456209444Z" level=info msg="Forcibly stopping sandbox \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\"" Jan 13 20:46:56.456273 containerd[1547]: time="2025-01-13T20:46:56.456251704Z" level=info msg="TearDown network for sandbox \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\" successfully" Jan 13 20:46:56.458610 containerd[1547]: time="2025-01-13T20:46:56.458592543Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.458648 containerd[1547]: time="2025-01-13T20:46:56.458621509Z" level=info msg="RemovePodSandbox \"0baeb54d5a192b2d4433849a5f1b95fe01de64ab820079be88b1d9ccd63fc6fd\" returns successfully" Jan 13 20:46:56.458766 containerd[1547]: time="2025-01-13T20:46:56.458744502Z" level=info msg="StopPodSandbox for \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\"" Jan 13 20:46:56.458897 containerd[1547]: time="2025-01-13T20:46:56.458861708Z" level=info msg="TearDown network for sandbox \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\" successfully" Jan 13 20:46:56.458897 containerd[1547]: time="2025-01-13T20:46:56.458870468Z" level=info msg="StopPodSandbox for \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\" returns successfully" Jan 13 20:46:56.459057 containerd[1547]: time="2025-01-13T20:46:56.459042184Z" level=info msg="RemovePodSandbox for \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\"" Jan 13 20:46:56.459057 containerd[1547]: time="2025-01-13T20:46:56.459054557Z" level=info msg="Forcibly stopping sandbox \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\"" Jan 13 20:46:56.459116 containerd[1547]: time="2025-01-13T20:46:56.459093096Z" level=info msg="TearDown network for sandbox \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\" successfully" Jan 13 20:46:56.460150 containerd[1547]: time="2025-01-13T20:46:56.460134364Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.460184 containerd[1547]: time="2025-01-13T20:46:56.460160798Z" level=info msg="RemovePodSandbox \"ad35edfaae15163d7acfff2c56470e3c076075167075c3d5b944df13239b5a9d\" returns successfully" Jan 13 20:46:56.460314 containerd[1547]: time="2025-01-13T20:46:56.460303679Z" level=info msg="StopPodSandbox for \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\"" Jan 13 20:46:56.460469 containerd[1547]: time="2025-01-13T20:46:56.460413030Z" level=info msg="TearDown network for sandbox \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\" successfully" Jan 13 20:46:56.460469 containerd[1547]: time="2025-01-13T20:46:56.460422007Z" level=info msg="StopPodSandbox for \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\" returns successfully" Jan 13 20:46:56.460807 containerd[1547]: time="2025-01-13T20:46:56.460566654Z" level=info msg="RemovePodSandbox for \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\"" Jan 13 20:46:56.460807 containerd[1547]: time="2025-01-13T20:46:56.460606059Z" level=info msg="Forcibly stopping sandbox \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\"" Jan 13 20:46:56.460807 containerd[1547]: time="2025-01-13T20:46:56.460660202Z" level=info msg="TearDown network for sandbox \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\" successfully" Jan 13 20:46:56.461753 containerd[1547]: time="2025-01-13T20:46:56.461737189Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.461780 containerd[1547]: time="2025-01-13T20:46:56.461763743Z" level=info msg="RemovePodSandbox \"66845289d1dd946124b3bc191ee0788969651fadb122555f8c45cd187d0398dd\" returns successfully" Jan 13 20:46:56.462022 containerd[1547]: time="2025-01-13T20:46:56.461938509Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\"" Jan 13 20:46:56.462022 containerd[1547]: time="2025-01-13T20:46:56.461980415Z" level=info msg="TearDown network for sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" successfully" Jan 13 20:46:56.462022 containerd[1547]: time="2025-01-13T20:46:56.461987188Z" level=info msg="StopPodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" returns successfully" Jan 13 20:46:56.462204 containerd[1547]: time="2025-01-13T20:46:56.462139900Z" level=info msg="RemovePodSandbox for \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\"" Jan 13 20:46:56.462204 containerd[1547]: time="2025-01-13T20:46:56.462151256Z" level=info msg="Forcibly stopping sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\"" Jan 13 20:46:56.462629 containerd[1547]: time="2025-01-13T20:46:56.462256571Z" level=info msg="TearDown network for sandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" successfully" Jan 13 20:46:56.463409 containerd[1547]: time="2025-01-13T20:46:56.463398145Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.463485 containerd[1547]: time="2025-01-13T20:46:56.463475240Z" level=info msg="RemovePodSandbox \"4a2e87ae28a85db124d425eb01129c2790255c9d11bdcd10eb84c9f096e58ab2\" returns successfully" Jan 13 20:46:56.463671 containerd[1547]: time="2025-01-13T20:46:56.463663068Z" level=info msg="StopPodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\"" Jan 13 20:46:56.463847 containerd[1547]: time="2025-01-13T20:46:56.463838384Z" level=info msg="TearDown network for sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" successfully" Jan 13 20:46:56.463893 containerd[1547]: time="2025-01-13T20:46:56.463886799Z" level=info msg="StopPodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" returns successfully" Jan 13 20:46:56.464053 containerd[1547]: time="2025-01-13T20:46:56.464041030Z" level=info msg="RemovePodSandbox for \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\"" Jan 13 20:46:56.464078 containerd[1547]: time="2025-01-13T20:46:56.464053917Z" level=info msg="Forcibly stopping sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\"" Jan 13 20:46:56.464139 containerd[1547]: time="2025-01-13T20:46:56.464089681Z" level=info msg="TearDown network for sandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" successfully" Jan 13 20:46:56.465198 containerd[1547]: time="2025-01-13T20:46:56.465183761Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.465240 containerd[1547]: time="2025-01-13T20:46:56.465204259Z" level=info msg="RemovePodSandbox \"44e7e1f03988ddcb7ad49427d0a921182c0e85d80cd9b58ad812c6a639433bb3\" returns successfully" Jan 13 20:46:56.465491 containerd[1547]: time="2025-01-13T20:46:56.465375069Z" level=info msg="StopPodSandbox for \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\"" Jan 13 20:46:56.465491 containerd[1547]: time="2025-01-13T20:46:56.465413899Z" level=info msg="TearDown network for sandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\" successfully" Jan 13 20:46:56.465491 containerd[1547]: time="2025-01-13T20:46:56.465419958Z" level=info msg="StopPodSandbox for \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\" returns successfully" Jan 13 20:46:56.466329 containerd[1547]: time="2025-01-13T20:46:56.465613890Z" level=info msg="RemovePodSandbox for \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\"" Jan 13 20:46:56.466329 containerd[1547]: time="2025-01-13T20:46:56.465625808Z" level=info msg="Forcibly stopping sandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\"" Jan 13 20:46:56.466329 containerd[1547]: time="2025-01-13T20:46:56.465657995Z" level=info msg="TearDown network for sandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\" successfully" Jan 13 20:46:56.466869 containerd[1547]: time="2025-01-13T20:46:56.466856937Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.466925 containerd[1547]: time="2025-01-13T20:46:56.466916364Z" level=info msg="RemovePodSandbox \"5e01187660d6864f70a1b70768f4543abd6d562424e8b4a816055e87801a594d\" returns successfully" Jan 13 20:46:56.467150 containerd[1547]: time="2025-01-13T20:46:56.467135474Z" level=info msg="StopPodSandbox for \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\"" Jan 13 20:46:56.467217 containerd[1547]: time="2025-01-13T20:46:56.467204607Z" level=info msg="TearDown network for sandbox \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\" successfully" Jan 13 20:46:56.467217 containerd[1547]: time="2025-01-13T20:46:56.467214688Z" level=info msg="StopPodSandbox for \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\" returns successfully" Jan 13 20:46:56.467409 containerd[1547]: time="2025-01-13T20:46:56.467394928Z" level=info msg="RemovePodSandbox for \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\"" Jan 13 20:46:56.467409 containerd[1547]: time="2025-01-13T20:46:56.467408483Z" level=info msg="Forcibly stopping sandbox \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\"" Jan 13 20:46:56.467460 containerd[1547]: time="2025-01-13T20:46:56.467436964Z" level=info msg="TearDown network for sandbox \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\" successfully" Jan 13 20:46:56.468498 containerd[1547]: time="2025-01-13T20:46:56.468482468Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.468528 containerd[1547]: time="2025-01-13T20:46:56.468503271Z" level=info msg="RemovePodSandbox \"94c21d75c41fb153daf5e0289bb47ee294c600ac29335c5ae6a3e0d1331b0f96\" returns successfully" Jan 13 20:46:56.468671 containerd[1547]: time="2025-01-13T20:46:56.468661498Z" level=info msg="StopPodSandbox for \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\"" Jan 13 20:46:56.468777 containerd[1547]: time="2025-01-13T20:46:56.468740322Z" level=info msg="TearDown network for sandbox \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\" successfully" Jan 13 20:46:56.468777 containerd[1547]: time="2025-01-13T20:46:56.468748591Z" level=info msg="StopPodSandbox for \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\" returns successfully" Jan 13 20:46:56.468882 containerd[1547]: time="2025-01-13T20:46:56.468870117Z" level=info msg="RemovePodSandbox for \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\"" Jan 13 20:46:56.469139 containerd[1547]: time="2025-01-13T20:46:56.468888002Z" level=info msg="Forcibly stopping sandbox \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\"" Jan 13 20:46:56.469139 containerd[1547]: time="2025-01-13T20:46:56.468919429Z" level=info msg="TearDown network for sandbox \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\" successfully" Jan 13 20:46:56.472293 containerd[1547]: time="2025-01-13T20:46:56.472278303Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.472404 containerd[1547]: time="2025-01-13T20:46:56.472302717Z" level=info msg="RemovePodSandbox \"5d8990cea75eac1b394731e73d3efa7fb7d651f92bbe925c0276a3a52b8f4e5f\" returns successfully" Jan 13 20:46:56.472432 containerd[1547]: time="2025-01-13T20:46:56.472423449Z" level=info msg="StopPodSandbox for \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\"" Jan 13 20:46:56.472722 containerd[1547]: time="2025-01-13T20:46:56.472467489Z" level=info msg="TearDown network for sandbox \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\" successfully" Jan 13 20:46:56.472722 containerd[1547]: time="2025-01-13T20:46:56.472475536Z" level=info msg="StopPodSandbox for \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\" returns successfully" Jan 13 20:46:56.472722 containerd[1547]: time="2025-01-13T20:46:56.472574829Z" level=info msg="RemovePodSandbox for \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\"" Jan 13 20:46:56.472722 containerd[1547]: time="2025-01-13T20:46:56.472602620Z" level=info msg="Forcibly stopping sandbox \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\"" Jan 13 20:46:56.472722 containerd[1547]: time="2025-01-13T20:46:56.472649101Z" level=info msg="TearDown network for sandbox \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\" successfully" Jan 13 20:46:56.474084 containerd[1547]: time="2025-01-13T20:46:56.474072030Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.474139 containerd[1547]: time="2025-01-13T20:46:56.474130902Z" level=info msg="RemovePodSandbox \"0604e9192eed0022ac0db0ccefd063aab55a80667140c820eafb3d9eca783e42\" returns successfully" Jan 13 20:46:56.474321 containerd[1547]: time="2025-01-13T20:46:56.474301996Z" level=info msg="StopPodSandbox for \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\"" Jan 13 20:46:56.474415 containerd[1547]: time="2025-01-13T20:46:56.474393012Z" level=info msg="TearDown network for sandbox \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\" successfully" Jan 13 20:46:56.474454 containerd[1547]: time="2025-01-13T20:46:56.474447382Z" level=info msg="StopPodSandbox for \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\" returns successfully" Jan 13 20:46:56.474720 containerd[1547]: time="2025-01-13T20:46:56.474705120Z" level=info msg="RemovePodSandbox for \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\"" Jan 13 20:46:56.474720 containerd[1547]: time="2025-01-13T20:46:56.474718865Z" level=info msg="Forcibly stopping sandbox \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\"" Jan 13 20:46:56.474785 containerd[1547]: time="2025-01-13T20:46:56.474762071Z" level=info msg="TearDown network for sandbox \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\" successfully" Jan 13 20:46:56.475843 containerd[1547]: time="2025-01-13T20:46:56.475828659Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.475875 containerd[1547]: time="2025-01-13T20:46:56.475849278Z" level=info msg="RemovePodSandbox \"d13fadc9bc6ead04a377c3ddeb119a35afa1417833a24ab40a706d66d31d393c\" returns successfully" Jan 13 20:46:56.476069 containerd[1547]: time="2025-01-13T20:46:56.475970325Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\"" Jan 13 20:46:56.476069 containerd[1547]: time="2025-01-13T20:46:56.476034622Z" level=info msg="TearDown network for sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" successfully" Jan 13 20:46:56.476069 containerd[1547]: time="2025-01-13T20:46:56.476041549Z" level=info msg="StopPodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" returns successfully" Jan 13 20:46:56.476442 containerd[1547]: time="2025-01-13T20:46:56.476151794Z" level=info msg="RemovePodSandbox for \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\"" Jan 13 20:46:56.476442 containerd[1547]: time="2025-01-13T20:46:56.476162615Z" level=info msg="Forcibly stopping sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\"" Jan 13 20:46:56.476442 containerd[1547]: time="2025-01-13T20:46:56.476197609Z" level=info msg="TearDown network for sandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" successfully" Jan 13 20:46:56.477301 containerd[1547]: time="2025-01-13T20:46:56.477286008Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.477328 containerd[1547]: time="2025-01-13T20:46:56.477307437Z" level=info msg="RemovePodSandbox \"2910c6c4bac8b6865025b0c270e19e10abfed5b4d78f2d0a1306f4df786446a2\" returns successfully" Jan 13 20:46:56.477533 containerd[1547]: time="2025-01-13T20:46:56.477463139Z" level=info msg="StopPodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\"" Jan 13 20:46:56.477533 containerd[1547]: time="2025-01-13T20:46:56.477501761Z" level=info msg="TearDown network for sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" successfully" Jan 13 20:46:56.477533 containerd[1547]: time="2025-01-13T20:46:56.477507597Z" level=info msg="StopPodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" returns successfully" Jan 13 20:46:56.477778 containerd[1547]: time="2025-01-13T20:46:56.477646589Z" level=info msg="RemovePodSandbox for \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\"" Jan 13 20:46:56.477778 containerd[1547]: time="2025-01-13T20:46:56.477657749Z" level=info msg="Forcibly stopping sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\"" Jan 13 20:46:56.477778 containerd[1547]: time="2025-01-13T20:46:56.477735356Z" level=info msg="TearDown network for sandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" successfully" Jan 13 20:46:56.479223 containerd[1547]: time="2025-01-13T20:46:56.479158407Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.479223 containerd[1547]: time="2025-01-13T20:46:56.479178319Z" level=info msg="RemovePodSandbox \"a7795d39411efbe9c7f24b9913ad7fbe082de52f7bf0766f5d9432903769e1f3\" returns successfully" Jan 13 20:46:56.479315 containerd[1547]: time="2025-01-13T20:46:56.479302131Z" level=info msg="StopPodSandbox for \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\"" Jan 13 20:46:56.479371 containerd[1547]: time="2025-01-13T20:46:56.479356815Z" level=info msg="TearDown network for sandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\" successfully" Jan 13 20:46:56.479371 containerd[1547]: time="2025-01-13T20:46:56.479364140Z" level=info msg="StopPodSandbox for \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\" returns successfully" Jan 13 20:46:56.479483 containerd[1547]: time="2025-01-13T20:46:56.479470399Z" level=info msg="RemovePodSandbox for \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\"" Jan 13 20:46:56.479483 containerd[1547]: time="2025-01-13T20:46:56.479480271Z" level=info msg="Forcibly stopping sandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\"" Jan 13 20:46:56.479525 containerd[1547]: time="2025-01-13T20:46:56.479509356Z" level=info msg="TearDown network for sandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\" successfully" Jan 13 20:46:56.497312 containerd[1547]: time="2025-01-13T20:46:56.496784356Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.497312 containerd[1547]: time="2025-01-13T20:46:56.496837059Z" level=info msg="RemovePodSandbox \"f4e12cacac7fe695c20b7851574944526ff2839b949191d4aeef3e51d25f47e3\" returns successfully" Jan 13 20:46:56.497312 containerd[1547]: time="2025-01-13T20:46:56.497061432Z" level=info msg="StopPodSandbox for \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\"" Jan 13 20:46:56.497312 containerd[1547]: time="2025-01-13T20:46:56.497120066Z" level=info msg="TearDown network for sandbox \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\" successfully" Jan 13 20:46:56.497312 containerd[1547]: time="2025-01-13T20:46:56.497126440Z" level=info msg="StopPodSandbox for \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\" returns successfully" Jan 13 20:46:56.497312 containerd[1547]: time="2025-01-13T20:46:56.497232606Z" level=info msg="RemovePodSandbox for \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\"" Jan 13 20:46:56.497312 containerd[1547]: time="2025-01-13T20:46:56.497269289Z" level=info msg="Forcibly stopping sandbox \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\"" Jan 13 20:46:56.497519 containerd[1547]: time="2025-01-13T20:46:56.497308314Z" level=info msg="TearDown network for sandbox \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\" successfully" Jan 13 20:46:56.509828 containerd[1547]: time="2025-01-13T20:46:56.509753891Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.509828 containerd[1547]: time="2025-01-13T20:46:56.509813809Z" level=info msg="RemovePodSandbox \"16cff5eb3278e0aa6e28428bbf80d3a5508b488afaa7079c625fa384e6b6d610\" returns successfully" Jan 13 20:46:56.510012 containerd[1547]: time="2025-01-13T20:46:56.509994185Z" level=info msg="StopPodSandbox for \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\"" Jan 13 20:46:56.510072 containerd[1547]: time="2025-01-13T20:46:56.510059602Z" level=info msg="TearDown network for sandbox \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\" successfully" Jan 13 20:46:56.510174 containerd[1547]: time="2025-01-13T20:46:56.510071943Z" level=info msg="StopPodSandbox for \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\" returns successfully" Jan 13 20:46:56.510249 containerd[1547]: time="2025-01-13T20:46:56.510230718Z" level=info msg="RemovePodSandbox for \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\"" Jan 13 20:46:56.510274 containerd[1547]: time="2025-01-13T20:46:56.510245758Z" level=info msg="Forcibly stopping sandbox \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\"" Jan 13 20:46:56.510317 containerd[1547]: time="2025-01-13T20:46:56.510299241Z" level=info msg="TearDown network for sandbox \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\" successfully" Jan 13 20:46:56.511462 containerd[1547]: time="2025-01-13T20:46:56.511446788Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.511479819Z" level=info msg="RemovePodSandbox \"62381e5725856192de1f7cf4c41de7f7fae7f013f581d109a42b12fb45c4b61d\" returns successfully" Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.511624608Z" level=info msg="StopPodSandbox for \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\"" Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.511662920Z" level=info msg="TearDown network for sandbox \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\" successfully" Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.511668852Z" level=info msg="StopPodSandbox for \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\" returns successfully" Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.511789237Z" level=info msg="RemovePodSandbox for \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\"" Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.511798932Z" level=info msg="Forcibly stopping sandbox \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\"" Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.511839772Z" level=info msg="TearDown network for sandbox \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\" successfully" Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.512905751Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.512924190Z" level=info msg="RemovePodSandbox \"10e3d77cb676ee36a45f15eb7ce0d5303bee93cc0a1e884a0ccb49f5bfda8b9b\" returns successfully" Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.513051264Z" level=info msg="StopPodSandbox for \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\"" Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.513091658Z" level=info msg="TearDown network for sandbox \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\" successfully" Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.513097872Z" level=info msg="StopPodSandbox for \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\" returns successfully" Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.513247338Z" level=info msg="RemovePodSandbox for \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\"" Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.513260619Z" level=info msg="Forcibly stopping sandbox \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\"" Jan 13 20:46:56.514333 containerd[1547]: time="2025-01-13T20:46:56.513318937Z" level=info msg="TearDown network for sandbox \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\" successfully" Jan 13 20:46:56.515033 containerd[1547]: time="2025-01-13T20:46:56.514605511Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.515033 containerd[1547]: time="2025-01-13T20:46:56.514623156Z" level=info msg="RemovePodSandbox \"1d5d36e57b601e1c762537d31b2c4909031679851b0529491abbad189f9f1231\" returns successfully" Jan 13 20:46:56.515241 containerd[1547]: time="2025-01-13T20:46:56.515161454Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\"" Jan 13 20:46:56.515241 containerd[1547]: time="2025-01-13T20:46:56.515203059Z" level=info msg="TearDown network for sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" successfully" Jan 13 20:46:56.515241 containerd[1547]: time="2025-01-13T20:46:56.515209307Z" level=info msg="StopPodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" returns successfully" Jan 13 20:46:56.516157 containerd[1547]: time="2025-01-13T20:46:56.515455282Z" level=info msg="RemovePodSandbox for \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\"" Jan 13 20:46:56.516157 containerd[1547]: time="2025-01-13T20:46:56.515466568Z" level=info msg="Forcibly stopping sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\"" Jan 13 20:46:56.516157 containerd[1547]: time="2025-01-13T20:46:56.515510019Z" level=info msg="TearDown network for sandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" successfully" Jan 13 20:46:56.516770 containerd[1547]: time="2025-01-13T20:46:56.516754117Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.516810 containerd[1547]: time="2025-01-13T20:46:56.516777310Z" level=info msg="RemovePodSandbox \"48b2c2a36074b1df0997d1117d4fed843ac6d4cbdfaff2b77ae9bb1e15ff4492\" returns successfully" Jan 13 20:46:56.517091 containerd[1547]: time="2025-01-13T20:46:56.517077261Z" level=info msg="StopPodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\"" Jan 13 20:46:56.517129 containerd[1547]: time="2025-01-13T20:46:56.517116791Z" level=info msg="TearDown network for sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" successfully" Jan 13 20:46:56.517129 containerd[1547]: time="2025-01-13T20:46:56.517125109Z" level=info msg="StopPodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" returns successfully" Jan 13 20:46:56.517297 containerd[1547]: time="2025-01-13T20:46:56.517284235Z" level=info msg="RemovePodSandbox for \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\"" Jan 13 20:46:56.517327 containerd[1547]: time="2025-01-13T20:46:56.517297479Z" level=info msg="Forcibly stopping sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\"" Jan 13 20:46:56.517524 containerd[1547]: time="2025-01-13T20:46:56.517362998Z" level=info msg="TearDown network for sandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" successfully" Jan 13 20:46:56.518745 containerd[1547]: time="2025-01-13T20:46:56.518728860Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.518829 containerd[1547]: time="2025-01-13T20:46:56.518756060Z" level=info msg="RemovePodSandbox \"18379950b6cfcde872b32e89d263a4bb5d903ddbf716baf9b57db94263214ed9\" returns successfully" Jan 13 20:46:56.519165 containerd[1547]: time="2025-01-13T20:46:56.519145136Z" level=info msg="StopPodSandbox for \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\"" Jan 13 20:46:56.519211 containerd[1547]: time="2025-01-13T20:46:56.519196283Z" level=info msg="TearDown network for sandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\" successfully" Jan 13 20:46:56.519211 containerd[1547]: time="2025-01-13T20:46:56.519207080Z" level=info msg="StopPodSandbox for \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\" returns successfully" Jan 13 20:46:56.519404 containerd[1547]: time="2025-01-13T20:46:56.519384245Z" level=info msg="RemovePodSandbox for \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\"" Jan 13 20:46:56.519436 containerd[1547]: time="2025-01-13T20:46:56.519399464Z" level=info msg="Forcibly stopping sandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\"" Jan 13 20:46:56.519600 containerd[1547]: time="2025-01-13T20:46:56.519571819Z" level=info msg="TearDown network for sandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\" successfully" Jan 13 20:46:56.520667 containerd[1547]: time="2025-01-13T20:46:56.520650994Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.520694 containerd[1547]: time="2025-01-13T20:46:56.520673523Z" level=info msg="RemovePodSandbox \"07cf49b7b5df56674093a0ab38cd6a5317fe441a9fb6c941551fdbf672e1cbf2\" returns successfully" Jan 13 20:46:56.520836 containerd[1547]: time="2025-01-13T20:46:56.520822486Z" level=info msg="StopPodSandbox for \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\"" Jan 13 20:46:56.520959 containerd[1547]: time="2025-01-13T20:46:56.520946404Z" level=info msg="TearDown network for sandbox \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\" successfully" Jan 13 20:46:56.520959 containerd[1547]: time="2025-01-13T20:46:56.520956607Z" level=info msg="StopPodSandbox for \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\" returns successfully" Jan 13 20:46:56.521269 containerd[1547]: time="2025-01-13T20:46:56.521115062Z" level=info msg="RemovePodSandbox for \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\"" Jan 13 20:46:56.521269 containerd[1547]: time="2025-01-13T20:46:56.521128494Z" level=info msg="Forcibly stopping sandbox \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\"" Jan 13 20:46:56.521269 containerd[1547]: time="2025-01-13T20:46:56.521157203Z" level=info msg="TearDown network for sandbox \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\" successfully" Jan 13 20:46:56.522372 containerd[1547]: time="2025-01-13T20:46:56.522359211Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.522442 containerd[1547]: time="2025-01-13T20:46:56.522433400Z" level=info msg="RemovePodSandbox \"46b008fbfa6d70776eabb0e43270ef40c197c16245e5fdfe8850c26423658d5a\" returns successfully" Jan 13 20:46:56.522664 containerd[1547]: time="2025-01-13T20:46:56.522649808Z" level=info msg="StopPodSandbox for \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\"" Jan 13 20:46:56.522750 containerd[1547]: time="2025-01-13T20:46:56.522735707Z" level=info msg="TearDown network for sandbox \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\" successfully" Jan 13 20:46:56.522750 containerd[1547]: time="2025-01-13T20:46:56.522746691Z" level=info msg="StopPodSandbox for \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\" returns successfully" Jan 13 20:46:56.523333 containerd[1547]: time="2025-01-13T20:46:56.522893208Z" level=info msg="RemovePodSandbox for \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\"" Jan 13 20:46:56.523333 containerd[1547]: time="2025-01-13T20:46:56.522905498Z" level=info msg="Forcibly stopping sandbox \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\"" Jan 13 20:46:56.523333 containerd[1547]: time="2025-01-13T20:46:56.522936358Z" level=info msg="TearDown network for sandbox \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\" successfully" Jan 13 20:46:56.524089 containerd[1547]: time="2025-01-13T20:46:56.524073292Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.524120 containerd[1547]: time="2025-01-13T20:46:56.524094874Z" level=info msg="RemovePodSandbox \"71a55a349f1a56f7ccb014d3eba93b9dcdd11df1fa01a7f77147b3760fea3aef\" returns successfully" Jan 13 20:46:56.524310 containerd[1547]: time="2025-01-13T20:46:56.524272334Z" level=info msg="StopPodSandbox for \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\"" Jan 13 20:46:56.524437 containerd[1547]: time="2025-01-13T20:46:56.524404862Z" level=info msg="TearDown network for sandbox \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\" successfully" Jan 13 20:46:56.524437 containerd[1547]: time="2025-01-13T20:46:56.524413290Z" level=info msg="StopPodSandbox for \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\" returns successfully" Jan 13 20:46:56.525327 containerd[1547]: time="2025-01-13T20:46:56.524607937Z" level=info msg="RemovePodSandbox for \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\"" Jan 13 20:46:56.525327 containerd[1547]: time="2025-01-13T20:46:56.524620077Z" level=info msg="Forcibly stopping sandbox \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\"" Jan 13 20:46:56.525327 containerd[1547]: time="2025-01-13T20:46:56.524647816Z" level=info msg="TearDown network for sandbox \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\" successfully" Jan 13 20:46:56.525732 containerd[1547]: time="2025-01-13T20:46:56.525720138Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.525785 containerd[1547]: time="2025-01-13T20:46:56.525776771Z" level=info msg="RemovePodSandbox \"5c4e63a178e4b0467c04d3973d39b5fd283a55e89dd5d66f0ab51d66a25525fb\" returns successfully" Jan 13 20:46:56.525928 containerd[1547]: time="2025-01-13T20:46:56.525914061Z" level=info msg="StopPodSandbox for \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\"" Jan 13 20:46:56.525978 containerd[1547]: time="2025-01-13T20:46:56.525959278Z" level=info msg="TearDown network for sandbox \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\" successfully" Jan 13 20:46:56.525978 containerd[1547]: time="2025-01-13T20:46:56.525968562Z" level=info msg="StopPodSandbox for \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\" returns successfully" Jan 13 20:46:56.526157 containerd[1547]: time="2025-01-13T20:46:56.526147810Z" level=info msg="RemovePodSandbox for \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\"" Jan 13 20:46:56.526240 containerd[1547]: time="2025-01-13T20:46:56.526231790Z" level=info msg="Forcibly stopping sandbox \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\"" Jan 13 20:46:56.526319 containerd[1547]: time="2025-01-13T20:46:56.526301649Z" level=info msg="TearDown network for sandbox \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\" successfully" Jan 13 20:46:56.527498 containerd[1547]: time="2025-01-13T20:46:56.527486170Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.527564 containerd[1547]: time="2025-01-13T20:46:56.527554045Z" level=info msg="RemovePodSandbox \"67c8264144b80a7611685e3c3efd7cd1c63fac393253c3db262d8c8c2e346aa7\" returns successfully" Jan 13 20:46:56.527764 containerd[1547]: time="2025-01-13T20:46:56.527732055Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\"" Jan 13 20:46:56.527791 containerd[1547]: time="2025-01-13T20:46:56.527772608Z" level=info msg="TearDown network for sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" successfully" Jan 13 20:46:56.527791 containerd[1547]: time="2025-01-13T20:46:56.527778548Z" level=info msg="StopPodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" returns successfully" Jan 13 20:46:56.527953 containerd[1547]: time="2025-01-13T20:46:56.527907837Z" level=info msg="RemovePodSandbox for \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\"" Jan 13 20:46:56.527953 containerd[1547]: time="2025-01-13T20:46:56.527916793Z" level=info msg="Forcibly stopping sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\"" Jan 13 20:46:56.528072 containerd[1547]: time="2025-01-13T20:46:56.527944551Z" level=info msg="TearDown network for sandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" successfully" Jan 13 20:46:56.529024 containerd[1547]: time="2025-01-13T20:46:56.529008680Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.529058 containerd[1547]: time="2025-01-13T20:46:56.529031497Z" level=info msg="RemovePodSandbox \"83d5b2046a07d2bb8cab68ad3d2a49d9d4acd26853b4ad7a948a0d07c40710bd\" returns successfully" Jan 13 20:46:56.529522 containerd[1547]: time="2025-01-13T20:46:56.529164054Z" level=info msg="StopPodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\"" Jan 13 20:46:56.529522 containerd[1547]: time="2025-01-13T20:46:56.529265427Z" level=info msg="TearDown network for sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" successfully" Jan 13 20:46:56.529522 containerd[1547]: time="2025-01-13T20:46:56.529272995Z" level=info msg="StopPodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" returns successfully" Jan 13 20:46:56.529690 containerd[1547]: time="2025-01-13T20:46:56.529526506Z" level=info msg="RemovePodSandbox for \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\"" Jan 13 20:46:56.529690 containerd[1547]: time="2025-01-13T20:46:56.529536841Z" level=info msg="Forcibly stopping sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\"" Jan 13 20:46:56.529690 containerd[1547]: time="2025-01-13T20:46:56.529598529Z" level=info msg="TearDown network for sandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" successfully" Jan 13 20:46:56.530651 containerd[1547]: time="2025-01-13T20:46:56.530636358Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.530753 containerd[1547]: time="2025-01-13T20:46:56.530658206Z" level=info msg="RemovePodSandbox \"3916e00614291a1aaec037cb6fa87c3186cbb3141c76d0c31907f3f81cab1dc7\" returns successfully" Jan 13 20:46:56.530946 containerd[1547]: time="2025-01-13T20:46:56.530825735Z" level=info msg="StopPodSandbox for \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\"" Jan 13 20:46:56.530946 containerd[1547]: time="2025-01-13T20:46:56.530863562Z" level=info msg="TearDown network for sandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\" successfully" Jan 13 20:46:56.530946 containerd[1547]: time="2025-01-13T20:46:56.530869479Z" level=info msg="StopPodSandbox for \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\" returns successfully" Jan 13 20:46:56.531795 containerd[1547]: time="2025-01-13T20:46:56.531161757Z" level=info msg="RemovePodSandbox for \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\"" Jan 13 20:46:56.531795 containerd[1547]: time="2025-01-13T20:46:56.531172380Z" level=info msg="Forcibly stopping sandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\"" Jan 13 20:46:56.531795 containerd[1547]: time="2025-01-13T20:46:56.531259537Z" level=info msg="TearDown network for sandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\" successfully" Jan 13 20:46:56.534134 containerd[1547]: time="2025-01-13T20:46:56.534066604Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.534134 containerd[1547]: time="2025-01-13T20:46:56.534100391Z" level=info msg="RemovePodSandbox \"3ab7afb0ce02dcca2c44f34fcecaaef674d313ff7a6768a5bd8f7ae5dc01e444\" returns successfully" Jan 13 20:46:56.534427 containerd[1547]: time="2025-01-13T20:46:56.534323289Z" level=info msg="StopPodSandbox for \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\"" Jan 13 20:46:56.534427 containerd[1547]: time="2025-01-13T20:46:56.534396846Z" level=info msg="TearDown network for sandbox \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\" successfully" Jan 13 20:46:56.534427 containerd[1547]: time="2025-01-13T20:46:56.534404616Z" level=info msg="StopPodSandbox for \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\" returns successfully" Jan 13 20:46:56.535077 containerd[1547]: time="2025-01-13T20:46:56.535066910Z" level=info msg="RemovePodSandbox for \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\"" Jan 13 20:46:56.535176 containerd[1547]: time="2025-01-13T20:46:56.535113655Z" level=info msg="Forcibly stopping sandbox \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\"" Jan 13 20:46:56.535176 containerd[1547]: time="2025-01-13T20:46:56.535154727Z" level=info msg="TearDown network for sandbox \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\" successfully" Jan 13 20:46:56.536683 containerd[1547]: time="2025-01-13T20:46:56.536633766Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.537221 containerd[1547]: time="2025-01-13T20:46:56.536861274Z" level=info msg="RemovePodSandbox \"989aa846e4fe9127a04925c8ffdbc94e8c0b798beca2c08c852191cc401e0dc8\" returns successfully" Jan 13 20:46:56.537221 containerd[1547]: time="2025-01-13T20:46:56.537017315Z" level=info msg="StopPodSandbox for \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\"" Jan 13 20:46:56.537221 containerd[1547]: time="2025-01-13T20:46:56.537057574Z" level=info msg="TearDown network for sandbox \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\" successfully" Jan 13 20:46:56.537221 containerd[1547]: time="2025-01-13T20:46:56.537083922Z" level=info msg="StopPodSandbox for \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\" returns successfully" Jan 13 20:46:56.537432 containerd[1547]: time="2025-01-13T20:46:56.537417177Z" level=info msg="RemovePodSandbox for \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\"" Jan 13 20:46:56.537467 containerd[1547]: time="2025-01-13T20:46:56.537432453Z" level=info msg="Forcibly stopping sandbox \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\"" Jan 13 20:46:56.537520 containerd[1547]: time="2025-01-13T20:46:56.537479530Z" level=info msg="TearDown network for sandbox \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\" successfully" Jan 13 20:46:56.539475 containerd[1547]: time="2025-01-13T20:46:56.539463245Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.539557 containerd[1547]: time="2025-01-13T20:46:56.539530170Z" level=info msg="RemovePodSandbox \"64d532ccceaf9a00529d89da3528419c24d59f638b6fa94dbea0f9bb5db025eb\" returns successfully" Jan 13 20:46:56.539861 containerd[1547]: time="2025-01-13T20:46:56.539771290Z" level=info msg="StopPodSandbox for \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\"" Jan 13 20:46:56.539861 containerd[1547]: time="2025-01-13T20:46:56.539820851Z" level=info msg="TearDown network for sandbox \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\" successfully" Jan 13 20:46:56.539861 containerd[1547]: time="2025-01-13T20:46:56.539827095Z" level=info msg="StopPodSandbox for \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\" returns successfully" Jan 13 20:46:56.540335 containerd[1547]: time="2025-01-13T20:46:56.540230961Z" level=info msg="RemovePodSandbox for \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\"" Jan 13 20:46:56.540335 containerd[1547]: time="2025-01-13T20:46:56.540244526Z" level=info msg="Forcibly stopping sandbox \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\"" Jan 13 20:46:56.540985 containerd[1547]: time="2025-01-13T20:46:56.540512460Z" level=info msg="TearDown network for sandbox \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\" successfully" Jan 13 20:46:56.542082 containerd[1547]: time="2025-01-13T20:46:56.542068782Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.542145 containerd[1547]: time="2025-01-13T20:46:56.542136274Z" level=info msg="RemovePodSandbox \"4629114cac229a61f9c23a36bc5a25935501c0e1a6cec90e18e86eefc12ae23f\" returns successfully" Jan 13 20:46:56.542359 containerd[1547]: time="2025-01-13T20:46:56.542338480Z" level=info msg="StopPodSandbox for \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\"" Jan 13 20:46:56.542473 containerd[1547]: time="2025-01-13T20:46:56.542464429Z" level=info msg="TearDown network for sandbox \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\" successfully" Jan 13 20:46:56.542515 containerd[1547]: time="2025-01-13T20:46:56.542506923Z" level=info msg="StopPodSandbox for \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\" returns successfully" Jan 13 20:46:56.542738 containerd[1547]: time="2025-01-13T20:46:56.542729201Z" level=info msg="RemovePodSandbox for \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\"" Jan 13 20:46:56.542795 containerd[1547]: time="2025-01-13T20:46:56.542780392Z" level=info msg="Forcibly stopping sandbox \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\"" Jan 13 20:46:56.542866 containerd[1547]: time="2025-01-13T20:46:56.542847963Z" level=info msg="TearDown network for sandbox \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\" successfully" Jan 13 20:46:56.543955 containerd[1547]: time="2025-01-13T20:46:56.543943506Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:46:56.544047 containerd[1547]: time="2025-01-13T20:46:56.544008773Z" level=info msg="RemovePodSandbox \"dd5bbe73fc3d44d06e50ab65935a12d378f766878b5cd3597e2db53b43eed102\" returns successfully" Jan 13 20:47:05.828497 systemd[1]: Started sshd@7-139.178.70.110:22-147.75.109.163:48118.service - OpenSSH per-connection server daemon (147.75.109.163:48118). Jan 13 20:47:05.937621 sshd[5980]: Accepted publickey for core from 147.75.109.163 port 48118 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:47:05.939122 sshd-session[5980]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:47:05.945125 systemd-logind[1526]: New session 10 of user core. Jan 13 20:47:05.950458 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 13 20:47:06.365012 sshd[5982]: Connection closed by 147.75.109.163 port 48118 Jan 13 20:47:06.365349 sshd-session[5980]: pam_unix(sshd:session): session closed for user core Jan 13 20:47:06.368647 systemd-logind[1526]: Session 10 logged out. Waiting for processes to exit. Jan 13 20:47:06.369399 systemd[1]: sshd@7-139.178.70.110:22-147.75.109.163:48118.service: Deactivated successfully. Jan 13 20:47:06.370871 systemd[1]: session-10.scope: Deactivated successfully. Jan 13 20:47:06.371854 systemd-logind[1526]: Removed session 10. Jan 13 20:47:11.375100 systemd[1]: Started sshd@8-139.178.70.110:22-147.75.109.163:54610.service - OpenSSH per-connection server daemon (147.75.109.163:54610). Jan 13 20:47:11.432142 sshd[5994]: Accepted publickey for core from 147.75.109.163 port 54610 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:47:11.433927 sshd-session[5994]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:47:11.438108 systemd-logind[1526]: New session 11 of user core. Jan 13 20:47:11.443470 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 13 20:47:11.573264 sshd[6014]: Connection closed by 147.75.109.163 port 54610 Jan 13 20:47:11.574273 sshd-session[5994]: pam_unix(sshd:session): session closed for user core Jan 13 20:47:11.575988 systemd[1]: sshd@8-139.178.70.110:22-147.75.109.163:54610.service: Deactivated successfully. Jan 13 20:47:11.577274 systemd[1]: session-11.scope: Deactivated successfully. Jan 13 20:47:11.578064 systemd-logind[1526]: Session 11 logged out. Waiting for processes to exit. Jan 13 20:47:11.578686 systemd-logind[1526]: Removed session 11. Jan 13 20:47:16.587814 systemd[1]: Started sshd@9-139.178.70.110:22-147.75.109.163:54622.service - OpenSSH per-connection server daemon (147.75.109.163:54622). Jan 13 20:47:16.628733 sshd[6060]: Accepted publickey for core from 147.75.109.163 port 54622 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:47:16.629685 sshd-session[6060]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:47:16.633183 systemd-logind[1526]: New session 12 of user core. Jan 13 20:47:16.640512 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 13 20:47:16.749055 sshd[6062]: Connection closed by 147.75.109.163 port 54622 Jan 13 20:47:16.750256 sshd-session[6060]: pam_unix(sshd:session): session closed for user core Jan 13 20:47:16.756247 systemd[1]: sshd@9-139.178.70.110:22-147.75.109.163:54622.service: Deactivated successfully. Jan 13 20:47:16.758511 systemd[1]: session-12.scope: Deactivated successfully. Jan 13 20:47:16.760131 systemd-logind[1526]: Session 12 logged out. Waiting for processes to exit. Jan 13 20:47:16.766652 systemd[1]: Started sshd@10-139.178.70.110:22-147.75.109.163:54634.service - OpenSSH per-connection server daemon (147.75.109.163:54634). Jan 13 20:47:16.769634 systemd-logind[1526]: Removed session 12. Jan 13 20:47:16.803188 sshd[6074]: Accepted publickey for core from 147.75.109.163 port 54634 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:47:16.804560 sshd-session[6074]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:47:16.808405 systemd-logind[1526]: New session 13 of user core. Jan 13 20:47:16.812433 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 13 20:47:16.975630 sshd[6077]: Connection closed by 147.75.109.163 port 54634 Jan 13 20:47:16.976643 sshd-session[6074]: pam_unix(sshd:session): session closed for user core Jan 13 20:47:16.981831 systemd[1]: sshd@10-139.178.70.110:22-147.75.109.163:54634.service: Deactivated successfully. Jan 13 20:47:16.983176 systemd[1]: session-13.scope: Deactivated successfully. Jan 13 20:47:16.985702 systemd-logind[1526]: Session 13 logged out. Waiting for processes to exit. Jan 13 20:47:16.991746 systemd[1]: Started sshd@11-139.178.70.110:22-147.75.109.163:54650.service - OpenSSH per-connection server daemon (147.75.109.163:54650). Jan 13 20:47:16.992368 systemd-logind[1526]: Removed session 13. Jan 13 20:47:17.060815 sshd[6086]: Accepted publickey for core from 147.75.109.163 port 54650 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:47:17.062188 sshd-session[6086]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:47:17.065286 systemd-logind[1526]: New session 14 of user core. Jan 13 20:47:17.068557 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 13 20:47:17.199819 sshd[6088]: Connection closed by 147.75.109.163 port 54650 Jan 13 20:47:17.200434 sshd-session[6086]: pam_unix(sshd:session): session closed for user core Jan 13 20:47:17.203216 systemd[1]: sshd@11-139.178.70.110:22-147.75.109.163:54650.service: Deactivated successfully. Jan 13 20:47:17.204936 systemd[1]: session-14.scope: Deactivated successfully. Jan 13 20:47:17.205744 systemd-logind[1526]: Session 14 logged out. Waiting for processes to exit. Jan 13 20:47:17.206713 systemd-logind[1526]: Removed session 14. Jan 13 20:47:22.211969 systemd[1]: Started sshd@12-139.178.70.110:22-147.75.109.163:52516.service - OpenSSH per-connection server daemon (147.75.109.163:52516). Jan 13 20:47:22.416628 sshd[6103]: Accepted publickey for core from 147.75.109.163 port 52516 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:47:22.417172 sshd-session[6103]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:47:22.420903 systemd-logind[1526]: New session 15 of user core. Jan 13 20:47:22.426689 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 13 20:47:22.552400 sshd[6105]: Connection closed by 147.75.109.163 port 52516 Jan 13 20:47:22.551815 sshd-session[6103]: pam_unix(sshd:session): session closed for user core Jan 13 20:47:22.553559 systemd-logind[1526]: Session 15 logged out. Waiting for processes to exit. Jan 13 20:47:22.553974 systemd[1]: sshd@12-139.178.70.110:22-147.75.109.163:52516.service: Deactivated successfully. Jan 13 20:47:22.555076 systemd[1]: session-15.scope: Deactivated successfully. Jan 13 20:47:22.555960 systemd-logind[1526]: Removed session 15. Jan 13 20:47:27.561159 systemd[1]: Started sshd@13-139.178.70.110:22-147.75.109.163:50864.service - OpenSSH per-connection server daemon (147.75.109.163:50864). Jan 13 20:47:27.614162 sshd[6118]: Accepted publickey for core from 147.75.109.163 port 50864 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:47:27.615074 sshd-session[6118]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:47:27.618803 systemd-logind[1526]: New session 16 of user core. Jan 13 20:47:27.628454 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 13 20:47:27.747646 sshd[6120]: Connection closed by 147.75.109.163 port 50864 Jan 13 20:47:27.747850 sshd-session[6118]: pam_unix(sshd:session): session closed for user core Jan 13 20:47:27.752788 systemd[1]: sshd@13-139.178.70.110:22-147.75.109.163:50864.service: Deactivated successfully. Jan 13 20:47:27.753686 systemd[1]: session-16.scope: Deactivated successfully. Jan 13 20:47:27.754446 systemd-logind[1526]: Session 16 logged out. Waiting for processes to exit. Jan 13 20:47:27.757575 systemd[1]: Started sshd@14-139.178.70.110:22-147.75.109.163:50868.service - OpenSSH per-connection server daemon (147.75.109.163:50868). Jan 13 20:47:27.758488 systemd-logind[1526]: Removed session 16. Jan 13 20:47:27.789359 sshd[6131]: Accepted publickey for core from 147.75.109.163 port 50868 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:47:27.789830 sshd-session[6131]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:47:27.792702 systemd-logind[1526]: New session 17 of user core. Jan 13 20:47:27.796558 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 13 20:47:28.356706 sshd[6133]: Connection closed by 147.75.109.163 port 50868 Jan 13 20:47:28.357767 sshd-session[6131]: pam_unix(sshd:session): session closed for user core Jan 13 20:47:28.362904 systemd[1]: sshd@14-139.178.70.110:22-147.75.109.163:50868.service: Deactivated successfully. Jan 13 20:47:28.364182 systemd[1]: session-17.scope: Deactivated successfully. Jan 13 20:47:28.364661 systemd-logind[1526]: Session 17 logged out. Waiting for processes to exit. Jan 13 20:47:28.370865 systemd[1]: Started sshd@15-139.178.70.110:22-147.75.109.163:50882.service - OpenSSH per-connection server daemon (147.75.109.163:50882). Jan 13 20:47:28.372318 systemd-logind[1526]: Removed session 17. Jan 13 20:47:28.442272 sshd[6142]: Accepted publickey for core from 147.75.109.163 port 50882 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:47:28.443107 sshd-session[6142]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:47:28.445996 systemd-logind[1526]: New session 18 of user core. Jan 13 20:47:28.450582 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 13 20:47:29.753148 sshd[6144]: Connection closed by 147.75.109.163 port 50882 Jan 13 20:47:29.762634 systemd[1]: Started sshd@16-139.178.70.110:22-147.75.109.163:50884.service - OpenSSH per-connection server daemon (147.75.109.163:50884). Jan 13 20:47:29.762966 sshd-session[6142]: pam_unix(sshd:session): session closed for user core Jan 13 20:47:29.771572 systemd-logind[1526]: Session 18 logged out. Waiting for processes to exit. Jan 13 20:47:29.772304 systemd[1]: sshd@15-139.178.70.110:22-147.75.109.163:50882.service: Deactivated successfully. Jan 13 20:47:29.773487 systemd[1]: session-18.scope: Deactivated successfully. Jan 13 20:47:29.775262 systemd-logind[1526]: Removed session 18. Jan 13 20:47:29.844322 sshd[6158]: Accepted publickey for core from 147.75.109.163 port 50884 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:47:29.845084 sshd-session[6158]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:47:29.848137 systemd-logind[1526]: New session 19 of user core. Jan 13 20:47:29.851553 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 13 20:47:30.210852 sshd[6162]: Connection closed by 147.75.109.163 port 50884 Jan 13 20:47:30.211445 sshd-session[6158]: pam_unix(sshd:session): session closed for user core Jan 13 20:47:30.220412 systemd[1]: sshd@16-139.178.70.110:22-147.75.109.163:50884.service: Deactivated successfully. Jan 13 20:47:30.221398 systemd[1]: session-19.scope: Deactivated successfully. Jan 13 20:47:30.222583 systemd-logind[1526]: Session 19 logged out. Waiting for processes to exit. Jan 13 20:47:30.230721 systemd[1]: Started sshd@17-139.178.70.110:22-147.75.109.163:50896.service - OpenSSH per-connection server daemon (147.75.109.163:50896). Jan 13 20:47:30.233091 systemd-logind[1526]: Removed session 19. Jan 13 20:47:30.261373 sshd[6171]: Accepted publickey for core from 147.75.109.163 port 50896 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:47:30.262162 sshd-session[6171]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:47:30.265282 systemd-logind[1526]: New session 20 of user core. Jan 13 20:47:30.270512 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 13 20:47:30.367076 sshd[6173]: Connection closed by 147.75.109.163 port 50896 Jan 13 20:47:30.367560 sshd-session[6171]: pam_unix(sshd:session): session closed for user core Jan 13 20:47:30.369486 systemd-logind[1526]: Session 20 logged out. Waiting for processes to exit. Jan 13 20:47:30.369592 systemd[1]: sshd@17-139.178.70.110:22-147.75.109.163:50896.service: Deactivated successfully. Jan 13 20:47:30.370657 systemd[1]: session-20.scope: Deactivated successfully. Jan 13 20:47:30.371173 systemd-logind[1526]: Removed session 20. Jan 13 20:47:35.375169 systemd[1]: Started sshd@18-139.178.70.110:22-147.75.109.163:50902.service - OpenSSH per-connection server daemon (147.75.109.163:50902). Jan 13 20:47:35.413404 sshd[6188]: Accepted publickey for core from 147.75.109.163 port 50902 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:47:35.414314 sshd-session[6188]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:47:35.417359 systemd-logind[1526]: New session 21 of user core. Jan 13 20:47:35.425580 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 13 20:47:35.533974 sshd[6190]: Connection closed by 147.75.109.163 port 50902 Jan 13 20:47:35.534319 sshd-session[6188]: pam_unix(sshd:session): session closed for user core Jan 13 20:47:35.536118 systemd[1]: sshd@18-139.178.70.110:22-147.75.109.163:50902.service: Deactivated successfully. Jan 13 20:47:35.537297 systemd[1]: session-21.scope: Deactivated successfully. Jan 13 20:47:35.537324 systemd-logind[1526]: Session 21 logged out. Waiting for processes to exit. Jan 13 20:47:35.538301 systemd-logind[1526]: Removed session 21. Jan 13 20:47:40.542428 systemd[1]: Started sshd@19-139.178.70.110:22-147.75.109.163:58830.service - OpenSSH per-connection server daemon (147.75.109.163:58830). Jan 13 20:47:40.636898 sshd[6220]: Accepted publickey for core from 147.75.109.163 port 58830 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:47:40.637850 sshd-session[6220]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:47:40.640387 systemd-logind[1526]: New session 22 of user core. Jan 13 20:47:40.644412 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 13 20:47:40.767367 sshd[6222]: Connection closed by 147.75.109.163 port 58830 Jan 13 20:47:40.767539 sshd-session[6220]: pam_unix(sshd:session): session closed for user core Jan 13 20:47:40.770857 systemd[1]: sshd@19-139.178.70.110:22-147.75.109.163:58830.service: Deactivated successfully. Jan 13 20:47:40.772028 systemd[1]: session-22.scope: Deactivated successfully. Jan 13 20:47:40.772963 systemd-logind[1526]: Session 22 logged out. Waiting for processes to exit. Jan 13 20:47:40.773528 systemd-logind[1526]: Removed session 22. Jan 13 20:47:45.781908 systemd[1]: Started sshd@20-139.178.70.110:22-147.75.109.163:58840.service - OpenSSH per-connection server daemon (147.75.109.163:58840). Jan 13 20:47:45.847780 sshd[6276]: Accepted publickey for core from 147.75.109.163 port 58840 ssh2: RSA SHA256:EKWTIpqKlhRTY8gr0WBQ1SLflvK+W4bCLRDtmnb42Eo Jan 13 20:47:45.848503 sshd-session[6276]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:47:45.850883 systemd-logind[1526]: New session 23 of user core. Jan 13 20:47:45.859416 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 13 20:47:46.023064 sshd[6278]: Connection closed by 147.75.109.163 port 58840 Jan 13 20:47:46.023451 sshd-session[6276]: pam_unix(sshd:session): session closed for user core Jan 13 20:47:46.027583 systemd[1]: sshd@20-139.178.70.110:22-147.75.109.163:58840.service: Deactivated successfully. Jan 13 20:47:46.028538 systemd[1]: session-23.scope: Deactivated successfully. Jan 13 20:47:46.028923 systemd-logind[1526]: Session 23 logged out. Waiting for processes to exit. Jan 13 20:47:46.029688 systemd-logind[1526]: Removed session 23.