Jan 13 20:54:24.732405 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Jan 13 19:01:45 -00 2025 Jan 13 20:54:24.732420 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 20:54:24.732427 kernel: Disabled fast string operations Jan 13 20:54:24.732431 kernel: BIOS-provided physical RAM map: Jan 13 20:54:24.732434 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Jan 13 20:54:24.732438 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Jan 13 20:54:24.732444 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Jan 13 20:54:24.732448 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Jan 13 20:54:24.732452 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Jan 13 20:54:24.732456 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Jan 13 20:54:24.732460 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Jan 13 20:54:24.732464 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Jan 13 20:54:24.732468 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Jan 13 20:54:24.732472 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Jan 13 20:54:24.732478 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Jan 13 20:54:24.732483 kernel: NX (Execute Disable) protection: active Jan 13 20:54:24.732487 kernel: APIC: Static calls initialized Jan 13 20:54:24.732492 kernel: SMBIOS 2.7 present. Jan 13 20:54:24.732497 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Jan 13 20:54:24.732502 kernel: vmware: hypercall mode: 0x00 Jan 13 20:54:24.732506 kernel: Hypervisor detected: VMware Jan 13 20:54:24.732511 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Jan 13 20:54:24.732516 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Jan 13 20:54:24.732521 kernel: vmware: using clock offset of 2442757049 ns Jan 13 20:54:24.732525 kernel: tsc: Detected 3408.000 MHz processor Jan 13 20:54:24.732530 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 13 20:54:24.732535 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 13 20:54:24.732540 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Jan 13 20:54:24.732545 kernel: total RAM covered: 3072M Jan 13 20:54:24.732549 kernel: Found optimal setting for mtrr clean up Jan 13 20:54:24.732554 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Jan 13 20:54:24.732560 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Jan 13 20:54:24.732564 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 13 20:54:24.732569 kernel: Using GB pages for direct mapping Jan 13 20:54:24.732574 kernel: ACPI: Early table checksum verification disabled Jan 13 20:54:24.732578 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Jan 13 20:54:24.732583 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Jan 13 20:54:24.732587 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Jan 13 20:54:24.732592 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Jan 13 20:54:24.732596 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:54:24.732604 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:54:24.732608 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Jan 13 20:54:24.732614 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Jan 13 20:54:24.732619 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Jan 13 20:54:24.732624 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Jan 13 20:54:24.732629 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Jan 13 20:54:24.732634 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Jan 13 20:54:24.732639 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Jan 13 20:54:24.732644 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Jan 13 20:54:24.732649 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:54:24.732654 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:54:24.732659 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Jan 13 20:54:24.732664 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Jan 13 20:54:24.732668 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Jan 13 20:54:24.732673 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Jan 13 20:54:24.732679 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Jan 13 20:54:24.732684 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Jan 13 20:54:24.732689 kernel: system APIC only can use physical flat Jan 13 20:54:24.732693 kernel: APIC: Switched APIC routing to: physical flat Jan 13 20:54:24.732698 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 13 20:54:24.732703 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 13 20:54:24.732708 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 13 20:54:24.732712 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 13 20:54:24.732717 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 13 20:54:24.732723 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 13 20:54:24.732728 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 13 20:54:24.732732 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 13 20:54:24.732737 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Jan 13 20:54:24.732742 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Jan 13 20:54:24.732746 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Jan 13 20:54:24.732751 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Jan 13 20:54:24.732756 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Jan 13 20:54:24.732760 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Jan 13 20:54:24.732765 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Jan 13 20:54:24.732770 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Jan 13 20:54:24.732778 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Jan 13 20:54:24.732801 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Jan 13 20:54:24.732806 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Jan 13 20:54:24.732810 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Jan 13 20:54:24.732830 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Jan 13 20:54:24.732835 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Jan 13 20:54:24.732840 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Jan 13 20:54:24.732844 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Jan 13 20:54:24.732849 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Jan 13 20:54:24.732854 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Jan 13 20:54:24.732860 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Jan 13 20:54:24.732864 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Jan 13 20:54:24.732869 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Jan 13 20:54:24.732874 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Jan 13 20:54:24.732879 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Jan 13 20:54:24.732884 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Jan 13 20:54:24.732888 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Jan 13 20:54:24.732893 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Jan 13 20:54:24.732898 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Jan 13 20:54:24.732903 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Jan 13 20:54:24.732908 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Jan 13 20:54:24.732913 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Jan 13 20:54:24.732918 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Jan 13 20:54:24.732923 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Jan 13 20:54:24.732927 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Jan 13 20:54:24.732932 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Jan 13 20:54:24.732937 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Jan 13 20:54:24.732942 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Jan 13 20:54:24.732947 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Jan 13 20:54:24.732951 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Jan 13 20:54:24.732957 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Jan 13 20:54:24.732962 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Jan 13 20:54:24.732966 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Jan 13 20:54:24.732971 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Jan 13 20:54:24.732976 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Jan 13 20:54:24.732981 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Jan 13 20:54:24.732985 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Jan 13 20:54:24.732990 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Jan 13 20:54:24.732995 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Jan 13 20:54:24.733000 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Jan 13 20:54:24.733005 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Jan 13 20:54:24.733010 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Jan 13 20:54:24.733015 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Jan 13 20:54:24.733023 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Jan 13 20:54:24.733029 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Jan 13 20:54:24.733034 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Jan 13 20:54:24.733039 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Jan 13 20:54:24.733044 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Jan 13 20:54:24.733049 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Jan 13 20:54:24.733055 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Jan 13 20:54:24.733060 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Jan 13 20:54:24.733066 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Jan 13 20:54:24.733071 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Jan 13 20:54:24.733076 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Jan 13 20:54:24.733081 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Jan 13 20:54:24.733086 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Jan 13 20:54:24.733091 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Jan 13 20:54:24.733096 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Jan 13 20:54:24.733101 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Jan 13 20:54:24.733107 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Jan 13 20:54:24.733112 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Jan 13 20:54:24.733117 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Jan 13 20:54:24.733122 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Jan 13 20:54:24.733127 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Jan 13 20:54:24.733132 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Jan 13 20:54:24.733137 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Jan 13 20:54:24.733142 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Jan 13 20:54:24.733147 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Jan 13 20:54:24.733152 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Jan 13 20:54:24.733158 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Jan 13 20:54:24.733164 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Jan 13 20:54:24.733169 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Jan 13 20:54:24.733174 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Jan 13 20:54:24.733179 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Jan 13 20:54:24.733184 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Jan 13 20:54:24.733189 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Jan 13 20:54:24.733194 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Jan 13 20:54:24.733199 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Jan 13 20:54:24.733204 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Jan 13 20:54:24.733209 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Jan 13 20:54:24.733215 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Jan 13 20:54:24.733220 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Jan 13 20:54:24.733225 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Jan 13 20:54:24.733230 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Jan 13 20:54:24.733235 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Jan 13 20:54:24.733240 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Jan 13 20:54:24.733245 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Jan 13 20:54:24.733250 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Jan 13 20:54:24.733255 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Jan 13 20:54:24.733260 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Jan 13 20:54:24.733266 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Jan 13 20:54:24.733272 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Jan 13 20:54:24.733277 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Jan 13 20:54:24.733443 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Jan 13 20:54:24.733449 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Jan 13 20:54:24.733454 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Jan 13 20:54:24.733459 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Jan 13 20:54:24.733464 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Jan 13 20:54:24.733469 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Jan 13 20:54:24.733474 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Jan 13 20:54:24.733481 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Jan 13 20:54:24.733486 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Jan 13 20:54:24.733492 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Jan 13 20:54:24.733497 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Jan 13 20:54:24.733502 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Jan 13 20:54:24.733507 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Jan 13 20:54:24.733512 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Jan 13 20:54:24.733517 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Jan 13 20:54:24.733522 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Jan 13 20:54:24.733527 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Jan 13 20:54:24.733532 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Jan 13 20:54:24.733538 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Jan 13 20:54:24.733544 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 13 20:54:24.733549 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 13 20:54:24.733554 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Jan 13 20:54:24.733560 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Jan 13 20:54:24.733565 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Jan 13 20:54:24.733571 kernel: Zone ranges: Jan 13 20:54:24.733576 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 13 20:54:24.733581 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Jan 13 20:54:24.733587 kernel: Normal empty Jan 13 20:54:24.733592 kernel: Movable zone start for each node Jan 13 20:54:24.733597 kernel: Early memory node ranges Jan 13 20:54:24.733603 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Jan 13 20:54:24.733608 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Jan 13 20:54:24.733613 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Jan 13 20:54:24.733618 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Jan 13 20:54:24.733623 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 13 20:54:24.733629 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Jan 13 20:54:24.733634 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Jan 13 20:54:24.733640 kernel: ACPI: PM-Timer IO Port: 0x1008 Jan 13 20:54:24.733645 kernel: system APIC only can use physical flat Jan 13 20:54:24.733650 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Jan 13 20:54:24.733656 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Jan 13 20:54:24.733661 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Jan 13 20:54:24.733666 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Jan 13 20:54:24.733671 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Jan 13 20:54:24.733676 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Jan 13 20:54:24.733682 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Jan 13 20:54:24.733688 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Jan 13 20:54:24.733693 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Jan 13 20:54:24.733698 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Jan 13 20:54:24.733703 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Jan 13 20:54:24.733708 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Jan 13 20:54:24.733713 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Jan 13 20:54:24.733718 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Jan 13 20:54:24.733723 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Jan 13 20:54:24.733729 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Jan 13 20:54:24.733734 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Jan 13 20:54:24.733740 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Jan 13 20:54:24.733745 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Jan 13 20:54:24.733750 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Jan 13 20:54:24.733755 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Jan 13 20:54:24.733760 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Jan 13 20:54:24.733765 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Jan 13 20:54:24.733771 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Jan 13 20:54:24.733779 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Jan 13 20:54:24.733784 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Jan 13 20:54:24.733790 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Jan 13 20:54:24.733796 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Jan 13 20:54:24.733801 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Jan 13 20:54:24.733806 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Jan 13 20:54:24.733811 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Jan 13 20:54:24.733817 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Jan 13 20:54:24.733822 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Jan 13 20:54:24.733827 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Jan 13 20:54:24.733832 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Jan 13 20:54:24.733837 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Jan 13 20:54:24.733843 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Jan 13 20:54:24.733849 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Jan 13 20:54:24.733854 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Jan 13 20:54:24.733859 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Jan 13 20:54:24.733864 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Jan 13 20:54:24.733869 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Jan 13 20:54:24.733875 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Jan 13 20:54:24.733880 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Jan 13 20:54:24.733885 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Jan 13 20:54:24.733890 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Jan 13 20:54:24.733896 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Jan 13 20:54:24.733901 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Jan 13 20:54:24.733906 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Jan 13 20:54:24.733912 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Jan 13 20:54:24.733917 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Jan 13 20:54:24.733922 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Jan 13 20:54:24.733927 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Jan 13 20:54:24.733932 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Jan 13 20:54:24.733937 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Jan 13 20:54:24.733943 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Jan 13 20:54:24.733949 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Jan 13 20:54:24.733954 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Jan 13 20:54:24.733959 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Jan 13 20:54:24.733964 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Jan 13 20:54:24.733969 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Jan 13 20:54:24.733974 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Jan 13 20:54:24.733979 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Jan 13 20:54:24.733984 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Jan 13 20:54:24.733990 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Jan 13 20:54:24.733996 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Jan 13 20:54:24.734001 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Jan 13 20:54:24.734006 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Jan 13 20:54:24.734011 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Jan 13 20:54:24.734016 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Jan 13 20:54:24.734225 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Jan 13 20:54:24.734231 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Jan 13 20:54:24.734237 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Jan 13 20:54:24.734242 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Jan 13 20:54:24.734247 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Jan 13 20:54:24.734254 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Jan 13 20:54:24.734259 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Jan 13 20:54:24.734264 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Jan 13 20:54:24.734269 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Jan 13 20:54:24.734275 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Jan 13 20:54:24.734288 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Jan 13 20:54:24.734294 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Jan 13 20:54:24.734299 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Jan 13 20:54:24.734304 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Jan 13 20:54:24.734309 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Jan 13 20:54:24.734316 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Jan 13 20:54:24.734321 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Jan 13 20:54:24.734326 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Jan 13 20:54:24.734331 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Jan 13 20:54:24.734336 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Jan 13 20:54:24.734341 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Jan 13 20:54:24.734347 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Jan 13 20:54:24.734352 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Jan 13 20:54:24.734357 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Jan 13 20:54:24.734363 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Jan 13 20:54:24.734368 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Jan 13 20:54:24.734374 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Jan 13 20:54:24.734379 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Jan 13 20:54:24.734384 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Jan 13 20:54:24.734389 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Jan 13 20:54:24.734394 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Jan 13 20:54:24.734399 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Jan 13 20:54:24.734404 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Jan 13 20:54:24.734409 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Jan 13 20:54:24.734415 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Jan 13 20:54:24.734421 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Jan 13 20:54:24.734426 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Jan 13 20:54:24.734431 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Jan 13 20:54:24.734436 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Jan 13 20:54:24.734441 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Jan 13 20:54:24.734446 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Jan 13 20:54:24.734451 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Jan 13 20:54:24.734456 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Jan 13 20:54:24.734461 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Jan 13 20:54:24.734468 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Jan 13 20:54:24.734473 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Jan 13 20:54:24.734478 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Jan 13 20:54:24.734483 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Jan 13 20:54:24.734488 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Jan 13 20:54:24.734493 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Jan 13 20:54:24.734498 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Jan 13 20:54:24.734503 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Jan 13 20:54:24.734509 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Jan 13 20:54:24.734514 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Jan 13 20:54:24.734520 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Jan 13 20:54:24.734525 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Jan 13 20:54:24.734530 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Jan 13 20:54:24.734535 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Jan 13 20:54:24.734540 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Jan 13 20:54:24.734546 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Jan 13 20:54:24.734551 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 13 20:54:24.734556 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Jan 13 20:54:24.734561 kernel: TSC deadline timer available Jan 13 20:54:24.734568 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Jan 13 20:54:24.734573 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Jan 13 20:54:24.734578 kernel: Booting paravirtualized kernel on VMware hypervisor Jan 13 20:54:24.734584 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 13 20:54:24.734589 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Jan 13 20:54:24.734594 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 13 20:54:24.734600 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 13 20:54:24.734605 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Jan 13 20:54:24.734610 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Jan 13 20:54:24.734616 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Jan 13 20:54:24.734621 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Jan 13 20:54:24.734627 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Jan 13 20:54:24.734638 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Jan 13 20:54:24.734645 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Jan 13 20:54:24.734650 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Jan 13 20:54:24.734655 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Jan 13 20:54:24.734661 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Jan 13 20:54:24.734668 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Jan 13 20:54:24.734673 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Jan 13 20:54:24.734679 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Jan 13 20:54:24.734684 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Jan 13 20:54:24.734689 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Jan 13 20:54:24.734695 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Jan 13 20:54:24.734701 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 20:54:24.734707 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 13 20:54:24.734714 kernel: random: crng init done Jan 13 20:54:24.734719 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Jan 13 20:54:24.734724 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Jan 13 20:54:24.734730 kernel: printk: log_buf_len min size: 262144 bytes Jan 13 20:54:24.734735 kernel: printk: log_buf_len: 1048576 bytes Jan 13 20:54:24.734741 kernel: printk: early log buf free: 239648(91%) Jan 13 20:54:24.734746 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 13 20:54:24.734752 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 13 20:54:24.734757 kernel: Fallback order for Node 0: 0 Jan 13 20:54:24.734763 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Jan 13 20:54:24.734769 kernel: Policy zone: DMA32 Jan 13 20:54:24.734775 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 13 20:54:24.734781 kernel: Memory: 1936376K/2096628K available (12288K kernel code, 2299K rwdata, 22736K rodata, 42976K init, 2216K bss, 159992K reserved, 0K cma-reserved) Jan 13 20:54:24.734788 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Jan 13 20:54:24.734793 kernel: ftrace: allocating 37920 entries in 149 pages Jan 13 20:54:24.734800 kernel: ftrace: allocated 149 pages with 4 groups Jan 13 20:54:24.734806 kernel: Dynamic Preempt: voluntary Jan 13 20:54:24.734812 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 13 20:54:24.734818 kernel: rcu: RCU event tracing is enabled. Jan 13 20:54:24.734824 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Jan 13 20:54:24.734829 kernel: Trampoline variant of Tasks RCU enabled. Jan 13 20:54:24.734835 kernel: Rude variant of Tasks RCU enabled. Jan 13 20:54:24.734840 kernel: Tracing variant of Tasks RCU enabled. Jan 13 20:54:24.734846 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 13 20:54:24.734852 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Jan 13 20:54:24.734858 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Jan 13 20:54:24.734863 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Jan 13 20:54:24.734869 kernel: Console: colour VGA+ 80x25 Jan 13 20:54:24.734874 kernel: printk: console [tty0] enabled Jan 13 20:54:24.734880 kernel: printk: console [ttyS0] enabled Jan 13 20:54:24.734885 kernel: ACPI: Core revision 20230628 Jan 13 20:54:24.734891 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Jan 13 20:54:24.734897 kernel: APIC: Switch to symmetric I/O mode setup Jan 13 20:54:24.734902 kernel: x2apic enabled Jan 13 20:54:24.734909 kernel: APIC: Switched APIC routing to: physical x2apic Jan 13 20:54:24.734915 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 13 20:54:24.734921 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:54:24.734926 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Jan 13 20:54:24.734932 kernel: Disabled fast string operations Jan 13 20:54:24.734937 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Jan 13 20:54:24.734943 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Jan 13 20:54:24.734949 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 13 20:54:24.734954 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Jan 13 20:54:24.734961 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Jan 13 20:54:24.734966 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 13 20:54:24.734972 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 13 20:54:24.734977 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 13 20:54:24.734983 kernel: RETBleed: Mitigation: Enhanced IBRS Jan 13 20:54:24.734989 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 13 20:54:24.734994 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 13 20:54:24.735000 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jan 13 20:54:24.735005 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 13 20:54:24.735012 kernel: GDS: Unknown: Dependent on hypervisor status Jan 13 20:54:24.735017 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 13 20:54:24.735023 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 13 20:54:24.735029 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 13 20:54:24.735034 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 13 20:54:24.735040 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jan 13 20:54:24.735045 kernel: Freeing SMP alternatives memory: 32K Jan 13 20:54:24.735051 kernel: pid_max: default: 131072 minimum: 1024 Jan 13 20:54:24.735056 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 13 20:54:24.735063 kernel: landlock: Up and running. Jan 13 20:54:24.735068 kernel: SELinux: Initializing. Jan 13 20:54:24.735074 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:54:24.735079 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:54:24.735085 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Jan 13 20:54:24.735091 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:54:24.735096 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:54:24.735102 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:54:24.735108 kernel: Performance Events: Skylake events, core PMU driver. Jan 13 20:54:24.735114 kernel: core: CPUID marked event: 'cpu cycles' unavailable Jan 13 20:54:24.735120 kernel: core: CPUID marked event: 'instructions' unavailable Jan 13 20:54:24.735125 kernel: core: CPUID marked event: 'bus cycles' unavailable Jan 13 20:54:24.735131 kernel: core: CPUID marked event: 'cache references' unavailable Jan 13 20:54:24.735136 kernel: core: CPUID marked event: 'cache misses' unavailable Jan 13 20:54:24.735141 kernel: core: CPUID marked event: 'branch instructions' unavailable Jan 13 20:54:24.735147 kernel: core: CPUID marked event: 'branch misses' unavailable Jan 13 20:54:24.735152 kernel: ... version: 1 Jan 13 20:54:24.735159 kernel: ... bit width: 48 Jan 13 20:54:24.735164 kernel: ... generic registers: 4 Jan 13 20:54:24.735170 kernel: ... value mask: 0000ffffffffffff Jan 13 20:54:24.735175 kernel: ... max period: 000000007fffffff Jan 13 20:54:24.735181 kernel: ... fixed-purpose events: 0 Jan 13 20:54:24.735188 kernel: ... event mask: 000000000000000f Jan 13 20:54:24.735193 kernel: signal: max sigframe size: 1776 Jan 13 20:54:24.735199 kernel: rcu: Hierarchical SRCU implementation. Jan 13 20:54:24.735204 kernel: rcu: Max phase no-delay instances is 400. Jan 13 20:54:24.735211 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 13 20:54:24.735216 kernel: smp: Bringing up secondary CPUs ... Jan 13 20:54:24.735222 kernel: smpboot: x86: Booting SMP configuration: Jan 13 20:54:24.735228 kernel: .... node #0, CPUs: #1 Jan 13 20:54:24.735233 kernel: Disabled fast string operations Jan 13 20:54:24.735239 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Jan 13 20:54:24.735244 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 13 20:54:24.735249 kernel: smp: Brought up 1 node, 2 CPUs Jan 13 20:54:24.735255 kernel: smpboot: Max logical packages: 128 Jan 13 20:54:24.735260 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Jan 13 20:54:24.735267 kernel: devtmpfs: initialized Jan 13 20:54:24.735273 kernel: x86/mm: Memory block size: 128MB Jan 13 20:54:24.735288 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Jan 13 20:54:24.735294 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 13 20:54:24.735300 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Jan 13 20:54:24.735305 kernel: pinctrl core: initialized pinctrl subsystem Jan 13 20:54:24.735311 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 13 20:54:24.735316 kernel: audit: initializing netlink subsys (disabled) Jan 13 20:54:24.735322 kernel: audit: type=2000 audit(1736801663.066:1): state=initialized audit_enabled=0 res=1 Jan 13 20:54:24.735329 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 13 20:54:24.735334 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 13 20:54:24.735340 kernel: cpuidle: using governor menu Jan 13 20:54:24.735345 kernel: Simple Boot Flag at 0x36 set to 0x80 Jan 13 20:54:24.735351 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 13 20:54:24.735356 kernel: dca service started, version 1.12.1 Jan 13 20:54:24.735362 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Jan 13 20:54:24.735368 kernel: PCI: Using configuration type 1 for base access Jan 13 20:54:24.735373 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 13 20:54:24.735380 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 13 20:54:24.735386 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 13 20:54:24.735392 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 13 20:54:24.735397 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 13 20:54:24.735403 kernel: ACPI: Added _OSI(Module Device) Jan 13 20:54:24.735408 kernel: ACPI: Added _OSI(Processor Device) Jan 13 20:54:24.735414 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 13 20:54:24.735419 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 13 20:54:24.735425 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 13 20:54:24.735431 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Jan 13 20:54:24.735437 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 13 20:54:24.735442 kernel: ACPI: Interpreter enabled Jan 13 20:54:24.735448 kernel: ACPI: PM: (supports S0 S1 S5) Jan 13 20:54:24.735454 kernel: ACPI: Using IOAPIC for interrupt routing Jan 13 20:54:24.735459 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 13 20:54:24.735465 kernel: PCI: Using E820 reservations for host bridge windows Jan 13 20:54:24.735470 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Jan 13 20:54:24.735476 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Jan 13 20:54:24.735550 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 13 20:54:24.737272 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Jan 13 20:54:24.739352 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Jan 13 20:54:24.739362 kernel: PCI host bridge to bus 0000:00 Jan 13 20:54:24.739414 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 13 20:54:24.739460 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Jan 13 20:54:24.739508 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 13 20:54:24.739551 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 13 20:54:24.739594 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Jan 13 20:54:24.739640 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Jan 13 20:54:24.739698 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Jan 13 20:54:24.739753 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Jan 13 20:54:24.739812 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Jan 13 20:54:24.739865 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Jan 13 20:54:24.739915 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Jan 13 20:54:24.739964 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Jan 13 20:54:24.740013 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Jan 13 20:54:24.740061 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Jan 13 20:54:24.740109 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Jan 13 20:54:24.740163 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Jan 13 20:54:24.740211 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Jan 13 20:54:24.740259 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Jan 13 20:54:24.740327 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Jan 13 20:54:24.740377 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Jan 13 20:54:24.740425 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Jan 13 20:54:24.740480 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Jan 13 20:54:24.740529 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Jan 13 20:54:24.740577 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Jan 13 20:54:24.740625 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Jan 13 20:54:24.740673 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Jan 13 20:54:24.740720 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 13 20:54:24.740772 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Jan 13 20:54:24.740869 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.740932 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.740986 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.741034 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.741086 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.741134 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.741191 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.741241 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.742346 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.742404 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.742459 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.742508 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.742561 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.742613 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.742666 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.742714 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.742782 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.742846 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.742901 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.742949 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743002 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743051 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743104 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743154 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743247 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743314 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743367 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743432 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743498 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743547 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743602 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743651 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743703 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743752 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743804 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743853 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743907 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743956 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.744008 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.746342 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.746405 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.746455 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.746506 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.746557 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.746609 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.746656 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.746707 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.746755 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.746807 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.746858 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.746909 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.746957 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.747008 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.747056 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.747108 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.747159 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.747212 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.747261 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.748326 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.748379 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.748432 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.748484 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.748537 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.748587 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.748636 kernel: pci_bus 0000:01: extended config space not accessible Jan 13 20:54:24.748686 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:54:24.748735 kernel: pci_bus 0000:02: extended config space not accessible Jan 13 20:54:24.748744 kernel: acpiphp: Slot [32] registered Jan 13 20:54:24.748752 kernel: acpiphp: Slot [33] registered Jan 13 20:54:24.748758 kernel: acpiphp: Slot [34] registered Jan 13 20:54:24.748763 kernel: acpiphp: Slot [35] registered Jan 13 20:54:24.748769 kernel: acpiphp: Slot [36] registered Jan 13 20:54:24.748799 kernel: acpiphp: Slot [37] registered Jan 13 20:54:24.748806 kernel: acpiphp: Slot [38] registered Jan 13 20:54:24.748812 kernel: acpiphp: Slot [39] registered Jan 13 20:54:24.748818 kernel: acpiphp: Slot [40] registered Jan 13 20:54:24.748824 kernel: acpiphp: Slot [41] registered Jan 13 20:54:24.748831 kernel: acpiphp: Slot [42] registered Jan 13 20:54:24.748852 kernel: acpiphp: Slot [43] registered Jan 13 20:54:24.748872 kernel: acpiphp: Slot [44] registered Jan 13 20:54:24.748878 kernel: acpiphp: Slot [45] registered Jan 13 20:54:24.748884 kernel: acpiphp: Slot [46] registered Jan 13 20:54:24.748889 kernel: acpiphp: Slot [47] registered Jan 13 20:54:24.748911 kernel: acpiphp: Slot [48] registered Jan 13 20:54:24.748917 kernel: acpiphp: Slot [49] registered Jan 13 20:54:24.748923 kernel: acpiphp: Slot [50] registered Jan 13 20:54:24.748928 kernel: acpiphp: Slot [51] registered Jan 13 20:54:24.748951 kernel: acpiphp: Slot [52] registered Jan 13 20:54:24.748957 kernel: acpiphp: Slot [53] registered Jan 13 20:54:24.748962 kernel: acpiphp: Slot [54] registered Jan 13 20:54:24.748968 kernel: acpiphp: Slot [55] registered Jan 13 20:54:24.748974 kernel: acpiphp: Slot [56] registered Jan 13 20:54:24.748979 kernel: acpiphp: Slot [57] registered Jan 13 20:54:24.748984 kernel: acpiphp: Slot [58] registered Jan 13 20:54:24.748990 kernel: acpiphp: Slot [59] registered Jan 13 20:54:24.748995 kernel: acpiphp: Slot [60] registered Jan 13 20:54:24.749002 kernel: acpiphp: Slot [61] registered Jan 13 20:54:24.749008 kernel: acpiphp: Slot [62] registered Jan 13 20:54:24.749014 kernel: acpiphp: Slot [63] registered Jan 13 20:54:24.749064 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jan 13 20:54:24.749113 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:54:24.749160 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:54:24.749208 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:54:24.749256 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Jan 13 20:54:24.750336 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Jan 13 20:54:24.750386 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Jan 13 20:54:24.750434 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Jan 13 20:54:24.750482 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Jan 13 20:54:24.750535 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Jan 13 20:54:24.750585 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Jan 13 20:54:24.750634 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Jan 13 20:54:24.750687 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:54:24.750735 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.750802 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:54:24.750866 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:54:24.750914 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:54:24.750962 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:54:24.751011 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:54:24.751060 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:54:24.751112 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:54:24.751159 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:54:24.751208 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:54:24.751256 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:54:24.752954 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:54:24.753007 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:54:24.753057 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:54:24.753109 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:54:24.753158 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:54:24.753207 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:54:24.753255 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:54:24.753327 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:54:24.753382 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:54:24.753430 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:54:24.753478 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:54:24.753526 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:54:24.753573 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:54:24.753621 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:54:24.753669 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:54:24.753716 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:54:24.753767 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:54:24.753820 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Jan 13 20:54:24.753896 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Jan 13 20:54:24.753945 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Jan 13 20:54:24.753993 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Jan 13 20:54:24.754041 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Jan 13 20:54:24.754089 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:54:24.754141 kernel: pci 0000:0b:00.0: supports D1 D2 Jan 13 20:54:24.754189 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Jan 13 20:54:24.754238 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:54:24.754584 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:54:24.754643 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:54:24.754692 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:54:24.754741 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:54:24.754789 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:54:24.754839 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:54:24.754919 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:54:24.754967 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:54:24.755016 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:54:24.755063 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:54:24.755110 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:54:24.755158 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:54:24.755208 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:54:24.755255 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:54:24.755626 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:54:24.755682 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:54:24.755733 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:54:24.755787 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:54:24.755837 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:54:24.755885 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:54:24.755936 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:54:24.755988 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:54:24.756036 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:54:24.756084 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:54:24.756133 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:54:24.756182 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:54:24.756231 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:54:24.756295 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:54:24.756350 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:54:24.756401 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:54:24.756451 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:54:24.756499 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:54:24.756547 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:54:24.756594 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:54:24.756643 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:54:24.756692 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:54:24.756742 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:54:24.756795 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:54:24.756845 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:54:24.756892 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:54:24.756940 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:54:24.756988 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:54:24.757037 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:54:24.757084 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:54:24.757135 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:54:24.757184 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:54:24.757232 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:54:24.757288 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:54:24.757342 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:54:24.757390 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:54:24.757439 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:54:24.757487 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:54:24.757539 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:54:24.757587 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:54:24.757635 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:54:24.757683 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:54:24.757731 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:54:24.757779 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:54:24.757827 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:54:24.757904 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:54:24.757969 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:54:24.758018 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:54:24.758067 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:54:24.758115 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:54:24.758164 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:54:24.758211 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:54:24.758259 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:54:24.758323 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:54:24.758373 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:54:24.758421 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:54:24.758470 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:54:24.758518 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:54:24.758566 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:54:24.758616 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:54:24.758665 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:54:24.758716 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:54:24.758765 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:54:24.758819 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:54:24.758867 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:54:24.758876 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Jan 13 20:54:24.758882 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Jan 13 20:54:24.758888 kernel: ACPI: PCI: Interrupt link LNKB disabled Jan 13 20:54:24.758894 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 13 20:54:24.758899 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Jan 13 20:54:24.758907 kernel: iommu: Default domain type: Translated Jan 13 20:54:24.758913 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 13 20:54:24.758918 kernel: PCI: Using ACPI for IRQ routing Jan 13 20:54:24.758924 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 13 20:54:24.758930 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Jan 13 20:54:24.758935 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Jan 13 20:54:24.758983 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Jan 13 20:54:24.759031 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Jan 13 20:54:24.759078 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 13 20:54:24.759088 kernel: vgaarb: loaded Jan 13 20:54:24.759095 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Jan 13 20:54:24.759100 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Jan 13 20:54:24.759106 kernel: clocksource: Switched to clocksource tsc-early Jan 13 20:54:24.759111 kernel: VFS: Disk quotas dquot_6.6.0 Jan 13 20:54:24.759117 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 13 20:54:24.759123 kernel: pnp: PnP ACPI init Jan 13 20:54:24.759176 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Jan 13 20:54:24.759224 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Jan 13 20:54:24.759268 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Jan 13 20:54:24.759333 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Jan 13 20:54:24.759381 kernel: pnp 00:06: [dma 2] Jan 13 20:54:24.759430 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Jan 13 20:54:24.759475 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Jan 13 20:54:24.759518 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Jan 13 20:54:24.759529 kernel: pnp: PnP ACPI: found 8 devices Jan 13 20:54:24.759535 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 13 20:54:24.759541 kernel: NET: Registered PF_INET protocol family Jan 13 20:54:24.759546 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 13 20:54:24.759552 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 13 20:54:24.759558 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 13 20:54:24.759564 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 13 20:54:24.759569 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 13 20:54:24.759576 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 13 20:54:24.759582 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:54:24.759588 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:54:24.759603 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 13 20:54:24.759610 kernel: NET: Registered PF_XDP protocol family Jan 13 20:54:24.759666 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Jan 13 20:54:24.759734 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 13 20:54:24.759801 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 13 20:54:24.759870 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 13 20:54:24.759920 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 13 20:54:24.759970 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 13 20:54:24.760189 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 13 20:54:24.760375 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 13 20:54:24.760441 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 13 20:54:24.760497 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 13 20:54:24.760547 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 13 20:54:24.760596 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 13 20:54:24.760645 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 13 20:54:24.760694 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 13 20:54:24.760746 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 13 20:54:24.760820 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 13 20:54:24.760885 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 13 20:54:24.760934 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 13 20:54:24.760985 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 13 20:54:24.761034 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 13 20:54:24.761085 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 13 20:54:24.761134 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 13 20:54:24.761183 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Jan 13 20:54:24.761232 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:54:24.761291 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:54:24.761346 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.761395 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.761448 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.761497 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.761546 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.761594 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.761644 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.761692 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.761741 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.761794 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.761846 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.761894 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.761943 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762017 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762069 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762119 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762168 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762216 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762269 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762396 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762445 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762493 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762541 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762589 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762638 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762686 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762738 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762787 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762836 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762884 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762933 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762982 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763031 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763084 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763166 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763221 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763271 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763358 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763407 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763456 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763504 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763552 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763604 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763653 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763711 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763759 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763808 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763889 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763938 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763989 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.764037 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.764095 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.764180 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.764230 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.766328 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.766387 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.766439 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.766490 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.766540 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.766589 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.766638 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.766691 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.766740 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.766789 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.766855 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.766905 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.766955 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.767004 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.767053 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.767102 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.767152 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.767204 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.767254 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.769327 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.769387 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.769440 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.769491 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.769541 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.769591 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.769641 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.769695 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.769744 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.769794 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.769843 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.769909 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:54:24.769958 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Jan 13 20:54:24.770007 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:54:24.770074 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:54:24.770124 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:54:24.770179 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Jan 13 20:54:24.770228 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:54:24.770348 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:54:24.770407 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:54:24.770457 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:54:24.770506 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:54:24.770555 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:54:24.770620 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:54:24.770688 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:54:24.770754 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:54:24.770804 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:54:24.770854 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:54:24.770918 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:54:24.770966 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:54:24.771015 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:54:24.771064 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:54:24.771112 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:54:24.771161 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:54:24.771212 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:54:24.771262 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:54:24.771330 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:54:24.771378 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:54:24.771427 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:54:24.771475 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:54:24.771528 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:54:24.771576 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:54:24.771624 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:54:24.771672 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:54:24.771741 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Jan 13 20:54:24.771810 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:54:24.771859 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:54:24.771908 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:54:24.771957 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:54:24.772027 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:54:24.772092 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:54:24.772140 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:54:24.772189 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:54:24.772256 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:54:24.772670 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:54:24.772741 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:54:24.772798 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:54:24.772850 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:54:24.772905 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:54:24.772954 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:54:24.773004 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:54:24.773054 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:54:24.773103 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:54:24.773153 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:54:24.773203 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:54:24.773252 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:54:24.773316 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:54:24.773367 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:54:24.773420 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:54:24.773471 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:54:24.773521 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:54:24.773586 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:54:24.773636 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:54:24.773685 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:54:24.773733 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:54:24.773782 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:54:24.773864 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:54:24.774030 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:54:24.774085 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:54:24.774137 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:54:24.774187 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:54:24.774237 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:54:24.774331 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:54:24.774383 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:54:24.774433 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:54:24.774483 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:54:24.774533 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:54:24.774586 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:54:24.774636 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:54:24.774686 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:54:24.774750 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:54:24.774804 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:54:24.774853 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:54:24.774903 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:54:24.774951 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:54:24.775000 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:54:24.775052 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:54:24.775100 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:54:24.775148 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:54:24.775198 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:54:24.775247 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:54:24.775302 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:54:24.775352 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:54:24.775403 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:54:24.775452 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:54:24.775501 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:54:24.775553 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:54:24.775602 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:54:24.775651 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:54:24.775700 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:54:24.775767 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:54:24.775816 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:54:24.775865 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:54:24.775916 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:54:24.775966 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:54:24.776018 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:54:24.776068 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:54:24.776119 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:54:24.776168 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:54:24.776219 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:54:24.776269 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:54:24.776400 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:54:24.776453 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:54:24.776503 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:54:24.776553 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:54:24.776605 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:54:24.776651 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:54:24.776696 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:54:24.776739 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:54:24.776788 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:54:24.776837 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Jan 13 20:54:24.776883 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Jan 13 20:54:24.776932 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:54:24.776977 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:54:24.777023 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:54:24.777069 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:54:24.777114 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:54:24.777159 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:54:24.777209 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Jan 13 20:54:24.777255 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Jan 13 20:54:24.777312 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:54:24.777363 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Jan 13 20:54:24.777410 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Jan 13 20:54:24.777456 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:54:24.777508 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Jan 13 20:54:24.777555 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Jan 13 20:54:24.777603 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:54:24.777652 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Jan 13 20:54:24.777698 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:54:24.777748 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Jan 13 20:54:24.777795 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:54:24.777844 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Jan 13 20:54:24.777891 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:54:24.777943 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Jan 13 20:54:24.777989 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:54:24.778041 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Jan 13 20:54:24.778096 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:54:24.778149 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Jan 13 20:54:24.778198 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Jan 13 20:54:24.778245 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:54:24.778349 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Jan 13 20:54:24.778398 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Jan 13 20:54:24.778444 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:54:24.778495 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Jan 13 20:54:24.778541 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Jan 13 20:54:24.778593 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:54:24.778646 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Jan 13 20:54:24.778692 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:54:24.778743 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Jan 13 20:54:24.778796 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:54:24.778846 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Jan 13 20:54:24.778897 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:54:24.778948 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Jan 13 20:54:24.778995 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:54:24.779046 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Jan 13 20:54:24.779092 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:54:24.779142 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Jan 13 20:54:24.779188 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Jan 13 20:54:24.779237 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:54:24.779295 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Jan 13 20:54:24.779343 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Jan 13 20:54:24.779389 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:54:24.779438 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Jan 13 20:54:24.779485 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Jan 13 20:54:24.779534 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:54:24.779587 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Jan 13 20:54:24.779634 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:54:24.779730 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Jan 13 20:54:24.779779 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:54:24.779830 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Jan 13 20:54:24.779888 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:54:24.779943 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Jan 13 20:54:24.779990 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:54:24.780041 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Jan 13 20:54:24.780088 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:54:24.780142 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 13 20:54:24.780191 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Jan 13 20:54:24.780238 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:54:24.781850 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Jan 13 20:54:24.781911 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Jan 13 20:54:24.781961 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:54:24.782013 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Jan 13 20:54:24.782061 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:54:24.782118 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Jan 13 20:54:24.782165 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:54:24.782216 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Jan 13 20:54:24.782263 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:54:24.782630 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Jan 13 20:54:24.782683 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:54:24.782738 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Jan 13 20:54:24.782786 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:54:24.782838 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Jan 13 20:54:24.782891 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:54:24.782948 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jan 13 20:54:24.782958 kernel: PCI: CLS 32 bytes, default 64 Jan 13 20:54:24.782966 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 13 20:54:24.782974 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:54:24.782981 kernel: clocksource: Switched to clocksource tsc Jan 13 20:54:24.782988 kernel: Initialise system trusted keyrings Jan 13 20:54:24.782994 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 13 20:54:24.783000 kernel: Key type asymmetric registered Jan 13 20:54:24.783007 kernel: Asymmetric key parser 'x509' registered Jan 13 20:54:24.783013 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 13 20:54:24.783019 kernel: io scheduler mq-deadline registered Jan 13 20:54:24.783026 kernel: io scheduler kyber registered Jan 13 20:54:24.783033 kernel: io scheduler bfq registered Jan 13 20:54:24.783087 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Jan 13 20:54:24.783141 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.783196 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Jan 13 20:54:24.783247 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.783981 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Jan 13 20:54:24.784043 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.784102 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Jan 13 20:54:24.784157 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.784210 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Jan 13 20:54:24.784263 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.784418 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Jan 13 20:54:24.785667 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.785729 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Jan 13 20:54:24.785784 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.785839 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Jan 13 20:54:24.785890 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.785943 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Jan 13 20:54:24.785995 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.786050 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Jan 13 20:54:24.786101 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.786155 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Jan 13 20:54:24.786206 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.786258 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Jan 13 20:54:24.787381 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.787447 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Jan 13 20:54:24.787502 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.787555 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Jan 13 20:54:24.787607 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.787660 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Jan 13 20:54:24.787715 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.787769 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Jan 13 20:54:24.787827 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.787880 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Jan 13 20:54:24.787932 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.787985 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Jan 13 20:54:24.788037 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.788092 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Jan 13 20:54:24.788143 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.789480 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Jan 13 20:54:24.789548 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.789606 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Jan 13 20:54:24.789661 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.789719 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Jan 13 20:54:24.789772 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.789824 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Jan 13 20:54:24.789876 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.789927 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Jan 13 20:54:24.789980 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790033 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Jan 13 20:54:24.790084 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790137 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Jan 13 20:54:24.790189 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790241 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Jan 13 20:54:24.790344 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790398 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Jan 13 20:54:24.790450 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790502 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Jan 13 20:54:24.790553 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790604 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Jan 13 20:54:24.790659 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790711 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Jan 13 20:54:24.790762 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790814 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Jan 13 20:54:24.790866 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790877 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 13 20:54:24.790884 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 13 20:54:24.790891 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 13 20:54:24.790899 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Jan 13 20:54:24.790905 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 13 20:54:24.790912 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 13 20:54:24.790965 kernel: rtc_cmos 00:01: registered as rtc0 Jan 13 20:54:24.791013 kernel: rtc_cmos 00:01: setting system clock to 2025-01-13T20:54:24 UTC (1736801664) Jan 13 20:54:24.791024 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 13 20:54:24.791070 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Jan 13 20:54:24.791078 kernel: intel_pstate: CPU model not supported Jan 13 20:54:24.791085 kernel: NET: Registered PF_INET6 protocol family Jan 13 20:54:24.791091 kernel: Segment Routing with IPv6 Jan 13 20:54:24.791098 kernel: In-situ OAM (IOAM) with IPv6 Jan 13 20:54:24.791104 kernel: NET: Registered PF_PACKET protocol family Jan 13 20:54:24.791111 kernel: Key type dns_resolver registered Jan 13 20:54:24.791117 kernel: IPI shorthand broadcast: enabled Jan 13 20:54:24.791125 kernel: sched_clock: Marking stable (872198871, 220655983)->(1150998113, -58143259) Jan 13 20:54:24.791131 kernel: registered taskstats version 1 Jan 13 20:54:24.791138 kernel: Loading compiled-in X.509 certificates Jan 13 20:54:24.791144 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: 98739e9049f62881f4df7ffd1e39335f7f55b344' Jan 13 20:54:24.791150 kernel: Key type .fscrypt registered Jan 13 20:54:24.791156 kernel: Key type fscrypt-provisioning registered Jan 13 20:54:24.791162 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 13 20:54:24.791169 kernel: ima: Allocated hash algorithm: sha1 Jan 13 20:54:24.791176 kernel: ima: No architecture policies found Jan 13 20:54:24.791182 kernel: clk: Disabling unused clocks Jan 13 20:54:24.791188 kernel: Freeing unused kernel image (initmem) memory: 42976K Jan 13 20:54:24.791195 kernel: Write protecting the kernel read-only data: 36864k Jan 13 20:54:24.791201 kernel: Freeing unused kernel image (rodata/data gap) memory: 1840K Jan 13 20:54:24.791207 kernel: Run /init as init process Jan 13 20:54:24.791213 kernel: with arguments: Jan 13 20:54:24.791220 kernel: /init Jan 13 20:54:24.791226 kernel: with environment: Jan 13 20:54:24.791232 kernel: HOME=/ Jan 13 20:54:24.791240 kernel: TERM=linux Jan 13 20:54:24.791246 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 13 20:54:24.791254 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 20:54:24.791262 systemd[1]: Detected virtualization vmware. Jan 13 20:54:24.791269 systemd[1]: Detected architecture x86-64. Jan 13 20:54:24.791276 systemd[1]: Running in initrd. Jan 13 20:54:24.791502 systemd[1]: No hostname configured, using default hostname. Jan 13 20:54:24.791510 systemd[1]: Hostname set to . Jan 13 20:54:24.791518 systemd[1]: Initializing machine ID from random generator. Jan 13 20:54:24.791524 systemd[1]: Queued start job for default target initrd.target. Jan 13 20:54:24.791531 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:54:24.791537 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:54:24.791544 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 13 20:54:24.791551 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 20:54:24.791557 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 13 20:54:24.791565 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 13 20:54:24.791573 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 13 20:54:24.791580 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 13 20:54:24.791587 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:54:24.791593 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:54:24.791601 systemd[1]: Reached target paths.target - Path Units. Jan 13 20:54:24.791607 systemd[1]: Reached target slices.target - Slice Units. Jan 13 20:54:24.791615 systemd[1]: Reached target swap.target - Swaps. Jan 13 20:54:24.791621 systemd[1]: Reached target timers.target - Timer Units. Jan 13 20:54:24.791628 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:54:24.791634 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:54:24.791641 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 13 20:54:24.791647 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 13 20:54:24.791654 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:54:24.791660 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 20:54:24.791668 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:54:24.791675 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 20:54:24.791681 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 13 20:54:24.791688 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 20:54:24.791694 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 13 20:54:24.791701 systemd[1]: Starting systemd-fsck-usr.service... Jan 13 20:54:24.791707 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 20:54:24.791714 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 20:54:24.791720 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:54:24.791728 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 13 20:54:24.791748 systemd-journald[216]: Collecting audit messages is disabled. Jan 13 20:54:24.791765 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:54:24.791772 systemd[1]: Finished systemd-fsck-usr.service. Jan 13 20:54:24.791789 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 20:54:24.791796 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:54:24.791803 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 13 20:54:24.791810 kernel: Bridge firewalling registered Jan 13 20:54:24.791818 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:54:24.791824 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 20:54:24.791831 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:54:24.791838 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 20:54:24.791845 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 20:54:24.791851 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:54:24.791858 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:54:24.791865 systemd-journald[216]: Journal started Jan 13 20:54:24.791881 systemd-journald[216]: Runtime Journal (/run/log/journal/5c9b62d1b3e940c5b774b14501140017) is 4.8M, max 38.7M, 33.8M free. Jan 13 20:54:24.744308 systemd-modules-load[217]: Inserted module 'overlay' Jan 13 20:54:24.764068 systemd-modules-load[217]: Inserted module 'br_netfilter' Jan 13 20:54:24.794316 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 20:54:24.795430 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:54:24.800418 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 13 20:54:24.802840 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 20:54:24.806589 dracut-cmdline[246]: dracut-dracut-053 Jan 13 20:54:24.807880 dracut-cmdline[246]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 20:54:24.809350 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:54:24.814412 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 20:54:24.829530 systemd-resolved[267]: Positive Trust Anchors: Jan 13 20:54:24.829540 systemd-resolved[267]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 20:54:24.829561 systemd-resolved[267]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 20:54:24.831227 systemd-resolved[267]: Defaulting to hostname 'linux'. Jan 13 20:54:24.831813 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 20:54:24.831966 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:54:24.850295 kernel: SCSI subsystem initialized Jan 13 20:54:24.856289 kernel: Loading iSCSI transport class v2.0-870. Jan 13 20:54:24.863291 kernel: iscsi: registered transport (tcp) Jan 13 20:54:24.876296 kernel: iscsi: registered transport (qla4xxx) Jan 13 20:54:24.876341 kernel: QLogic iSCSI HBA Driver Jan 13 20:54:24.896233 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 13 20:54:24.900385 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 13 20:54:24.914337 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 13 20:54:24.914383 kernel: device-mapper: uevent: version 1.0.3 Jan 13 20:54:24.915945 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 13 20:54:24.947296 kernel: raid6: avx2x4 gen() 52448 MB/s Jan 13 20:54:24.964324 kernel: raid6: avx2x2 gen() 52728 MB/s Jan 13 20:54:24.981476 kernel: raid6: avx2x1 gen() 43904 MB/s Jan 13 20:54:24.981497 kernel: raid6: using algorithm avx2x2 gen() 52728 MB/s Jan 13 20:54:24.999493 kernel: raid6: .... xor() 31090 MB/s, rmw enabled Jan 13 20:54:24.999527 kernel: raid6: using avx2x2 recovery algorithm Jan 13 20:54:25.013293 kernel: xor: automatically using best checksumming function avx Jan 13 20:54:25.110298 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 13 20:54:25.115847 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:54:25.121357 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:54:25.128864 systemd-udevd[435]: Using default interface naming scheme 'v255'. Jan 13 20:54:25.131331 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:54:25.138403 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 13 20:54:25.145416 dracut-pre-trigger[440]: rd.md=0: removing MD RAID activation Jan 13 20:54:25.161850 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:54:25.166376 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 20:54:25.230982 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:54:25.234387 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 13 20:54:25.245159 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 13 20:54:25.245919 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:54:25.246416 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:54:25.246760 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 20:54:25.250503 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 13 20:54:25.260424 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:54:25.295289 kernel: VMware PVSCSI driver - version 1.0.7.0-k Jan 13 20:54:25.297297 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Jan 13 20:54:25.300350 kernel: vmw_pvscsi: using 64bit dma Jan 13 20:54:25.300366 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Jan 13 20:54:25.311009 kernel: vmw_pvscsi: max_id: 16 Jan 13 20:54:25.311023 kernel: vmw_pvscsi: setting ring_pages to 8 Jan 13 20:54:25.311031 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Jan 13 20:54:25.311105 kernel: vmw_pvscsi: enabling reqCallThreshold Jan 13 20:54:25.311115 kernel: vmw_pvscsi: driver-based request coalescing enabled Jan 13 20:54:25.311123 kernel: vmw_pvscsi: using MSI-X Jan 13 20:54:25.313291 kernel: cryptd: max_cpu_qlen set to 1000 Jan 13 20:54:25.317293 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Jan 13 20:54:25.322615 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Jan 13 20:54:25.322705 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Jan 13 20:54:25.324109 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Jan 13 20:54:25.326863 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:54:25.326935 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:54:25.327119 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:54:25.327221 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:54:25.327299 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:54:25.327414 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:54:25.331286 kernel: libata version 3.00 loaded. Jan 13 20:54:25.334291 kernel: ata_piix 0000:00:07.1: version 2.13 Jan 13 20:54:25.338917 kernel: AVX2 version of gcm_enc/dec engaged. Jan 13 20:54:25.338928 kernel: scsi host1: ata_piix Jan 13 20:54:25.339000 kernel: scsi host2: ata_piix Jan 13 20:54:25.339059 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Jan 13 20:54:25.339068 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Jan 13 20:54:25.334720 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:54:25.341716 kernel: AES CTR mode by8 optimization enabled Jan 13 20:54:25.350250 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:54:25.357369 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:54:25.365566 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:54:25.511343 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Jan 13 20:54:25.518345 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Jan 13 20:54:25.529517 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Jan 13 20:54:25.532980 kernel: sd 0:0:0:0: [sda] Write Protect is off Jan 13 20:54:25.533176 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Jan 13 20:54:25.533247 kernel: sd 0:0:0:0: [sda] Cache data unavailable Jan 13 20:54:25.533328 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Jan 13 20:54:25.533399 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:54:25.533408 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jan 13 20:54:25.549317 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Jan 13 20:54:25.557232 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 13 20:54:25.557244 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jan 13 20:54:25.561665 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (493) Jan 13 20:54:25.563882 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Jan 13 20:54:25.564288 kernel: BTRFS: device fsid 5e7921ba-229a-48a0-bc77-9b30aaa34aeb devid 1 transid 36 /dev/sda3 scanned by (udev-worker) (495) Jan 13 20:54:25.566564 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Jan 13 20:54:25.569203 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 20:54:25.571270 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Jan 13 20:54:25.571396 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Jan 13 20:54:25.580351 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 13 20:54:25.613315 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:54:26.622302 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:54:26.622950 disk-uuid[589]: The operation has completed successfully. Jan 13 20:54:26.658831 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 13 20:54:26.658900 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 13 20:54:26.662378 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 13 20:54:26.664179 sh[608]: Success Jan 13 20:54:26.673301 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Jan 13 20:54:26.733179 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 13 20:54:26.734366 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 13 20:54:26.734669 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 13 20:54:26.751782 kernel: BTRFS info (device dm-0): first mount of filesystem 5e7921ba-229a-48a0-bc77-9b30aaa34aeb Jan 13 20:54:26.751820 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:54:26.751829 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 13 20:54:26.752881 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 13 20:54:26.754346 kernel: BTRFS info (device dm-0): using free space tree Jan 13 20:54:26.761298 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 13 20:54:26.763430 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 13 20:54:26.771381 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Jan 13 20:54:26.772532 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 13 20:54:26.796158 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:54:26.796200 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:54:26.796214 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:54:26.815298 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:54:26.820723 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 13 20:54:26.822353 kernel: BTRFS info (device sda6): last unmount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:54:26.825985 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 13 20:54:26.829446 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 13 20:54:26.844381 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:54:26.851389 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 13 20:54:26.910210 ignition[667]: Ignition 2.20.0 Jan 13 20:54:26.910217 ignition[667]: Stage: fetch-offline Jan 13 20:54:26.910236 ignition[667]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:54:26.910241 ignition[667]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:54:26.910298 ignition[667]: parsed url from cmdline: "" Jan 13 20:54:26.910300 ignition[667]: no config URL provided Jan 13 20:54:26.910303 ignition[667]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 20:54:26.910310 ignition[667]: no config at "/usr/lib/ignition/user.ign" Jan 13 20:54:26.910657 ignition[667]: config successfully fetched Jan 13 20:54:26.910673 ignition[667]: parsing config with SHA512: de55b6cdf32be21cf51cb6917001c7e1789efffaf777134b15c13b6502dfb24067e622287d497bc7f083a92acb9fc1b30c011c80d10e3bc750d2f98feafb6d8c Jan 13 20:54:26.914198 unknown[667]: fetched base config from "system" Jan 13 20:54:26.914204 unknown[667]: fetched user config from "vmware" Jan 13 20:54:26.914738 ignition[667]: fetch-offline: fetch-offline passed Jan 13 20:54:26.914888 ignition[667]: Ignition finished successfully Jan 13 20:54:26.915529 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:54:26.917160 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:54:26.923399 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 20:54:26.935070 systemd-networkd[801]: lo: Link UP Jan 13 20:54:26.935077 systemd-networkd[801]: lo: Gained carrier Jan 13 20:54:26.935772 systemd-networkd[801]: Enumeration completed Jan 13 20:54:26.936032 systemd-networkd[801]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Jan 13 20:54:26.936035 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 20:54:26.936491 systemd[1]: Reached target network.target - Network. Jan 13 20:54:26.936589 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jan 13 20:54:26.939345 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 20:54:26.939458 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 20:54:26.939206 systemd-networkd[801]: ens192: Link UP Jan 13 20:54:26.939208 systemd-networkd[801]: ens192: Gained carrier Jan 13 20:54:26.940379 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 13 20:54:26.947995 ignition[803]: Ignition 2.20.0 Jan 13 20:54:26.948001 ignition[803]: Stage: kargs Jan 13 20:54:26.948094 ignition[803]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:54:26.948101 ignition[803]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:54:26.948619 ignition[803]: kargs: kargs passed Jan 13 20:54:26.948642 ignition[803]: Ignition finished successfully Jan 13 20:54:26.949695 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 13 20:54:26.954391 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 13 20:54:26.960988 ignition[810]: Ignition 2.20.0 Jan 13 20:54:26.960995 ignition[810]: Stage: disks Jan 13 20:54:26.961124 ignition[810]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:54:26.961135 ignition[810]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:54:26.961662 ignition[810]: disks: disks passed Jan 13 20:54:26.961689 ignition[810]: Ignition finished successfully Jan 13 20:54:26.962559 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 13 20:54:26.962793 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 13 20:54:26.962926 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 13 20:54:26.963119 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 20:54:26.963309 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 20:54:26.963480 systemd[1]: Reached target basic.target - Basic System. Jan 13 20:54:26.966392 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 13 20:54:26.977353 systemd-fsck[818]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 13 20:54:26.978222 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 13 20:54:26.982392 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 13 20:54:27.038340 kernel: EXT4-fs (sda9): mounted filesystem 84bcd1b2-5573-4e91-8fd5-f97782397085 r/w with ordered data mode. Quota mode: none. Jan 13 20:54:27.037959 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 13 20:54:27.038298 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 13 20:54:27.043336 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:54:27.044329 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 13 20:54:27.044602 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 13 20:54:27.044629 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 13 20:54:27.044643 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:54:27.048462 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 13 20:54:27.049310 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 13 20:54:27.055213 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (826) Jan 13 20:54:27.055244 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:54:27.055259 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:54:27.056836 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:54:27.060349 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:54:27.061082 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:54:27.079181 initrd-setup-root[850]: cut: /sysroot/etc/passwd: No such file or directory Jan 13 20:54:27.081744 initrd-setup-root[857]: cut: /sysroot/etc/group: No such file or directory Jan 13 20:54:27.083920 initrd-setup-root[864]: cut: /sysroot/etc/shadow: No such file or directory Jan 13 20:54:27.085904 initrd-setup-root[871]: cut: /sysroot/etc/gshadow: No such file or directory Jan 13 20:54:27.138752 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 13 20:54:27.143369 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 13 20:54:27.144794 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 13 20:54:27.150311 kernel: BTRFS info (device sda6): last unmount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:54:27.165384 ignition[939]: INFO : Ignition 2.20.0 Jan 13 20:54:27.165384 ignition[939]: INFO : Stage: mount Jan 13 20:54:27.165723 ignition[939]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:54:27.165723 ignition[939]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:54:27.166471 ignition[939]: INFO : mount: mount passed Jan 13 20:54:27.166471 ignition[939]: INFO : Ignition finished successfully Jan 13 20:54:27.166106 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 13 20:54:27.166586 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 13 20:54:27.169398 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 13 20:54:27.749974 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 13 20:54:27.754417 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:54:27.761291 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (950) Jan 13 20:54:27.763639 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:54:27.763656 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:54:27.763668 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:54:27.767291 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:54:27.768169 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:54:27.783366 ignition[967]: INFO : Ignition 2.20.0 Jan 13 20:54:27.783366 ignition[967]: INFO : Stage: files Jan 13 20:54:27.783850 ignition[967]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:54:27.783850 ignition[967]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:54:27.784104 ignition[967]: DEBUG : files: compiled without relabeling support, skipping Jan 13 20:54:27.784423 ignition[967]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 13 20:54:27.784423 ignition[967]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 13 20:54:27.786456 ignition[967]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 13 20:54:27.786592 ignition[967]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 13 20:54:27.786727 ignition[967]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 13 20:54:27.786676 unknown[967]: wrote ssh authorized keys file for user: core Jan 13 20:54:27.788333 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:54:27.788584 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 13 20:54:27.821363 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 13 20:54:27.934010 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:54:27.934260 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 13 20:54:27.934260 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 13 20:54:27.934260 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:54:27.934260 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:54:27.934260 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 13 20:54:27.936355 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 Jan 13 20:54:28.241387 systemd-networkd[801]: ens192: Gained IPv6LL Jan 13 20:54:28.393276 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 13 20:54:28.563576 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 13 20:54:28.563576 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:54:28.564376 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Jan 13 20:54:28.604180 ignition[967]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:54:28.606707 ignition[967]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:54:28.606874 ignition[967]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Jan 13 20:54:28.606874 ignition[967]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Jan 13 20:54:28.606874 ignition[967]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Jan 13 20:54:28.607880 ignition[967]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:54:28.607880 ignition[967]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:54:28.607880 ignition[967]: INFO : files: files passed Jan 13 20:54:28.607880 ignition[967]: INFO : Ignition finished successfully Jan 13 20:54:28.608543 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 13 20:54:28.612479 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 13 20:54:28.613887 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 13 20:54:28.614311 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 13 20:54:28.614362 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 13 20:54:28.621449 initrd-setup-root-after-ignition[998]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:54:28.621449 initrd-setup-root-after-ignition[998]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:54:28.621885 initrd-setup-root-after-ignition[1002]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:54:28.622527 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:54:28.623056 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 13 20:54:28.627406 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 13 20:54:28.644520 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 13 20:54:28.644577 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 13 20:54:28.645210 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 13 20:54:28.645482 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 13 20:54:28.645713 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 13 20:54:28.646340 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 13 20:54:28.659103 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:54:28.663396 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 13 20:54:28.668562 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:54:28.668831 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:54:28.669124 systemd[1]: Stopped target timers.target - Timer Units. Jan 13 20:54:28.669394 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 13 20:54:28.669465 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:54:28.669962 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 13 20:54:28.670221 systemd[1]: Stopped target basic.target - Basic System. Jan 13 20:54:28.670449 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 13 20:54:28.670718 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:54:28.671007 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 13 20:54:28.671446 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 13 20:54:28.671582 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:54:28.671860 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 13 20:54:28.672245 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 13 20:54:28.672498 systemd[1]: Stopped target swap.target - Swaps. Jan 13 20:54:28.672704 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 13 20:54:28.672777 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:54:28.673242 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:54:28.673502 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:54:28.673641 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 13 20:54:28.673688 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:54:28.673865 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 13 20:54:28.673934 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 13 20:54:28.674190 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 13 20:54:28.674252 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:54:28.674511 systemd[1]: Stopped target paths.target - Path Units. Jan 13 20:54:28.674654 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 13 20:54:28.678327 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:54:28.678512 systemd[1]: Stopped target slices.target - Slice Units. Jan 13 20:54:28.678716 systemd[1]: Stopped target sockets.target - Socket Units. Jan 13 20:54:28.678906 systemd[1]: iscsid.socket: Deactivated successfully. Jan 13 20:54:28.678974 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:54:28.679182 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 13 20:54:28.679229 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:54:28.679507 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 13 20:54:28.679591 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:54:28.679827 systemd[1]: ignition-files.service: Deactivated successfully. Jan 13 20:54:28.679903 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 13 20:54:28.688417 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 13 20:54:28.690409 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 13 20:54:28.690510 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 13 20:54:28.690602 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:54:28.690878 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 13 20:54:28.690957 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:54:28.693960 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 13 20:54:28.694022 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 13 20:54:28.697984 ignition[1022]: INFO : Ignition 2.20.0 Jan 13 20:54:28.698398 ignition[1022]: INFO : Stage: umount Jan 13 20:54:28.698398 ignition[1022]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:54:28.698398 ignition[1022]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:54:28.699346 ignition[1022]: INFO : umount: umount passed Jan 13 20:54:28.699512 ignition[1022]: INFO : Ignition finished successfully Jan 13 20:54:28.700292 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 13 20:54:28.700464 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 13 20:54:28.700673 systemd[1]: Stopped target network.target - Network. Jan 13 20:54:28.700769 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 13 20:54:28.700794 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 13 20:54:28.700942 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 13 20:54:28.700963 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 13 20:54:28.701105 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 13 20:54:28.701125 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 13 20:54:28.701270 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 13 20:54:28.701299 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 13 20:54:28.701892 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 13 20:54:28.702292 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 13 20:54:28.705068 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 13 20:54:28.705131 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 13 20:54:28.705903 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 13 20:54:28.705930 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:54:28.711377 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 13 20:54:28.711477 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 13 20:54:28.711509 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:54:28.711637 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Jan 13 20:54:28.711660 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:54:28.711816 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:54:28.712728 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 13 20:54:28.715212 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 13 20:54:28.715273 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 13 20:54:28.716204 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 13 20:54:28.716246 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:54:28.716502 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 13 20:54:28.716525 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 13 20:54:28.716668 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 13 20:54:28.716690 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:54:28.719922 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 13 20:54:28.719987 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 13 20:54:28.723719 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 13 20:54:28.723906 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:54:28.724309 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 13 20:54:28.724331 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 13 20:54:28.724710 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 13 20:54:28.724727 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:54:28.724957 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 13 20:54:28.724979 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:54:28.725256 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 13 20:54:28.725415 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 13 20:54:28.725684 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:54:28.725708 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:54:28.730371 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 13 20:54:28.730481 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 13 20:54:28.730509 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:54:28.730634 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 13 20:54:28.730655 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:54:28.730773 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 13 20:54:28.730794 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:54:28.730911 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:54:28.730931 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:54:28.735400 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 13 20:54:28.735613 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 13 20:54:28.811857 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 13 20:54:28.811924 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 13 20:54:28.812295 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 13 20:54:28.812482 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 13 20:54:28.812520 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 13 20:54:28.819500 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 13 20:54:28.824464 systemd[1]: Switching root. Jan 13 20:54:28.858510 systemd-journald[216]: Journal stopped Jan 13 20:54:24.732405 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Jan 13 19:01:45 -00 2025 Jan 13 20:54:24.732420 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 20:54:24.732427 kernel: Disabled fast string operations Jan 13 20:54:24.732431 kernel: BIOS-provided physical RAM map: Jan 13 20:54:24.732434 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Jan 13 20:54:24.732438 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Jan 13 20:54:24.732444 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Jan 13 20:54:24.732448 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Jan 13 20:54:24.732452 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Jan 13 20:54:24.732456 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Jan 13 20:54:24.732460 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Jan 13 20:54:24.732464 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Jan 13 20:54:24.732468 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Jan 13 20:54:24.732472 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Jan 13 20:54:24.732478 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Jan 13 20:54:24.732483 kernel: NX (Execute Disable) protection: active Jan 13 20:54:24.732487 kernel: APIC: Static calls initialized Jan 13 20:54:24.732492 kernel: SMBIOS 2.7 present. Jan 13 20:54:24.732497 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Jan 13 20:54:24.732502 kernel: vmware: hypercall mode: 0x00 Jan 13 20:54:24.732506 kernel: Hypervisor detected: VMware Jan 13 20:54:24.732511 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Jan 13 20:54:24.732516 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Jan 13 20:54:24.732521 kernel: vmware: using clock offset of 2442757049 ns Jan 13 20:54:24.732525 kernel: tsc: Detected 3408.000 MHz processor Jan 13 20:54:24.732530 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 13 20:54:24.732535 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 13 20:54:24.732540 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Jan 13 20:54:24.732545 kernel: total RAM covered: 3072M Jan 13 20:54:24.732549 kernel: Found optimal setting for mtrr clean up Jan 13 20:54:24.732554 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Jan 13 20:54:24.732560 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Jan 13 20:54:24.732564 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 13 20:54:24.732569 kernel: Using GB pages for direct mapping Jan 13 20:54:24.732574 kernel: ACPI: Early table checksum verification disabled Jan 13 20:54:24.732578 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Jan 13 20:54:24.732583 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Jan 13 20:54:24.732587 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Jan 13 20:54:24.732592 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Jan 13 20:54:24.732596 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:54:24.732604 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:54:24.732608 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Jan 13 20:54:24.732614 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Jan 13 20:54:24.732619 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Jan 13 20:54:24.732624 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Jan 13 20:54:24.732629 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Jan 13 20:54:24.732634 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Jan 13 20:54:24.732639 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Jan 13 20:54:24.732644 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Jan 13 20:54:24.732649 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:54:24.732654 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:54:24.732659 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Jan 13 20:54:24.732664 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Jan 13 20:54:24.732668 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Jan 13 20:54:24.732673 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Jan 13 20:54:24.732679 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Jan 13 20:54:24.732684 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Jan 13 20:54:24.732689 kernel: system APIC only can use physical flat Jan 13 20:54:24.732693 kernel: APIC: Switched APIC routing to: physical flat Jan 13 20:54:24.732698 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 13 20:54:24.732703 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 13 20:54:24.732708 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 13 20:54:24.732712 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 13 20:54:24.732717 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 13 20:54:24.732723 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 13 20:54:24.732728 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 13 20:54:24.732732 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 13 20:54:24.732737 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Jan 13 20:54:24.732742 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Jan 13 20:54:24.732746 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Jan 13 20:54:24.732751 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Jan 13 20:54:24.732756 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Jan 13 20:54:24.732760 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Jan 13 20:54:24.732765 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Jan 13 20:54:24.732770 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Jan 13 20:54:24.732778 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Jan 13 20:54:24.732801 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Jan 13 20:54:24.732806 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Jan 13 20:54:24.732810 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Jan 13 20:54:24.732830 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Jan 13 20:54:24.732835 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Jan 13 20:54:24.732840 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Jan 13 20:54:24.732844 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Jan 13 20:54:24.732849 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Jan 13 20:54:24.732854 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Jan 13 20:54:24.732860 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Jan 13 20:54:24.732864 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Jan 13 20:54:24.732869 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Jan 13 20:54:24.732874 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Jan 13 20:54:24.732879 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Jan 13 20:54:24.732884 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Jan 13 20:54:24.732888 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Jan 13 20:54:24.732893 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Jan 13 20:54:24.732898 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Jan 13 20:54:24.732903 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Jan 13 20:54:24.732908 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Jan 13 20:54:24.732913 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Jan 13 20:54:24.732918 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Jan 13 20:54:24.732923 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Jan 13 20:54:24.732927 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Jan 13 20:54:24.732932 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Jan 13 20:54:24.732937 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Jan 13 20:54:24.732942 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Jan 13 20:54:24.732947 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Jan 13 20:54:24.732951 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Jan 13 20:54:24.732957 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Jan 13 20:54:24.732962 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Jan 13 20:54:24.732966 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Jan 13 20:54:24.732971 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Jan 13 20:54:24.732976 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Jan 13 20:54:24.732981 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Jan 13 20:54:24.732985 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Jan 13 20:54:24.732990 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Jan 13 20:54:24.732995 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Jan 13 20:54:24.733000 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Jan 13 20:54:24.733005 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Jan 13 20:54:24.733010 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Jan 13 20:54:24.733015 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Jan 13 20:54:24.733023 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Jan 13 20:54:24.733029 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Jan 13 20:54:24.733034 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Jan 13 20:54:24.733039 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Jan 13 20:54:24.733044 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Jan 13 20:54:24.733049 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Jan 13 20:54:24.733055 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Jan 13 20:54:24.733060 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Jan 13 20:54:24.733066 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Jan 13 20:54:24.733071 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Jan 13 20:54:24.733076 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Jan 13 20:54:24.733081 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Jan 13 20:54:24.733086 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Jan 13 20:54:24.733091 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Jan 13 20:54:24.733096 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Jan 13 20:54:24.733101 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Jan 13 20:54:24.733107 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Jan 13 20:54:24.733112 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Jan 13 20:54:24.733117 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Jan 13 20:54:24.733122 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Jan 13 20:54:24.733127 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Jan 13 20:54:24.733132 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Jan 13 20:54:24.733137 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Jan 13 20:54:24.733142 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Jan 13 20:54:24.733147 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Jan 13 20:54:24.733152 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Jan 13 20:54:24.733158 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Jan 13 20:54:24.733164 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Jan 13 20:54:24.733169 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Jan 13 20:54:24.733174 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Jan 13 20:54:24.733179 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Jan 13 20:54:24.733184 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Jan 13 20:54:24.733189 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Jan 13 20:54:24.733194 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Jan 13 20:54:24.733199 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Jan 13 20:54:24.733204 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Jan 13 20:54:24.733209 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Jan 13 20:54:24.733215 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Jan 13 20:54:24.733220 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Jan 13 20:54:24.733225 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Jan 13 20:54:24.733230 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Jan 13 20:54:24.733235 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Jan 13 20:54:24.733240 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Jan 13 20:54:24.733245 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Jan 13 20:54:24.733250 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Jan 13 20:54:24.733255 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Jan 13 20:54:24.733260 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Jan 13 20:54:24.733266 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Jan 13 20:54:24.733272 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Jan 13 20:54:24.733277 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Jan 13 20:54:24.733443 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Jan 13 20:54:24.733449 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Jan 13 20:54:24.733454 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Jan 13 20:54:24.733459 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Jan 13 20:54:24.733464 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Jan 13 20:54:24.733469 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Jan 13 20:54:24.733474 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Jan 13 20:54:24.733481 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Jan 13 20:54:24.733486 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Jan 13 20:54:24.733492 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Jan 13 20:54:24.733497 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Jan 13 20:54:24.733502 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Jan 13 20:54:24.733507 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Jan 13 20:54:24.733512 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Jan 13 20:54:24.733517 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Jan 13 20:54:24.733522 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Jan 13 20:54:24.733527 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Jan 13 20:54:24.733532 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Jan 13 20:54:24.733538 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Jan 13 20:54:24.733544 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 13 20:54:24.733549 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 13 20:54:24.733554 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Jan 13 20:54:24.733560 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Jan 13 20:54:24.733565 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Jan 13 20:54:24.733571 kernel: Zone ranges: Jan 13 20:54:24.733576 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 13 20:54:24.733581 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Jan 13 20:54:24.733587 kernel: Normal empty Jan 13 20:54:24.733592 kernel: Movable zone start for each node Jan 13 20:54:24.733597 kernel: Early memory node ranges Jan 13 20:54:24.733603 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Jan 13 20:54:24.733608 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Jan 13 20:54:24.733613 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Jan 13 20:54:24.733618 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Jan 13 20:54:24.733623 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 13 20:54:24.733629 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Jan 13 20:54:24.733634 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Jan 13 20:54:24.733640 kernel: ACPI: PM-Timer IO Port: 0x1008 Jan 13 20:54:24.733645 kernel: system APIC only can use physical flat Jan 13 20:54:24.733650 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Jan 13 20:54:24.733656 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Jan 13 20:54:24.733661 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Jan 13 20:54:24.733666 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Jan 13 20:54:24.733671 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Jan 13 20:54:24.733676 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Jan 13 20:54:24.733682 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Jan 13 20:54:24.733688 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Jan 13 20:54:24.733693 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Jan 13 20:54:24.733698 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Jan 13 20:54:24.733703 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Jan 13 20:54:24.733708 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Jan 13 20:54:24.733713 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Jan 13 20:54:24.733718 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Jan 13 20:54:24.733723 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Jan 13 20:54:24.733729 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Jan 13 20:54:24.733734 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Jan 13 20:54:24.733740 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Jan 13 20:54:24.733745 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Jan 13 20:54:24.733750 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Jan 13 20:54:24.733755 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Jan 13 20:54:24.733760 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Jan 13 20:54:24.733765 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Jan 13 20:54:24.733771 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Jan 13 20:54:24.733779 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Jan 13 20:54:24.733784 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Jan 13 20:54:24.733790 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Jan 13 20:54:24.733796 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Jan 13 20:54:24.733801 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Jan 13 20:54:24.733806 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Jan 13 20:54:24.733811 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Jan 13 20:54:24.733817 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Jan 13 20:54:24.733822 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Jan 13 20:54:24.733827 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Jan 13 20:54:24.733832 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Jan 13 20:54:24.733837 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Jan 13 20:54:24.733843 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Jan 13 20:54:24.733849 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Jan 13 20:54:24.733854 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Jan 13 20:54:24.733859 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Jan 13 20:54:24.733864 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Jan 13 20:54:24.733869 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Jan 13 20:54:24.733875 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Jan 13 20:54:24.733880 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Jan 13 20:54:24.733885 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Jan 13 20:54:24.733890 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Jan 13 20:54:24.733896 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Jan 13 20:54:24.733901 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Jan 13 20:54:24.733906 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Jan 13 20:54:24.733912 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Jan 13 20:54:24.733917 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Jan 13 20:54:24.733922 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Jan 13 20:54:24.733927 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Jan 13 20:54:24.733932 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Jan 13 20:54:24.733937 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Jan 13 20:54:24.733943 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Jan 13 20:54:24.733949 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Jan 13 20:54:24.733954 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Jan 13 20:54:24.733959 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Jan 13 20:54:24.733964 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Jan 13 20:54:24.733969 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Jan 13 20:54:24.733974 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Jan 13 20:54:24.733979 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Jan 13 20:54:24.733984 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Jan 13 20:54:24.733990 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Jan 13 20:54:24.733996 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Jan 13 20:54:24.734001 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Jan 13 20:54:24.734006 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Jan 13 20:54:24.734011 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Jan 13 20:54:24.734016 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Jan 13 20:54:24.734225 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Jan 13 20:54:24.734231 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Jan 13 20:54:24.734237 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Jan 13 20:54:24.734242 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Jan 13 20:54:24.734247 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Jan 13 20:54:24.734254 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Jan 13 20:54:24.734259 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Jan 13 20:54:24.734264 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Jan 13 20:54:24.734269 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Jan 13 20:54:24.734275 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Jan 13 20:54:24.734288 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Jan 13 20:54:24.734294 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Jan 13 20:54:24.734299 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Jan 13 20:54:24.734304 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Jan 13 20:54:24.734309 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Jan 13 20:54:24.734316 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Jan 13 20:54:24.734321 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Jan 13 20:54:24.734326 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Jan 13 20:54:24.734331 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Jan 13 20:54:24.734336 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Jan 13 20:54:24.734341 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Jan 13 20:54:24.734347 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Jan 13 20:54:24.734352 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Jan 13 20:54:24.734357 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Jan 13 20:54:24.734363 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Jan 13 20:54:24.734368 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Jan 13 20:54:24.734374 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Jan 13 20:54:24.734379 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Jan 13 20:54:24.734384 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Jan 13 20:54:24.734389 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Jan 13 20:54:24.734394 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Jan 13 20:54:24.734399 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Jan 13 20:54:24.734404 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Jan 13 20:54:24.734409 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Jan 13 20:54:24.734415 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Jan 13 20:54:24.734421 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Jan 13 20:54:24.734426 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Jan 13 20:54:24.734431 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Jan 13 20:54:24.734436 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Jan 13 20:54:24.734441 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Jan 13 20:54:24.734446 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Jan 13 20:54:24.734451 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Jan 13 20:54:24.734456 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Jan 13 20:54:24.734461 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Jan 13 20:54:24.734468 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Jan 13 20:54:24.734473 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Jan 13 20:54:24.734478 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Jan 13 20:54:24.734483 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Jan 13 20:54:24.734488 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Jan 13 20:54:24.734493 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Jan 13 20:54:24.734498 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Jan 13 20:54:24.734503 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Jan 13 20:54:24.734509 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Jan 13 20:54:24.734514 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Jan 13 20:54:24.734520 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Jan 13 20:54:24.734525 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Jan 13 20:54:24.734530 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Jan 13 20:54:24.734535 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Jan 13 20:54:24.734540 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Jan 13 20:54:24.734546 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Jan 13 20:54:24.734551 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 13 20:54:24.734556 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Jan 13 20:54:24.734561 kernel: TSC deadline timer available Jan 13 20:54:24.734568 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Jan 13 20:54:24.734573 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Jan 13 20:54:24.734578 kernel: Booting paravirtualized kernel on VMware hypervisor Jan 13 20:54:24.734584 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 13 20:54:24.734589 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Jan 13 20:54:24.734594 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 13 20:54:24.734600 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 13 20:54:24.734605 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Jan 13 20:54:24.734610 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Jan 13 20:54:24.734616 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Jan 13 20:54:24.734621 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Jan 13 20:54:24.734627 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Jan 13 20:54:24.734638 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Jan 13 20:54:24.734645 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Jan 13 20:54:24.734650 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Jan 13 20:54:24.734655 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Jan 13 20:54:24.734661 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Jan 13 20:54:24.734668 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Jan 13 20:54:24.734673 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Jan 13 20:54:24.734679 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Jan 13 20:54:24.734684 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Jan 13 20:54:24.734689 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Jan 13 20:54:24.734695 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Jan 13 20:54:24.734701 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 20:54:24.734707 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 13 20:54:24.734714 kernel: random: crng init done Jan 13 20:54:24.734719 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Jan 13 20:54:24.734724 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Jan 13 20:54:24.734730 kernel: printk: log_buf_len min size: 262144 bytes Jan 13 20:54:24.734735 kernel: printk: log_buf_len: 1048576 bytes Jan 13 20:54:24.734741 kernel: printk: early log buf free: 239648(91%) Jan 13 20:54:24.734746 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 13 20:54:24.734752 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 13 20:54:24.734757 kernel: Fallback order for Node 0: 0 Jan 13 20:54:24.734763 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Jan 13 20:54:24.734769 kernel: Policy zone: DMA32 Jan 13 20:54:24.734775 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 13 20:54:24.734781 kernel: Memory: 1936376K/2096628K available (12288K kernel code, 2299K rwdata, 22736K rodata, 42976K init, 2216K bss, 159992K reserved, 0K cma-reserved) Jan 13 20:54:24.734788 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Jan 13 20:54:24.734793 kernel: ftrace: allocating 37920 entries in 149 pages Jan 13 20:54:24.734800 kernel: ftrace: allocated 149 pages with 4 groups Jan 13 20:54:24.734806 kernel: Dynamic Preempt: voluntary Jan 13 20:54:24.734812 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 13 20:54:24.734818 kernel: rcu: RCU event tracing is enabled. Jan 13 20:54:24.734824 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Jan 13 20:54:24.734829 kernel: Trampoline variant of Tasks RCU enabled. Jan 13 20:54:24.734835 kernel: Rude variant of Tasks RCU enabled. Jan 13 20:54:24.734840 kernel: Tracing variant of Tasks RCU enabled. Jan 13 20:54:24.734846 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 13 20:54:24.734852 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Jan 13 20:54:24.734858 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Jan 13 20:54:24.734863 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Jan 13 20:54:24.734869 kernel: Console: colour VGA+ 80x25 Jan 13 20:54:24.734874 kernel: printk: console [tty0] enabled Jan 13 20:54:24.734880 kernel: printk: console [ttyS0] enabled Jan 13 20:54:24.734885 kernel: ACPI: Core revision 20230628 Jan 13 20:54:24.734891 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Jan 13 20:54:24.734897 kernel: APIC: Switch to symmetric I/O mode setup Jan 13 20:54:24.734902 kernel: x2apic enabled Jan 13 20:54:24.734909 kernel: APIC: Switched APIC routing to: physical x2apic Jan 13 20:54:24.734915 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 13 20:54:24.734921 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:54:24.734926 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Jan 13 20:54:24.734932 kernel: Disabled fast string operations Jan 13 20:54:24.734937 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Jan 13 20:54:24.734943 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Jan 13 20:54:24.734949 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 13 20:54:24.734954 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Jan 13 20:54:24.734961 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Jan 13 20:54:24.734966 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 13 20:54:24.734972 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 13 20:54:24.734977 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 13 20:54:24.734983 kernel: RETBleed: Mitigation: Enhanced IBRS Jan 13 20:54:24.734989 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 13 20:54:24.734994 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 13 20:54:24.735000 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jan 13 20:54:24.735005 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 13 20:54:24.735012 kernel: GDS: Unknown: Dependent on hypervisor status Jan 13 20:54:24.735017 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 13 20:54:24.735023 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 13 20:54:24.735029 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 13 20:54:24.735034 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 13 20:54:24.735040 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jan 13 20:54:24.735045 kernel: Freeing SMP alternatives memory: 32K Jan 13 20:54:24.735051 kernel: pid_max: default: 131072 minimum: 1024 Jan 13 20:54:24.735056 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 13 20:54:24.735063 kernel: landlock: Up and running. Jan 13 20:54:24.735068 kernel: SELinux: Initializing. Jan 13 20:54:24.735074 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:54:24.735079 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:54:24.735085 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Jan 13 20:54:24.735091 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:54:24.735096 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:54:24.735102 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:54:24.735108 kernel: Performance Events: Skylake events, core PMU driver. Jan 13 20:54:24.735114 kernel: core: CPUID marked event: 'cpu cycles' unavailable Jan 13 20:54:24.735120 kernel: core: CPUID marked event: 'instructions' unavailable Jan 13 20:54:24.735125 kernel: core: CPUID marked event: 'bus cycles' unavailable Jan 13 20:54:24.735131 kernel: core: CPUID marked event: 'cache references' unavailable Jan 13 20:54:24.735136 kernel: core: CPUID marked event: 'cache misses' unavailable Jan 13 20:54:24.735141 kernel: core: CPUID marked event: 'branch instructions' unavailable Jan 13 20:54:24.735147 kernel: core: CPUID marked event: 'branch misses' unavailable Jan 13 20:54:24.735152 kernel: ... version: 1 Jan 13 20:54:24.735159 kernel: ... bit width: 48 Jan 13 20:54:24.735164 kernel: ... generic registers: 4 Jan 13 20:54:24.735170 kernel: ... value mask: 0000ffffffffffff Jan 13 20:54:24.735175 kernel: ... max period: 000000007fffffff Jan 13 20:54:24.735181 kernel: ... fixed-purpose events: 0 Jan 13 20:54:24.735188 kernel: ... event mask: 000000000000000f Jan 13 20:54:24.735193 kernel: signal: max sigframe size: 1776 Jan 13 20:54:24.735199 kernel: rcu: Hierarchical SRCU implementation. Jan 13 20:54:24.735204 kernel: rcu: Max phase no-delay instances is 400. Jan 13 20:54:24.735211 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 13 20:54:24.735216 kernel: smp: Bringing up secondary CPUs ... Jan 13 20:54:24.735222 kernel: smpboot: x86: Booting SMP configuration: Jan 13 20:54:24.735228 kernel: .... node #0, CPUs: #1 Jan 13 20:54:24.735233 kernel: Disabled fast string operations Jan 13 20:54:24.735239 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Jan 13 20:54:24.735244 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 13 20:54:24.735249 kernel: smp: Brought up 1 node, 2 CPUs Jan 13 20:54:24.735255 kernel: smpboot: Max logical packages: 128 Jan 13 20:54:24.735260 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Jan 13 20:54:24.735267 kernel: devtmpfs: initialized Jan 13 20:54:24.735273 kernel: x86/mm: Memory block size: 128MB Jan 13 20:54:24.735288 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Jan 13 20:54:24.735294 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 13 20:54:24.735300 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Jan 13 20:54:24.735305 kernel: pinctrl core: initialized pinctrl subsystem Jan 13 20:54:24.735311 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 13 20:54:24.735316 kernel: audit: initializing netlink subsys (disabled) Jan 13 20:54:24.735322 kernel: audit: type=2000 audit(1736801663.066:1): state=initialized audit_enabled=0 res=1 Jan 13 20:54:24.735329 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 13 20:54:24.735334 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 13 20:54:24.735340 kernel: cpuidle: using governor menu Jan 13 20:54:24.735345 kernel: Simple Boot Flag at 0x36 set to 0x80 Jan 13 20:54:24.735351 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 13 20:54:24.735356 kernel: dca service started, version 1.12.1 Jan 13 20:54:24.735362 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Jan 13 20:54:24.735368 kernel: PCI: Using configuration type 1 for base access Jan 13 20:54:24.735373 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 13 20:54:24.735380 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 13 20:54:24.735386 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 13 20:54:24.735392 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 13 20:54:24.735397 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 13 20:54:24.735403 kernel: ACPI: Added _OSI(Module Device) Jan 13 20:54:24.735408 kernel: ACPI: Added _OSI(Processor Device) Jan 13 20:54:24.735414 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 13 20:54:24.735419 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 13 20:54:24.735425 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 13 20:54:24.735431 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Jan 13 20:54:24.735437 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 13 20:54:24.735442 kernel: ACPI: Interpreter enabled Jan 13 20:54:24.735448 kernel: ACPI: PM: (supports S0 S1 S5) Jan 13 20:54:24.735454 kernel: ACPI: Using IOAPIC for interrupt routing Jan 13 20:54:24.735459 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 13 20:54:24.735465 kernel: PCI: Using E820 reservations for host bridge windows Jan 13 20:54:24.735470 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Jan 13 20:54:24.735476 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Jan 13 20:54:24.735550 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 13 20:54:24.737272 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Jan 13 20:54:24.739352 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Jan 13 20:54:24.739362 kernel: PCI host bridge to bus 0000:00 Jan 13 20:54:24.739414 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 13 20:54:24.739460 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Jan 13 20:54:24.739508 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 13 20:54:24.739551 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 13 20:54:24.739594 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Jan 13 20:54:24.739640 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Jan 13 20:54:24.739698 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Jan 13 20:54:24.739753 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Jan 13 20:54:24.739812 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Jan 13 20:54:24.739865 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Jan 13 20:54:24.739915 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Jan 13 20:54:24.739964 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Jan 13 20:54:24.740013 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Jan 13 20:54:24.740061 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Jan 13 20:54:24.740109 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Jan 13 20:54:24.740163 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Jan 13 20:54:24.740211 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Jan 13 20:54:24.740259 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Jan 13 20:54:24.740327 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Jan 13 20:54:24.740377 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Jan 13 20:54:24.740425 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Jan 13 20:54:24.740480 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Jan 13 20:54:24.740529 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Jan 13 20:54:24.740577 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Jan 13 20:54:24.740625 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Jan 13 20:54:24.740673 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Jan 13 20:54:24.740720 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 13 20:54:24.740772 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Jan 13 20:54:24.740869 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.740932 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.740986 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.741034 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.741086 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.741134 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.741191 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.741241 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.742346 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.742404 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.742459 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.742508 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.742561 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.742613 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.742666 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.742714 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.742782 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.742846 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.742901 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.742949 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743002 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743051 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743104 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743154 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743247 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743314 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743367 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743432 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743498 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743547 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743602 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743651 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743703 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743752 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743804 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743853 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.743907 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.743956 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.744008 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.746342 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.746405 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.746455 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.746506 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.746557 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.746609 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.746656 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.746707 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.746755 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.746807 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.746858 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.746909 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.746957 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.747008 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.747056 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.747108 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.747159 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.747212 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.747261 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.748326 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.748379 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.748432 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.748484 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.748537 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:54:24.748587 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.748636 kernel: pci_bus 0000:01: extended config space not accessible Jan 13 20:54:24.748686 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:54:24.748735 kernel: pci_bus 0000:02: extended config space not accessible Jan 13 20:54:24.748744 kernel: acpiphp: Slot [32] registered Jan 13 20:54:24.748752 kernel: acpiphp: Slot [33] registered Jan 13 20:54:24.748758 kernel: acpiphp: Slot [34] registered Jan 13 20:54:24.748763 kernel: acpiphp: Slot [35] registered Jan 13 20:54:24.748769 kernel: acpiphp: Slot [36] registered Jan 13 20:54:24.748799 kernel: acpiphp: Slot [37] registered Jan 13 20:54:24.748806 kernel: acpiphp: Slot [38] registered Jan 13 20:54:24.748812 kernel: acpiphp: Slot [39] registered Jan 13 20:54:24.748818 kernel: acpiphp: Slot [40] registered Jan 13 20:54:24.748824 kernel: acpiphp: Slot [41] registered Jan 13 20:54:24.748831 kernel: acpiphp: Slot [42] registered Jan 13 20:54:24.748852 kernel: acpiphp: Slot [43] registered Jan 13 20:54:24.748872 kernel: acpiphp: Slot [44] registered Jan 13 20:54:24.748878 kernel: acpiphp: Slot [45] registered Jan 13 20:54:24.748884 kernel: acpiphp: Slot [46] registered Jan 13 20:54:24.748889 kernel: acpiphp: Slot [47] registered Jan 13 20:54:24.748911 kernel: acpiphp: Slot [48] registered Jan 13 20:54:24.748917 kernel: acpiphp: Slot [49] registered Jan 13 20:54:24.748923 kernel: acpiphp: Slot [50] registered Jan 13 20:54:24.748928 kernel: acpiphp: Slot [51] registered Jan 13 20:54:24.748951 kernel: acpiphp: Slot [52] registered Jan 13 20:54:24.748957 kernel: acpiphp: Slot [53] registered Jan 13 20:54:24.748962 kernel: acpiphp: Slot [54] registered Jan 13 20:54:24.748968 kernel: acpiphp: Slot [55] registered Jan 13 20:54:24.748974 kernel: acpiphp: Slot [56] registered Jan 13 20:54:24.748979 kernel: acpiphp: Slot [57] registered Jan 13 20:54:24.748984 kernel: acpiphp: Slot [58] registered Jan 13 20:54:24.748990 kernel: acpiphp: Slot [59] registered Jan 13 20:54:24.748995 kernel: acpiphp: Slot [60] registered Jan 13 20:54:24.749002 kernel: acpiphp: Slot [61] registered Jan 13 20:54:24.749008 kernel: acpiphp: Slot [62] registered Jan 13 20:54:24.749014 kernel: acpiphp: Slot [63] registered Jan 13 20:54:24.749064 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jan 13 20:54:24.749113 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:54:24.749160 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:54:24.749208 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:54:24.749256 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Jan 13 20:54:24.750336 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Jan 13 20:54:24.750386 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Jan 13 20:54:24.750434 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Jan 13 20:54:24.750482 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Jan 13 20:54:24.750535 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Jan 13 20:54:24.750585 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Jan 13 20:54:24.750634 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Jan 13 20:54:24.750687 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:54:24.750735 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Jan 13 20:54:24.750802 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:54:24.750866 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:54:24.750914 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:54:24.750962 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:54:24.751011 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:54:24.751060 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:54:24.751112 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:54:24.751159 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:54:24.751208 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:54:24.751256 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:54:24.752954 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:54:24.753007 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:54:24.753057 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:54:24.753109 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:54:24.753158 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:54:24.753207 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:54:24.753255 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:54:24.753327 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:54:24.753382 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:54:24.753430 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:54:24.753478 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:54:24.753526 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:54:24.753573 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:54:24.753621 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:54:24.753669 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:54:24.753716 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:54:24.753767 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:54:24.753820 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Jan 13 20:54:24.753896 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Jan 13 20:54:24.753945 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Jan 13 20:54:24.753993 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Jan 13 20:54:24.754041 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Jan 13 20:54:24.754089 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:54:24.754141 kernel: pci 0000:0b:00.0: supports D1 D2 Jan 13 20:54:24.754189 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Jan 13 20:54:24.754238 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:54:24.754584 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:54:24.754643 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:54:24.754692 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:54:24.754741 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:54:24.754789 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:54:24.754839 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:54:24.754919 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:54:24.754967 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:54:24.755016 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:54:24.755063 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:54:24.755110 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:54:24.755158 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:54:24.755208 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:54:24.755255 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:54:24.755626 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:54:24.755682 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:54:24.755733 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:54:24.755787 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:54:24.755837 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:54:24.755885 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:54:24.755936 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:54:24.755988 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:54:24.756036 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:54:24.756084 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:54:24.756133 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:54:24.756182 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:54:24.756231 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:54:24.756295 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:54:24.756350 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:54:24.756401 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:54:24.756451 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:54:24.756499 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:54:24.756547 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:54:24.756594 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:54:24.756643 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:54:24.756692 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:54:24.756742 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:54:24.756795 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:54:24.756845 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:54:24.756892 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:54:24.756940 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:54:24.756988 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:54:24.757037 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:54:24.757084 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:54:24.757135 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:54:24.757184 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:54:24.757232 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:54:24.757288 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:54:24.757342 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:54:24.757390 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:54:24.757439 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:54:24.757487 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:54:24.757539 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:54:24.757587 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:54:24.757635 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:54:24.757683 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:54:24.757731 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:54:24.757779 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:54:24.757827 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:54:24.757904 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:54:24.757969 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:54:24.758018 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:54:24.758067 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:54:24.758115 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:54:24.758164 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:54:24.758211 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:54:24.758259 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:54:24.758323 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:54:24.758373 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:54:24.758421 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:54:24.758470 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:54:24.758518 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:54:24.758566 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:54:24.758616 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:54:24.758665 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:54:24.758716 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:54:24.758765 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:54:24.758819 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:54:24.758867 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:54:24.758876 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Jan 13 20:54:24.758882 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Jan 13 20:54:24.758888 kernel: ACPI: PCI: Interrupt link LNKB disabled Jan 13 20:54:24.758894 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 13 20:54:24.758899 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Jan 13 20:54:24.758907 kernel: iommu: Default domain type: Translated Jan 13 20:54:24.758913 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 13 20:54:24.758918 kernel: PCI: Using ACPI for IRQ routing Jan 13 20:54:24.758924 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 13 20:54:24.758930 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Jan 13 20:54:24.758935 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Jan 13 20:54:24.758983 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Jan 13 20:54:24.759031 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Jan 13 20:54:24.759078 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 13 20:54:24.759088 kernel: vgaarb: loaded Jan 13 20:54:24.759095 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Jan 13 20:54:24.759100 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Jan 13 20:54:24.759106 kernel: clocksource: Switched to clocksource tsc-early Jan 13 20:54:24.759111 kernel: VFS: Disk quotas dquot_6.6.0 Jan 13 20:54:24.759117 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 13 20:54:24.759123 kernel: pnp: PnP ACPI init Jan 13 20:54:24.759176 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Jan 13 20:54:24.759224 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Jan 13 20:54:24.759268 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Jan 13 20:54:24.759333 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Jan 13 20:54:24.759381 kernel: pnp 00:06: [dma 2] Jan 13 20:54:24.759430 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Jan 13 20:54:24.759475 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Jan 13 20:54:24.759518 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Jan 13 20:54:24.759529 kernel: pnp: PnP ACPI: found 8 devices Jan 13 20:54:24.759535 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 13 20:54:24.759541 kernel: NET: Registered PF_INET protocol family Jan 13 20:54:24.759546 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 13 20:54:24.759552 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 13 20:54:24.759558 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 13 20:54:24.759564 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 13 20:54:24.759569 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 13 20:54:24.759576 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 13 20:54:24.759582 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:54:24.759588 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:54:24.759603 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 13 20:54:24.759610 kernel: NET: Registered PF_XDP protocol family Jan 13 20:54:24.759666 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Jan 13 20:54:24.759734 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 13 20:54:24.759801 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 13 20:54:24.759870 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 13 20:54:24.759920 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 13 20:54:24.759970 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 13 20:54:24.760189 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 13 20:54:24.760375 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 13 20:54:24.760441 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 13 20:54:24.760497 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 13 20:54:24.760547 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 13 20:54:24.760596 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 13 20:54:24.760645 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 13 20:54:24.760694 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 13 20:54:24.760746 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 13 20:54:24.760820 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 13 20:54:24.760885 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 13 20:54:24.760934 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 13 20:54:24.760985 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 13 20:54:24.761034 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 13 20:54:24.761085 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 13 20:54:24.761134 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 13 20:54:24.761183 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Jan 13 20:54:24.761232 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:54:24.761291 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:54:24.761346 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.761395 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.761448 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.761497 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.761546 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.761594 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.761644 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.761692 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.761741 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.761794 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.761846 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.761894 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.761943 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762017 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762069 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762119 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762168 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762216 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762269 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762396 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762445 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762493 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762541 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762589 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762638 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762686 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762738 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762787 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762836 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762884 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.762933 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.762982 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763031 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763084 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763166 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763221 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763271 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763358 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763407 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763456 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763504 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763552 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763604 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763653 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763711 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763759 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763808 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763889 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.763938 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.763989 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.764037 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.764095 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.764180 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.764230 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.766328 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.766387 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.766439 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.766490 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.766540 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.766589 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.766638 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.766691 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.766740 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.766789 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.766855 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.766905 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.766955 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.767004 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.767053 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.767102 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.767152 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.767204 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.767254 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.769327 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.769387 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.769440 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.769491 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.769541 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.769591 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.769641 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.769695 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.769744 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.769794 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:54:24.769843 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:54:24.769909 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:54:24.769958 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Jan 13 20:54:24.770007 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:54:24.770074 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:54:24.770124 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:54:24.770179 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Jan 13 20:54:24.770228 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:54:24.770348 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:54:24.770407 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:54:24.770457 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:54:24.770506 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:54:24.770555 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:54:24.770620 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:54:24.770688 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:54:24.770754 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:54:24.770804 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:54:24.770854 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:54:24.770918 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:54:24.770966 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:54:24.771015 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:54:24.771064 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:54:24.771112 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:54:24.771161 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:54:24.771212 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:54:24.771262 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:54:24.771330 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:54:24.771378 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:54:24.771427 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:54:24.771475 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:54:24.771528 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:54:24.771576 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:54:24.771624 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:54:24.771672 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:54:24.771741 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Jan 13 20:54:24.771810 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:54:24.771859 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:54:24.771908 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:54:24.771957 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:54:24.772027 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:54:24.772092 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:54:24.772140 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:54:24.772189 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:54:24.772256 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:54:24.772670 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:54:24.772741 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:54:24.772798 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:54:24.772850 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:54:24.772905 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:54:24.772954 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:54:24.773004 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:54:24.773054 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:54:24.773103 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:54:24.773153 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:54:24.773203 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:54:24.773252 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:54:24.773316 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:54:24.773367 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:54:24.773420 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:54:24.773471 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:54:24.773521 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:54:24.773586 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:54:24.773636 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:54:24.773685 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:54:24.773733 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:54:24.773782 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:54:24.773864 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:54:24.774030 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:54:24.774085 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:54:24.774137 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:54:24.774187 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:54:24.774237 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:54:24.774331 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:54:24.774383 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:54:24.774433 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:54:24.774483 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:54:24.774533 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:54:24.774586 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:54:24.774636 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:54:24.774686 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:54:24.774750 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:54:24.774804 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:54:24.774853 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:54:24.774903 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:54:24.774951 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:54:24.775000 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:54:24.775052 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:54:24.775100 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:54:24.775148 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:54:24.775198 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:54:24.775247 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:54:24.775302 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:54:24.775352 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:54:24.775403 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:54:24.775452 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:54:24.775501 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:54:24.775553 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:54:24.775602 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:54:24.775651 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:54:24.775700 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:54:24.775767 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:54:24.775816 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:54:24.775865 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:54:24.775916 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:54:24.775966 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:54:24.776018 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:54:24.776068 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:54:24.776119 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:54:24.776168 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:54:24.776219 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:54:24.776269 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:54:24.776400 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:54:24.776453 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:54:24.776503 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:54:24.776553 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:54:24.776605 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:54:24.776651 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:54:24.776696 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:54:24.776739 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:54:24.776788 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:54:24.776837 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Jan 13 20:54:24.776883 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Jan 13 20:54:24.776932 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:54:24.776977 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:54:24.777023 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:54:24.777069 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:54:24.777114 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:54:24.777159 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:54:24.777209 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Jan 13 20:54:24.777255 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Jan 13 20:54:24.777312 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:54:24.777363 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Jan 13 20:54:24.777410 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Jan 13 20:54:24.777456 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:54:24.777508 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Jan 13 20:54:24.777555 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Jan 13 20:54:24.777603 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:54:24.777652 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Jan 13 20:54:24.777698 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:54:24.777748 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Jan 13 20:54:24.777795 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:54:24.777844 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Jan 13 20:54:24.777891 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:54:24.777943 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Jan 13 20:54:24.777989 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:54:24.778041 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Jan 13 20:54:24.778096 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:54:24.778149 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Jan 13 20:54:24.778198 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Jan 13 20:54:24.778245 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:54:24.778349 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Jan 13 20:54:24.778398 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Jan 13 20:54:24.778444 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:54:24.778495 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Jan 13 20:54:24.778541 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Jan 13 20:54:24.778593 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:54:24.778646 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Jan 13 20:54:24.778692 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:54:24.778743 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Jan 13 20:54:24.778796 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:54:24.778846 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Jan 13 20:54:24.778897 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:54:24.778948 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Jan 13 20:54:24.778995 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:54:24.779046 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Jan 13 20:54:24.779092 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:54:24.779142 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Jan 13 20:54:24.779188 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Jan 13 20:54:24.779237 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:54:24.779295 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Jan 13 20:54:24.779343 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Jan 13 20:54:24.779389 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:54:24.779438 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Jan 13 20:54:24.779485 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Jan 13 20:54:24.779534 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:54:24.779587 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Jan 13 20:54:24.779634 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:54:24.779730 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Jan 13 20:54:24.779779 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:54:24.779830 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Jan 13 20:54:24.779888 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:54:24.779943 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Jan 13 20:54:24.779990 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:54:24.780041 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Jan 13 20:54:24.780088 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:54:24.780142 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 13 20:54:24.780191 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Jan 13 20:54:24.780238 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:54:24.781850 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Jan 13 20:54:24.781911 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Jan 13 20:54:24.781961 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:54:24.782013 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Jan 13 20:54:24.782061 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:54:24.782118 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Jan 13 20:54:24.782165 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:54:24.782216 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Jan 13 20:54:24.782263 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:54:24.782630 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Jan 13 20:54:24.782683 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:54:24.782738 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Jan 13 20:54:24.782786 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:54:24.782838 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Jan 13 20:54:24.782891 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:54:24.782948 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jan 13 20:54:24.782958 kernel: PCI: CLS 32 bytes, default 64 Jan 13 20:54:24.782966 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 13 20:54:24.782974 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:54:24.782981 kernel: clocksource: Switched to clocksource tsc Jan 13 20:54:24.782988 kernel: Initialise system trusted keyrings Jan 13 20:54:24.782994 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 13 20:54:24.783000 kernel: Key type asymmetric registered Jan 13 20:54:24.783007 kernel: Asymmetric key parser 'x509' registered Jan 13 20:54:24.783013 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 13 20:54:24.783019 kernel: io scheduler mq-deadline registered Jan 13 20:54:24.783026 kernel: io scheduler kyber registered Jan 13 20:54:24.783033 kernel: io scheduler bfq registered Jan 13 20:54:24.783087 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Jan 13 20:54:24.783141 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.783196 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Jan 13 20:54:24.783247 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.783981 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Jan 13 20:54:24.784043 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.784102 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Jan 13 20:54:24.784157 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.784210 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Jan 13 20:54:24.784263 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.784418 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Jan 13 20:54:24.785667 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.785729 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Jan 13 20:54:24.785784 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.785839 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Jan 13 20:54:24.785890 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.785943 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Jan 13 20:54:24.785995 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.786050 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Jan 13 20:54:24.786101 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.786155 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Jan 13 20:54:24.786206 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.786258 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Jan 13 20:54:24.787381 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.787447 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Jan 13 20:54:24.787502 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.787555 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Jan 13 20:54:24.787607 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.787660 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Jan 13 20:54:24.787715 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.787769 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Jan 13 20:54:24.787827 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.787880 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Jan 13 20:54:24.787932 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.787985 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Jan 13 20:54:24.788037 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.788092 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Jan 13 20:54:24.788143 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.789480 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Jan 13 20:54:24.789548 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.789606 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Jan 13 20:54:24.789661 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.789719 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Jan 13 20:54:24.789772 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.789824 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Jan 13 20:54:24.789876 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.789927 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Jan 13 20:54:24.789980 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790033 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Jan 13 20:54:24.790084 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790137 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Jan 13 20:54:24.790189 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790241 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Jan 13 20:54:24.790344 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790398 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Jan 13 20:54:24.790450 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790502 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Jan 13 20:54:24.790553 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790604 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Jan 13 20:54:24.790659 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790711 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Jan 13 20:54:24.790762 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790814 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Jan 13 20:54:24.790866 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:54:24.790877 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 13 20:54:24.790884 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 13 20:54:24.790891 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 13 20:54:24.790899 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Jan 13 20:54:24.790905 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 13 20:54:24.790912 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 13 20:54:24.790965 kernel: rtc_cmos 00:01: registered as rtc0 Jan 13 20:54:24.791013 kernel: rtc_cmos 00:01: setting system clock to 2025-01-13T20:54:24 UTC (1736801664) Jan 13 20:54:24.791024 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 13 20:54:24.791070 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Jan 13 20:54:24.791078 kernel: intel_pstate: CPU model not supported Jan 13 20:54:24.791085 kernel: NET: Registered PF_INET6 protocol family Jan 13 20:54:24.791091 kernel: Segment Routing with IPv6 Jan 13 20:54:24.791098 kernel: In-situ OAM (IOAM) with IPv6 Jan 13 20:54:24.791104 kernel: NET: Registered PF_PACKET protocol family Jan 13 20:54:24.791111 kernel: Key type dns_resolver registered Jan 13 20:54:24.791117 kernel: IPI shorthand broadcast: enabled Jan 13 20:54:24.791125 kernel: sched_clock: Marking stable (872198871, 220655983)->(1150998113, -58143259) Jan 13 20:54:24.791131 kernel: registered taskstats version 1 Jan 13 20:54:24.791138 kernel: Loading compiled-in X.509 certificates Jan 13 20:54:24.791144 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: 98739e9049f62881f4df7ffd1e39335f7f55b344' Jan 13 20:54:24.791150 kernel: Key type .fscrypt registered Jan 13 20:54:24.791156 kernel: Key type fscrypt-provisioning registered Jan 13 20:54:24.791162 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 13 20:54:24.791169 kernel: ima: Allocated hash algorithm: sha1 Jan 13 20:54:24.791176 kernel: ima: No architecture policies found Jan 13 20:54:24.791182 kernel: clk: Disabling unused clocks Jan 13 20:54:24.791188 kernel: Freeing unused kernel image (initmem) memory: 42976K Jan 13 20:54:24.791195 kernel: Write protecting the kernel read-only data: 36864k Jan 13 20:54:24.791201 kernel: Freeing unused kernel image (rodata/data gap) memory: 1840K Jan 13 20:54:24.791207 kernel: Run /init as init process Jan 13 20:54:24.791213 kernel: with arguments: Jan 13 20:54:24.791220 kernel: /init Jan 13 20:54:24.791226 kernel: with environment: Jan 13 20:54:24.791232 kernel: HOME=/ Jan 13 20:54:24.791240 kernel: TERM=linux Jan 13 20:54:24.791246 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 13 20:54:24.791254 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 20:54:24.791262 systemd[1]: Detected virtualization vmware. Jan 13 20:54:24.791269 systemd[1]: Detected architecture x86-64. Jan 13 20:54:24.791276 systemd[1]: Running in initrd. Jan 13 20:54:24.791502 systemd[1]: No hostname configured, using default hostname. Jan 13 20:54:24.791510 systemd[1]: Hostname set to . Jan 13 20:54:24.791518 systemd[1]: Initializing machine ID from random generator. Jan 13 20:54:24.791524 systemd[1]: Queued start job for default target initrd.target. Jan 13 20:54:24.791531 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:54:24.791537 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:54:24.791544 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 13 20:54:24.791551 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 20:54:24.791557 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 13 20:54:24.791565 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 13 20:54:24.791573 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 13 20:54:24.791580 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 13 20:54:24.791587 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:54:24.791593 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:54:24.791601 systemd[1]: Reached target paths.target - Path Units. Jan 13 20:54:24.791607 systemd[1]: Reached target slices.target - Slice Units. Jan 13 20:54:24.791615 systemd[1]: Reached target swap.target - Swaps. Jan 13 20:54:24.791621 systemd[1]: Reached target timers.target - Timer Units. Jan 13 20:54:24.791628 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:54:24.791634 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:54:24.791641 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 13 20:54:24.791647 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 13 20:54:24.791654 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:54:24.791660 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 20:54:24.791668 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:54:24.791675 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 20:54:24.791681 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 13 20:54:24.791688 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 20:54:24.791694 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 13 20:54:24.791701 systemd[1]: Starting systemd-fsck-usr.service... Jan 13 20:54:24.791707 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 20:54:24.791714 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 20:54:24.791720 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:54:24.791728 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 13 20:54:24.791748 systemd-journald[216]: Collecting audit messages is disabled. Jan 13 20:54:24.791765 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:54:24.791772 systemd[1]: Finished systemd-fsck-usr.service. Jan 13 20:54:24.791789 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 20:54:24.791796 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:54:24.791803 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 13 20:54:24.791810 kernel: Bridge firewalling registered Jan 13 20:54:24.791818 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:54:24.791824 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 20:54:24.791831 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:54:24.791838 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 20:54:24.791845 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 20:54:24.791851 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:54:24.791858 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:54:24.791865 systemd-journald[216]: Journal started Jan 13 20:54:24.791881 systemd-journald[216]: Runtime Journal (/run/log/journal/5c9b62d1b3e940c5b774b14501140017) is 4.8M, max 38.7M, 33.8M free. Jan 13 20:54:24.744308 systemd-modules-load[217]: Inserted module 'overlay' Jan 13 20:54:24.764068 systemd-modules-load[217]: Inserted module 'br_netfilter' Jan 13 20:54:24.794316 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 20:54:24.795430 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:54:24.800418 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 13 20:54:24.802840 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 20:54:24.806589 dracut-cmdline[246]: dracut-dracut-053 Jan 13 20:54:24.807880 dracut-cmdline[246]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 20:54:24.809350 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:54:24.814412 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 20:54:24.829530 systemd-resolved[267]: Positive Trust Anchors: Jan 13 20:54:24.829540 systemd-resolved[267]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 20:54:24.829561 systemd-resolved[267]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 20:54:24.831227 systemd-resolved[267]: Defaulting to hostname 'linux'. Jan 13 20:54:24.831813 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 20:54:24.831966 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:54:24.850295 kernel: SCSI subsystem initialized Jan 13 20:54:24.856289 kernel: Loading iSCSI transport class v2.0-870. Jan 13 20:54:24.863291 kernel: iscsi: registered transport (tcp) Jan 13 20:54:24.876296 kernel: iscsi: registered transport (qla4xxx) Jan 13 20:54:24.876341 kernel: QLogic iSCSI HBA Driver Jan 13 20:54:24.896233 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 13 20:54:24.900385 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 13 20:54:24.914337 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 13 20:54:24.914383 kernel: device-mapper: uevent: version 1.0.3 Jan 13 20:54:24.915945 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 13 20:54:24.947296 kernel: raid6: avx2x4 gen() 52448 MB/s Jan 13 20:54:24.964324 kernel: raid6: avx2x2 gen() 52728 MB/s Jan 13 20:54:24.981476 kernel: raid6: avx2x1 gen() 43904 MB/s Jan 13 20:54:24.981497 kernel: raid6: using algorithm avx2x2 gen() 52728 MB/s Jan 13 20:54:24.999493 kernel: raid6: .... xor() 31090 MB/s, rmw enabled Jan 13 20:54:24.999527 kernel: raid6: using avx2x2 recovery algorithm Jan 13 20:54:25.013293 kernel: xor: automatically using best checksumming function avx Jan 13 20:54:25.110298 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 13 20:54:25.115847 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:54:25.121357 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:54:25.128864 systemd-udevd[435]: Using default interface naming scheme 'v255'. Jan 13 20:54:25.131331 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:54:25.138403 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 13 20:54:25.145416 dracut-pre-trigger[440]: rd.md=0: removing MD RAID activation Jan 13 20:54:25.161850 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:54:25.166376 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 20:54:25.230982 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:54:25.234387 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 13 20:54:25.245159 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 13 20:54:25.245919 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:54:25.246416 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:54:25.246760 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 20:54:25.250503 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 13 20:54:25.260424 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:54:25.295289 kernel: VMware PVSCSI driver - version 1.0.7.0-k Jan 13 20:54:25.297297 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Jan 13 20:54:25.300350 kernel: vmw_pvscsi: using 64bit dma Jan 13 20:54:25.300366 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Jan 13 20:54:25.311009 kernel: vmw_pvscsi: max_id: 16 Jan 13 20:54:25.311023 kernel: vmw_pvscsi: setting ring_pages to 8 Jan 13 20:54:25.311031 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Jan 13 20:54:25.311105 kernel: vmw_pvscsi: enabling reqCallThreshold Jan 13 20:54:25.311115 kernel: vmw_pvscsi: driver-based request coalescing enabled Jan 13 20:54:25.311123 kernel: vmw_pvscsi: using MSI-X Jan 13 20:54:25.313291 kernel: cryptd: max_cpu_qlen set to 1000 Jan 13 20:54:25.317293 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Jan 13 20:54:25.322615 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Jan 13 20:54:25.322705 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Jan 13 20:54:25.324109 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Jan 13 20:54:25.326863 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:54:25.326935 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:54:25.327119 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:54:25.327221 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:54:25.327299 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:54:25.327414 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:54:25.331286 kernel: libata version 3.00 loaded. Jan 13 20:54:25.334291 kernel: ata_piix 0000:00:07.1: version 2.13 Jan 13 20:54:25.338917 kernel: AVX2 version of gcm_enc/dec engaged. Jan 13 20:54:25.338928 kernel: scsi host1: ata_piix Jan 13 20:54:25.339000 kernel: scsi host2: ata_piix Jan 13 20:54:25.339059 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Jan 13 20:54:25.339068 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Jan 13 20:54:25.334720 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:54:25.341716 kernel: AES CTR mode by8 optimization enabled Jan 13 20:54:25.350250 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:54:25.357369 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:54:25.365566 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:54:25.511343 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Jan 13 20:54:25.518345 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Jan 13 20:54:25.529517 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Jan 13 20:54:25.532980 kernel: sd 0:0:0:0: [sda] Write Protect is off Jan 13 20:54:25.533176 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Jan 13 20:54:25.533247 kernel: sd 0:0:0:0: [sda] Cache data unavailable Jan 13 20:54:25.533328 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Jan 13 20:54:25.533399 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:54:25.533408 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jan 13 20:54:25.549317 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Jan 13 20:54:25.557232 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 13 20:54:25.557244 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jan 13 20:54:25.561665 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (493) Jan 13 20:54:25.563882 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Jan 13 20:54:25.564288 kernel: BTRFS: device fsid 5e7921ba-229a-48a0-bc77-9b30aaa34aeb devid 1 transid 36 /dev/sda3 scanned by (udev-worker) (495) Jan 13 20:54:25.566564 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Jan 13 20:54:25.569203 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 20:54:25.571270 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Jan 13 20:54:25.571396 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Jan 13 20:54:25.580351 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 13 20:54:25.613315 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:54:26.622302 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:54:26.622950 disk-uuid[589]: The operation has completed successfully. Jan 13 20:54:26.658831 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 13 20:54:26.658900 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 13 20:54:26.662378 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 13 20:54:26.664179 sh[608]: Success Jan 13 20:54:26.673301 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Jan 13 20:54:26.733179 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 13 20:54:26.734366 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 13 20:54:26.734669 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 13 20:54:26.751782 kernel: BTRFS info (device dm-0): first mount of filesystem 5e7921ba-229a-48a0-bc77-9b30aaa34aeb Jan 13 20:54:26.751820 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:54:26.751829 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 13 20:54:26.752881 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 13 20:54:26.754346 kernel: BTRFS info (device dm-0): using free space tree Jan 13 20:54:26.761298 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 13 20:54:26.763430 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 13 20:54:26.771381 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Jan 13 20:54:26.772532 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 13 20:54:26.796158 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:54:26.796200 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:54:26.796214 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:54:26.815298 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:54:26.820723 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 13 20:54:26.822353 kernel: BTRFS info (device sda6): last unmount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:54:26.825985 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 13 20:54:26.829446 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 13 20:54:26.844381 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:54:26.851389 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 13 20:54:26.910210 ignition[667]: Ignition 2.20.0 Jan 13 20:54:26.910217 ignition[667]: Stage: fetch-offline Jan 13 20:54:26.910236 ignition[667]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:54:26.910241 ignition[667]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:54:26.910298 ignition[667]: parsed url from cmdline: "" Jan 13 20:54:26.910300 ignition[667]: no config URL provided Jan 13 20:54:26.910303 ignition[667]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 20:54:26.910310 ignition[667]: no config at "/usr/lib/ignition/user.ign" Jan 13 20:54:26.910657 ignition[667]: config successfully fetched Jan 13 20:54:26.910673 ignition[667]: parsing config with SHA512: de55b6cdf32be21cf51cb6917001c7e1789efffaf777134b15c13b6502dfb24067e622287d497bc7f083a92acb9fc1b30c011c80d10e3bc750d2f98feafb6d8c Jan 13 20:54:26.914198 unknown[667]: fetched base config from "system" Jan 13 20:54:26.914204 unknown[667]: fetched user config from "vmware" Jan 13 20:54:26.914738 ignition[667]: fetch-offline: fetch-offline passed Jan 13 20:54:26.914888 ignition[667]: Ignition finished successfully Jan 13 20:54:26.915529 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:54:26.917160 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:54:26.923399 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 20:54:26.935070 systemd-networkd[801]: lo: Link UP Jan 13 20:54:26.935077 systemd-networkd[801]: lo: Gained carrier Jan 13 20:54:26.935772 systemd-networkd[801]: Enumeration completed Jan 13 20:54:26.936032 systemd-networkd[801]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Jan 13 20:54:26.936035 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 20:54:26.936491 systemd[1]: Reached target network.target - Network. Jan 13 20:54:26.936589 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jan 13 20:54:26.939345 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 20:54:26.939458 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 20:54:26.939206 systemd-networkd[801]: ens192: Link UP Jan 13 20:54:26.939208 systemd-networkd[801]: ens192: Gained carrier Jan 13 20:54:26.940379 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 13 20:54:26.947995 ignition[803]: Ignition 2.20.0 Jan 13 20:54:26.948001 ignition[803]: Stage: kargs Jan 13 20:54:26.948094 ignition[803]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:54:26.948101 ignition[803]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:54:26.948619 ignition[803]: kargs: kargs passed Jan 13 20:54:26.948642 ignition[803]: Ignition finished successfully Jan 13 20:54:26.949695 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 13 20:54:26.954391 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 13 20:54:26.960988 ignition[810]: Ignition 2.20.0 Jan 13 20:54:26.960995 ignition[810]: Stage: disks Jan 13 20:54:26.961124 ignition[810]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:54:26.961135 ignition[810]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:54:26.961662 ignition[810]: disks: disks passed Jan 13 20:54:26.961689 ignition[810]: Ignition finished successfully Jan 13 20:54:26.962559 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 13 20:54:26.962793 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 13 20:54:26.962926 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 13 20:54:26.963119 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 20:54:26.963309 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 20:54:26.963480 systemd[1]: Reached target basic.target - Basic System. Jan 13 20:54:26.966392 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 13 20:54:26.977353 systemd-fsck[818]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 13 20:54:26.978222 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 13 20:54:26.982392 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 13 20:54:27.038340 kernel: EXT4-fs (sda9): mounted filesystem 84bcd1b2-5573-4e91-8fd5-f97782397085 r/w with ordered data mode. Quota mode: none. Jan 13 20:54:27.037959 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 13 20:54:27.038298 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 13 20:54:27.043336 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:54:27.044329 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 13 20:54:27.044602 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 13 20:54:27.044629 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 13 20:54:27.044643 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:54:27.048462 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 13 20:54:27.049310 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 13 20:54:27.055213 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (826) Jan 13 20:54:27.055244 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:54:27.055259 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:54:27.056836 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:54:27.060349 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:54:27.061082 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:54:27.079181 initrd-setup-root[850]: cut: /sysroot/etc/passwd: No such file or directory Jan 13 20:54:27.081744 initrd-setup-root[857]: cut: /sysroot/etc/group: No such file or directory Jan 13 20:54:27.083920 initrd-setup-root[864]: cut: /sysroot/etc/shadow: No such file or directory Jan 13 20:54:27.085904 initrd-setup-root[871]: cut: /sysroot/etc/gshadow: No such file or directory Jan 13 20:54:27.138752 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 13 20:54:27.143369 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 13 20:54:27.144794 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 13 20:54:27.150311 kernel: BTRFS info (device sda6): last unmount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:54:27.165384 ignition[939]: INFO : Ignition 2.20.0 Jan 13 20:54:27.165384 ignition[939]: INFO : Stage: mount Jan 13 20:54:27.165723 ignition[939]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:54:27.165723 ignition[939]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:54:27.166471 ignition[939]: INFO : mount: mount passed Jan 13 20:54:27.166471 ignition[939]: INFO : Ignition finished successfully Jan 13 20:54:27.166106 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 13 20:54:27.166586 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 13 20:54:27.169398 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 13 20:54:27.749974 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 13 20:54:27.754417 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:54:27.761291 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (950) Jan 13 20:54:27.763639 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:54:27.763656 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:54:27.763668 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:54:27.767291 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:54:27.768169 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:54:27.783366 ignition[967]: INFO : Ignition 2.20.0 Jan 13 20:54:27.783366 ignition[967]: INFO : Stage: files Jan 13 20:54:27.783850 ignition[967]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:54:27.783850 ignition[967]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:54:27.784104 ignition[967]: DEBUG : files: compiled without relabeling support, skipping Jan 13 20:54:27.784423 ignition[967]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 13 20:54:27.784423 ignition[967]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 13 20:54:27.786456 ignition[967]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 13 20:54:27.786592 ignition[967]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 13 20:54:27.786727 ignition[967]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 13 20:54:27.786676 unknown[967]: wrote ssh authorized keys file for user: core Jan 13 20:54:27.788333 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:54:27.788584 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 13 20:54:27.821363 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 13 20:54:27.934010 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:54:27.934260 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 13 20:54:27.934260 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 13 20:54:27.934260 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:54:27.934260 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:54:27.934260 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 13 20:54:27.935060 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 13 20:54:27.936355 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 Jan 13 20:54:28.241387 systemd-networkd[801]: ens192: Gained IPv6LL Jan 13 20:54:28.393276 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 13 20:54:28.563576 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 13 20:54:28.563576 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:54:28.564376 ignition[967]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Jan 13 20:54:28.564376 ignition[967]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Jan 13 20:54:28.604180 ignition[967]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:54:28.606707 ignition[967]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:54:28.606874 ignition[967]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Jan 13 20:54:28.606874 ignition[967]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Jan 13 20:54:28.606874 ignition[967]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Jan 13 20:54:28.607880 ignition[967]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:54:28.607880 ignition[967]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:54:28.607880 ignition[967]: INFO : files: files passed Jan 13 20:54:28.607880 ignition[967]: INFO : Ignition finished successfully Jan 13 20:54:28.608543 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 13 20:54:28.612479 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 13 20:54:28.613887 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 13 20:54:28.614311 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 13 20:54:28.614362 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 13 20:54:28.621449 initrd-setup-root-after-ignition[998]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:54:28.621449 initrd-setup-root-after-ignition[998]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:54:28.621885 initrd-setup-root-after-ignition[1002]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:54:28.622527 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:54:28.623056 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 13 20:54:28.627406 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 13 20:54:28.644520 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 13 20:54:28.644577 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 13 20:54:28.645210 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 13 20:54:28.645482 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 13 20:54:28.645713 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 13 20:54:28.646340 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 13 20:54:28.659103 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:54:28.663396 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 13 20:54:28.668562 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:54:28.668831 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:54:28.669124 systemd[1]: Stopped target timers.target - Timer Units. Jan 13 20:54:28.669394 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 13 20:54:28.669465 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:54:28.669962 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 13 20:54:28.670221 systemd[1]: Stopped target basic.target - Basic System. Jan 13 20:54:28.670449 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 13 20:54:28.670718 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:54:28.671007 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 13 20:54:28.671446 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 13 20:54:28.671582 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:54:28.671860 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 13 20:54:28.672245 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 13 20:54:28.672498 systemd[1]: Stopped target swap.target - Swaps. Jan 13 20:54:28.672704 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 13 20:54:28.672777 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:54:28.673242 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:54:28.673502 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:54:28.673641 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 13 20:54:28.673688 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:54:28.673865 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 13 20:54:28.673934 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 13 20:54:28.674190 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 13 20:54:28.674252 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:54:28.674511 systemd[1]: Stopped target paths.target - Path Units. Jan 13 20:54:28.674654 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 13 20:54:28.678327 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:54:28.678512 systemd[1]: Stopped target slices.target - Slice Units. Jan 13 20:54:28.678716 systemd[1]: Stopped target sockets.target - Socket Units. Jan 13 20:54:28.678906 systemd[1]: iscsid.socket: Deactivated successfully. Jan 13 20:54:28.678974 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:54:28.679182 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 13 20:54:28.679229 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:54:28.679507 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 13 20:54:28.679591 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:54:28.679827 systemd[1]: ignition-files.service: Deactivated successfully. Jan 13 20:54:28.679903 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 13 20:54:28.688417 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 13 20:54:28.690409 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 13 20:54:28.690510 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 13 20:54:28.690602 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:54:28.690878 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 13 20:54:28.690957 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:54:28.693960 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 13 20:54:28.694022 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 13 20:54:28.697984 ignition[1022]: INFO : Ignition 2.20.0 Jan 13 20:54:28.698398 ignition[1022]: INFO : Stage: umount Jan 13 20:54:28.698398 ignition[1022]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:54:28.698398 ignition[1022]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:54:28.699346 ignition[1022]: INFO : umount: umount passed Jan 13 20:54:28.699512 ignition[1022]: INFO : Ignition finished successfully Jan 13 20:54:28.700292 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 13 20:54:28.700464 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 13 20:54:28.700673 systemd[1]: Stopped target network.target - Network. Jan 13 20:54:28.700769 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 13 20:54:28.700794 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 13 20:54:28.700942 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 13 20:54:28.700963 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 13 20:54:28.701105 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 13 20:54:28.701125 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 13 20:54:28.701270 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 13 20:54:28.701299 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 13 20:54:28.701892 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 13 20:54:28.702292 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 13 20:54:28.705068 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 13 20:54:28.705131 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 13 20:54:28.705903 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 13 20:54:28.705930 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:54:28.711377 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 13 20:54:28.711477 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 13 20:54:28.711509 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:54:28.711637 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Jan 13 20:54:28.711660 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:54:28.711816 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:54:28.712728 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 13 20:54:28.715212 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 13 20:54:28.715273 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 13 20:54:28.716204 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 13 20:54:28.716246 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:54:28.716502 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 13 20:54:28.716525 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 13 20:54:28.716668 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 13 20:54:28.716690 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:54:28.719922 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 13 20:54:28.719987 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 13 20:54:28.723719 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 13 20:54:28.723906 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:54:28.724309 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 13 20:54:28.724331 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 13 20:54:28.724710 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 13 20:54:28.724727 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:54:28.724957 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 13 20:54:28.724979 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:54:28.725256 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 13 20:54:28.725415 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 13 20:54:28.725684 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:54:28.725708 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:54:28.730371 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 13 20:54:28.730481 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 13 20:54:28.730509 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:54:28.730634 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 13 20:54:28.730655 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:54:28.730773 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 13 20:54:28.730794 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:54:28.730911 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:54:28.730931 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:54:28.735400 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 13 20:54:28.735613 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 13 20:54:28.811857 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 13 20:54:28.811924 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 13 20:54:28.812295 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 13 20:54:28.812482 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 13 20:54:28.812520 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 13 20:54:28.819500 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 13 20:54:28.824464 systemd[1]: Switching root. Jan 13 20:54:28.858510 systemd-journald[216]: Journal stopped Jan 13 20:54:29.896432 systemd-journald[216]: Received SIGTERM from PID 1 (systemd). Jan 13 20:54:29.896453 kernel: SELinux: policy capability network_peer_controls=1 Jan 13 20:54:29.896461 kernel: SELinux: policy capability open_perms=1 Jan 13 20:54:29.896467 kernel: SELinux: policy capability extended_socket_class=1 Jan 13 20:54:29.896472 kernel: SELinux: policy capability always_check_network=0 Jan 13 20:54:29.896477 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 13 20:54:29.896485 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 13 20:54:29.896491 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 13 20:54:29.896496 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 13 20:54:29.896502 systemd[1]: Successfully loaded SELinux policy in 37.005ms. Jan 13 20:54:29.896509 kernel: audit: type=1403 audit(1736801669.381:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 13 20:54:29.896515 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 6.842ms. Jan 13 20:54:29.896522 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 20:54:29.896530 systemd[1]: Detected virtualization vmware. Jan 13 20:54:29.896537 systemd[1]: Detected architecture x86-64. Jan 13 20:54:29.896543 systemd[1]: Detected first boot. Jan 13 20:54:29.896550 systemd[1]: Initializing machine ID from random generator. Jan 13 20:54:29.896558 zram_generator::config[1065]: No configuration found. Jan 13 20:54:29.896566 systemd[1]: Populated /etc with preset unit settings. Jan 13 20:54:29.896573 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:54:29.896580 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Jan 13 20:54:29.896586 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 13 20:54:29.896592 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 13 20:54:29.896599 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 13 20:54:29.896607 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 13 20:54:29.896614 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 13 20:54:29.896620 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 13 20:54:29.896627 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 13 20:54:29.896633 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 13 20:54:29.896640 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 13 20:54:29.896646 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 13 20:54:29.896654 systemd[1]: Created slice user.slice - User and Session Slice. Jan 13 20:54:29.896661 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:54:29.896668 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:54:29.896674 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 13 20:54:29.896681 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 13 20:54:29.896687 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 13 20:54:29.896694 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 20:54:29.896701 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 13 20:54:29.896709 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:54:29.896716 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 13 20:54:29.896724 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 13 20:54:29.896731 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 13 20:54:29.896737 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 13 20:54:29.896744 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:54:29.896751 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 20:54:29.896758 systemd[1]: Reached target slices.target - Slice Units. Jan 13 20:54:29.896766 systemd[1]: Reached target swap.target - Swaps. Jan 13 20:54:29.897506 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 13 20:54:29.897516 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 13 20:54:29.897524 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:54:29.897531 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 20:54:29.897541 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:54:29.897548 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 13 20:54:29.897555 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 13 20:54:29.897562 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 13 20:54:29.897569 systemd[1]: Mounting media.mount - External Media Directory... Jan 13 20:54:29.897578 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:54:29.897585 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 13 20:54:29.897592 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 13 20:54:29.897600 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 13 20:54:29.897608 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 13 20:54:29.897615 systemd[1]: Reached target machines.target - Containers. Jan 13 20:54:29.897622 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 13 20:54:29.897629 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Jan 13 20:54:29.897636 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 20:54:29.897643 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 13 20:54:29.897649 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:54:29.897658 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 20:54:29.897665 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 20:54:29.897672 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 13 20:54:29.897679 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:54:29.897686 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 13 20:54:29.897693 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 13 20:54:29.897700 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 13 20:54:29.897707 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 13 20:54:29.897714 systemd[1]: Stopped systemd-fsck-usr.service. Jan 13 20:54:29.897722 kernel: fuse: init (API version 7.39) Jan 13 20:54:29.897729 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 20:54:29.897736 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 20:54:29.897743 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 13 20:54:29.897750 kernel: loop: module loaded Jan 13 20:54:29.897756 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 13 20:54:29.897763 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 20:54:29.897770 systemd[1]: verity-setup.service: Deactivated successfully. Jan 13 20:54:29.897777 systemd[1]: Stopped verity-setup.service. Jan 13 20:54:29.897785 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:54:29.897792 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 13 20:54:29.897799 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 13 20:54:29.897806 systemd[1]: Mounted media.mount - External Media Directory. Jan 13 20:54:29.897813 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 13 20:54:29.897833 systemd-journald[1166]: Collecting audit messages is disabled. Jan 13 20:54:29.897850 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 13 20:54:29.897858 systemd-journald[1166]: Journal started Jan 13 20:54:29.897873 systemd-journald[1166]: Runtime Journal (/run/log/journal/04cabe9981e6475699431324cb7ce93e) is 4.8M, max 38.7M, 33.8M free. Jan 13 20:54:29.727562 systemd[1]: Queued start job for default target multi-user.target. Jan 13 20:54:29.746455 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 13 20:54:29.746694 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 13 20:54:29.898911 jq[1132]: true Jan 13 20:54:29.903196 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 13 20:54:29.903221 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 20:54:29.900990 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 13 20:54:29.902483 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:54:29.902715 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 13 20:54:29.902804 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 13 20:54:29.903035 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:54:29.903104 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:54:29.903827 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 20:54:29.903908 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 20:54:29.904130 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 13 20:54:29.904201 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 13 20:54:29.904433 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:54:29.904513 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:54:29.904740 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 20:54:29.904961 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 13 20:54:29.906437 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 13 20:54:29.912451 kernel: ACPI: bus type drm_connector registered Jan 13 20:54:29.915111 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 20:54:29.915457 jq[1174]: true Jan 13 20:54:29.915594 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 20:54:29.917042 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 13 20:54:29.921457 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 13 20:54:29.923217 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 13 20:54:29.923576 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 13 20:54:29.924342 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 20:54:29.925192 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Jan 13 20:54:29.929437 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 13 20:54:29.933394 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 13 20:54:29.933587 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:54:29.937179 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 13 20:54:29.941422 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 13 20:54:29.941575 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 20:54:29.949973 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 13 20:54:29.950120 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 20:54:29.951470 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 20:54:29.954997 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 13 20:54:29.965414 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 20:54:29.966686 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 13 20:54:29.967474 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 13 20:54:29.967752 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 13 20:54:29.986413 kernel: loop0: detected capacity change from 0 to 2944 Jan 13 20:54:29.978070 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 13 20:54:29.980559 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 13 20:54:29.986645 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Jan 13 20:54:29.999669 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 13 20:54:30.001023 systemd-journald[1166]: Time spent on flushing to /var/log/journal/04cabe9981e6475699431324cb7ce93e is 52.749ms for 1839 entries. Jan 13 20:54:30.001023 systemd-journald[1166]: System Journal (/var/log/journal/04cabe9981e6475699431324cb7ce93e) is 8.0M, max 584.8M, 576.8M free. Jan 13 20:54:30.103445 systemd-journald[1166]: Received client request to flush runtime journal. Jan 13 20:54:30.103484 kernel: loop1: detected capacity change from 0 to 140992 Jan 13 20:54:30.050551 ignition[1190]: Ignition 2.20.0 Jan 13 20:54:30.050725 ignition[1190]: deleting config from guestinfo properties Jan 13 20:54:30.116739 ignition[1190]: Successfully deleted config Jan 13 20:54:30.113510 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:54:30.115443 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 13 20:54:30.130768 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Jan 13 20:54:30.131667 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Jan 13 20:54:30.132018 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:54:30.140953 udevadm[1223]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Jan 13 20:54:30.141867 systemd-tmpfiles[1204]: ACLs are not supported, ignoring. Jan 13 20:54:30.141882 systemd-tmpfiles[1204]: ACLs are not supported, ignoring. Jan 13 20:54:30.155395 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:54:30.161409 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 13 20:54:30.162011 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 13 20:54:30.164487 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Jan 13 20:54:30.176630 kernel: loop2: detected capacity change from 0 to 138184 Jan 13 20:54:30.201980 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 13 20:54:30.209521 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 20:54:30.219298 kernel: loop3: detected capacity change from 0 to 205544 Jan 13 20:54:30.220507 systemd-tmpfiles[1234]: ACLs are not supported, ignoring. Jan 13 20:54:30.220518 systemd-tmpfiles[1234]: ACLs are not supported, ignoring. Jan 13 20:54:30.223251 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:54:30.291296 kernel: loop4: detected capacity change from 0 to 2944 Jan 13 20:54:30.304335 kernel: loop5: detected capacity change from 0 to 140992 Jan 13 20:54:30.369295 kernel: loop6: detected capacity change from 0 to 138184 Jan 13 20:54:30.410522 kernel: loop7: detected capacity change from 0 to 205544 Jan 13 20:54:30.439867 (sd-merge)[1238]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Jan 13 20:54:30.440186 (sd-merge)[1238]: Merged extensions into '/usr'. Jan 13 20:54:30.445066 systemd[1]: Reloading requested from client PID 1203 ('systemd-sysext') (unit systemd-sysext.service)... Jan 13 20:54:30.445142 systemd[1]: Reloading... Jan 13 20:54:30.493299 zram_generator::config[1260]: No configuration found. Jan 13 20:54:30.575657 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:54:30.603259 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:54:30.636596 systemd[1]: Reloading finished in 190 ms. Jan 13 20:54:30.655758 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 13 20:54:30.664431 systemd[1]: Starting ensure-sysext.service... Jan 13 20:54:30.665732 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 20:54:30.682493 systemd-tmpfiles[1320]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 13 20:54:30.682722 systemd-tmpfiles[1320]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 13 20:54:30.682927 systemd[1]: Reloading requested from client PID 1319 ('systemctl') (unit ensure-sysext.service)... Jan 13 20:54:30.682988 systemd[1]: Reloading... Jan 13 20:54:30.683275 systemd-tmpfiles[1320]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 13 20:54:30.683479 systemd-tmpfiles[1320]: ACLs are not supported, ignoring. Jan 13 20:54:30.683518 systemd-tmpfiles[1320]: ACLs are not supported, ignoring. Jan 13 20:54:30.701379 systemd-tmpfiles[1320]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 20:54:30.701386 systemd-tmpfiles[1320]: Skipping /boot Jan 13 20:54:30.708674 systemd-tmpfiles[1320]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 20:54:30.708685 systemd-tmpfiles[1320]: Skipping /boot Jan 13 20:54:30.735912 zram_generator::config[1348]: No configuration found. Jan 13 20:54:30.824006 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:54:30.844019 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:54:30.873095 ldconfig[1198]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 13 20:54:30.880170 systemd[1]: Reloading finished in 196 ms. Jan 13 20:54:30.893768 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 13 20:54:30.951618 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:54:30.957792 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 20:54:30.982458 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 13 20:54:30.984478 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 13 20:54:30.986867 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 20:54:30.990836 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:54:30.993482 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 13 20:54:30.994332 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 13 20:54:30.998043 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:54:31.008031 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:54:31.008784 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 20:54:31.009517 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:54:31.009792 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:54:31.011243 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 13 20:54:31.011396 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:54:31.014024 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:54:31.014217 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:54:31.014316 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:54:31.016667 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:54:31.022954 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 20:54:31.023136 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:54:31.023225 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:54:31.024552 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 13 20:54:31.024968 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:54:31.025070 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:54:31.028568 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 20:54:31.028690 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 20:54:31.029267 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:54:31.029915 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:54:31.030621 systemd-udevd[1411]: Using default interface naming scheme 'v255'. Jan 13 20:54:31.030828 systemd[1]: Finished ensure-sysext.service. Jan 13 20:54:31.033327 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 20:54:31.033483 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 20:54:31.033984 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 20:54:31.034231 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 20:54:31.041471 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 13 20:54:31.057040 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 13 20:54:31.115140 augenrules[1450]: No rules Jan 13 20:54:31.115104 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 20:54:31.115227 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 20:54:31.115404 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 13 20:54:31.115831 systemd[1]: Reached target time-set.target - System Time Set. Jan 13 20:54:31.117993 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 13 20:54:31.122502 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 13 20:54:31.135689 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 13 20:54:31.140955 systemd-resolved[1410]: Positive Trust Anchors: Jan 13 20:54:31.140965 systemd-resolved[1410]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 20:54:31.140987 systemd-resolved[1410]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 20:54:31.168859 systemd-resolved[1410]: Defaulting to hostname 'linux'. Jan 13 20:54:31.169937 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 20:54:31.170147 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:54:31.282133 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:54:31.290775 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 20:54:31.312377 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 13 20:54:31.362290 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Jan 13 20:54:31.364299 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1462) Jan 13 20:54:31.367331 kernel: ACPI: button: Power Button [PWRF] Jan 13 20:54:31.391026 systemd-networkd[1464]: lo: Link UP Jan 13 20:54:31.391032 systemd-networkd[1464]: lo: Gained carrier Jan 13 20:54:31.392505 systemd-networkd[1464]: Enumeration completed Jan 13 20:54:31.392562 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 20:54:31.392715 systemd[1]: Reached target network.target - Network. Jan 13 20:54:31.393968 systemd-networkd[1464]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Jan 13 20:54:31.395337 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 20:54:31.395458 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 20:54:31.401212 systemd-networkd[1464]: ens192: Link UP Jan 13 20:54:31.401684 systemd-networkd[1464]: ens192: Gained carrier Jan 13 20:54:31.403130 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 13 20:54:31.405110 systemd-timesyncd[1431]: Network configuration changed, trying to establish connection. Jan 13 20:54:31.425294 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Jan 13 20:54:31.440343 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 Jan 13 20:54:31.458438 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:54:31.466021 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 20:54:31.470430 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Jan 13 20:54:31.471476 kernel: mousedev: PS/2 mouse device common for all mice Jan 13 20:54:31.471489 kernel: Guest personality initialized and is active Jan 13 20:54:31.472396 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Jan 13 20:54:31.472419 kernel: Initialized host personality Jan 13 20:54:31.473428 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 13 20:54:31.476063 (udev-worker)[1476]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Jan 13 20:54:31.520477 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 13 20:54:31.532625 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Jan 13 20:54:31.537473 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Jan 13 20:54:31.579481 lvm[1501]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 13 20:54:31.621987 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Jan 13 20:54:31.622630 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:54:31.625446 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Jan 13 20:54:31.630504 lvm[1503]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 13 20:54:31.669324 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Jan 13 20:54:31.748206 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 13 20:54:31.748465 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 13 20:54:31.762602 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:54:31.763227 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 20:54:31.763456 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 13 20:54:31.763607 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 13 20:54:31.763853 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 13 20:54:31.764053 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 13 20:54:31.764187 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 13 20:54:31.764336 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 13 20:54:31.764360 systemd[1]: Reached target paths.target - Path Units. Jan 13 20:54:31.764450 systemd[1]: Reached target timers.target - Timer Units. Jan 13 20:54:31.764998 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 13 20:54:31.766185 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 13 20:54:31.770646 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 13 20:54:31.771207 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 13 20:54:31.771367 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 20:54:31.771459 systemd[1]: Reached target basic.target - Basic System. Jan 13 20:54:31.771577 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 13 20:54:31.771596 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 13 20:54:31.772516 systemd[1]: Starting containerd.service - containerd container runtime... Jan 13 20:54:31.775517 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 13 20:54:31.778253 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 13 20:54:31.779427 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 13 20:54:31.779549 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 13 20:54:31.782232 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 13 20:54:31.784387 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 13 20:54:31.786420 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 13 20:54:31.789883 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 13 20:54:31.794407 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 13 20:54:31.794791 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 13 20:54:31.795359 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 13 20:54:31.797441 systemd[1]: Starting update-engine.service - Update Engine... Jan 13 20:54:31.797862 jq[1513]: false Jan 13 20:54:31.798781 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 13 20:54:31.800374 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Jan 13 20:54:31.801813 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 13 20:54:31.801997 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 13 20:54:31.812616 dbus-daemon[1512]: [system] SELinux support is enabled Jan 13 20:54:31.816503 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 13 20:54:31.821649 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 13 20:54:31.821813 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 13 20:54:31.826143 jq[1521]: true Jan 13 20:54:31.828207 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 13 20:54:31.828253 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 13 20:54:31.828480 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 13 20:54:31.828497 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 13 20:54:31.841452 update_engine[1520]: I20250113 20:54:31.840437 1520 main.cc:92] Flatcar Update Engine starting Jan 13 20:54:31.838725 systemd[1]: motdgen.service: Deactivated successfully. Jan 13 20:54:31.840619 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 13 20:54:31.844214 (ntainerd)[1537]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 13 20:54:31.847233 jq[1534]: true Jan 13 20:54:31.847700 systemd[1]: Started update-engine.service - Update Engine. Jan 13 20:54:31.847758 update_engine[1520]: I20250113 20:54:31.847715 1520 update_check_scheduler.cc:74] Next update check in 4m1s Jan 13 20:54:31.851182 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Jan 13 20:54:31.855384 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Jan 13 20:54:31.856489 extend-filesystems[1514]: Found loop4 Jan 13 20:54:31.856489 extend-filesystems[1514]: Found loop5 Jan 13 20:54:31.856489 extend-filesystems[1514]: Found loop6 Jan 13 20:54:31.856489 extend-filesystems[1514]: Found loop7 Jan 13 20:54:31.856489 extend-filesystems[1514]: Found sda Jan 13 20:54:31.856489 extend-filesystems[1514]: Found sda1 Jan 13 20:54:31.856489 extend-filesystems[1514]: Found sda2 Jan 13 20:54:31.856489 extend-filesystems[1514]: Found sda3 Jan 13 20:54:31.856489 extend-filesystems[1514]: Found usr Jan 13 20:54:31.856489 extend-filesystems[1514]: Found sda4 Jan 13 20:54:31.856489 extend-filesystems[1514]: Found sda6 Jan 13 20:54:31.856489 extend-filesystems[1514]: Found sda7 Jan 13 20:54:31.856489 extend-filesystems[1514]: Found sda9 Jan 13 20:54:31.856489 extend-filesystems[1514]: Checking size of /dev/sda9 Jan 13 20:54:31.858673 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 13 20:54:31.883472 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Jan 13 20:54:31.885400 tar[1529]: linux-amd64/helm Jan 13 20:54:31.893081 extend-filesystems[1514]: Old size kept for /dev/sda9 Jan 13 20:54:31.893240 extend-filesystems[1514]: Found sr0 Jan 13 20:54:31.894199 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 13 20:54:31.894331 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 13 20:54:31.899271 bash[1568]: Updated "/home/core/.ssh/authorized_keys" Jan 13 20:54:31.900700 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 13 20:54:31.901134 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Jan 13 20:54:31.912634 kernel: NET: Registered PF_VSOCK protocol family Jan 13 20:54:31.917435 unknown[1545]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Jan 13 20:54:31.920424 unknown[1545]: Core dump limit set to -1 Jan 13 20:54:31.949295 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1473) Jan 13 20:54:31.953252 systemd-logind[1519]: Watching system buttons on /dev/input/event1 (Power Button) Jan 13 20:54:31.955402 systemd-logind[1519]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 13 20:54:31.955515 systemd-logind[1519]: New seat seat0. Jan 13 20:54:31.963997 systemd[1]: Started systemd-logind.service - User Login Management. Jan 13 20:54:32.035387 locksmithd[1549]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 13 20:54:32.216710 containerd[1537]: time="2025-01-13T20:54:32.216639407Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Jan 13 20:54:32.229002 sshd_keygen[1551]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 13 20:54:32.259792 containerd[1537]: time="2025-01-13T20:54:32.258299831Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:54:32.262367 containerd[1537]: time="2025-01-13T20:54:32.262342408Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.71-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:54:32.263018 containerd[1537]: time="2025-01-13T20:54:32.262430080Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Jan 13 20:54:32.263018 containerd[1537]: time="2025-01-13T20:54:32.262446428Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Jan 13 20:54:32.263018 containerd[1537]: time="2025-01-13T20:54:32.262537137Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Jan 13 20:54:32.263018 containerd[1537]: time="2025-01-13T20:54:32.262546775Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Jan 13 20:54:32.263018 containerd[1537]: time="2025-01-13T20:54:32.262582036Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:54:32.263018 containerd[1537]: time="2025-01-13T20:54:32.262590193Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:54:32.263018 containerd[1537]: time="2025-01-13T20:54:32.262675003Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:54:32.263018 containerd[1537]: time="2025-01-13T20:54:32.262683607Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Jan 13 20:54:32.263018 containerd[1537]: time="2025-01-13T20:54:32.262690603Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:54:32.263018 containerd[1537]: time="2025-01-13T20:54:32.262695891Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Jan 13 20:54:32.263018 containerd[1537]: time="2025-01-13T20:54:32.262744431Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:54:32.263018 containerd[1537]: time="2025-01-13T20:54:32.262855870Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:54:32.263264 containerd[1537]: time="2025-01-13T20:54:32.262908283Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:54:32.263264 containerd[1537]: time="2025-01-13T20:54:32.262915772Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Jan 13 20:54:32.263264 containerd[1537]: time="2025-01-13T20:54:32.262956131Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Jan 13 20:54:32.263264 containerd[1537]: time="2025-01-13T20:54:32.262981239Z" level=info msg="metadata content store policy set" policy=shared Jan 13 20:54:32.269027 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 13 20:54:32.271298 containerd[1537]: time="2025-01-13T20:54:32.270345346Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Jan 13 20:54:32.271298 containerd[1537]: time="2025-01-13T20:54:32.270391412Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Jan 13 20:54:32.271298 containerd[1537]: time="2025-01-13T20:54:32.270405914Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Jan 13 20:54:32.271298 containerd[1537]: time="2025-01-13T20:54:32.270421989Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Jan 13 20:54:32.271298 containerd[1537]: time="2025-01-13T20:54:32.270431024Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Jan 13 20:54:32.271298 containerd[1537]: time="2025-01-13T20:54:32.270522491Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Jan 13 20:54:32.271298 containerd[1537]: time="2025-01-13T20:54:32.270670338Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Jan 13 20:54:32.271298 containerd[1537]: time="2025-01-13T20:54:32.270728909Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Jan 13 20:54:32.271298 containerd[1537]: time="2025-01-13T20:54:32.270739274Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Jan 13 20:54:32.271298 containerd[1537]: time="2025-01-13T20:54:32.270747257Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Jan 13 20:54:32.271298 containerd[1537]: time="2025-01-13T20:54:32.270755233Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Jan 13 20:54:32.271298 containerd[1537]: time="2025-01-13T20:54:32.270763254Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Jan 13 20:54:32.271298 containerd[1537]: time="2025-01-13T20:54:32.270770370Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Jan 13 20:54:32.271298 containerd[1537]: time="2025-01-13T20:54:32.270777914Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Jan 13 20:54:32.271531 containerd[1537]: time="2025-01-13T20:54:32.270785891Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Jan 13 20:54:32.271531 containerd[1537]: time="2025-01-13T20:54:32.270808030Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Jan 13 20:54:32.271531 containerd[1537]: time="2025-01-13T20:54:32.270817989Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Jan 13 20:54:32.271531 containerd[1537]: time="2025-01-13T20:54:32.270825707Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Jan 13 20:54:32.271531 containerd[1537]: time="2025-01-13T20:54:32.270840120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271531 containerd[1537]: time="2025-01-13T20:54:32.270852198Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271531 containerd[1537]: time="2025-01-13T20:54:32.270864040Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271531 containerd[1537]: time="2025-01-13T20:54:32.270875260Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271531 containerd[1537]: time="2025-01-13T20:54:32.270882797Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271531 containerd[1537]: time="2025-01-13T20:54:32.270890749Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271531 containerd[1537]: time="2025-01-13T20:54:32.270901198Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271531 containerd[1537]: time="2025-01-13T20:54:32.270910018Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271531 containerd[1537]: time="2025-01-13T20:54:32.270923920Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271531 containerd[1537]: time="2025-01-13T20:54:32.270934744Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271714 containerd[1537]: time="2025-01-13T20:54:32.270943131Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271714 containerd[1537]: time="2025-01-13T20:54:32.270950029Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271714 containerd[1537]: time="2025-01-13T20:54:32.270957356Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271714 containerd[1537]: time="2025-01-13T20:54:32.270967372Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Jan 13 20:54:32.271714 containerd[1537]: time="2025-01-13T20:54:32.270983328Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271714 containerd[1537]: time="2025-01-13T20:54:32.270994684Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271714 containerd[1537]: time="2025-01-13T20:54:32.271001632Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Jan 13 20:54:32.271714 containerd[1537]: time="2025-01-13T20:54:32.271026215Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Jan 13 20:54:32.271714 containerd[1537]: time="2025-01-13T20:54:32.271038223Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Jan 13 20:54:32.271714 containerd[1537]: time="2025-01-13T20:54:32.271045610Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Jan 13 20:54:32.271714 containerd[1537]: time="2025-01-13T20:54:32.271052727Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Jan 13 20:54:32.271714 containerd[1537]: time="2025-01-13T20:54:32.271057594Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271714 containerd[1537]: time="2025-01-13T20:54:32.271064718Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Jan 13 20:54:32.271714 containerd[1537]: time="2025-01-13T20:54:32.271070334Z" level=info msg="NRI interface is disabled by configuration." Jan 13 20:54:32.271923 containerd[1537]: time="2025-01-13T20:54:32.271076894Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Jan 13 20:54:32.271945 containerd[1537]: time="2025-01-13T20:54:32.271292499Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Jan 13 20:54:32.271945 containerd[1537]: time="2025-01-13T20:54:32.271322336Z" level=info msg="Connect containerd service" Jan 13 20:54:32.271945 containerd[1537]: time="2025-01-13T20:54:32.271348899Z" level=info msg="using legacy CRI server" Jan 13 20:54:32.271945 containerd[1537]: time="2025-01-13T20:54:32.271353409Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 13 20:54:32.271945 containerd[1537]: time="2025-01-13T20:54:32.271425936Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Jan 13 20:54:32.271945 containerd[1537]: time="2025-01-13T20:54:32.271769673Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 13 20:54:32.275010 containerd[1537]: time="2025-01-13T20:54:32.272262578Z" level=info msg="Start subscribing containerd event" Jan 13 20:54:32.275010 containerd[1537]: time="2025-01-13T20:54:32.272400959Z" level=info msg="Start recovering state" Jan 13 20:54:32.275010 containerd[1537]: time="2025-01-13T20:54:32.272441161Z" level=info msg="Start event monitor" Jan 13 20:54:32.275010 containerd[1537]: time="2025-01-13T20:54:32.272451531Z" level=info msg="Start snapshots syncer" Jan 13 20:54:32.275010 containerd[1537]: time="2025-01-13T20:54:32.272460316Z" level=info msg="Start cni network conf syncer for default" Jan 13 20:54:32.275010 containerd[1537]: time="2025-01-13T20:54:32.272466221Z" level=info msg="Start streaming server" Jan 13 20:54:32.275010 containerd[1537]: time="2025-01-13T20:54:32.272636110Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 13 20:54:32.275010 containerd[1537]: time="2025-01-13T20:54:32.272662672Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 13 20:54:32.275010 containerd[1537]: time="2025-01-13T20:54:32.272743649Z" level=info msg="containerd successfully booted in 0.057779s" Jan 13 20:54:32.274479 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 13 20:54:32.274674 systemd[1]: Started containerd.service - containerd container runtime. Jan 13 20:54:32.283113 systemd[1]: issuegen.service: Deactivated successfully. Jan 13 20:54:32.283274 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 13 20:54:32.288506 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 13 20:54:32.300365 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 13 20:54:32.311484 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 13 20:54:32.314483 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 13 20:54:32.315395 systemd[1]: Reached target getty.target - Login Prompts. Jan 13 20:54:32.387913 tar[1529]: linux-amd64/LICENSE Jan 13 20:54:32.387984 tar[1529]: linux-amd64/README.md Jan 13 20:54:32.395468 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 13 20:54:33.233455 systemd-networkd[1464]: ens192: Gained IPv6LL Jan 13 20:54:33.233737 systemd-timesyncd[1431]: Network configuration changed, trying to establish connection. Jan 13 20:54:33.234577 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 13 20:54:33.235366 systemd[1]: Reached target network-online.target - Network is Online. Jan 13 20:54:33.241541 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Jan 13 20:54:33.247526 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:54:33.249449 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 13 20:54:33.282964 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 13 20:54:33.323253 systemd[1]: coreos-metadata.service: Deactivated successfully. Jan 13 20:54:33.323519 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Jan 13 20:54:33.324614 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 13 20:54:34.868027 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:54:34.868578 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 13 20:54:34.868794 systemd[1]: Startup finished in 955ms (kernel) + 4.748s (initrd) + 5.523s (userspace) = 11.227s. Jan 13 20:54:34.871397 (kubelet)[1689]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:54:34.935545 login[1654]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 13 20:54:34.936899 login[1655]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 13 20:54:34.942669 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 13 20:54:34.949510 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 13 20:54:34.952285 systemd-logind[1519]: New session 2 of user core. Jan 13 20:54:34.956879 systemd-logind[1519]: New session 1 of user core. Jan 13 20:54:34.960014 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 13 20:54:34.965556 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 13 20:54:34.970780 (systemd)[1696]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 13 20:54:35.047809 systemd[1696]: Queued start job for default target default.target. Jan 13 20:54:35.056455 systemd[1696]: Created slice app.slice - User Application Slice. Jan 13 20:54:35.056479 systemd[1696]: Reached target paths.target - Paths. Jan 13 20:54:35.056492 systemd[1696]: Reached target timers.target - Timers. Jan 13 20:54:35.057350 systemd[1696]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 13 20:54:35.064889 systemd[1696]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 13 20:54:35.064927 systemd[1696]: Reached target sockets.target - Sockets. Jan 13 20:54:35.064937 systemd[1696]: Reached target basic.target - Basic System. Jan 13 20:54:35.064962 systemd[1696]: Reached target default.target - Main User Target. Jan 13 20:54:35.064980 systemd[1696]: Startup finished in 90ms. Jan 13 20:54:35.065196 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 13 20:54:35.066632 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 13 20:54:35.068134 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 13 20:54:35.828091 kubelet[1689]: E0113 20:54:35.828045 1689 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:54:35.829383 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:54:35.829488 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:54:45.916707 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 13 20:54:45.925465 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:54:46.245745 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:54:46.249446 (kubelet)[1740]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:54:46.295711 kubelet[1740]: E0113 20:54:46.295678 1740 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:54:46.298026 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:54:46.298106 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:54:56.416648 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 13 20:54:56.427447 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:54:56.656352 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:54:56.658713 (kubelet)[1756]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:54:56.680098 kubelet[1756]: E0113 20:54:56.680028 1756 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:54:56.681647 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:54:56.681777 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:56:16.247380 systemd-resolved[1410]: Clock change detected. Flushing caches. Jan 13 20:56:16.247443 systemd-timesyncd[1431]: Contacted time server 129.146.193.200:123 (2.flatcar.pool.ntp.org). Jan 13 20:56:16.247475 systemd-timesyncd[1431]: Initial clock synchronization to Mon 2025-01-13 20:56:16.247337 UTC. Jan 13 20:56:19.662268 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 13 20:56:19.671958 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:56:19.831228 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:56:19.833735 (kubelet)[1771]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:56:19.861500 kubelet[1771]: E0113 20:56:19.861466 1771 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:56:19.862596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:56:19.862678 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:56:24.772817 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 13 20:56:24.774009 systemd[1]: Started sshd@0-139.178.70.103:22-147.75.109.163:51716.service - OpenSSH per-connection server daemon (147.75.109.163:51716). Jan 13 20:56:24.815369 sshd[1778]: Accepted publickey for core from 147.75.109.163 port 51716 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:56:24.816191 sshd-session[1778]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:56:24.819989 systemd-logind[1519]: New session 3 of user core. Jan 13 20:56:24.829002 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 13 20:56:24.887990 systemd[1]: Started sshd@1-139.178.70.103:22-147.75.109.163:51732.service - OpenSSH per-connection server daemon (147.75.109.163:51732). Jan 13 20:56:24.920899 sshd[1783]: Accepted publickey for core from 147.75.109.163 port 51732 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:56:24.921565 sshd-session[1783]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:56:24.923840 systemd-logind[1519]: New session 4 of user core. Jan 13 20:56:24.927902 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 13 20:56:24.976303 sshd[1785]: Connection closed by 147.75.109.163 port 51732 Jan 13 20:56:24.976719 sshd-session[1783]: pam_unix(sshd:session): session closed for user core Jan 13 20:56:24.990550 systemd[1]: sshd@1-139.178.70.103:22-147.75.109.163:51732.service: Deactivated successfully. Jan 13 20:56:24.991474 systemd[1]: session-4.scope: Deactivated successfully. Jan 13 20:56:24.991969 systemd-logind[1519]: Session 4 logged out. Waiting for processes to exit. Jan 13 20:56:24.993223 systemd[1]: Started sshd@2-139.178.70.103:22-147.75.109.163:51748.service - OpenSSH per-connection server daemon (147.75.109.163:51748). Jan 13 20:56:24.994666 systemd-logind[1519]: Removed session 4. Jan 13 20:56:25.036419 sshd[1790]: Accepted publickey for core from 147.75.109.163 port 51748 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:56:25.037291 sshd-session[1790]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:56:25.039850 systemd-logind[1519]: New session 5 of user core. Jan 13 20:56:25.051058 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 13 20:56:25.097428 sshd[1792]: Connection closed by 147.75.109.163 port 51748 Jan 13 20:56:25.097730 sshd-session[1790]: pam_unix(sshd:session): session closed for user core Jan 13 20:56:25.106548 systemd[1]: sshd@2-139.178.70.103:22-147.75.109.163:51748.service: Deactivated successfully. Jan 13 20:56:25.107515 systemd[1]: session-5.scope: Deactivated successfully. Jan 13 20:56:25.108555 systemd-logind[1519]: Session 5 logged out. Waiting for processes to exit. Jan 13 20:56:25.109425 systemd[1]: Started sshd@3-139.178.70.103:22-147.75.109.163:51760.service - OpenSSH per-connection server daemon (147.75.109.163:51760). Jan 13 20:56:25.110370 systemd-logind[1519]: Removed session 5. Jan 13 20:56:25.153293 sshd[1797]: Accepted publickey for core from 147.75.109.163 port 51760 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:56:25.154012 sshd-session[1797]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:56:25.156634 systemd-logind[1519]: New session 6 of user core. Jan 13 20:56:25.167979 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 13 20:56:25.216786 sshd[1799]: Connection closed by 147.75.109.163 port 51760 Jan 13 20:56:25.217247 sshd-session[1797]: pam_unix(sshd:session): session closed for user core Jan 13 20:56:25.230510 systemd[1]: sshd@3-139.178.70.103:22-147.75.109.163:51760.service: Deactivated successfully. Jan 13 20:56:25.231397 systemd[1]: session-6.scope: Deactivated successfully. Jan 13 20:56:25.231843 systemd-logind[1519]: Session 6 logged out. Waiting for processes to exit. Jan 13 20:56:25.237099 systemd[1]: Started sshd@4-139.178.70.103:22-147.75.109.163:51770.service - OpenSSH per-connection server daemon (147.75.109.163:51770). Jan 13 20:56:25.238980 systemd-logind[1519]: Removed session 6. Jan 13 20:56:25.272668 sshd[1804]: Accepted publickey for core from 147.75.109.163 port 51770 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:56:25.273558 sshd-session[1804]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:56:25.276423 systemd-logind[1519]: New session 7 of user core. Jan 13 20:56:25.293070 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 13 20:56:25.351634 sudo[1807]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 13 20:56:25.351859 sudo[1807]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:56:25.370795 sudo[1807]: pam_unix(sudo:session): session closed for user root Jan 13 20:56:25.371620 sshd[1806]: Connection closed by 147.75.109.163 port 51770 Jan 13 20:56:25.372011 sshd-session[1804]: pam_unix(sshd:session): session closed for user core Jan 13 20:56:25.377563 systemd[1]: sshd@4-139.178.70.103:22-147.75.109.163:51770.service: Deactivated successfully. Jan 13 20:56:25.378471 systemd[1]: session-7.scope: Deactivated successfully. Jan 13 20:56:25.379419 systemd-logind[1519]: Session 7 logged out. Waiting for processes to exit. Jan 13 20:56:25.380336 systemd[1]: Started sshd@5-139.178.70.103:22-147.75.109.163:51774.service - OpenSSH per-connection server daemon (147.75.109.163:51774). Jan 13 20:56:25.382005 systemd-logind[1519]: Removed session 7. Jan 13 20:56:25.418259 sshd[1812]: Accepted publickey for core from 147.75.109.163 port 51774 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:56:25.419145 sshd-session[1812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:56:25.422366 systemd-logind[1519]: New session 8 of user core. Jan 13 20:56:25.441071 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 13 20:56:25.490204 sudo[1816]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 13 20:56:25.490413 sudo[1816]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:56:25.493028 sudo[1816]: pam_unix(sudo:session): session closed for user root Jan 13 20:56:25.496651 sudo[1815]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 13 20:56:25.496886 sudo[1815]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:56:25.512115 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 20:56:25.530744 augenrules[1838]: No rules Jan 13 20:56:25.531078 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 20:56:25.531215 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 20:56:25.531997 sudo[1815]: pam_unix(sudo:session): session closed for user root Jan 13 20:56:25.532786 sshd[1814]: Connection closed by 147.75.109.163 port 51774 Jan 13 20:56:25.533040 sshd-session[1812]: pam_unix(sshd:session): session closed for user core Jan 13 20:56:25.538261 systemd[1]: sshd@5-139.178.70.103:22-147.75.109.163:51774.service: Deactivated successfully. Jan 13 20:56:25.539169 systemd[1]: session-8.scope: Deactivated successfully. Jan 13 20:56:25.539917 systemd-logind[1519]: Session 8 logged out. Waiting for processes to exit. Jan 13 20:56:25.544105 systemd[1]: Started sshd@6-139.178.70.103:22-147.75.109.163:51790.service - OpenSSH per-connection server daemon (147.75.109.163:51790). Jan 13 20:56:25.545052 systemd-logind[1519]: Removed session 8. Jan 13 20:56:25.575630 sshd[1846]: Accepted publickey for core from 147.75.109.163 port 51790 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:56:25.576472 sshd-session[1846]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:56:25.579324 systemd-logind[1519]: New session 9 of user core. Jan 13 20:56:25.588952 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 13 20:56:25.638093 sudo[1849]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 13 20:56:25.638305 sudo[1849]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:56:25.903102 (dockerd)[1867]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 13 20:56:25.903332 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 13 20:56:26.148608 dockerd[1867]: time="2025-01-13T20:56:26.148575697Z" level=info msg="Starting up" Jan 13 20:56:26.216488 dockerd[1867]: time="2025-01-13T20:56:26.216362199Z" level=info msg="Loading containers: start." Jan 13 20:56:26.312846 kernel: Initializing XFRM netlink socket Jan 13 20:56:26.359552 systemd-networkd[1464]: docker0: Link UP Jan 13 20:56:26.380837 dockerd[1867]: time="2025-01-13T20:56:26.380806548Z" level=info msg="Loading containers: done." Jan 13 20:56:26.391816 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3350418956-merged.mount: Deactivated successfully. Jan 13 20:56:26.392873 dockerd[1867]: time="2025-01-13T20:56:26.392741974Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 13 20:56:26.392873 dockerd[1867]: time="2025-01-13T20:56:26.392810297Z" level=info msg="Docker daemon" commit=8b539b8df24032dabeaaa099cf1d0535ef0286a3 containerd-snapshotter=false storage-driver=overlay2 version=27.2.1 Jan 13 20:56:26.392959 dockerd[1867]: time="2025-01-13T20:56:26.392909235Z" level=info msg="Daemon has completed initialization" Jan 13 20:56:26.407682 dockerd[1867]: time="2025-01-13T20:56:26.407418132Z" level=info msg="API listen on /run/docker.sock" Jan 13 20:56:26.407512 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 13 20:56:27.288761 containerd[1537]: time="2025-01-13T20:56:27.288677875Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.4\"" Jan 13 20:56:27.870000 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3932508105.mount: Deactivated successfully. Jan 13 20:56:28.915374 containerd[1537]: time="2025-01-13T20:56:28.915350741Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:28.916404 containerd[1537]: time="2025-01-13T20:56:28.916363824Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.4: active requests=0, bytes read=27975483" Jan 13 20:56:28.917489 containerd[1537]: time="2025-01-13T20:56:28.916734310Z" level=info msg="ImageCreate event name:\"sha256:bdc2eadbf366279693097982a31da61cc2f1d90f07ada3f4b3b91251a18f665e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:28.918043 containerd[1537]: time="2025-01-13T20:56:28.918031910Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:ace6a943b058439bd6daeb74f152e7c36e6fc0b5e481cdff9364cd6ca0473e5e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:28.918672 containerd[1537]: time="2025-01-13T20:56:28.918655225Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.4\" with image id \"sha256:bdc2eadbf366279693097982a31da61cc2f1d90f07ada3f4b3b91251a18f665e\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:ace6a943b058439bd6daeb74f152e7c36e6fc0b5e481cdff9364cd6ca0473e5e\", size \"27972283\" in 1.629956467s" Jan 13 20:56:28.918704 containerd[1537]: time="2025-01-13T20:56:28.918675129Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.4\" returns image reference \"sha256:bdc2eadbf366279693097982a31da61cc2f1d90f07ada3f4b3b91251a18f665e\"" Jan 13 20:56:28.920151 containerd[1537]: time="2025-01-13T20:56:28.920136499Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.4\"" Jan 13 20:56:29.390941 update_engine[1520]: I20250113 20:56:29.390848 1520 update_attempter.cc:509] Updating boot flags... Jan 13 20:56:29.416844 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (2120) Jan 13 20:56:29.451514 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (2116) Jan 13 20:56:29.912233 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 13 20:56:29.920013 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:56:30.396817 containerd[1537]: time="2025-01-13T20:56:30.396784963Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:30.404452 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:56:30.405951 containerd[1537]: time="2025-01-13T20:56:30.405920128Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.4: active requests=0, bytes read=24702157" Jan 13 20:56:30.407652 (kubelet)[2141]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:56:30.412064 containerd[1537]: time="2025-01-13T20:56:30.412041791Z" level=info msg="ImageCreate event name:\"sha256:359b9f2307326a4c66172318ca63ee9792c3146ca57d53329239bd123ea70079\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:30.421503 containerd[1537]: time="2025-01-13T20:56:30.421350696Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:4bd1d4a449e7a1a4f375bd7c71abf48a95f8949b38f725ded255077329f21f7b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:30.422214 containerd[1537]: time="2025-01-13T20:56:30.421965958Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.4\" with image id \"sha256:359b9f2307326a4c66172318ca63ee9792c3146ca57d53329239bd123ea70079\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:4bd1d4a449e7a1a4f375bd7c71abf48a95f8949b38f725ded255077329f21f7b\", size \"26147269\" in 1.501812337s" Jan 13 20:56:30.422214 containerd[1537]: time="2025-01-13T20:56:30.421982623Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.4\" returns image reference \"sha256:359b9f2307326a4c66172318ca63ee9792c3146ca57d53329239bd123ea70079\"" Jan 13 20:56:30.422505 containerd[1537]: time="2025-01-13T20:56:30.422475725Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.4\"" Jan 13 20:56:30.434294 kubelet[2141]: E0113 20:56:30.434274 2141 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:56:30.435837 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:56:30.435971 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:56:31.656851 containerd[1537]: time="2025-01-13T20:56:31.656418384Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:31.657271 containerd[1537]: time="2025-01-13T20:56:31.657231266Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.4: active requests=0, bytes read=18652067" Jan 13 20:56:31.657639 containerd[1537]: time="2025-01-13T20:56:31.657621157Z" level=info msg="ImageCreate event name:\"sha256:3a66234066fe10fa299c0a52265f90a107450f0372652867118cd9007940d674\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:31.658988 containerd[1537]: time="2025-01-13T20:56:31.658962207Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:1a3081cb7d21763d22eb2c0781cc462d89f501ed523ad558dea1226f128fbfdd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:31.659637 containerd[1537]: time="2025-01-13T20:56:31.659549396Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.4\" with image id \"sha256:3a66234066fe10fa299c0a52265f90a107450f0372652867118cd9007940d674\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:1a3081cb7d21763d22eb2c0781cc462d89f501ed523ad558dea1226f128fbfdd\", size \"20097197\" in 1.237041435s" Jan 13 20:56:31.659637 containerd[1537]: time="2025-01-13T20:56:31.659566953Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.4\" returns image reference \"sha256:3a66234066fe10fa299c0a52265f90a107450f0372652867118cd9007940d674\"" Jan 13 20:56:31.659802 containerd[1537]: time="2025-01-13T20:56:31.659789518Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.4\"" Jan 13 20:56:33.773648 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1295502509.mount: Deactivated successfully. Jan 13 20:56:34.158985 containerd[1537]: time="2025-01-13T20:56:34.158907150Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:34.172375 containerd[1537]: time="2025-01-13T20:56:34.172326662Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.4: active requests=0, bytes read=30230243" Jan 13 20:56:34.179649 containerd[1537]: time="2025-01-13T20:56:34.179608226Z" level=info msg="ImageCreate event name:\"sha256:ebf80573666f86f115452db568feb34f6f771c3bdc7bfed14b9577f992cfa300\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:34.189562 containerd[1537]: time="2025-01-13T20:56:34.189485351Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:1739b3febca392035bf6edfe31efdfa55226be7b57389b2001ae357f7dcb99cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:34.190527 containerd[1537]: time="2025-01-13T20:56:34.190324655Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.4\" with image id \"sha256:ebf80573666f86f115452db568feb34f6f771c3bdc7bfed14b9577f992cfa300\", repo tag \"registry.k8s.io/kube-proxy:v1.31.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:1739b3febca392035bf6edfe31efdfa55226be7b57389b2001ae357f7dcb99cf\", size \"30229262\" in 2.530516899s" Jan 13 20:56:34.190527 containerd[1537]: time="2025-01-13T20:56:34.190369370Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.4\" returns image reference \"sha256:ebf80573666f86f115452db568feb34f6f771c3bdc7bfed14b9577f992cfa300\"" Jan 13 20:56:34.190953 containerd[1537]: time="2025-01-13T20:56:34.190934160Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Jan 13 20:56:35.616756 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2350833598.mount: Deactivated successfully. Jan 13 20:56:36.755683 containerd[1537]: time="2025-01-13T20:56:36.755543491Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:36.756230 containerd[1537]: time="2025-01-13T20:56:36.756199964Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" Jan 13 20:56:36.756844 containerd[1537]: time="2025-01-13T20:56:36.756347445Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:36.758224 containerd[1537]: time="2025-01-13T20:56:36.758168302Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:36.760074 containerd[1537]: time="2025-01-13T20:56:36.760012502Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.569055887s" Jan 13 20:56:36.760074 containerd[1537]: time="2025-01-13T20:56:36.760039362Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Jan 13 20:56:36.760441 containerd[1537]: time="2025-01-13T20:56:36.760420844Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 13 20:56:37.527876 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1257318414.mount: Deactivated successfully. Jan 13 20:56:37.580629 containerd[1537]: time="2025-01-13T20:56:37.580591535Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:37.589265 containerd[1537]: time="2025-01-13T20:56:37.589228622Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Jan 13 20:56:37.592340 containerd[1537]: time="2025-01-13T20:56:37.592313595Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:37.597529 containerd[1537]: time="2025-01-13T20:56:37.597493712Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:37.598210 containerd[1537]: time="2025-01-13T20:56:37.597935991Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 837.493569ms" Jan 13 20:56:37.598210 containerd[1537]: time="2025-01-13T20:56:37.597956499Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 13 20:56:37.598309 containerd[1537]: time="2025-01-13T20:56:37.598260119Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Jan 13 20:56:38.347832 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3665819249.mount: Deactivated successfully. Jan 13 20:56:40.662217 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jan 13 20:56:40.670892 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:56:41.982718 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:56:41.986805 (kubelet)[2268]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:56:42.173054 kubelet[2268]: E0113 20:56:42.173011 2268 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:56:42.174894 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:56:42.175005 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:56:42.656792 containerd[1537]: time="2025-01-13T20:56:42.656071263Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:42.660652 containerd[1537]: time="2025-01-13T20:56:42.660608061Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56779973" Jan 13 20:56:42.664099 containerd[1537]: time="2025-01-13T20:56:42.664068372Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:42.666138 containerd[1537]: time="2025-01-13T20:56:42.666111379Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:56:42.668804 containerd[1537]: time="2025-01-13T20:56:42.668025952Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 5.069745282s" Jan 13 20:56:42.668804 containerd[1537]: time="2025-01-13T20:56:42.668052404Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Jan 13 20:56:44.626318 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:56:44.636096 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:56:44.661599 systemd[1]: Reloading requested from client PID 2306 ('systemctl') (unit session-9.scope)... Jan 13 20:56:44.661620 systemd[1]: Reloading... Jan 13 20:56:44.726985 zram_generator::config[2343]: No configuration found. Jan 13 20:56:44.796260 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:56:44.813648 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:56:44.860982 systemd[1]: Reloading finished in 199 ms. Jan 13 20:56:44.887552 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 13 20:56:44.887594 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 13 20:56:44.887722 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:56:44.891018 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:56:45.325407 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:56:45.329439 (kubelet)[2411]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 20:56:45.356441 kubelet[2411]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:56:45.356441 kubelet[2411]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 13 20:56:45.356441 kubelet[2411]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:56:45.363670 kubelet[2411]: I0113 20:56:45.363624 2411 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 20:56:45.722856 kubelet[2411]: I0113 20:56:45.722773 2411 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Jan 13 20:56:45.722856 kubelet[2411]: I0113 20:56:45.722804 2411 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 20:56:45.723009 kubelet[2411]: I0113 20:56:45.722994 2411 server.go:929] "Client rotation is on, will bootstrap in background" Jan 13 20:56:45.752527 kubelet[2411]: I0113 20:56:45.752186 2411 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 20:56:45.753890 kubelet[2411]: E0113 20:56:45.753860 2411 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.103:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Jan 13 20:56:45.768128 kubelet[2411]: E0113 20:56:45.768095 2411 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Jan 13 20:56:45.768128 kubelet[2411]: I0113 20:56:45.768126 2411 server.go:1403] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Jan 13 20:56:45.771464 kubelet[2411]: I0113 20:56:45.771437 2411 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 20:56:45.772474 kubelet[2411]: I0113 20:56:45.772453 2411 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 13 20:56:45.772601 kubelet[2411]: I0113 20:56:45.772572 2411 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 20:56:45.772736 kubelet[2411]: I0113 20:56:45.772599 2411 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 13 20:56:45.772867 kubelet[2411]: I0113 20:56:45.772739 2411 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 20:56:45.772867 kubelet[2411]: I0113 20:56:45.772748 2411 container_manager_linux.go:300] "Creating device plugin manager" Jan 13 20:56:45.772867 kubelet[2411]: I0113 20:56:45.772852 2411 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:56:45.774373 kubelet[2411]: I0113 20:56:45.774355 2411 kubelet.go:408] "Attempting to sync node with API server" Jan 13 20:56:45.774373 kubelet[2411]: I0113 20:56:45.774372 2411 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 20:56:45.774450 kubelet[2411]: I0113 20:56:45.774397 2411 kubelet.go:314] "Adding apiserver pod source" Jan 13 20:56:45.774450 kubelet[2411]: I0113 20:56:45.774412 2411 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 20:56:45.781757 kubelet[2411]: W0113 20:56:45.781486 2411 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Jan 13 20:56:45.781757 kubelet[2411]: E0113 20:56:45.781559 2411 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Jan 13 20:56:45.781757 kubelet[2411]: I0113 20:56:45.781645 2411 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 13 20:56:45.783557 kubelet[2411]: I0113 20:56:45.783465 2411 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 20:56:45.784387 kubelet[2411]: W0113 20:56:45.784173 2411 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 13 20:56:45.785926 kubelet[2411]: I0113 20:56:45.785770 2411 server.go:1269] "Started kubelet" Jan 13 20:56:45.786917 kubelet[2411]: W0113 20:56:45.786441 2411 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Jan 13 20:56:45.786917 kubelet[2411]: E0113 20:56:45.786482 2411 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Jan 13 20:56:45.786917 kubelet[2411]: I0113 20:56:45.786513 2411 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 20:56:45.787672 kubelet[2411]: I0113 20:56:45.787347 2411 server.go:460] "Adding debug handlers to kubelet server" Jan 13 20:56:45.788942 kubelet[2411]: I0113 20:56:45.788927 2411 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 20:56:45.790879 kubelet[2411]: I0113 20:56:45.790639 2411 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 20:56:45.790879 kubelet[2411]: I0113 20:56:45.790790 2411 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 20:56:45.797177 kubelet[2411]: E0113 20:56:45.794444 2411 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.103:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.103:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.181a5c0a571dea62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-01-13 20:56:45.78575421 +0000 UTC m=+0.453629168,LastTimestamp:2025-01-13 20:56:45.78575421 +0000 UTC m=+0.453629168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jan 13 20:56:45.800848 kubelet[2411]: I0113 20:56:45.797335 2411 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 13 20:56:45.802495 kubelet[2411]: I0113 20:56:45.802132 2411 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 13 20:56:45.806717 kubelet[2411]: E0113 20:56:45.806691 2411 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:56:45.807039 kubelet[2411]: E0113 20:56:45.807021 2411 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="200ms" Jan 13 20:56:45.813483 kubelet[2411]: I0113 20:56:45.813454 2411 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 13 20:56:45.813772 kubelet[2411]: I0113 20:56:45.813629 2411 reconciler.go:26] "Reconciler: start to sync state" Jan 13 20:56:45.814122 kubelet[2411]: W0113 20:56:45.814087 2411 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Jan 13 20:56:45.814219 kubelet[2411]: E0113 20:56:45.814207 2411 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Jan 13 20:56:45.814512 kubelet[2411]: I0113 20:56:45.814501 2411 factory.go:221] Registration of the systemd container factory successfully Jan 13 20:56:45.814757 kubelet[2411]: I0113 20:56:45.814745 2411 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 20:56:45.815846 kubelet[2411]: E0113 20:56:45.815830 2411 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 13 20:56:45.816141 kubelet[2411]: I0113 20:56:45.816131 2411 factory.go:221] Registration of the containerd container factory successfully Jan 13 20:56:45.856078 kubelet[2411]: I0113 20:56:45.856062 2411 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 13 20:56:45.856078 kubelet[2411]: I0113 20:56:45.856072 2411 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 13 20:56:45.856078 kubelet[2411]: I0113 20:56:45.856082 2411 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:56:45.856881 kubelet[2411]: I0113 20:56:45.856862 2411 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 20:56:45.857864 kubelet[2411]: I0113 20:56:45.857683 2411 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 20:56:45.857864 kubelet[2411]: I0113 20:56:45.857705 2411 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 13 20:56:45.857864 kubelet[2411]: I0113 20:56:45.857718 2411 kubelet.go:2321] "Starting kubelet main sync loop" Jan 13 20:56:45.857864 kubelet[2411]: E0113 20:56:45.857739 2411 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 13 20:56:45.861810 kubelet[2411]: W0113 20:56:45.861782 2411 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Jan 13 20:56:45.861922 kubelet[2411]: E0113 20:56:45.861818 2411 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Jan 13 20:56:45.896189 kubelet[2411]: I0113 20:56:45.896172 2411 policy_none.go:49] "None policy: Start" Jan 13 20:56:45.896894 kubelet[2411]: I0113 20:56:45.896881 2411 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 13 20:56:45.896945 kubelet[2411]: I0113 20:56:45.896897 2411 state_mem.go:35] "Initializing new in-memory state store" Jan 13 20:56:45.907035 kubelet[2411]: E0113 20:56:45.907011 2411 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:56:45.958407 kubelet[2411]: E0113 20:56:45.958359 2411 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 13 20:56:46.008053 kubelet[2411]: E0113 20:56:46.007976 2411 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="400ms" Jan 13 20:56:46.008053 kubelet[2411]: E0113 20:56:46.008012 2411 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:56:46.033295 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 13 20:56:46.042283 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 13 20:56:46.044325 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 13 20:56:46.059702 kubelet[2411]: I0113 20:56:46.059594 2411 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 20:56:46.059888 kubelet[2411]: I0113 20:56:46.059833 2411 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 13 20:56:46.059888 kubelet[2411]: I0113 20:56:46.059842 2411 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 13 20:56:46.060111 kubelet[2411]: I0113 20:56:46.060053 2411 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 20:56:46.060963 kubelet[2411]: E0113 20:56:46.060922 2411 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Jan 13 20:56:46.161782 kubelet[2411]: I0113 20:56:46.161729 2411 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jan 13 20:56:46.162144 kubelet[2411]: E0113 20:56:46.161962 2411 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Jan 13 20:56:46.166252 systemd[1]: Created slice kubepods-burstable-pod0e8bc992702ca2e3191346c577e08654.slice - libcontainer container kubepods-burstable-pod0e8bc992702ca2e3191346c577e08654.slice. Jan 13 20:56:46.179512 systemd[1]: Created slice kubepods-burstable-pod50a9ae38ddb3bec3278d8dc73a6a7009.slice - libcontainer container kubepods-burstable-pod50a9ae38ddb3bec3278d8dc73a6a7009.slice. Jan 13 20:56:46.188030 systemd[1]: Created slice kubepods-burstable-poda52b86ce975f496e6002ba953fa9b888.slice - libcontainer container kubepods-burstable-poda52b86ce975f496e6002ba953fa9b888.slice. Jan 13 20:56:46.216960 kubelet[2411]: I0113 20:56:46.216714 2411 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/50a9ae38ddb3bec3278d8dc73a6a7009-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"50a9ae38ddb3bec3278d8dc73a6a7009\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:56:46.216960 kubelet[2411]: I0113 20:56:46.216783 2411 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/50a9ae38ddb3bec3278d8dc73a6a7009-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"50a9ae38ddb3bec3278d8dc73a6a7009\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:56:46.216960 kubelet[2411]: I0113 20:56:46.216796 2411 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0e8bc992702ca2e3191346c577e08654-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"0e8bc992702ca2e3191346c577e08654\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:56:46.216960 kubelet[2411]: I0113 20:56:46.216808 2411 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0e8bc992702ca2e3191346c577e08654-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"0e8bc992702ca2e3191346c577e08654\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:56:46.216960 kubelet[2411]: I0113 20:56:46.216845 2411 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/50a9ae38ddb3bec3278d8dc73a6a7009-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"50a9ae38ddb3bec3278d8dc73a6a7009\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:56:46.217211 kubelet[2411]: I0113 20:56:46.216859 2411 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/50a9ae38ddb3bec3278d8dc73a6a7009-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"50a9ae38ddb3bec3278d8dc73a6a7009\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:56:46.217211 kubelet[2411]: I0113 20:56:46.216869 2411 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a52b86ce975f496e6002ba953fa9b888-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a52b86ce975f496e6002ba953fa9b888\") " pod="kube-system/kube-scheduler-localhost" Jan 13 20:56:46.217211 kubelet[2411]: I0113 20:56:46.216881 2411 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0e8bc992702ca2e3191346c577e08654-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"0e8bc992702ca2e3191346c577e08654\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:56:46.217211 kubelet[2411]: I0113 20:56:46.216911 2411 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/50a9ae38ddb3bec3278d8dc73a6a7009-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"50a9ae38ddb3bec3278d8dc73a6a7009\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:56:46.363502 kubelet[2411]: I0113 20:56:46.363430 2411 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jan 13 20:56:46.363894 kubelet[2411]: E0113 20:56:46.363873 2411 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Jan 13 20:56:46.409137 kubelet[2411]: E0113 20:56:46.409108 2411 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="800ms" Jan 13 20:56:46.478541 containerd[1537]: time="2025-01-13T20:56:46.478471623Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:0e8bc992702ca2e3191346c577e08654,Namespace:kube-system,Attempt:0,}" Jan 13 20:56:46.487313 containerd[1537]: time="2025-01-13T20:56:46.487284236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:50a9ae38ddb3bec3278d8dc73a6a7009,Namespace:kube-system,Attempt:0,}" Jan 13 20:56:46.498988 containerd[1537]: time="2025-01-13T20:56:46.498966763Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a52b86ce975f496e6002ba953fa9b888,Namespace:kube-system,Attempt:0,}" Jan 13 20:56:46.685414 kubelet[2411]: W0113 20:56:46.685355 2411 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Jan 13 20:56:46.685414 kubelet[2411]: E0113 20:56:46.685393 2411 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Jan 13 20:56:46.764925 kubelet[2411]: I0113 20:56:46.764897 2411 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jan 13 20:56:46.765199 kubelet[2411]: E0113 20:56:46.765170 2411 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Jan 13 20:56:46.896353 kubelet[2411]: W0113 20:56:46.896288 2411 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Jan 13 20:56:46.896353 kubelet[2411]: E0113 20:56:46.896342 2411 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Jan 13 20:56:47.147660 kubelet[2411]: W0113 20:56:47.147575 2411 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Jan 13 20:56:47.147660 kubelet[2411]: E0113 20:56:47.147630 2411 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Jan 13 20:56:47.210102 kubelet[2411]: E0113 20:56:47.210068 2411 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="1.6s" Jan 13 20:56:47.269797 kubelet[2411]: W0113 20:56:47.269755 2411 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Jan 13 20:56:47.269797 kubelet[2411]: E0113 20:56:47.269800 2411 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Jan 13 20:56:47.315445 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1414208421.mount: Deactivated successfully. Jan 13 20:56:47.318113 containerd[1537]: time="2025-01-13T20:56:47.318091944Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:56:47.319105 containerd[1537]: time="2025-01-13T20:56:47.318979997Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 13 20:56:47.319105 containerd[1537]: time="2025-01-13T20:56:47.319047562Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:56:47.319906 containerd[1537]: time="2025-01-13T20:56:47.319879437Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Jan 13 20:56:47.320697 containerd[1537]: time="2025-01-13T20:56:47.320093993Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:56:47.320697 containerd[1537]: time="2025-01-13T20:56:47.320663092Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 13 20:56:47.322645 containerd[1537]: time="2025-01-13T20:56:47.322622606Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:56:47.323761 containerd[1537]: time="2025-01-13T20:56:47.323737255Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 824.727625ms" Jan 13 20:56:47.324383 containerd[1537]: time="2025-01-13T20:56:47.324364347Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 825.506864ms" Jan 13 20:56:47.324867 containerd[1537]: time="2025-01-13T20:56:47.324851564Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:56:47.325984 containerd[1537]: time="2025-01-13T20:56:47.325966985Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 827.247262ms" Jan 13 20:56:47.566191 kubelet[2411]: I0113 20:56:47.566172 2411 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jan 13 20:56:47.566406 kubelet[2411]: E0113 20:56:47.566362 2411 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Jan 13 20:56:47.731174 containerd[1537]: time="2025-01-13T20:56:47.730967935Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:56:47.731831 containerd[1537]: time="2025-01-13T20:56:47.731471051Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:56:47.731962 containerd[1537]: time="2025-01-13T20:56:47.731883160Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:56:47.731962 containerd[1537]: time="2025-01-13T20:56:47.731930668Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:56:47.732988 containerd[1537]: time="2025-01-13T20:56:47.732956882Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:56:47.733125 containerd[1537]: time="2025-01-13T20:56:47.733041200Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:56:47.733125 containerd[1537]: time="2025-01-13T20:56:47.733054176Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:56:47.733125 containerd[1537]: time="2025-01-13T20:56:47.733096497Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:56:47.737228 containerd[1537]: time="2025-01-13T20:56:47.729706810Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:56:47.737960 containerd[1537]: time="2025-01-13T20:56:47.737898554Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:56:47.737960 containerd[1537]: time="2025-01-13T20:56:47.737911328Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:56:47.738120 containerd[1537]: time="2025-01-13T20:56:47.738070520Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:56:47.762015 systemd[1]: Started cri-containerd-666b331e08136ba7389dba1b982525ca57408cf48d4c617f10990ea1752cabd6.scope - libcontainer container 666b331e08136ba7389dba1b982525ca57408cf48d4c617f10990ea1752cabd6. Jan 13 20:56:47.766104 kubelet[2411]: E0113 20:56:47.765986 2411 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.103:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Jan 13 20:56:47.767720 systemd[1]: Started cri-containerd-3040c12019600df9866725a5ebefe7e65b2e6db117cd48b63630df6eb52121e3.scope - libcontainer container 3040c12019600df9866725a5ebefe7e65b2e6db117cd48b63630df6eb52121e3. Jan 13 20:56:47.769383 systemd[1]: Started cri-containerd-c62cac835c5238fabf3478ec37683a27ef6cdd3a4ac4db5c0baac085230c68ea.scope - libcontainer container c62cac835c5238fabf3478ec37683a27ef6cdd3a4ac4db5c0baac085230c68ea. Jan 13 20:56:47.798973 containerd[1537]: time="2025-01-13T20:56:47.798946588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:0e8bc992702ca2e3191346c577e08654,Namespace:kube-system,Attempt:0,} returns sandbox id \"666b331e08136ba7389dba1b982525ca57408cf48d4c617f10990ea1752cabd6\"" Jan 13 20:56:47.800916 containerd[1537]: time="2025-01-13T20:56:47.800898089Z" level=info msg="CreateContainer within sandbox \"666b331e08136ba7389dba1b982525ca57408cf48d4c617f10990ea1752cabd6\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 13 20:56:47.810258 containerd[1537]: time="2025-01-13T20:56:47.810209171Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:50a9ae38ddb3bec3278d8dc73a6a7009,Namespace:kube-system,Attempt:0,} returns sandbox id \"3040c12019600df9866725a5ebefe7e65b2e6db117cd48b63630df6eb52121e3\"" Jan 13 20:56:47.812079 containerd[1537]: time="2025-01-13T20:56:47.811973293Z" level=info msg="CreateContainer within sandbox \"3040c12019600df9866725a5ebefe7e65b2e6db117cd48b63630df6eb52121e3\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 13 20:56:47.819291 containerd[1537]: time="2025-01-13T20:56:47.819192282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a52b86ce975f496e6002ba953fa9b888,Namespace:kube-system,Attempt:0,} returns sandbox id \"c62cac835c5238fabf3478ec37683a27ef6cdd3a4ac4db5c0baac085230c68ea\"" Jan 13 20:56:47.821227 containerd[1537]: time="2025-01-13T20:56:47.821208886Z" level=info msg="CreateContainer within sandbox \"c62cac835c5238fabf3478ec37683a27ef6cdd3a4ac4db5c0baac085230c68ea\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 13 20:56:48.022250 containerd[1537]: time="2025-01-13T20:56:48.022198044Z" level=info msg="CreateContainer within sandbox \"3040c12019600df9866725a5ebefe7e65b2e6db117cd48b63630df6eb52121e3\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"c83c1a1d213e1b793a0e5f4e80852fc91232d5af7a84656b6aee3e1bcf71ac44\"" Jan 13 20:56:48.023028 containerd[1537]: time="2025-01-13T20:56:48.022753962Z" level=info msg="StartContainer for \"c83c1a1d213e1b793a0e5f4e80852fc91232d5af7a84656b6aee3e1bcf71ac44\"" Jan 13 20:56:48.046946 systemd[1]: Started cri-containerd-c83c1a1d213e1b793a0e5f4e80852fc91232d5af7a84656b6aee3e1bcf71ac44.scope - libcontainer container c83c1a1d213e1b793a0e5f4e80852fc91232d5af7a84656b6aee3e1bcf71ac44. Jan 13 20:56:48.585916 containerd[1537]: time="2025-01-13T20:56:48.585804900Z" level=info msg="StartContainer for \"c83c1a1d213e1b793a0e5f4e80852fc91232d5af7a84656b6aee3e1bcf71ac44\" returns successfully" Jan 13 20:56:48.586355 containerd[1537]: time="2025-01-13T20:56:48.586249900Z" level=info msg="CreateContainer within sandbox \"666b331e08136ba7389dba1b982525ca57408cf48d4c617f10990ea1752cabd6\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"14f94434d941e5d9fd283d52c4d3c8a60178bb6bad5e1da179996c677f08f921\"" Jan 13 20:56:48.586355 containerd[1537]: time="2025-01-13T20:56:48.586328762Z" level=info msg="CreateContainer within sandbox \"c62cac835c5238fabf3478ec37683a27ef6cdd3a4ac4db5c0baac085230c68ea\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"0a9f0b68bc10d72c1203955eaf7467cfec0a80af60e46399ea4820067bad779c\"" Jan 13 20:56:48.587387 containerd[1537]: time="2025-01-13T20:56:48.587297511Z" level=info msg="StartContainer for \"0a9f0b68bc10d72c1203955eaf7467cfec0a80af60e46399ea4820067bad779c\"" Jan 13 20:56:48.588377 containerd[1537]: time="2025-01-13T20:56:48.587322095Z" level=info msg="StartContainer for \"14f94434d941e5d9fd283d52c4d3c8a60178bb6bad5e1da179996c677f08f921\"" Jan 13 20:56:48.626922 systemd[1]: Started cri-containerd-0a9f0b68bc10d72c1203955eaf7467cfec0a80af60e46399ea4820067bad779c.scope - libcontainer container 0a9f0b68bc10d72c1203955eaf7467cfec0a80af60e46399ea4820067bad779c. Jan 13 20:56:48.628340 systemd[1]: Started cri-containerd-14f94434d941e5d9fd283d52c4d3c8a60178bb6bad5e1da179996c677f08f921.scope - libcontainer container 14f94434d941e5d9fd283d52c4d3c8a60178bb6bad5e1da179996c677f08f921. Jan 13 20:56:48.679929 kubelet[2411]: W0113 20:56:48.679893 2411 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Jan 13 20:56:48.683921 kubelet[2411]: E0113 20:56:48.683821 2411 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Jan 13 20:56:48.692535 containerd[1537]: time="2025-01-13T20:56:48.692423636Z" level=info msg="StartContainer for \"0a9f0b68bc10d72c1203955eaf7467cfec0a80af60e46399ea4820067bad779c\" returns successfully" Jan 13 20:56:48.692535 containerd[1537]: time="2025-01-13T20:56:48.692423611Z" level=info msg="StartContainer for \"14f94434d941e5d9fd283d52c4d3c8a60178bb6bad5e1da179996c677f08f921\" returns successfully" Jan 13 20:56:48.810667 kubelet[2411]: E0113 20:56:48.810636 2411 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="3.2s" Jan 13 20:56:49.010326 kubelet[2411]: W0113 20:56:49.010303 2411 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Jan 13 20:56:49.010414 kubelet[2411]: E0113 20:56:49.010333 2411 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Jan 13 20:56:49.169399 kubelet[2411]: I0113 20:56:49.169377 2411 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jan 13 20:56:49.169715 kubelet[2411]: E0113 20:56:49.169701 2411 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Jan 13 20:56:49.306490 systemd[1]: run-containerd-runc-k8s.io-0a9f0b68bc10d72c1203955eaf7467cfec0a80af60e46399ea4820067bad779c-runc.QUSvEM.mount: Deactivated successfully. Jan 13 20:56:49.306565 systemd[1]: run-containerd-runc-k8s.io-14f94434d941e5d9fd283d52c4d3c8a60178bb6bad5e1da179996c677f08f921-runc.LTcrCz.mount: Deactivated successfully. Jan 13 20:56:49.823189 kubelet[2411]: W0113 20:56:49.823161 2411 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Jan 13 20:56:49.823189 kubelet[2411]: E0113 20:56:49.823192 2411 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Jan 13 20:56:49.867987 kubelet[2411]: W0113 20:56:49.867964 2411 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Jan 13 20:56:49.868114 kubelet[2411]: E0113 20:56:49.868094 2411 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Jan 13 20:56:50.882485 kubelet[2411]: E0113 20:56:50.882422 2411 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{localhost.181a5c0a571dea62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-01-13 20:56:45.78575421 +0000 UTC m=+0.453629168,LastTimestamp:2025-01-13 20:56:45.78575421 +0000 UTC m=+0.453629168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jan 13 20:56:50.936387 kubelet[2411]: E0113 20:56:50.936289 2411 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{localhost.181a5c0a58e8a29e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:InvalidDiskCapacity,Message:invalid capacity 0 on image filesystem,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-01-13 20:56:45.815816862 +0000 UTC m=+0.483691818,LastTimestamp:2025-01-13 20:56:45.815816862 +0000 UTC m=+0.483691818,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jan 13 20:56:50.988752 kubelet[2411]: E0113 20:56:50.988642 2411 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{localhost.181a5c0a5b45e41f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node localhost status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-01-13 20:56:45.855482911 +0000 UTC m=+0.523357859,LastTimestamp:2025-01-13 20:56:45.855482911 +0000 UTC m=+0.523357859,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jan 13 20:56:51.131997 kubelet[2411]: E0113 20:56:51.131973 2411 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Jan 13 20:56:51.484465 kubelet[2411]: E0113 20:56:51.484436 2411 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Jan 13 20:56:51.925000 kubelet[2411]: E0113 20:56:51.924978 2411 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Jan 13 20:56:52.012891 kubelet[2411]: E0113 20:56:52.012867 2411 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Jan 13 20:56:52.371234 kubelet[2411]: I0113 20:56:52.370919 2411 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jan 13 20:56:52.374348 kubelet[2411]: I0113 20:56:52.374335 2411 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Jan 13 20:56:52.533888 systemd[1]: Reloading requested from client PID 2682 ('systemctl') (unit session-9.scope)... Jan 13 20:56:52.533900 systemd[1]: Reloading... Jan 13 20:56:52.585842 zram_generator::config[2721]: No configuration found. Jan 13 20:56:52.646551 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:56:52.661570 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:56:52.712630 systemd[1]: Reloading finished in 178 ms. Jan 13 20:56:52.736138 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:56:52.748648 systemd[1]: kubelet.service: Deactivated successfully. Jan 13 20:56:52.748890 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:56:52.753011 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:56:53.653638 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:56:53.656961 (kubelet)[2787]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 20:56:53.875854 kubelet[2787]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:56:53.875854 kubelet[2787]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 13 20:56:53.875854 kubelet[2787]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:56:53.877635 kubelet[2787]: I0113 20:56:53.877088 2787 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 20:56:53.882434 kubelet[2787]: I0113 20:56:53.882411 2787 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Jan 13 20:56:53.882434 kubelet[2787]: I0113 20:56:53.882430 2787 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 20:56:53.883191 kubelet[2787]: I0113 20:56:53.882621 2787 server.go:929] "Client rotation is on, will bootstrap in background" Jan 13 20:56:53.883536 kubelet[2787]: I0113 20:56:53.883519 2787 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 13 20:56:53.887944 kubelet[2787]: I0113 20:56:53.887857 2787 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 20:56:53.890723 kubelet[2787]: E0113 20:56:53.890702 2787 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Jan 13 20:56:53.890723 kubelet[2787]: I0113 20:56:53.890719 2787 server.go:1403] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Jan 13 20:56:53.893416 kubelet[2787]: I0113 20:56:53.893400 2787 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 20:56:53.893535 kubelet[2787]: I0113 20:56:53.893494 2787 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 13 20:56:53.893588 kubelet[2787]: I0113 20:56:53.893558 2787 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 20:56:53.893746 kubelet[2787]: I0113 20:56:53.893589 2787 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 13 20:56:53.893800 kubelet[2787]: I0113 20:56:53.893755 2787 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 20:56:53.893800 kubelet[2787]: I0113 20:56:53.893762 2787 container_manager_linux.go:300] "Creating device plugin manager" Jan 13 20:56:53.898441 kubelet[2787]: I0113 20:56:53.898403 2787 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:56:53.901084 kubelet[2787]: I0113 20:56:53.900877 2787 kubelet.go:408] "Attempting to sync node with API server" Jan 13 20:56:53.901084 kubelet[2787]: I0113 20:56:53.900890 2787 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 20:56:53.901084 kubelet[2787]: I0113 20:56:53.900909 2787 kubelet.go:314] "Adding apiserver pod source" Jan 13 20:56:53.901084 kubelet[2787]: I0113 20:56:53.900918 2787 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 20:56:53.906892 kubelet[2787]: I0113 20:56:53.905913 2787 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 13 20:56:53.906892 kubelet[2787]: I0113 20:56:53.906154 2787 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 20:56:53.916192 kubelet[2787]: I0113 20:56:53.916178 2787 server.go:1269] "Started kubelet" Jan 13 20:56:53.918839 kubelet[2787]: I0113 20:56:53.917585 2787 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 20:56:53.919178 kubelet[2787]: I0113 20:56:53.919162 2787 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 13 20:56:53.919254 kubelet[2787]: I0113 20:56:53.919234 2787 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 20:56:53.919930 kubelet[2787]: I0113 20:56:53.919917 2787 server.go:460] "Adding debug handlers to kubelet server" Jan 13 20:56:53.921513 kubelet[2787]: I0113 20:56:53.921363 2787 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 20:56:53.921513 kubelet[2787]: I0113 20:56:53.921513 2787 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 20:56:53.926744 kubelet[2787]: I0113 20:56:53.926478 2787 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 13 20:56:53.930986 kubelet[2787]: I0113 20:56:53.930967 2787 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 13 20:56:53.931064 kubelet[2787]: I0113 20:56:53.931049 2787 reconciler.go:26] "Reconciler: start to sync state" Jan 13 20:56:53.933127 kubelet[2787]: I0113 20:56:53.933052 2787 factory.go:221] Registration of the systemd container factory successfully Jan 13 20:56:53.933127 kubelet[2787]: I0113 20:56:53.933068 2787 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 20:56:53.933127 kubelet[2787]: I0113 20:56:53.933111 2787 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 20:56:53.933501 kubelet[2787]: E0113 20:56:53.933482 2787 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 13 20:56:53.935079 kubelet[2787]: I0113 20:56:53.934898 2787 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 20:56:53.935079 kubelet[2787]: I0113 20:56:53.934920 2787 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 13 20:56:53.935079 kubelet[2787]: I0113 20:56:53.934931 2787 kubelet.go:2321] "Starting kubelet main sync loop" Jan 13 20:56:53.935079 kubelet[2787]: E0113 20:56:53.934953 2787 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 13 20:56:53.939449 kubelet[2787]: I0113 20:56:53.938419 2787 factory.go:221] Registration of the containerd container factory successfully Jan 13 20:56:53.967226 kubelet[2787]: I0113 20:56:53.967205 2787 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 13 20:56:53.967226 kubelet[2787]: I0113 20:56:53.967218 2787 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 13 20:56:53.967226 kubelet[2787]: I0113 20:56:53.967230 2787 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:56:53.967352 kubelet[2787]: I0113 20:56:53.967341 2787 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 13 20:56:53.967376 kubelet[2787]: I0113 20:56:53.967350 2787 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 13 20:56:53.967376 kubelet[2787]: I0113 20:56:53.967362 2787 policy_none.go:49] "None policy: Start" Jan 13 20:56:53.967770 kubelet[2787]: I0113 20:56:53.967755 2787 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 13 20:56:53.967770 kubelet[2787]: I0113 20:56:53.967769 2787 state_mem.go:35] "Initializing new in-memory state store" Jan 13 20:56:53.967890 kubelet[2787]: I0113 20:56:53.967876 2787 state_mem.go:75] "Updated machine memory state" Jan 13 20:56:53.970677 kubelet[2787]: I0113 20:56:53.970312 2787 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 20:56:53.970677 kubelet[2787]: I0113 20:56:53.970408 2787 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 13 20:56:53.970677 kubelet[2787]: I0113 20:56:53.970414 2787 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 13 20:56:53.970764 kubelet[2787]: I0113 20:56:53.970682 2787 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 20:56:54.075419 kubelet[2787]: I0113 20:56:54.075402 2787 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jan 13 20:56:54.080230 kubelet[2787]: I0113 20:56:54.079656 2787 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Jan 13 20:56:54.080230 kubelet[2787]: I0113 20:56:54.079713 2787 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Jan 13 20:56:54.132366 kubelet[2787]: I0113 20:56:54.132199 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/50a9ae38ddb3bec3278d8dc73a6a7009-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"50a9ae38ddb3bec3278d8dc73a6a7009\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:56:54.132366 kubelet[2787]: I0113 20:56:54.132219 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/50a9ae38ddb3bec3278d8dc73a6a7009-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"50a9ae38ddb3bec3278d8dc73a6a7009\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:56:54.132366 kubelet[2787]: I0113 20:56:54.132270 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/50a9ae38ddb3bec3278d8dc73a6a7009-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"50a9ae38ddb3bec3278d8dc73a6a7009\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:56:54.132366 kubelet[2787]: I0113 20:56:54.132283 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a52b86ce975f496e6002ba953fa9b888-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a52b86ce975f496e6002ba953fa9b888\") " pod="kube-system/kube-scheduler-localhost" Jan 13 20:56:54.132366 kubelet[2787]: I0113 20:56:54.132292 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0e8bc992702ca2e3191346c577e08654-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"0e8bc992702ca2e3191346c577e08654\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:56:54.132514 kubelet[2787]: I0113 20:56:54.132301 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/50a9ae38ddb3bec3278d8dc73a6a7009-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"50a9ae38ddb3bec3278d8dc73a6a7009\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:56:54.132514 kubelet[2787]: I0113 20:56:54.132315 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0e8bc992702ca2e3191346c577e08654-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"0e8bc992702ca2e3191346c577e08654\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:56:54.132514 kubelet[2787]: I0113 20:56:54.132331 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0e8bc992702ca2e3191346c577e08654-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"0e8bc992702ca2e3191346c577e08654\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:56:54.132514 kubelet[2787]: I0113 20:56:54.132340 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/50a9ae38ddb3bec3278d8dc73a6a7009-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"50a9ae38ddb3bec3278d8dc73a6a7009\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:56:54.904334 kubelet[2787]: I0113 20:56:54.904308 2787 apiserver.go:52] "Watching apiserver" Jan 13 20:56:54.932053 kubelet[2787]: I0113 20:56:54.932013 2787 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 13 20:56:54.964981 kubelet[2787]: E0113 20:56:54.964949 2787 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Jan 13 20:56:54.974415 kubelet[2787]: I0113 20:56:54.974373 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=0.974360959 podStartE2EDuration="974.360959ms" podCreationTimestamp="2025-01-13 20:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:56:54.974225975 +0000 UTC m=+1.133563947" watchObservedRunningTime="2025-01-13 20:56:54.974360959 +0000 UTC m=+1.133698939" Jan 13 20:56:54.988180 kubelet[2787]: I0113 20:56:54.988139 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=0.988124919 podStartE2EDuration="988.124919ms" podCreationTimestamp="2025-01-13 20:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:56:54.981201058 +0000 UTC m=+1.140539037" watchObservedRunningTime="2025-01-13 20:56:54.988124919 +0000 UTC m=+1.147462889" Jan 13 20:56:54.988399 kubelet[2787]: I0113 20:56:54.988277 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=0.988272169 podStartE2EDuration="988.272169ms" podCreationTimestamp="2025-01-13 20:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:56:54.988114614 +0000 UTC m=+1.147452594" watchObservedRunningTime="2025-01-13 20:56:54.988272169 +0000 UTC m=+1.147610148" Jan 13 20:56:57.955534 kubelet[2787]: I0113 20:56:57.955502 2787 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 13 20:56:57.955814 kubelet[2787]: I0113 20:56:57.955778 2787 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 13 20:56:57.955870 containerd[1537]: time="2025-01-13T20:56:57.955685515Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 13 20:56:58.374710 sudo[1849]: pam_unix(sudo:session): session closed for user root Jan 13 20:56:58.375891 sshd[1848]: Connection closed by 147.75.109.163 port 51790 Jan 13 20:56:58.376920 sshd-session[1846]: pam_unix(sshd:session): session closed for user core Jan 13 20:56:58.379078 systemd-logind[1519]: Session 9 logged out. Waiting for processes to exit. Jan 13 20:56:58.380332 systemd[1]: sshd@6-139.178.70.103:22-147.75.109.163:51790.service: Deactivated successfully. Jan 13 20:56:58.381998 systemd[1]: session-9.scope: Deactivated successfully. Jan 13 20:56:58.382167 systemd[1]: session-9.scope: Consumed 2.822s CPU time, 136.3M memory peak, 0B memory swap peak. Jan 13 20:56:58.382896 systemd-logind[1519]: Removed session 9. Jan 13 20:56:58.751081 systemd[1]: Created slice kubepods-besteffort-pod7fd02daf_f577_48ee_86a6_f0b5b6ecf399.slice - libcontainer container kubepods-besteffort-pod7fd02daf_f577_48ee_86a6_f0b5b6ecf399.slice. Jan 13 20:56:58.760772 kubelet[2787]: I0113 20:56:58.760754 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/7fd02daf-f577-48ee-86a6-f0b5b6ecf399-kube-proxy\") pod \"kube-proxy-5n4sg\" (UID: \"7fd02daf-f577-48ee-86a6-f0b5b6ecf399\") " pod="kube-system/kube-proxy-5n4sg" Jan 13 20:56:58.760913 kubelet[2787]: I0113 20:56:58.760904 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7fd02daf-f577-48ee-86a6-f0b5b6ecf399-xtables-lock\") pod \"kube-proxy-5n4sg\" (UID: \"7fd02daf-f577-48ee-86a6-f0b5b6ecf399\") " pod="kube-system/kube-proxy-5n4sg" Jan 13 20:56:58.760961 kubelet[2787]: I0113 20:56:58.760954 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fd02daf-f577-48ee-86a6-f0b5b6ecf399-lib-modules\") pod \"kube-proxy-5n4sg\" (UID: \"7fd02daf-f577-48ee-86a6-f0b5b6ecf399\") " pod="kube-system/kube-proxy-5n4sg" Jan 13 20:56:58.761011 kubelet[2787]: I0113 20:56:58.761003 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jfvq\" (UniqueName: \"kubernetes.io/projected/7fd02daf-f577-48ee-86a6-f0b5b6ecf399-kube-api-access-2jfvq\") pod \"kube-proxy-5n4sg\" (UID: \"7fd02daf-f577-48ee-86a6-f0b5b6ecf399\") " pod="kube-system/kube-proxy-5n4sg" Jan 13 20:56:59.043682 systemd[1]: Created slice kubepods-besteffort-pod369934bc_9963_4ae2_8b97_c0fb37c0181d.slice - libcontainer container kubepods-besteffort-pod369934bc_9963_4ae2_8b97_c0fb37c0181d.slice. Jan 13 20:56:59.059075 containerd[1537]: time="2025-01-13T20:56:59.059037694Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5n4sg,Uid:7fd02daf-f577-48ee-86a6-f0b5b6ecf399,Namespace:kube-system,Attempt:0,}" Jan 13 20:56:59.065366 kubelet[2787]: I0113 20:56:59.064411 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/369934bc-9963-4ae2-8b97-c0fb37c0181d-var-lib-calico\") pod \"tigera-operator-76c4976dd7-52jrh\" (UID: \"369934bc-9963-4ae2-8b97-c0fb37c0181d\") " pod="tigera-operator/tigera-operator-76c4976dd7-52jrh" Jan 13 20:56:59.065366 kubelet[2787]: I0113 20:56:59.064459 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7npqz\" (UniqueName: \"kubernetes.io/projected/369934bc-9963-4ae2-8b97-c0fb37c0181d-kube-api-access-7npqz\") pod \"tigera-operator-76c4976dd7-52jrh\" (UID: \"369934bc-9963-4ae2-8b97-c0fb37c0181d\") " pod="tigera-operator/tigera-operator-76c4976dd7-52jrh" Jan 13 20:56:59.073960 containerd[1537]: time="2025-01-13T20:56:59.073722709Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:56:59.073960 containerd[1537]: time="2025-01-13T20:56:59.073769018Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:56:59.073960 containerd[1537]: time="2025-01-13T20:56:59.073776569Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:56:59.073960 containerd[1537]: time="2025-01-13T20:56:59.073858132Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:56:59.095988 systemd[1]: Started cri-containerd-488ace05b6db477cdcd9505c6e167c3dc8c46bd560fb1d0384d9f3a8ec3dd509.scope - libcontainer container 488ace05b6db477cdcd9505c6e167c3dc8c46bd560fb1d0384d9f3a8ec3dd509. Jan 13 20:56:59.111399 containerd[1537]: time="2025-01-13T20:56:59.111370899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5n4sg,Uid:7fd02daf-f577-48ee-86a6-f0b5b6ecf399,Namespace:kube-system,Attempt:0,} returns sandbox id \"488ace05b6db477cdcd9505c6e167c3dc8c46bd560fb1d0384d9f3a8ec3dd509\"" Jan 13 20:56:59.113549 containerd[1537]: time="2025-01-13T20:56:59.113498295Z" level=info msg="CreateContainer within sandbox \"488ace05b6db477cdcd9505c6e167c3dc8c46bd560fb1d0384d9f3a8ec3dd509\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 13 20:56:59.123569 containerd[1537]: time="2025-01-13T20:56:59.123535823Z" level=info msg="CreateContainer within sandbox \"488ace05b6db477cdcd9505c6e167c3dc8c46bd560fb1d0384d9f3a8ec3dd509\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"6c5726f839afcc81aa9610ec2bb2c29650bf61aee1f33440cc9f8095dacd1212\"" Jan 13 20:56:59.124970 containerd[1537]: time="2025-01-13T20:56:59.124053533Z" level=info msg="StartContainer for \"6c5726f839afcc81aa9610ec2bb2c29650bf61aee1f33440cc9f8095dacd1212\"" Jan 13 20:56:59.144967 systemd[1]: Started cri-containerd-6c5726f839afcc81aa9610ec2bb2c29650bf61aee1f33440cc9f8095dacd1212.scope - libcontainer container 6c5726f839afcc81aa9610ec2bb2c29650bf61aee1f33440cc9f8095dacd1212. Jan 13 20:56:59.165643 containerd[1537]: time="2025-01-13T20:56:59.165570991Z" level=info msg="StartContainer for \"6c5726f839afcc81aa9610ec2bb2c29650bf61aee1f33440cc9f8095dacd1212\" returns successfully" Jan 13 20:56:59.346892 containerd[1537]: time="2025-01-13T20:56:59.346437622Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-76c4976dd7-52jrh,Uid:369934bc-9963-4ae2-8b97-c0fb37c0181d,Namespace:tigera-operator,Attempt:0,}" Jan 13 20:56:59.368104 containerd[1537]: time="2025-01-13T20:56:59.367774987Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:56:59.368104 containerd[1537]: time="2025-01-13T20:56:59.367880613Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:56:59.368649 containerd[1537]: time="2025-01-13T20:56:59.367955541Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:56:59.368649 containerd[1537]: time="2025-01-13T20:56:59.368606061Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:56:59.386039 systemd[1]: Started cri-containerd-4568f8f8f24a7865b37b133512136d479fab025cbb0ddc623aa93f4ca724759b.scope - libcontainer container 4568f8f8f24a7865b37b133512136d479fab025cbb0ddc623aa93f4ca724759b. Jan 13 20:56:59.422396 containerd[1537]: time="2025-01-13T20:56:59.422318413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-76c4976dd7-52jrh,Uid:369934bc-9963-4ae2-8b97-c0fb37c0181d,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"4568f8f8f24a7865b37b133512136d479fab025cbb0ddc623aa93f4ca724759b\"" Jan 13 20:56:59.428298 containerd[1537]: time="2025-01-13T20:56:59.428243429Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Jan 13 20:56:59.879061 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1980793533.mount: Deactivated successfully. Jan 13 20:56:59.991481 kubelet[2787]: I0113 20:56:59.991267 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-5n4sg" podStartSLOduration=1.99125426 podStartE2EDuration="1.99125426s" podCreationTimestamp="2025-01-13 20:56:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:56:59.991067621 +0000 UTC m=+6.150405604" watchObservedRunningTime="2025-01-13 20:56:59.99125426 +0000 UTC m=+6.150592249" Jan 13 20:57:01.700351 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3422242969.mount: Deactivated successfully. Jan 13 20:57:02.130133 containerd[1537]: time="2025-01-13T20:57:02.130050770Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:02.131098 containerd[1537]: time="2025-01-13T20:57:02.130659693Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21764349" Jan 13 20:57:02.131098 containerd[1537]: time="2025-01-13T20:57:02.131069558Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:02.133519 containerd[1537]: time="2025-01-13T20:57:02.133431696Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 2.705155404s" Jan 13 20:57:02.133519 containerd[1537]: time="2025-01-13T20:57:02.133459271Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Jan 13 20:57:02.133946 containerd[1537]: time="2025-01-13T20:57:02.133929872Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:02.139612 containerd[1537]: time="2025-01-13T20:57:02.139486191Z" level=info msg="CreateContainer within sandbox \"4568f8f8f24a7865b37b133512136d479fab025cbb0ddc623aa93f4ca724759b\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 13 20:57:02.146550 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount84769826.mount: Deactivated successfully. Jan 13 20:57:02.152223 containerd[1537]: time="2025-01-13T20:57:02.152188168Z" level=info msg="CreateContainer within sandbox \"4568f8f8f24a7865b37b133512136d479fab025cbb0ddc623aa93f4ca724759b\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"deb83cd160ad4a969e430996d1a885c0d68a1f97cfad6e8c2d83c08fae82a163\"" Jan 13 20:57:02.152981 containerd[1537]: time="2025-01-13T20:57:02.152961697Z" level=info msg="StartContainer for \"deb83cd160ad4a969e430996d1a885c0d68a1f97cfad6e8c2d83c08fae82a163\"" Jan 13 20:57:02.178017 systemd[1]: Started cri-containerd-deb83cd160ad4a969e430996d1a885c0d68a1f97cfad6e8c2d83c08fae82a163.scope - libcontainer container deb83cd160ad4a969e430996d1a885c0d68a1f97cfad6e8c2d83c08fae82a163. Jan 13 20:57:02.198466 containerd[1537]: time="2025-01-13T20:57:02.198431568Z" level=info msg="StartContainer for \"deb83cd160ad4a969e430996d1a885c0d68a1f97cfad6e8c2d83c08fae82a163\" returns successfully" Jan 13 20:57:04.122294 kubelet[2787]: I0113 20:57:04.122122 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-76c4976dd7-52jrh" podStartSLOduration=2.411214398 podStartE2EDuration="5.12204866s" podCreationTimestamp="2025-01-13 20:56:59 +0000 UTC" firstStartedPulling="2025-01-13 20:56:59.427660608 +0000 UTC m=+5.586998582" lastFinishedPulling="2025-01-13 20:57:02.13849487 +0000 UTC m=+8.297832844" observedRunningTime="2025-01-13 20:57:02.981302507 +0000 UTC m=+9.140640488" watchObservedRunningTime="2025-01-13 20:57:04.12204866 +0000 UTC m=+10.281386638" Jan 13 20:57:05.977882 systemd[1]: Created slice kubepods-besteffort-pod0da3dfd6_758e_47e6_8dbb_82aa95aea79b.slice - libcontainer container kubepods-besteffort-pod0da3dfd6_758e_47e6_8dbb_82aa95aea79b.slice. Jan 13 20:57:06.012704 kubelet[2787]: I0113 20:57:06.012499 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0da3dfd6-758e-47e6-8dbb-82aa95aea79b-typha-certs\") pod \"calico-typha-77bc676d94-7j9c5\" (UID: \"0da3dfd6-758e-47e6-8dbb-82aa95aea79b\") " pod="calico-system/calico-typha-77bc676d94-7j9c5" Jan 13 20:57:06.012704 kubelet[2787]: I0113 20:57:06.012536 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0da3dfd6-758e-47e6-8dbb-82aa95aea79b-tigera-ca-bundle\") pod \"calico-typha-77bc676d94-7j9c5\" (UID: \"0da3dfd6-758e-47e6-8dbb-82aa95aea79b\") " pod="calico-system/calico-typha-77bc676d94-7j9c5" Jan 13 20:57:06.012704 kubelet[2787]: I0113 20:57:06.012559 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm49l\" (UniqueName: \"kubernetes.io/projected/0da3dfd6-758e-47e6-8dbb-82aa95aea79b-kube-api-access-lm49l\") pod \"calico-typha-77bc676d94-7j9c5\" (UID: \"0da3dfd6-758e-47e6-8dbb-82aa95aea79b\") " pod="calico-system/calico-typha-77bc676d94-7j9c5" Jan 13 20:57:06.099453 systemd[1]: Created slice kubepods-besteffort-podb5b99a2e_4a23_4904_9961_423d7f5593e3.slice - libcontainer container kubepods-besteffort-podb5b99a2e_4a23_4904_9961_423d7f5593e3.slice. Jan 13 20:57:06.113787 kubelet[2787]: I0113 20:57:06.112761 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-policysync\") pod \"calico-node-wxzwq\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " pod="calico-system/calico-node-wxzwq" Jan 13 20:57:06.113787 kubelet[2787]: I0113 20:57:06.112793 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b99a2e-4a23-4904-9961-423d7f5593e3-tigera-ca-bundle\") pod \"calico-node-wxzwq\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " pod="calico-system/calico-node-wxzwq" Jan 13 20:57:06.113787 kubelet[2787]: I0113 20:57:06.112807 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-xtables-lock\") pod \"calico-node-wxzwq\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " pod="calico-system/calico-node-wxzwq" Jan 13 20:57:06.113787 kubelet[2787]: I0113 20:57:06.112833 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-var-run-calico\") pod \"calico-node-wxzwq\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " pod="calico-system/calico-node-wxzwq" Jan 13 20:57:06.113787 kubelet[2787]: I0113 20:57:06.112851 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-var-lib-calico\") pod \"calico-node-wxzwq\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " pod="calico-system/calico-node-wxzwq" Jan 13 20:57:06.114083 kubelet[2787]: I0113 20:57:06.112865 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv84w\" (UniqueName: \"kubernetes.io/projected/b5b99a2e-4a23-4904-9961-423d7f5593e3-kube-api-access-nv84w\") pod \"calico-node-wxzwq\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " pod="calico-system/calico-node-wxzwq" Jan 13 20:57:06.114083 kubelet[2787]: I0113 20:57:06.112881 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b5b99a2e-4a23-4904-9961-423d7f5593e3-node-certs\") pod \"calico-node-wxzwq\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " pod="calico-system/calico-node-wxzwq" Jan 13 20:57:06.114083 kubelet[2787]: I0113 20:57:06.112897 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-cni-bin-dir\") pod \"calico-node-wxzwq\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " pod="calico-system/calico-node-wxzwq" Jan 13 20:57:06.114083 kubelet[2787]: I0113 20:57:06.112911 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-cni-net-dir\") pod \"calico-node-wxzwq\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " pod="calico-system/calico-node-wxzwq" Jan 13 20:57:06.114083 kubelet[2787]: I0113 20:57:06.112923 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-lib-modules\") pod \"calico-node-wxzwq\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " pod="calico-system/calico-node-wxzwq" Jan 13 20:57:06.114305 kubelet[2787]: I0113 20:57:06.112934 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-cni-log-dir\") pod \"calico-node-wxzwq\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " pod="calico-system/calico-node-wxzwq" Jan 13 20:57:06.114305 kubelet[2787]: I0113 20:57:06.112950 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-flexvol-driver-host\") pod \"calico-node-wxzwq\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " pod="calico-system/calico-node-wxzwq" Jan 13 20:57:06.269195 kubelet[2787]: E0113 20:57:06.269057 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.269195 kubelet[2787]: W0113 20:57:06.269088 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.269195 kubelet[2787]: E0113 20:57:06.269107 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.281843 kubelet[2787]: E0113 20:57:06.280437 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.281843 kubelet[2787]: W0113 20:57:06.280453 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.281843 kubelet[2787]: E0113 20:57:06.280467 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.293371 kubelet[2787]: E0113 20:57:06.293325 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jkzvs" podUID="156fa3f2-d364-43dd-86de-274512f7d213" Jan 13 20:57:06.314376 kubelet[2787]: E0113 20:57:06.314345 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.314376 kubelet[2787]: W0113 20:57:06.314370 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.314919 kubelet[2787]: E0113 20:57:06.314395 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.314919 kubelet[2787]: E0113 20:57:06.314885 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.314919 kubelet[2787]: W0113 20:57:06.314896 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.315021 kubelet[2787]: E0113 20:57:06.314909 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.315192 kubelet[2787]: E0113 20:57:06.315178 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.315192 kubelet[2787]: W0113 20:57:06.315187 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.315269 kubelet[2787]: E0113 20:57:06.315195 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.315590 kubelet[2787]: E0113 20:57:06.315574 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.315590 kubelet[2787]: W0113 20:57:06.315585 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.315771 kubelet[2787]: E0113 20:57:06.315597 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.315811 kubelet[2787]: E0113 20:57:06.315774 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.315811 kubelet[2787]: W0113 20:57:06.315782 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.315811 kubelet[2787]: E0113 20:57:06.315790 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.316024 kubelet[2787]: E0113 20:57:06.315942 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.316024 kubelet[2787]: W0113 20:57:06.315950 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.316024 kubelet[2787]: E0113 20:57:06.315958 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.316290 kubelet[2787]: E0113 20:57:06.316216 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.316290 kubelet[2787]: W0113 20:57:06.316224 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.316290 kubelet[2787]: E0113 20:57:06.316234 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.317079 kubelet[2787]: E0113 20:57:06.317029 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.317079 kubelet[2787]: W0113 20:57:06.317046 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.317079 kubelet[2787]: E0113 20:57:06.317063 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.317406 kubelet[2787]: E0113 20:57:06.317391 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.317406 kubelet[2787]: W0113 20:57:06.317401 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.317472 kubelet[2787]: E0113 20:57:06.317412 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.317992 kubelet[2787]: E0113 20:57:06.317968 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.317992 kubelet[2787]: W0113 20:57:06.317978 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.318125 kubelet[2787]: E0113 20:57:06.317998 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.318236 kubelet[2787]: E0113 20:57:06.318217 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.318236 kubelet[2787]: W0113 20:57:06.318224 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.318236 kubelet[2787]: E0113 20:57:06.318233 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.319308 kubelet[2787]: E0113 20:57:06.319280 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.319308 kubelet[2787]: W0113 20:57:06.319302 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.319535 kubelet[2787]: E0113 20:57:06.319326 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.319759 kubelet[2787]: E0113 20:57:06.319728 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.319759 kubelet[2787]: W0113 20:57:06.319753 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.319901 kubelet[2787]: E0113 20:57:06.319765 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.319960 kubelet[2787]: E0113 20:57:06.319949 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.319960 kubelet[2787]: W0113 20:57:06.319956 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.320036 kubelet[2787]: E0113 20:57:06.319963 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.320144 kubelet[2787]: E0113 20:57:06.320129 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.320144 kubelet[2787]: W0113 20:57:06.320142 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.320217 kubelet[2787]: E0113 20:57:06.320150 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.320349 kubelet[2787]: E0113 20:57:06.320333 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.320349 kubelet[2787]: W0113 20:57:06.320341 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.320349 kubelet[2787]: E0113 20:57:06.320347 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.321073 kubelet[2787]: E0113 20:57:06.320988 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.321378 containerd[1537]: time="2025-01-13T20:57:06.321297620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-77bc676d94-7j9c5,Uid:0da3dfd6-758e-47e6-8dbb-82aa95aea79b,Namespace:calico-system,Attempt:0,}" Jan 13 20:57:06.322076 kubelet[2787]: W0113 20:57:06.321416 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.322076 kubelet[2787]: E0113 20:57:06.321440 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.322941 kubelet[2787]: E0113 20:57:06.322415 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.322941 kubelet[2787]: W0113 20:57:06.322432 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.322941 kubelet[2787]: E0113 20:57:06.322452 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.323567 kubelet[2787]: E0113 20:57:06.323145 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.323567 kubelet[2787]: W0113 20:57:06.323158 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.323567 kubelet[2787]: E0113 20:57:06.323177 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.323567 kubelet[2787]: E0113 20:57:06.323527 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.323567 kubelet[2787]: W0113 20:57:06.323536 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.323567 kubelet[2787]: E0113 20:57:06.323548 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.327480 kubelet[2787]: E0113 20:57:06.327213 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.327480 kubelet[2787]: W0113 20:57:06.327239 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.327480 kubelet[2787]: E0113 20:57:06.327261 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.327480 kubelet[2787]: I0113 20:57:06.327290 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/156fa3f2-d364-43dd-86de-274512f7d213-varrun\") pod \"csi-node-driver-jkzvs\" (UID: \"156fa3f2-d364-43dd-86de-274512f7d213\") " pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:06.335229 kubelet[2787]: E0113 20:57:06.334959 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.335229 kubelet[2787]: W0113 20:57:06.334984 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.335229 kubelet[2787]: E0113 20:57:06.335008 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.335229 kubelet[2787]: I0113 20:57:06.335037 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/156fa3f2-d364-43dd-86de-274512f7d213-socket-dir\") pod \"csi-node-driver-jkzvs\" (UID: \"156fa3f2-d364-43dd-86de-274512f7d213\") " pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:06.341998 kubelet[2787]: E0113 20:57:06.337463 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.341998 kubelet[2787]: W0113 20:57:06.337483 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.341998 kubelet[2787]: E0113 20:57:06.337512 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.341998 kubelet[2787]: I0113 20:57:06.337536 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/156fa3f2-d364-43dd-86de-274512f7d213-registration-dir\") pod \"csi-node-driver-jkzvs\" (UID: \"156fa3f2-d364-43dd-86de-274512f7d213\") " pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:06.343357 kubelet[2787]: E0113 20:57:06.343337 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.343592 kubelet[2787]: W0113 20:57:06.343557 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.344227 kubelet[2787]: E0113 20:57:06.344211 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.344355 kubelet[2787]: I0113 20:57:06.344345 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stsnt\" (UniqueName: \"kubernetes.io/projected/156fa3f2-d364-43dd-86de-274512f7d213-kube-api-access-stsnt\") pod \"csi-node-driver-jkzvs\" (UID: \"156fa3f2-d364-43dd-86de-274512f7d213\") " pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:06.348286 kubelet[2787]: E0113 20:57:06.347730 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.348286 kubelet[2787]: W0113 20:57:06.347756 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.348286 kubelet[2787]: E0113 20:57:06.347775 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.348286 kubelet[2787]: I0113 20:57:06.347799 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/156fa3f2-d364-43dd-86de-274512f7d213-kubelet-dir\") pod \"csi-node-driver-jkzvs\" (UID: \"156fa3f2-d364-43dd-86de-274512f7d213\") " pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:06.354004 kubelet[2787]: E0113 20:57:06.349465 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.354004 kubelet[2787]: W0113 20:57:06.349485 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.354004 kubelet[2787]: E0113 20:57:06.350090 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.354004 kubelet[2787]: W0113 20:57:06.350102 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.354004 kubelet[2787]: E0113 20:57:06.350348 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.354004 kubelet[2787]: E0113 20:57:06.350366 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.354923 kubelet[2787]: E0113 20:57:06.354905 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.355119 kubelet[2787]: W0113 20:57:06.355102 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.360875 kubelet[2787]: E0113 20:57:06.358758 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.361075 kubelet[2787]: W0113 20:57:06.361047 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.361610 kubelet[2787]: E0113 20:57:06.361597 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.366027 kubelet[2787]: W0113 20:57:06.365976 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.369798 kubelet[2787]: E0113 20:57:06.369771 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.369964 kubelet[2787]: E0113 20:57:06.369952 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.371488 kubelet[2787]: E0113 20:57:06.370557 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.371600 kubelet[2787]: W0113 20:57:06.371493 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.371600 kubelet[2787]: E0113 20:57:06.371520 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.371863 kubelet[2787]: E0113 20:57:06.371764 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.371863 kubelet[2787]: W0113 20:57:06.371777 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.371863 kubelet[2787]: E0113 20:57:06.371790 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.372081 kubelet[2787]: E0113 20:57:06.372067 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.372081 kubelet[2787]: W0113 20:57:06.372075 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.373201 kubelet[2787]: E0113 20:57:06.372083 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.373201 kubelet[2787]: E0113 20:57:06.372482 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.373201 kubelet[2787]: W0113 20:57:06.372494 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.373201 kubelet[2787]: E0113 20:57:06.372512 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.373201 kubelet[2787]: E0113 20:57:06.370572 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.375749 kubelet[2787]: E0113 20:57:06.373354 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.375749 kubelet[2787]: W0113 20:57:06.373371 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.375749 kubelet[2787]: E0113 20:57:06.373393 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.381952 containerd[1537]: time="2025-01-13T20:57:06.381585925Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:57:06.381952 containerd[1537]: time="2025-01-13T20:57:06.381666760Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:57:06.381952 containerd[1537]: time="2025-01-13T20:57:06.381688957Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:06.381952 containerd[1537]: time="2025-01-13T20:57:06.381798415Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:06.405354 containerd[1537]: time="2025-01-13T20:57:06.405323320Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wxzwq,Uid:b5b99a2e-4a23-4904-9961-423d7f5593e3,Namespace:calico-system,Attempt:0,}" Jan 13 20:57:06.412793 systemd[1]: Started cri-containerd-fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f.scope - libcontainer container fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f. Jan 13 20:57:06.435955 containerd[1537]: time="2025-01-13T20:57:06.435771160Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:57:06.436203 containerd[1537]: time="2025-01-13T20:57:06.435993711Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:57:06.436306 containerd[1537]: time="2025-01-13T20:57:06.436190659Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:06.437069 containerd[1537]: time="2025-01-13T20:57:06.436940627Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:06.452245 kubelet[2787]: E0113 20:57:06.451865 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.452245 kubelet[2787]: W0113 20:57:06.451887 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.452245 kubelet[2787]: E0113 20:57:06.451906 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.453091 kubelet[2787]: E0113 20:57:06.453064 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.453217 kubelet[2787]: W0113 20:57:06.453201 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.453360 kubelet[2787]: E0113 20:57:06.453349 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.454581 kubelet[2787]: E0113 20:57:06.453898 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.454581 kubelet[2787]: W0113 20:57:06.454500 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.454907 kubelet[2787]: E0113 20:57:06.454751 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.455179 kubelet[2787]: E0113 20:57:06.455168 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.455362 kubelet[2787]: W0113 20:57:06.455251 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.455434 kubelet[2787]: E0113 20:57:06.455414 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.455771 kubelet[2787]: E0113 20:57:06.455763 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.455868 kubelet[2787]: W0113 20:57:06.455818 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.456080 kubelet[2787]: E0113 20:57:06.455929 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.456524 kubelet[2787]: E0113 20:57:06.456447 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.456524 kubelet[2787]: W0113 20:57:06.456456 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.456784 kubelet[2787]: E0113 20:57:06.456613 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.458730 kubelet[2787]: E0113 20:57:06.457548 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.458730 kubelet[2787]: W0113 20:57:06.457576 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.458730 kubelet[2787]: E0113 20:57:06.458678 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.460568 kubelet[2787]: E0113 20:57:06.460076 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.460568 kubelet[2787]: W0113 20:57:06.460104 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.461997 kubelet[2787]: E0113 20:57:06.460804 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.464240 kubelet[2787]: E0113 20:57:06.462716 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.464240 kubelet[2787]: W0113 20:57:06.462730 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.464240 kubelet[2787]: E0113 20:57:06.462778 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.466720 kubelet[2787]: E0113 20:57:06.466569 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.466720 kubelet[2787]: W0113 20:57:06.466589 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.467136 kubelet[2787]: E0113 20:57:06.467028 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.467400 kubelet[2787]: E0113 20:57:06.467389 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.467595 kubelet[2787]: W0113 20:57:06.467499 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.467820 kubelet[2787]: E0113 20:57:06.467766 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.467820 kubelet[2787]: W0113 20:57:06.467774 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.468433 kubelet[2787]: E0113 20:57:06.468354 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.468433 kubelet[2787]: W0113 20:57:06.468376 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.468753 kubelet[2787]: E0113 20:57:06.468700 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.468753 kubelet[2787]: W0113 20:57:06.468708 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.469011 kubelet[2787]: E0113 20:57:06.468925 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.469630 kubelet[2787]: E0113 20:57:06.469353 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.469630 kubelet[2787]: E0113 20:57:06.469574 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.469630 kubelet[2787]: E0113 20:57:06.469615 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.469865 kubelet[2787]: E0113 20:57:06.469857 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.470007 kubelet[2787]: W0113 20:57:06.469905 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.470007 kubelet[2787]: E0113 20:57:06.469921 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.470293 kubelet[2787]: E0113 20:57:06.470239 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.470293 kubelet[2787]: W0113 20:57:06.470249 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.470293 kubelet[2787]: E0113 20:57:06.470263 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.470552 kubelet[2787]: E0113 20:57:06.470535 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.470552 kubelet[2787]: W0113 20:57:06.470548 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.470634 kubelet[2787]: E0113 20:57:06.470562 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.472208 kubelet[2787]: E0113 20:57:06.471460 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.472208 kubelet[2787]: W0113 20:57:06.471474 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.472208 kubelet[2787]: E0113 20:57:06.471494 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.471518 systemd[1]: Started cri-containerd-398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df.scope - libcontainer container 398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df. Jan 13 20:57:06.473292 kubelet[2787]: E0113 20:57:06.473002 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.473450 kubelet[2787]: W0113 20:57:06.473292 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.473450 kubelet[2787]: E0113 20:57:06.473351 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.474585 kubelet[2787]: E0113 20:57:06.474375 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.474585 kubelet[2787]: W0113 20:57:06.474424 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.475128 kubelet[2787]: E0113 20:57:06.474789 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.475321 kubelet[2787]: E0113 20:57:06.475108 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.475321 kubelet[2787]: W0113 20:57:06.475213 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.475321 kubelet[2787]: E0113 20:57:06.475288 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.475839 kubelet[2787]: E0113 20:57:06.475639 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.475839 kubelet[2787]: W0113 20:57:06.475647 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.475839 kubelet[2787]: E0113 20:57:06.475686 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.476223 kubelet[2787]: E0113 20:57:06.476001 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.476223 kubelet[2787]: W0113 20:57:06.476010 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.476223 kubelet[2787]: E0113 20:57:06.476023 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.476519 kubelet[2787]: E0113 20:57:06.476457 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.476519 kubelet[2787]: W0113 20:57:06.476473 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.476519 kubelet[2787]: E0113 20:57:06.476482 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.483556 kubelet[2787]: E0113 20:57:06.483523 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.483728 kubelet[2787]: W0113 20:57:06.483680 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.483728 kubelet[2787]: E0113 20:57:06.483701 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.507270 kubelet[2787]: E0113 20:57:06.507176 2787 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:57:06.507270 kubelet[2787]: W0113 20:57:06.507198 2787 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:57:06.507270 kubelet[2787]: E0113 20:57:06.507224 2787 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:57:06.524983 containerd[1537]: time="2025-01-13T20:57:06.524592543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wxzwq,Uid:b5b99a2e-4a23-4904-9961-423d7f5593e3,Namespace:calico-system,Attempt:0,} returns sandbox id \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\"" Jan 13 20:57:06.532470 containerd[1537]: time="2025-01-13T20:57:06.531403349Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Jan 13 20:57:06.559059 containerd[1537]: time="2025-01-13T20:57:06.559035354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-77bc676d94-7j9c5,Uid:0da3dfd6-758e-47e6-8dbb-82aa95aea79b,Namespace:calico-system,Attempt:0,} returns sandbox id \"fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f\"" Jan 13 20:57:07.937120 kubelet[2787]: E0113 20:57:07.936807 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jkzvs" podUID="156fa3f2-d364-43dd-86de-274512f7d213" Jan 13 20:57:08.444692 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1080815082.mount: Deactivated successfully. Jan 13 20:57:08.616670 containerd[1537]: time="2025-01-13T20:57:08.616160472Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:08.620268 containerd[1537]: time="2025-01-13T20:57:08.620234717Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=6855343" Jan 13 20:57:08.622816 containerd[1537]: time="2025-01-13T20:57:08.622781188Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:08.625113 containerd[1537]: time="2025-01-13T20:57:08.625075607Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:08.625641 containerd[1537]: time="2025-01-13T20:57:08.625622843Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 2.094157421s" Jan 13 20:57:08.625800 containerd[1537]: time="2025-01-13T20:57:08.625717412Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Jan 13 20:57:08.627012 containerd[1537]: time="2025-01-13T20:57:08.626904122Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Jan 13 20:57:08.627988 containerd[1537]: time="2025-01-13T20:57:08.627881674Z" level=info msg="CreateContainer within sandbox \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 13 20:57:08.703663 containerd[1537]: time="2025-01-13T20:57:08.702853582Z" level=info msg="CreateContainer within sandbox \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"1d4bf2708587ca90f3dabf85c0cabedb3a0aefec59ff83b769b2cc69bb9d23e6\"" Jan 13 20:57:08.703663 containerd[1537]: time="2025-01-13T20:57:08.703406003Z" level=info msg="StartContainer for \"1d4bf2708587ca90f3dabf85c0cabedb3a0aefec59ff83b769b2cc69bb9d23e6\"" Jan 13 20:57:08.730115 systemd[1]: Started cri-containerd-1d4bf2708587ca90f3dabf85c0cabedb3a0aefec59ff83b769b2cc69bb9d23e6.scope - libcontainer container 1d4bf2708587ca90f3dabf85c0cabedb3a0aefec59ff83b769b2cc69bb9d23e6. Jan 13 20:57:08.763463 containerd[1537]: time="2025-01-13T20:57:08.763422151Z" level=info msg="StartContainer for \"1d4bf2708587ca90f3dabf85c0cabedb3a0aefec59ff83b769b2cc69bb9d23e6\" returns successfully" Jan 13 20:57:08.768921 systemd[1]: cri-containerd-1d4bf2708587ca90f3dabf85c0cabedb3a0aefec59ff83b769b2cc69bb9d23e6.scope: Deactivated successfully. Jan 13 20:57:09.296543 containerd[1537]: time="2025-01-13T20:57:09.296419612Z" level=info msg="shim disconnected" id=1d4bf2708587ca90f3dabf85c0cabedb3a0aefec59ff83b769b2cc69bb9d23e6 namespace=k8s.io Jan 13 20:57:09.296543 containerd[1537]: time="2025-01-13T20:57:09.296499750Z" level=warning msg="cleaning up after shim disconnected" id=1d4bf2708587ca90f3dabf85c0cabedb3a0aefec59ff83b769b2cc69bb9d23e6 namespace=k8s.io Jan 13 20:57:09.296543 containerd[1537]: time="2025-01-13T20:57:09.296506548Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:57:09.420553 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1d4bf2708587ca90f3dabf85c0cabedb3a0aefec59ff83b769b2cc69bb9d23e6-rootfs.mount: Deactivated successfully. Jan 13 20:57:09.935754 kubelet[2787]: E0113 20:57:09.935482 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jkzvs" podUID="156fa3f2-d364-43dd-86de-274512f7d213" Jan 13 20:57:11.455817 containerd[1537]: time="2025-01-13T20:57:11.455776620Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:11.456581 containerd[1537]: time="2025-01-13T20:57:11.456518823Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=29850141" Jan 13 20:57:11.456902 containerd[1537]: time="2025-01-13T20:57:11.456886859Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:11.458872 containerd[1537]: time="2025-01-13T20:57:11.458641735Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:11.459784 containerd[1537]: time="2025-01-13T20:57:11.459360846Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 2.832272826s" Jan 13 20:57:11.459784 containerd[1537]: time="2025-01-13T20:57:11.459379645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Jan 13 20:57:11.460304 containerd[1537]: time="2025-01-13T20:57:11.460288178Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Jan 13 20:57:11.474486 containerd[1537]: time="2025-01-13T20:57:11.474448509Z" level=info msg="CreateContainer within sandbox \"fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 13 20:57:11.487969 containerd[1537]: time="2025-01-13T20:57:11.487936606Z" level=info msg="CreateContainer within sandbox \"fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962\"" Jan 13 20:57:11.488486 containerd[1537]: time="2025-01-13T20:57:11.488348029Z" level=info msg="StartContainer for \"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962\"" Jan 13 20:57:11.519002 systemd[1]: Started cri-containerd-5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962.scope - libcontainer container 5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962. Jan 13 20:57:11.579269 containerd[1537]: time="2025-01-13T20:57:11.579232577Z" level=info msg="StartContainer for \"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962\" returns successfully" Jan 13 20:57:12.063593 kubelet[2787]: E0113 20:57:12.062815 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jkzvs" podUID="156fa3f2-d364-43dd-86de-274512f7d213" Jan 13 20:57:12.251016 kubelet[2787]: I0113 20:57:12.250953 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-77bc676d94-7j9c5" podStartSLOduration=2.350981719 podStartE2EDuration="7.250941558s" podCreationTimestamp="2025-01-13 20:57:05 +0000 UTC" firstStartedPulling="2025-01-13 20:57:06.560228172 +0000 UTC m=+12.719566150" lastFinishedPulling="2025-01-13 20:57:11.460188015 +0000 UTC m=+17.619525989" observedRunningTime="2025-01-13 20:57:12.250748095 +0000 UTC m=+18.410086075" watchObservedRunningTime="2025-01-13 20:57:12.250941558 +0000 UTC m=+18.410279541" Jan 13 20:57:13.162567 kubelet[2787]: I0113 20:57:13.162501 2787 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:57:13.956762 kubelet[2787]: E0113 20:57:13.956690 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jkzvs" podUID="156fa3f2-d364-43dd-86de-274512f7d213" Jan 13 20:57:15.128211 kubelet[2787]: I0113 20:57:15.127737 2787 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:57:15.936537 kubelet[2787]: E0113 20:57:15.935786 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jkzvs" podUID="156fa3f2-d364-43dd-86de-274512f7d213" Jan 13 20:57:17.063756 containerd[1537]: time="2025-01-13T20:57:17.063704146Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:17.064916 containerd[1537]: time="2025-01-13T20:57:17.064432887Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Jan 13 20:57:17.066305 containerd[1537]: time="2025-01-13T20:57:17.065206613Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:17.066584 containerd[1537]: time="2025-01-13T20:57:17.066569161Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:17.067367 containerd[1537]: time="2025-01-13T20:57:17.067346562Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 5.606982649s" Jan 13 20:57:17.067444 containerd[1537]: time="2025-01-13T20:57:17.067432246Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Jan 13 20:57:17.071685 containerd[1537]: time="2025-01-13T20:57:17.071656988Z" level=info msg="CreateContainer within sandbox \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 13 20:57:17.083634 containerd[1537]: time="2025-01-13T20:57:17.083607129Z" level=info msg="CreateContainer within sandbox \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"4c5cf52fd4d67dfef056d20edc195a2a25ed17cd5d64960811f721e6833a70d1\"" Jan 13 20:57:17.086482 containerd[1537]: time="2025-01-13T20:57:17.086461884Z" level=info msg="StartContainer for \"4c5cf52fd4d67dfef056d20edc195a2a25ed17cd5d64960811f721e6833a70d1\"" Jan 13 20:57:17.198988 systemd[1]: Started cri-containerd-4c5cf52fd4d67dfef056d20edc195a2a25ed17cd5d64960811f721e6833a70d1.scope - libcontainer container 4c5cf52fd4d67dfef056d20edc195a2a25ed17cd5d64960811f721e6833a70d1. Jan 13 20:57:17.246267 containerd[1537]: time="2025-01-13T20:57:17.246243636Z" level=info msg="StartContainer for \"4c5cf52fd4d67dfef056d20edc195a2a25ed17cd5d64960811f721e6833a70d1\" returns successfully" Jan 13 20:57:17.936118 kubelet[2787]: E0113 20:57:17.935507 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jkzvs" podUID="156fa3f2-d364-43dd-86de-274512f7d213" Jan 13 20:57:19.582949 systemd[1]: cri-containerd-4c5cf52fd4d67dfef056d20edc195a2a25ed17cd5d64960811f721e6833a70d1.scope: Deactivated successfully. Jan 13 20:57:19.629345 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4c5cf52fd4d67dfef056d20edc195a2a25ed17cd5d64960811f721e6833a70d1-rootfs.mount: Deactivated successfully. Jan 13 20:57:19.647914 containerd[1537]: time="2025-01-13T20:57:19.647815910Z" level=info msg="shim disconnected" id=4c5cf52fd4d67dfef056d20edc195a2a25ed17cd5d64960811f721e6833a70d1 namespace=k8s.io Jan 13 20:57:19.647914 containerd[1537]: time="2025-01-13T20:57:19.647894000Z" level=warning msg="cleaning up after shim disconnected" id=4c5cf52fd4d67dfef056d20edc195a2a25ed17cd5d64960811f721e6833a70d1 namespace=k8s.io Jan 13 20:57:19.654283 containerd[1537]: time="2025-01-13T20:57:19.647900117Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:57:19.719045 kubelet[2787]: I0113 20:57:19.718093 2787 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Jan 13 20:57:19.751984 systemd[1]: Created slice kubepods-burstable-pod2e3cb21e_fc92_46c5_8852_4ee6f84c4eb9.slice - libcontainer container kubepods-burstable-pod2e3cb21e_fc92_46c5_8852_4ee6f84c4eb9.slice. Jan 13 20:57:19.760414 systemd[1]: Created slice kubepods-burstable-podfa68722f_5974_4f04_9b6e_46b8f479c300.slice - libcontainer container kubepods-burstable-podfa68722f_5974_4f04_9b6e_46b8f479c300.slice. Jan 13 20:57:19.761742 kubelet[2787]: W0113 20:57:19.760500 2787 reflector.go:561] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'localhost' and this object Jan 13 20:57:19.761742 kubelet[2787]: E0113 20:57:19.760546 2787 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Jan 13 20:57:19.761742 kubelet[2787]: W0113 20:57:19.760585 2787 reflector.go:561] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:localhost" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'localhost' and this object Jan 13 20:57:19.761742 kubelet[2787]: E0113 20:57:19.760601 2787 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:localhost\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Jan 13 20:57:19.767753 systemd[1]: Created slice kubepods-besteffort-pod86a1a104_aa85_4352_b08c_36f54c7172c1.slice - libcontainer container kubepods-besteffort-pod86a1a104_aa85_4352_b08c_36f54c7172c1.slice. Jan 13 20:57:19.774995 systemd[1]: Created slice kubepods-besteffort-pod6a232370_fd20_47eb_a2e3_9b0d1d786995.slice - libcontainer container kubepods-besteffort-pod6a232370_fd20_47eb_a2e3_9b0d1d786995.slice. Jan 13 20:57:19.780710 systemd[1]: Created slice kubepods-besteffort-podbe5d865c_3359_4beb_8044_736dced88771.slice - libcontainer container kubepods-besteffort-podbe5d865c_3359_4beb_8044_736dced88771.slice. Jan 13 20:57:19.861764 kubelet[2787]: I0113 20:57:19.861616 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgq9w\" (UniqueName: \"kubernetes.io/projected/be5d865c-3359-4beb-8044-736dced88771-kube-api-access-lgq9w\") pod \"calico-kube-controllers-6df8b96c48-4j6ml\" (UID: \"be5d865c-3359-4beb-8044-736dced88771\") " pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:19.861764 kubelet[2787]: I0113 20:57:19.861655 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6a232370-fd20-47eb-a2e3-9b0d1d786995-calico-apiserver-certs\") pod \"calico-apiserver-867f57c995-67ltf\" (UID: \"6a232370-fd20-47eb-a2e3-9b0d1d786995\") " pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" Jan 13 20:57:19.861764 kubelet[2787]: I0113 20:57:19.861673 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa68722f-5974-4f04-9b6e-46b8f479c300-config-volume\") pod \"coredns-6f6b679f8f-k7f2m\" (UID: \"fa68722f-5974-4f04-9b6e-46b8f479c300\") " pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:19.861764 kubelet[2787]: I0113 20:57:19.861690 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmxdk\" (UniqueName: \"kubernetes.io/projected/86a1a104-aa85-4352-b08c-36f54c7172c1-kube-api-access-kmxdk\") pod \"calico-apiserver-867f57c995-7rrvw\" (UID: \"86a1a104-aa85-4352-b08c-36f54c7172c1\") " pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" Jan 13 20:57:19.861764 kubelet[2787]: I0113 20:57:19.861708 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/86a1a104-aa85-4352-b08c-36f54c7172c1-calico-apiserver-certs\") pod \"calico-apiserver-867f57c995-7rrvw\" (UID: \"86a1a104-aa85-4352-b08c-36f54c7172c1\") " pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" Jan 13 20:57:19.863324 kubelet[2787]: I0113 20:57:19.861737 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2sw5\" (UniqueName: \"kubernetes.io/projected/6a232370-fd20-47eb-a2e3-9b0d1d786995-kube-api-access-q2sw5\") pod \"calico-apiserver-867f57c995-67ltf\" (UID: \"6a232370-fd20-47eb-a2e3-9b0d1d786995\") " pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" Jan 13 20:57:19.863324 kubelet[2787]: I0113 20:57:19.861753 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkmw2\" (UniqueName: \"kubernetes.io/projected/fa68722f-5974-4f04-9b6e-46b8f479c300-kube-api-access-lkmw2\") pod \"coredns-6f6b679f8f-k7f2m\" (UID: \"fa68722f-5974-4f04-9b6e-46b8f479c300\") " pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:19.863324 kubelet[2787]: I0113 20:57:19.861770 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9-config-volume\") pod \"coredns-6f6b679f8f-9c7f5\" (UID: \"2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9\") " pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:19.863324 kubelet[2787]: I0113 20:57:19.861785 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpgff\" (UniqueName: \"kubernetes.io/projected/2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9-kube-api-access-fpgff\") pod \"coredns-6f6b679f8f-9c7f5\" (UID: \"2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9\") " pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:19.863324 kubelet[2787]: I0113 20:57:19.861804 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be5d865c-3359-4beb-8044-736dced88771-tigera-ca-bundle\") pod \"calico-kube-controllers-6df8b96c48-4j6ml\" (UID: \"be5d865c-3359-4beb-8044-736dced88771\") " pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:19.939320 systemd[1]: Created slice kubepods-besteffort-pod156fa3f2_d364_43dd_86de_274512f7d213.slice - libcontainer container kubepods-besteffort-pod156fa3f2_d364_43dd_86de_274512f7d213.slice. Jan 13 20:57:19.978870 containerd[1537]: time="2025-01-13T20:57:19.940579109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:0,}" Jan 13 20:57:20.056210 containerd[1537]: time="2025-01-13T20:57:20.056181392Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:0,}" Jan 13 20:57:20.065302 containerd[1537]: time="2025-01-13T20:57:20.065150812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:0,}" Jan 13 20:57:20.083355 containerd[1537]: time="2025-01-13T20:57:20.083289864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:0,}" Jan 13 20:57:20.180837 containerd[1537]: time="2025-01-13T20:57:20.180738113Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Jan 13 20:57:20.437382 containerd[1537]: time="2025-01-13T20:57:20.436602960Z" level=error msg="Failed to destroy network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.442347 containerd[1537]: time="2025-01-13T20:57:20.442323843Z" level=error msg="encountered an error cleaning up failed sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.445919 containerd[1537]: time="2025-01-13T20:57:20.442526602Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.445919 containerd[1537]: time="2025-01-13T20:57:20.442772696Z" level=error msg="Failed to destroy network for sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.445919 containerd[1537]: time="2025-01-13T20:57:20.442966818Z" level=error msg="encountered an error cleaning up failed sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.445919 containerd[1537]: time="2025-01-13T20:57:20.443000652Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.448034 containerd[1537]: time="2025-01-13T20:57:20.447390365Z" level=error msg="Failed to destroy network for sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.448851 containerd[1537]: time="2025-01-13T20:57:20.447534929Z" level=error msg="encountered an error cleaning up failed sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.448950 containerd[1537]: time="2025-01-13T20:57:20.448864242Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.449560 kubelet[2787]: E0113 20:57:20.449541 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.449787 kubelet[2787]: E0113 20:57:20.449776 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:20.455119 kubelet[2787]: E0113 20:57:20.449928 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:20.455119 kubelet[2787]: E0113 20:57:20.449961 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-9c7f5_kube-system(2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-9c7f5_kube-system(2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-9c7f5" podUID="2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9" Jan 13 20:57:20.455119 kubelet[2787]: E0113 20:57:20.449690 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.455222 kubelet[2787]: E0113 20:57:20.449996 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:20.455222 kubelet[2787]: E0113 20:57:20.450005 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:20.455222 kubelet[2787]: E0113 20:57:20.450019 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-k7f2m" podUID="fa68722f-5974-4f04-9b6e-46b8f479c300" Jan 13 20:57:20.455289 kubelet[2787]: E0113 20:57:20.449703 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.455289 kubelet[2787]: E0113 20:57:20.450037 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:20.455289 kubelet[2787]: E0113 20:57:20.450045 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:20.455341 kubelet[2787]: E0113 20:57:20.450057 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jkzvs_calico-system(156fa3f2-d364-43dd-86de-274512f7d213)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jkzvs_calico-system(156fa3f2-d364-43dd-86de-274512f7d213)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jkzvs" podUID="156fa3f2-d364-43dd-86de-274512f7d213" Jan 13 20:57:20.455719 containerd[1537]: time="2025-01-13T20:57:20.455693960Z" level=error msg="Failed to destroy network for sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.455978 containerd[1537]: time="2025-01-13T20:57:20.455960518Z" level=error msg="encountered an error cleaning up failed sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.456054 containerd[1537]: time="2025-01-13T20:57:20.456039450Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.456290 kubelet[2787]: E0113 20:57:20.456187 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:20.456290 kubelet[2787]: E0113 20:57:20.456221 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:20.456290 kubelet[2787]: E0113 20:57:20.456233 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:20.456380 kubelet[2787]: E0113 20:57:20.456258 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6df8b96c48-4j6ml_calico-system(be5d865c-3359-4beb-8044-736dced88771)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6df8b96c48-4j6ml_calico-system(be5d865c-3359-4beb-8044-736dced88771)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" podUID="be5d865c-3359-4beb-8044-736dced88771" Jan 13 20:57:20.629261 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0-shm.mount: Deactivated successfully. Jan 13 20:57:20.964350 kubelet[2787]: E0113 20:57:20.964306 2787 secret.go:188] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Jan 13 20:57:20.964714 kubelet[2787]: E0113 20:57:20.964401 2787 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86a1a104-aa85-4352-b08c-36f54c7172c1-calico-apiserver-certs podName:86a1a104-aa85-4352-b08c-36f54c7172c1 nodeName:}" failed. No retries permitted until 2025-01-13 20:57:21.464372662 +0000 UTC m=+27.623710632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/86a1a104-aa85-4352-b08c-36f54c7172c1-calico-apiserver-certs") pod "calico-apiserver-867f57c995-7rrvw" (UID: "86a1a104-aa85-4352-b08c-36f54c7172c1") : failed to sync secret cache: timed out waiting for the condition Jan 13 20:57:20.964714 kubelet[2787]: E0113 20:57:20.964306 2787 secret.go:188] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Jan 13 20:57:20.964714 kubelet[2787]: E0113 20:57:20.964602 2787 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a232370-fd20-47eb-a2e3-9b0d1d786995-calico-apiserver-certs podName:6a232370-fd20-47eb-a2e3-9b0d1d786995 nodeName:}" failed. No retries permitted until 2025-01-13 20:57:21.464593409 +0000 UTC m=+27.623931380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/6a232370-fd20-47eb-a2e3-9b0d1d786995-calico-apiserver-certs") pod "calico-apiserver-867f57c995-67ltf" (UID: "6a232370-fd20-47eb-a2e3-9b0d1d786995") : failed to sync secret cache: timed out waiting for the condition Jan 13 20:57:21.179661 kubelet[2787]: I0113 20:57:21.179229 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1" Jan 13 20:57:21.180490 kubelet[2787]: I0113 20:57:21.180451 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0" Jan 13 20:57:21.199722 containerd[1537]: time="2025-01-13T20:57:21.199482011Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\"" Jan 13 20:57:21.200945 containerd[1537]: time="2025-01-13T20:57:21.200182876Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\"" Jan 13 20:57:21.210360 containerd[1537]: time="2025-01-13T20:57:21.208273099Z" level=info msg="Ensure that sandbox a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0 in task-service has been cleanup successfully" Jan 13 20:57:21.210213 systemd[1]: run-netns-cni\x2d2d299e46\x2df621\x2d4f5e\x2d0bcb\x2d830442b4521b.mount: Deactivated successfully. Jan 13 20:57:21.211645 kubelet[2787]: I0113 20:57:21.211633 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46" Jan 13 20:57:21.213039 kubelet[2787]: I0113 20:57:21.213026 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0" Jan 13 20:57:21.213172 containerd[1537]: time="2025-01-13T20:57:21.208509371Z" level=info msg="TearDown network for sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" successfully" Jan 13 20:57:21.213214 containerd[1537]: time="2025-01-13T20:57:21.213171486Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" returns successfully" Jan 13 20:57:21.213315 containerd[1537]: time="2025-01-13T20:57:21.211061846Z" level=info msg="Ensure that sandbox 7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1 in task-service has been cleanup successfully" Jan 13 20:57:21.214097 containerd[1537]: time="2025-01-13T20:57:21.213681768Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\"" Jan 13 20:57:21.214097 containerd[1537]: time="2025-01-13T20:57:21.212812679Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\"" Jan 13 20:57:21.214097 containerd[1537]: time="2025-01-13T20:57:21.214024572Z" level=info msg="Ensure that sandbox 1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46 in task-service has been cleanup successfully" Jan 13 20:57:21.214423 containerd[1537]: time="2025-01-13T20:57:21.214210876Z" level=info msg="Ensure that sandbox e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0 in task-service has been cleanup successfully" Jan 13 20:57:21.214423 containerd[1537]: time="2025-01-13T20:57:21.214342027Z" level=info msg="TearDown network for sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" successfully" Jan 13 20:57:21.214423 containerd[1537]: time="2025-01-13T20:57:21.214353415Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" returns successfully" Jan 13 20:57:21.215544 containerd[1537]: time="2025-01-13T20:57:21.215175922Z" level=info msg="TearDown network for sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" successfully" Jan 13 20:57:21.215544 containerd[1537]: time="2025-01-13T20:57:21.215198948Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" returns successfully" Jan 13 20:57:21.215544 containerd[1537]: time="2025-01-13T20:57:21.215483816Z" level=info msg="TearDown network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" successfully" Jan 13 20:57:21.215544 containerd[1537]: time="2025-01-13T20:57:21.215512352Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" returns successfully" Jan 13 20:57:21.218065 containerd[1537]: time="2025-01-13T20:57:21.215726929Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:1,}" Jan 13 20:57:21.218065 containerd[1537]: time="2025-01-13T20:57:21.217586652Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:1,}" Jan 13 20:57:21.217123 systemd[1]: run-netns-cni\x2d0dafb135\x2da780\x2dfb34\x2dec16\x2d00fb434c41dd.mount: Deactivated successfully. Jan 13 20:57:21.219393 containerd[1537]: time="2025-01-13T20:57:21.218508158Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:1,}" Jan 13 20:57:21.219393 containerd[1537]: time="2025-01-13T20:57:21.218710960Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:1,}" Jan 13 20:57:21.314804 containerd[1537]: time="2025-01-13T20:57:21.314690502Z" level=error msg="Failed to destroy network for sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.315694 containerd[1537]: time="2025-01-13T20:57:21.315614519Z" level=error msg="encountered an error cleaning up failed sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.315694 containerd[1537]: time="2025-01-13T20:57:21.315654573Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.316678 kubelet[2787]: E0113 20:57:21.315805 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.316678 kubelet[2787]: E0113 20:57:21.315961 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:21.316678 kubelet[2787]: E0113 20:57:21.315979 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:21.316786 kubelet[2787]: E0113 20:57:21.316161 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6df8b96c48-4j6ml_calico-system(be5d865c-3359-4beb-8044-736dced88771)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6df8b96c48-4j6ml_calico-system(be5d865c-3359-4beb-8044-736dced88771)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" podUID="be5d865c-3359-4beb-8044-736dced88771" Jan 13 20:57:21.321070 containerd[1537]: time="2025-01-13T20:57:21.321036444Z" level=error msg="Failed to destroy network for sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.321302 containerd[1537]: time="2025-01-13T20:57:21.321285446Z" level=error msg="encountered an error cleaning up failed sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.321352 containerd[1537]: time="2025-01-13T20:57:21.321324712Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.321491 kubelet[2787]: E0113 20:57:21.321458 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.321719 kubelet[2787]: E0113 20:57:21.321592 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:21.321719 kubelet[2787]: E0113 20:57:21.321612 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:21.321783 containerd[1537]: time="2025-01-13T20:57:21.321740229Z" level=error msg="Failed to destroy network for sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.322035 containerd[1537]: time="2025-01-13T20:57:21.321939763Z" level=error msg="encountered an error cleaning up failed sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.322035 containerd[1537]: time="2025-01-13T20:57:21.321967251Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.323265 containerd[1537]: time="2025-01-13T20:57:21.322770191Z" level=error msg="Failed to destroy network for sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.323265 containerd[1537]: time="2025-01-13T20:57:21.323089226Z" level=error msg="encountered an error cleaning up failed sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.323265 containerd[1537]: time="2025-01-13T20:57:21.323141309Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.323463 kubelet[2787]: E0113 20:57:21.323094 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.323463 kubelet[2787]: E0113 20:57:21.323121 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:21.323463 kubelet[2787]: E0113 20:57:21.323132 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:21.323582 kubelet[2787]: E0113 20:57:21.323150 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-k7f2m" podUID="fa68722f-5974-4f04-9b6e-46b8f479c300" Jan 13 20:57:21.323582 kubelet[2787]: E0113 20:57:21.321703 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jkzvs_calico-system(156fa3f2-d364-43dd-86de-274512f7d213)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jkzvs_calico-system(156fa3f2-d364-43dd-86de-274512f7d213)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jkzvs" podUID="156fa3f2-d364-43dd-86de-274512f7d213" Jan 13 20:57:21.323582 kubelet[2787]: E0113 20:57:21.323310 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.323762 kubelet[2787]: E0113 20:57:21.323324 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:21.323762 kubelet[2787]: E0113 20:57:21.323334 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:21.323762 kubelet[2787]: E0113 20:57:21.323348 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-9c7f5_kube-system(2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-9c7f5_kube-system(2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-9c7f5" podUID="2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9" Jan 13 20:57:21.573047 containerd[1537]: time="2025-01-13T20:57:21.572878390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-7rrvw,Uid:86a1a104-aa85-4352-b08c-36f54c7172c1,Namespace:calico-apiserver,Attempt:0,}" Jan 13 20:57:21.578396 containerd[1537]: time="2025-01-13T20:57:21.578332552Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-67ltf,Uid:6a232370-fd20-47eb-a2e3-9b0d1d786995,Namespace:calico-apiserver,Attempt:0,}" Jan 13 20:57:21.632374 systemd[1]: run-netns-cni\x2dcbdce7ab\x2d7d82\x2d5e74\x2d4954\x2d4abd6ff78c84.mount: Deactivated successfully. Jan 13 20:57:21.632814 systemd[1]: run-netns-cni\x2d13a03f66\x2d39f4\x2d548a\x2d2c64\x2d8c1e7016c0e4.mount: Deactivated successfully. Jan 13 20:57:21.726496 containerd[1537]: time="2025-01-13T20:57:21.726295119Z" level=error msg="Failed to destroy network for sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.726739 containerd[1537]: time="2025-01-13T20:57:21.726695865Z" level=error msg="encountered an error cleaning up failed sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.726832 containerd[1537]: time="2025-01-13T20:57:21.726781870Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-7rrvw,Uid:86a1a104-aa85-4352-b08c-36f54c7172c1,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.727013 kubelet[2787]: E0113 20:57:21.726991 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.727207 kubelet[2787]: E0113 20:57:21.727109 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" Jan 13 20:57:21.727207 kubelet[2787]: E0113 20:57:21.727135 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" Jan 13 20:57:21.727207 kubelet[2787]: E0113 20:57:21.727165 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-867f57c995-7rrvw_calico-apiserver(86a1a104-aa85-4352-b08c-36f54c7172c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-867f57c995-7rrvw_calico-apiserver(86a1a104-aa85-4352-b08c-36f54c7172c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" podUID="86a1a104-aa85-4352-b08c-36f54c7172c1" Jan 13 20:57:21.728180 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3-shm.mount: Deactivated successfully. Jan 13 20:57:21.750141 containerd[1537]: time="2025-01-13T20:57:21.750064720Z" level=error msg="Failed to destroy network for sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.751681 containerd[1537]: time="2025-01-13T20:57:21.750264268Z" level=error msg="encountered an error cleaning up failed sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.751681 containerd[1537]: time="2025-01-13T20:57:21.750301188Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-67ltf,Uid:6a232370-fd20-47eb-a2e3-9b0d1d786995,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.751783 kubelet[2787]: E0113 20:57:21.750509 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:21.751783 kubelet[2787]: E0113 20:57:21.750550 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" Jan 13 20:57:21.751783 kubelet[2787]: E0113 20:57:21.750563 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" Jan 13 20:57:21.751907 kubelet[2787]: E0113 20:57:21.750588 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-867f57c995-67ltf_calico-apiserver(6a232370-fd20-47eb-a2e3-9b0d1d786995)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-867f57c995-67ltf_calico-apiserver(6a232370-fd20-47eb-a2e3-9b0d1d786995)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" podUID="6a232370-fd20-47eb-a2e3-9b0d1d786995" Jan 13 20:57:21.751915 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd-shm.mount: Deactivated successfully. Jan 13 20:57:22.217427 kubelet[2787]: I0113 20:57:22.217368 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74" Jan 13 20:57:22.219643 containerd[1537]: time="2025-01-13T20:57:22.219223665Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\"" Jan 13 20:57:22.219643 containerd[1537]: time="2025-01-13T20:57:22.219339059Z" level=info msg="Ensure that sandbox c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74 in task-service has been cleanup successfully" Jan 13 20:57:22.221174 kubelet[2787]: I0113 20:57:22.220644 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b" Jan 13 20:57:22.221340 containerd[1537]: time="2025-01-13T20:57:22.221080990Z" level=info msg="TearDown network for sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" successfully" Jan 13 20:57:22.221340 containerd[1537]: time="2025-01-13T20:57:22.221092062Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" returns successfully" Jan 13 20:57:22.222213 containerd[1537]: time="2025-01-13T20:57:22.221905978Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\"" Jan 13 20:57:22.222213 containerd[1537]: time="2025-01-13T20:57:22.221950154Z" level=info msg="TearDown network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" successfully" Jan 13 20:57:22.222213 containerd[1537]: time="2025-01-13T20:57:22.221957464Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" returns successfully" Jan 13 20:57:22.223140 containerd[1537]: time="2025-01-13T20:57:22.223126726Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:2,}" Jan 13 20:57:22.224870 kubelet[2787]: I0113 20:57:22.224799 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd" Jan 13 20:57:22.225431 containerd[1537]: time="2025-01-13T20:57:22.225261587Z" level=info msg="StopPodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\"" Jan 13 20:57:22.225431 containerd[1537]: time="2025-01-13T20:57:22.225365465Z" level=info msg="Ensure that sandbox 579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd in task-service has been cleanup successfully" Jan 13 20:57:22.225520 containerd[1537]: time="2025-01-13T20:57:22.225511172Z" level=info msg="TearDown network for sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" successfully" Jan 13 20:57:22.225569 containerd[1537]: time="2025-01-13T20:57:22.225561902Z" level=info msg="StopPodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" returns successfully" Jan 13 20:57:22.225876 containerd[1537]: time="2025-01-13T20:57:22.225864931Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-67ltf,Uid:6a232370-fd20-47eb-a2e3-9b0d1d786995,Namespace:calico-apiserver,Attempt:1,}" Jan 13 20:57:22.226359 kubelet[2787]: I0113 20:57:22.226345 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f" Jan 13 20:57:22.226668 containerd[1537]: time="2025-01-13T20:57:22.226651109Z" level=info msg="StopPodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\"" Jan 13 20:57:22.226801 containerd[1537]: time="2025-01-13T20:57:22.226791077Z" level=info msg="Ensure that sandbox 4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f in task-service has been cleanup successfully" Jan 13 20:57:22.226979 containerd[1537]: time="2025-01-13T20:57:22.226969813Z" level=info msg="TearDown network for sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" successfully" Jan 13 20:57:22.227099 containerd[1537]: time="2025-01-13T20:57:22.226994949Z" level=info msg="StopPodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" returns successfully" Jan 13 20:57:22.227254 containerd[1537]: time="2025-01-13T20:57:22.227243493Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\"" Jan 13 20:57:22.227408 containerd[1537]: time="2025-01-13T20:57:22.227316853Z" level=info msg="TearDown network for sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" successfully" Jan 13 20:57:22.227408 containerd[1537]: time="2025-01-13T20:57:22.227325128Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" returns successfully" Jan 13 20:57:22.228791 kubelet[2787]: I0113 20:57:22.228777 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3" Jan 13 20:57:22.242762 kubelet[2787]: I0113 20:57:22.232185 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.230525642Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:2,}" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.230742826Z" level=info msg="StopPodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\"" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.230887136Z" level=info msg="Ensure that sandbox 839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3 in task-service has been cleanup successfully" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.231019155Z" level=info msg="TearDown network for sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" successfully" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.231026946Z" level=info msg="StopPodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" returns successfully" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.231284866Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-7rrvw,Uid:86a1a104-aa85-4352-b08c-36f54c7172c1,Namespace:calico-apiserver,Attempt:1,}" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.232430144Z" level=info msg="StopPodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\"" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.232543422Z" level=info msg="Ensure that sandbox 23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7 in task-service has been cleanup successfully" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.232645019Z" level=info msg="TearDown network for sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" successfully" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.232652704Z" level=info msg="StopPodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" returns successfully" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.232771134Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\"" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.232815027Z" level=info msg="TearDown network for sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" successfully" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.232821076Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" returns successfully" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.233077670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:2,}" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.233866600Z" level=info msg="StopPodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\"" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.233946457Z" level=info msg="Ensure that sandbox 2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b in task-service has been cleanup successfully" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.234057159Z" level=info msg="TearDown network for sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" successfully" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.234065589Z" level=info msg="StopPodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" returns successfully" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.236370944Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\"" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.236413471Z" level=info msg="TearDown network for sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" successfully" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.236420076Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" returns successfully" Jan 13 20:57:22.242866 containerd[1537]: time="2025-01-13T20:57:22.237025823Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:2,}" Jan 13 20:57:22.401571 containerd[1537]: time="2025-01-13T20:57:22.400945021Z" level=error msg="Failed to destroy network for sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.401651 containerd[1537]: time="2025-01-13T20:57:22.401606391Z" level=error msg="encountered an error cleaning up failed sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.401651 containerd[1537]: time="2025-01-13T20:57:22.401641441Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.402163 kubelet[2787]: E0113 20:57:22.402137 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.402208 kubelet[2787]: E0113 20:57:22.402177 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:22.402208 kubelet[2787]: E0113 20:57:22.402192 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:22.402275 kubelet[2787]: E0113 20:57:22.402218 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-k7f2m" podUID="fa68722f-5974-4f04-9b6e-46b8f479c300" Jan 13 20:57:22.406322 containerd[1537]: time="2025-01-13T20:57:22.406295058Z" level=error msg="Failed to destroy network for sandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.406971 containerd[1537]: time="2025-01-13T20:57:22.406956997Z" level=error msg="encountered an error cleaning up failed sandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.407070 containerd[1537]: time="2025-01-13T20:57:22.407055350Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-67ltf,Uid:6a232370-fd20-47eb-a2e3-9b0d1d786995,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.407285 kubelet[2787]: E0113 20:57:22.407261 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.413341 kubelet[2787]: E0113 20:57:22.407296 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" Jan 13 20:57:22.413341 kubelet[2787]: E0113 20:57:22.407309 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" Jan 13 20:57:22.413341 kubelet[2787]: E0113 20:57:22.407334 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-867f57c995-67ltf_calico-apiserver(6a232370-fd20-47eb-a2e3-9b0d1d786995)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-867f57c995-67ltf_calico-apiserver(6a232370-fd20-47eb-a2e3-9b0d1d786995)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" podUID="6a232370-fd20-47eb-a2e3-9b0d1d786995" Jan 13 20:57:22.433512 containerd[1537]: time="2025-01-13T20:57:22.433480642Z" level=error msg="Failed to destroy network for sandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.433753 containerd[1537]: time="2025-01-13T20:57:22.433731422Z" level=error msg="encountered an error cleaning up failed sandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.433804 containerd[1537]: time="2025-01-13T20:57:22.433767617Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.434866 kubelet[2787]: E0113 20:57:22.433984 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.434912 kubelet[2787]: E0113 20:57:22.434884 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:22.434912 kubelet[2787]: E0113 20:57:22.434897 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:22.434969 kubelet[2787]: E0113 20:57:22.434950 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6df8b96c48-4j6ml_calico-system(be5d865c-3359-4beb-8044-736dced88771)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6df8b96c48-4j6ml_calico-system(be5d865c-3359-4beb-8044-736dced88771)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" podUID="be5d865c-3359-4beb-8044-736dced88771" Jan 13 20:57:22.438359 containerd[1537]: time="2025-01-13T20:57:22.438318231Z" level=error msg="Failed to destroy network for sandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.443224 containerd[1537]: time="2025-01-13T20:57:22.438517447Z" level=error msg="encountered an error cleaning up failed sandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.443224 containerd[1537]: time="2025-01-13T20:57:22.438551262Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-7rrvw,Uid:86a1a104-aa85-4352-b08c-36f54c7172c1,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.448482 kubelet[2787]: E0113 20:57:22.438658 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.448482 kubelet[2787]: E0113 20:57:22.438687 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" Jan 13 20:57:22.448482 kubelet[2787]: E0113 20:57:22.438699 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" Jan 13 20:57:22.463496 containerd[1537]: time="2025-01-13T20:57:22.447061048Z" level=error msg="Failed to destroy network for sandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.463496 containerd[1537]: time="2025-01-13T20:57:22.447322431Z" level=error msg="encountered an error cleaning up failed sandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.463496 containerd[1537]: time="2025-01-13T20:57:22.447369521Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.463496 containerd[1537]: time="2025-01-13T20:57:22.460843254Z" level=error msg="Failed to destroy network for sandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.463496 containerd[1537]: time="2025-01-13T20:57:22.461093092Z" level=error msg="encountered an error cleaning up failed sandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.463496 containerd[1537]: time="2025-01-13T20:57:22.461127978Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.463616 kubelet[2787]: E0113 20:57:22.438721 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-867f57c995-7rrvw_calico-apiserver(86a1a104-aa85-4352-b08c-36f54c7172c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-867f57c995-7rrvw_calico-apiserver(86a1a104-aa85-4352-b08c-36f54c7172c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" podUID="86a1a104-aa85-4352-b08c-36f54c7172c1" Jan 13 20:57:22.463616 kubelet[2787]: E0113 20:57:22.447708 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.463616 kubelet[2787]: E0113 20:57:22.447736 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:22.463702 kubelet[2787]: E0113 20:57:22.447753 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:22.463702 kubelet[2787]: E0113 20:57:22.447775 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-9c7f5_kube-system(2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-9c7f5_kube-system(2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-9c7f5" podUID="2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9" Jan 13 20:57:22.463702 kubelet[2787]: E0113 20:57:22.461247 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:22.463769 kubelet[2787]: E0113 20:57:22.461311 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:22.463769 kubelet[2787]: E0113 20:57:22.461325 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:22.463769 kubelet[2787]: E0113 20:57:22.461355 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jkzvs_calico-system(156fa3f2-d364-43dd-86de-274512f7d213)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jkzvs_calico-system(156fa3f2-d364-43dd-86de-274512f7d213)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jkzvs" podUID="156fa3f2-d364-43dd-86de-274512f7d213" Jan 13 20:57:22.631564 systemd[1]: run-netns-cni\x2d3bd4f7f4\x2d8bb7\x2d11b1\x2d61e3\x2d35dc5ebd3653.mount: Deactivated successfully. Jan 13 20:57:22.631821 systemd[1]: run-netns-cni\x2d3654569f\x2dfa22\x2dedb4\x2d688f\x2d412a8386d595.mount: Deactivated successfully. Jan 13 20:57:22.631953 systemd[1]: run-netns-cni\x2d6b67a5fd\x2d9643\x2d5916\x2dd7d2\x2de4d204ac7f54.mount: Deactivated successfully. Jan 13 20:57:22.632055 systemd[1]: run-netns-cni\x2d9c22e8df\x2d27ae\x2d3431\x2d5bac\x2d02609eb3ffa5.mount: Deactivated successfully. Jan 13 20:57:22.632184 systemd[1]: run-netns-cni\x2daccc364c\x2de2d1\x2d9e66\x2d64cd\x2d6bf14c7c2de1.mount: Deactivated successfully. Jan 13 20:57:22.632264 systemd[1]: run-netns-cni\x2d91154578\x2da6df\x2d1b56\x2daa0c\x2dedca5d5a7fa4.mount: Deactivated successfully. Jan 13 20:57:23.234235 kubelet[2787]: I0113 20:57:23.234215 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8" Jan 13 20:57:23.235258 containerd[1537]: time="2025-01-13T20:57:23.234649508Z" level=info msg="StopPodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\"" Jan 13 20:57:23.235258 containerd[1537]: time="2025-01-13T20:57:23.234769336Z" level=info msg="Ensure that sandbox 0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8 in task-service has been cleanup successfully" Jan 13 20:57:23.235258 containerd[1537]: time="2025-01-13T20:57:23.234892258Z" level=info msg="TearDown network for sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" successfully" Jan 13 20:57:23.235258 containerd[1537]: time="2025-01-13T20:57:23.234900434Z" level=info msg="StopPodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" returns successfully" Jan 13 20:57:23.235774 containerd[1537]: time="2025-01-13T20:57:23.235332593Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\"" Jan 13 20:57:23.235774 containerd[1537]: time="2025-01-13T20:57:23.235379451Z" level=info msg="TearDown network for sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" successfully" Jan 13 20:57:23.235774 containerd[1537]: time="2025-01-13T20:57:23.235392045Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" returns successfully" Jan 13 20:57:23.236520 containerd[1537]: time="2025-01-13T20:57:23.236151946Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\"" Jan 13 20:57:23.236520 containerd[1537]: time="2025-01-13T20:57:23.236202368Z" level=info msg="TearDown network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" successfully" Jan 13 20:57:23.236520 containerd[1537]: time="2025-01-13T20:57:23.236209196Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" returns successfully" Jan 13 20:57:23.236985 kubelet[2787]: I0113 20:57:23.236810 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3" Jan 13 20:57:23.237281 containerd[1537]: time="2025-01-13T20:57:23.237138413Z" level=info msg="StopPodSandbox for \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\"" Jan 13 20:57:23.237346 containerd[1537]: time="2025-01-13T20:57:23.237335945Z" level=info msg="Ensure that sandbox 83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3 in task-service has been cleanup successfully" Jan 13 20:57:23.237549 containerd[1537]: time="2025-01-13T20:57:23.237489300Z" level=info msg="TearDown network for sandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\" successfully" Jan 13 20:57:23.237549 containerd[1537]: time="2025-01-13T20:57:23.237498697Z" level=info msg="StopPodSandbox for \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\" returns successfully" Jan 13 20:57:23.237622 containerd[1537]: time="2025-01-13T20:57:23.237613158Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:3,}" Jan 13 20:57:23.237961 containerd[1537]: time="2025-01-13T20:57:23.237847419Z" level=info msg="StopPodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\"" Jan 13 20:57:23.237961 containerd[1537]: time="2025-01-13T20:57:23.237892579Z" level=info msg="TearDown network for sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" successfully" Jan 13 20:57:23.237961 containerd[1537]: time="2025-01-13T20:57:23.237898993Z" level=info msg="StopPodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" returns successfully" Jan 13 20:57:23.238126 containerd[1537]: time="2025-01-13T20:57:23.238049307Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\"" Jan 13 20:57:23.238126 containerd[1537]: time="2025-01-13T20:57:23.238091758Z" level=info msg="TearDown network for sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" successfully" Jan 13 20:57:23.238126 containerd[1537]: time="2025-01-13T20:57:23.238097260Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" returns successfully" Jan 13 20:57:23.239320 systemd[1]: run-netns-cni\x2d5b6cc808\x2d7f62\x2d5566\x2d7906\x2dba06178f20c9.mount: Deactivated successfully. Jan 13 20:57:23.239373 systemd[1]: run-netns-cni\x2d3de794d5\x2d45f5\x2de96d\x2d18a3\x2d0984808e36de.mount: Deactivated successfully. Jan 13 20:57:23.259985 kubelet[2787]: I0113 20:57:23.240347 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d" Jan 13 20:57:23.259985 kubelet[2787]: I0113 20:57:23.243115 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6" Jan 13 20:57:23.259985 kubelet[2787]: I0113 20:57:23.246883 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4" Jan 13 20:57:23.259985 kubelet[2787]: I0113 20:57:23.248121 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.239892937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:3,}" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.241644618Z" level=info msg="StopPodSandbox for \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\"" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.241755493Z" level=info msg="Ensure that sandbox 6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d in task-service has been cleanup successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.243430995Z" level=info msg="StopPodSandbox for \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\"" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.243468846Z" level=info msg="TearDown network for sandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\" successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.243477927Z" level=info msg="StopPodSandbox for \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\" returns successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.243520353Z" level=info msg="Ensure that sandbox 190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6 in task-service has been cleanup successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.243603802Z" level=info msg="TearDown network for sandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\" successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.243615942Z" level=info msg="StopPodSandbox for \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\" returns successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.243909023Z" level=info msg="StopPodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\"" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.243944635Z" level=info msg="TearDown network for sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.243950433Z" level=info msg="StopPodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" returns successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.244183119Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-67ltf,Uid:6a232370-fd20-47eb-a2e3-9b0d1d786995,Namespace:calico-apiserver,Attempt:2,}" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.245312478Z" level=info msg="StopPodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\"" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.245350452Z" level=info msg="TearDown network for sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.245357887Z" level=info msg="StopPodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" returns successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.245851063Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\"" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.245886082Z" level=info msg="TearDown network for sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.245891712Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" returns successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.246406287Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:3,}" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.247465430Z" level=info msg="StopPodSandbox for \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\"" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.247552233Z" level=info msg="Ensure that sandbox 00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4 in task-service has been cleanup successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.247655608Z" level=info msg="TearDown network for sandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\" successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.247663551Z" level=info msg="StopPodSandbox for \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\" returns successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.247771044Z" level=info msg="StopPodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\"" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.247805016Z" level=info msg="TearDown network for sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.247811057Z" level=info msg="StopPodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" returns successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.248298582Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-7rrvw,Uid:86a1a104-aa85-4352-b08c-36f54c7172c1,Namespace:calico-apiserver,Attempt:2,}" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.248419355Z" level=info msg="StopPodSandbox for \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\"" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.248496341Z" level=info msg="Ensure that sandbox 298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d in task-service has been cleanup successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.248775502Z" level=info msg="TearDown network for sandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\" successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.248784205Z" level=info msg="StopPodSandbox for \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\" returns successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.248908335Z" level=info msg="StopPodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\"" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.248940284Z" level=info msg="TearDown network for sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" successfully" Jan 13 20:57:23.260066 containerd[1537]: time="2025-01-13T20:57:23.248945437Z" level=info msg="StopPodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" returns successfully" Jan 13 20:57:23.242843 systemd[1]: run-netns-cni\x2db2f336ce\x2d7dfb\x2d8c4c\x2dd194\x2d2bc9cc20ba83.mount: Deactivated successfully. Jan 13 20:57:23.262733 containerd[1537]: time="2025-01-13T20:57:23.250871697Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\"" Jan 13 20:57:23.262733 containerd[1537]: time="2025-01-13T20:57:23.250916859Z" level=info msg="TearDown network for sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" successfully" Jan 13 20:57:23.262733 containerd[1537]: time="2025-01-13T20:57:23.250924152Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" returns successfully" Jan 13 20:57:23.262733 containerd[1537]: time="2025-01-13T20:57:23.251514144Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:3,}" Jan 13 20:57:23.244933 systemd[1]: run-netns-cni\x2d775f8b1d\x2d3101\x2d1688\x2d94d9\x2dbe8f1233e517.mount: Deactivated successfully. Jan 13 20:57:23.251933 systemd[1]: run-netns-cni\x2d97aa711b\x2d94a0\x2dfa42\x2d14f8\x2d017a10670906.mount: Deactivated successfully. Jan 13 20:57:23.251989 systemd[1]: run-netns-cni\x2d3490edaf\x2db729\x2d08bc\x2d1d1f\x2da1183c70b56b.mount: Deactivated successfully. Jan 13 20:57:24.139995 containerd[1537]: time="2025-01-13T20:57:24.139885289Z" level=error msg="Failed to destroy network for sandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.141962 containerd[1537]: time="2025-01-13T20:57:24.141407197Z" level=error msg="encountered an error cleaning up failed sandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.141962 containerd[1537]: time="2025-01-13T20:57:24.141447453Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.141747 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e-shm.mount: Deactivated successfully. Jan 13 20:57:24.142115 kubelet[2787]: E0113 20:57:24.141592 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.142115 kubelet[2787]: E0113 20:57:24.141626 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:24.142115 kubelet[2787]: E0113 20:57:24.141641 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:24.142199 kubelet[2787]: E0113 20:57:24.141666 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-k7f2m" podUID="fa68722f-5974-4f04-9b6e-46b8f479c300" Jan 13 20:57:24.251843 kubelet[2787]: I0113 20:57:24.251306 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e" Jan 13 20:57:24.252642 containerd[1537]: time="2025-01-13T20:57:24.252428154Z" level=info msg="StopPodSandbox for \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\"" Jan 13 20:57:24.252642 containerd[1537]: time="2025-01-13T20:57:24.252562914Z" level=info msg="Ensure that sandbox 547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e in task-service has been cleanup successfully" Jan 13 20:57:24.253003 containerd[1537]: time="2025-01-13T20:57:24.252951640Z" level=info msg="TearDown network for sandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\" successfully" Jan 13 20:57:24.253003 containerd[1537]: time="2025-01-13T20:57:24.252961450Z" level=info msg="StopPodSandbox for \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\" returns successfully" Jan 13 20:57:24.253343 containerd[1537]: time="2025-01-13T20:57:24.253262392Z" level=info msg="StopPodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\"" Jan 13 20:57:24.253343 containerd[1537]: time="2025-01-13T20:57:24.253303116Z" level=info msg="TearDown network for sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" successfully" Jan 13 20:57:24.253343 containerd[1537]: time="2025-01-13T20:57:24.253310014Z" level=info msg="StopPodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" returns successfully" Jan 13 20:57:24.253528 containerd[1537]: time="2025-01-13T20:57:24.253518948Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\"" Jan 13 20:57:24.253641 containerd[1537]: time="2025-01-13T20:57:24.253632325Z" level=info msg="TearDown network for sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" successfully" Jan 13 20:57:24.253704 containerd[1537]: time="2025-01-13T20:57:24.253673761Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" returns successfully" Jan 13 20:57:24.253945 containerd[1537]: time="2025-01-13T20:57:24.253851952Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\"" Jan 13 20:57:24.253945 containerd[1537]: time="2025-01-13T20:57:24.253889578Z" level=info msg="TearDown network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" successfully" Jan 13 20:57:24.253945 containerd[1537]: time="2025-01-13T20:57:24.253895193Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" returns successfully" Jan 13 20:57:24.255721 containerd[1537]: time="2025-01-13T20:57:24.255511771Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:4,}" Jan 13 20:57:24.383252 containerd[1537]: time="2025-01-13T20:57:24.383167788Z" level=error msg="Failed to destroy network for sandbox \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.383496 containerd[1537]: time="2025-01-13T20:57:24.383482454Z" level=error msg="encountered an error cleaning up failed sandbox \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.383616 containerd[1537]: time="2025-01-13T20:57:24.383552692Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.383970 kubelet[2787]: E0113 20:57:24.383944 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.399153 kubelet[2787]: E0113 20:57:24.383985 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:24.399153 kubelet[2787]: E0113 20:57:24.384006 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:24.399153 kubelet[2787]: E0113 20:57:24.384032 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6df8b96c48-4j6ml_calico-system(be5d865c-3359-4beb-8044-736dced88771)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6df8b96c48-4j6ml_calico-system(be5d865c-3359-4beb-8044-736dced88771)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" podUID="be5d865c-3359-4beb-8044-736dced88771" Jan 13 20:57:24.436542 containerd[1537]: time="2025-01-13T20:57:24.436509590Z" level=error msg="Failed to destroy network for sandbox \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.446722 containerd[1537]: time="2025-01-13T20:57:24.446687780Z" level=error msg="Failed to destroy network for sandbox \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.450016 containerd[1537]: time="2025-01-13T20:57:24.449940614Z" level=error msg="Failed to destroy network for sandbox \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.456871 containerd[1537]: time="2025-01-13T20:57:24.456810307Z" level=error msg="Failed to destroy network for sandbox \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.462659 containerd[1537]: time="2025-01-13T20:57:24.462565520Z" level=error msg="Failed to destroy network for sandbox \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.463085 containerd[1537]: time="2025-01-13T20:57:24.462945788Z" level=error msg="encountered an error cleaning up failed sandbox \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.463085 containerd[1537]: time="2025-01-13T20:57:24.462991414Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.463238 kubelet[2787]: E0113 20:57:24.463211 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.463297 kubelet[2787]: E0113 20:57:24.463255 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:24.463297 kubelet[2787]: E0113 20:57:24.463291 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:24.463368 kubelet[2787]: E0113 20:57:24.463322 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-k7f2m" podUID="fa68722f-5974-4f04-9b6e-46b8f479c300" Jan 13 20:57:24.528064 containerd[1537]: time="2025-01-13T20:57:24.527908667Z" level=error msg="encountered an error cleaning up failed sandbox \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.528064 containerd[1537]: time="2025-01-13T20:57:24.527961344Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-7rrvw,Uid:86a1a104-aa85-4352-b08c-36f54c7172c1,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.576243 containerd[1537]: time="2025-01-13T20:57:24.528195055Z" level=error msg="encountered an error cleaning up failed sandbox \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.576243 containerd[1537]: time="2025-01-13T20:57:24.528309584Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-67ltf,Uid:6a232370-fd20-47eb-a2e3-9b0d1d786995,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.576243 containerd[1537]: time="2025-01-13T20:57:24.528398280Z" level=error msg="encountered an error cleaning up failed sandbox \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.576243 containerd[1537]: time="2025-01-13T20:57:24.528420266Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.576243 containerd[1537]: time="2025-01-13T20:57:24.528523523Z" level=error msg="encountered an error cleaning up failed sandbox \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.576243 containerd[1537]: time="2025-01-13T20:57:24.528543355Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.583037 kubelet[2787]: E0113 20:57:24.528106 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.583037 kubelet[2787]: E0113 20:57:24.528140 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" Jan 13 20:57:24.583037 kubelet[2787]: E0113 20:57:24.528255 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" Jan 13 20:57:24.583106 kubelet[2787]: E0113 20:57:24.528282 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-867f57c995-7rrvw_calico-apiserver(86a1a104-aa85-4352-b08c-36f54c7172c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-867f57c995-7rrvw_calico-apiserver(86a1a104-aa85-4352-b08c-36f54c7172c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" podUID="86a1a104-aa85-4352-b08c-36f54c7172c1" Jan 13 20:57:24.583106 kubelet[2787]: E0113 20:57:24.528677 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.583106 kubelet[2787]: E0113 20:57:24.528694 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" Jan 13 20:57:24.583188 kubelet[2787]: E0113 20:57:24.528703 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" Jan 13 20:57:24.583188 kubelet[2787]: E0113 20:57:24.528726 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.583188 kubelet[2787]: E0113 20:57:24.528738 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:24.583188 kubelet[2787]: E0113 20:57:24.528748 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:24.583262 kubelet[2787]: E0113 20:57:24.528762 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-9c7f5_kube-system(2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-9c7f5_kube-system(2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-9c7f5" podUID="2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9" Jan 13 20:57:24.583262 kubelet[2787]: E0113 20:57:24.528780 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:24.583262 kubelet[2787]: E0113 20:57:24.528791 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:24.583345 kubelet[2787]: E0113 20:57:24.528799 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:24.583345 kubelet[2787]: E0113 20:57:24.528879 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-867f57c995-67ltf_calico-apiserver(6a232370-fd20-47eb-a2e3-9b0d1d786995)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-867f57c995-67ltf_calico-apiserver(6a232370-fd20-47eb-a2e3-9b0d1d786995)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" podUID="6a232370-fd20-47eb-a2e3-9b0d1d786995" Jan 13 20:57:24.583345 kubelet[2787]: E0113 20:57:24.528906 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jkzvs_calico-system(156fa3f2-d364-43dd-86de-274512f7d213)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jkzvs_calico-system(156fa3f2-d364-43dd-86de-274512f7d213)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jkzvs" podUID="156fa3f2-d364-43dd-86de-274512f7d213" Jan 13 20:57:24.630138 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92-shm.mount: Deactivated successfully. Jan 13 20:57:24.630214 systemd[1]: run-netns-cni\x2df87fef66\x2db318\x2d8dc0\x2d01f9\x2d02c7f616fa04.mount: Deactivated successfully. Jan 13 20:57:25.320760 kubelet[2787]: I0113 20:57:25.320725 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020" Jan 13 20:57:25.323944 containerd[1537]: time="2025-01-13T20:57:25.323921530Z" level=info msg="StopPodSandbox for \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\"" Jan 13 20:57:25.329021 containerd[1537]: time="2025-01-13T20:57:25.328999844Z" level=info msg="Ensure that sandbox c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020 in task-service has been cleanup successfully" Jan 13 20:57:25.329319 containerd[1537]: time="2025-01-13T20:57:25.329148041Z" level=info msg="TearDown network for sandbox \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\" successfully" Jan 13 20:57:25.329319 containerd[1537]: time="2025-01-13T20:57:25.329165881Z" level=info msg="StopPodSandbox for \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\" returns successfully" Jan 13 20:57:25.329731 containerd[1537]: time="2025-01-13T20:57:25.329647003Z" level=info msg="StopPodSandbox for \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\"" Jan 13 20:57:25.330675 containerd[1537]: time="2025-01-13T20:57:25.329696361Z" level=info msg="TearDown network for sandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\" successfully" Jan 13 20:57:25.330675 containerd[1537]: time="2025-01-13T20:57:25.329803975Z" level=info msg="StopPodSandbox for \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\" returns successfully" Jan 13 20:57:25.331595 systemd[1]: run-netns-cni\x2d5ffc7528\x2d9b11\x2d61f9\x2dc12d\x2d3e0467322314.mount: Deactivated successfully. Jan 13 20:57:25.335195 containerd[1537]: time="2025-01-13T20:57:25.335167651Z" level=info msg="StopPodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\"" Jan 13 20:57:25.335258 containerd[1537]: time="2025-01-13T20:57:25.335239649Z" level=info msg="TearDown network for sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" successfully" Jan 13 20:57:25.335258 containerd[1537]: time="2025-01-13T20:57:25.335255766Z" level=info msg="StopPodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" returns successfully" Jan 13 20:57:25.336361 containerd[1537]: time="2025-01-13T20:57:25.336311327Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\"" Jan 13 20:57:25.336402 containerd[1537]: time="2025-01-13T20:57:25.336367561Z" level=info msg="TearDown network for sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" successfully" Jan 13 20:57:25.336402 containerd[1537]: time="2025-01-13T20:57:25.336383361Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" returns successfully" Jan 13 20:57:25.352510 containerd[1537]: time="2025-01-13T20:57:25.352376270Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:4,}" Jan 13 20:57:25.392805 kubelet[2787]: I0113 20:57:25.390295 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc" Jan 13 20:57:25.392717 systemd[1]: run-netns-cni\x2d8809b612\x2d8ab1\x2dc802\x2dad2d\x2dd5e0c338a586.mount: Deactivated successfully. Jan 13 20:57:25.392944 containerd[1537]: time="2025-01-13T20:57:25.390801725Z" level=info msg="StopPodSandbox for \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\"" Jan 13 20:57:25.392944 containerd[1537]: time="2025-01-13T20:57:25.390908516Z" level=info msg="Ensure that sandbox f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc in task-service has been cleanup successfully" Jan 13 20:57:25.393960 containerd[1537]: time="2025-01-13T20:57:25.393715936Z" level=info msg="TearDown network for sandbox \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\" successfully" Jan 13 20:57:25.393960 containerd[1537]: time="2025-01-13T20:57:25.393729807Z" level=info msg="StopPodSandbox for \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\" returns successfully" Jan 13 20:57:25.394334 containerd[1537]: time="2025-01-13T20:57:25.394241522Z" level=info msg="StopPodSandbox for \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\"" Jan 13 20:57:25.394334 containerd[1537]: time="2025-01-13T20:57:25.394304938Z" level=info msg="TearDown network for sandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\" successfully" Jan 13 20:57:25.394334 containerd[1537]: time="2025-01-13T20:57:25.394311844Z" level=info msg="StopPodSandbox for \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\" returns successfully" Jan 13 20:57:25.394877 containerd[1537]: time="2025-01-13T20:57:25.394652191Z" level=info msg="StopPodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\"" Jan 13 20:57:25.394877 containerd[1537]: time="2025-01-13T20:57:25.394689852Z" level=info msg="TearDown network for sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" successfully" Jan 13 20:57:25.394877 containerd[1537]: time="2025-01-13T20:57:25.394696784Z" level=info msg="StopPodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" returns successfully" Jan 13 20:57:25.395083 containerd[1537]: time="2025-01-13T20:57:25.395071999Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-67ltf,Uid:6a232370-fd20-47eb-a2e3-9b0d1d786995,Namespace:calico-apiserver,Attempt:3,}" Jan 13 20:57:25.396170 kubelet[2787]: I0113 20:57:25.396132 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0" Jan 13 20:57:25.397831 containerd[1537]: time="2025-01-13T20:57:25.397751614Z" level=info msg="StopPodSandbox for \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\"" Jan 13 20:57:25.398095 containerd[1537]: time="2025-01-13T20:57:25.398009867Z" level=info msg="Ensure that sandbox 420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0 in task-service has been cleanup successfully" Jan 13 20:57:25.399098 containerd[1537]: time="2025-01-13T20:57:25.399087693Z" level=info msg="TearDown network for sandbox \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\" successfully" Jan 13 20:57:25.399289 containerd[1537]: time="2025-01-13T20:57:25.399151425Z" level=info msg="StopPodSandbox for \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\" returns successfully" Jan 13 20:57:25.399568 systemd[1]: run-netns-cni\x2d2f9b16e0\x2dc1d6\x2da93f\x2df4d2\x2d50c12680978c.mount: Deactivated successfully. Jan 13 20:57:25.401065 containerd[1537]: time="2025-01-13T20:57:25.401050213Z" level=info msg="StopPodSandbox for \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\"" Jan 13 20:57:25.401583 containerd[1537]: time="2025-01-13T20:57:25.401095003Z" level=info msg="TearDown network for sandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\" successfully" Jan 13 20:57:25.401583 containerd[1537]: time="2025-01-13T20:57:25.401101731Z" level=info msg="StopPodSandbox for \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\" returns successfully" Jan 13 20:57:25.401583 containerd[1537]: time="2025-01-13T20:57:25.401298291Z" level=info msg="StopPodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\"" Jan 13 20:57:25.401583 containerd[1537]: time="2025-01-13T20:57:25.401334781Z" level=info msg="TearDown network for sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" successfully" Jan 13 20:57:25.401583 containerd[1537]: time="2025-01-13T20:57:25.401340768Z" level=info msg="StopPodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" returns successfully" Jan 13 20:57:25.401583 containerd[1537]: time="2025-01-13T20:57:25.401569869Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\"" Jan 13 20:57:25.402504 containerd[1537]: time="2025-01-13T20:57:25.402392367Z" level=info msg="TearDown network for sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" successfully" Jan 13 20:57:25.402504 containerd[1537]: time="2025-01-13T20:57:25.402402701Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" returns successfully" Jan 13 20:57:25.403113 containerd[1537]: time="2025-01-13T20:57:25.403087864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:4,}" Jan 13 20:57:25.405630 containerd[1537]: time="2025-01-13T20:57:25.405608022Z" level=info msg="StopPodSandbox for \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\"" Jan 13 20:57:25.405692 kubelet[2787]: I0113 20:57:25.405266 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d" Jan 13 20:57:25.406159 containerd[1537]: time="2025-01-13T20:57:25.406100899Z" level=info msg="Ensure that sandbox 9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d in task-service has been cleanup successfully" Jan 13 20:57:25.406574 containerd[1537]: time="2025-01-13T20:57:25.406487397Z" level=info msg="TearDown network for sandbox \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\" successfully" Jan 13 20:57:25.406574 containerd[1537]: time="2025-01-13T20:57:25.406506019Z" level=info msg="StopPodSandbox for \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\" returns successfully" Jan 13 20:57:25.411979 containerd[1537]: time="2025-01-13T20:57:25.408917504Z" level=info msg="StopPodSandbox for \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\"" Jan 13 20:57:25.411979 containerd[1537]: time="2025-01-13T20:57:25.408961706Z" level=info msg="TearDown network for sandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\" successfully" Jan 13 20:57:25.411979 containerd[1537]: time="2025-01-13T20:57:25.408967782Z" level=info msg="StopPodSandbox for \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\" returns successfully" Jan 13 20:57:25.411979 containerd[1537]: time="2025-01-13T20:57:25.409750384Z" level=info msg="StopPodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\"" Jan 13 20:57:25.411979 containerd[1537]: time="2025-01-13T20:57:25.409896558Z" level=info msg="TearDown network for sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" successfully" Jan 13 20:57:25.411979 containerd[1537]: time="2025-01-13T20:57:25.409905013Z" level=info msg="StopPodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" returns successfully" Jan 13 20:57:25.411979 containerd[1537]: time="2025-01-13T20:57:25.410701906Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\"" Jan 13 20:57:25.411979 containerd[1537]: time="2025-01-13T20:57:25.410742717Z" level=info msg="TearDown network for sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" successfully" Jan 13 20:57:25.411979 containerd[1537]: time="2025-01-13T20:57:25.410748980Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" returns successfully" Jan 13 20:57:25.408496 systemd[1]: run-netns-cni\x2d86292b6d\x2d8c42\x2d461b\x2d6ed0\x2d6f2e8c230bc5.mount: Deactivated successfully. Jan 13 20:57:25.415491 containerd[1537]: time="2025-01-13T20:57:25.415350927Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\"" Jan 13 20:57:25.415491 containerd[1537]: time="2025-01-13T20:57:25.415398235Z" level=info msg="TearDown network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" successfully" Jan 13 20:57:25.415491 containerd[1537]: time="2025-01-13T20:57:25.415404997Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" returns successfully" Jan 13 20:57:25.463420 containerd[1537]: time="2025-01-13T20:57:25.463177718Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:5,}" Jan 13 20:57:25.465423 kubelet[2787]: I0113 20:57:25.465274 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51" Jan 13 20:57:25.467648 containerd[1537]: time="2025-01-13T20:57:25.467596825Z" level=info msg="StopPodSandbox for \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\"" Jan 13 20:57:25.469333 containerd[1537]: time="2025-01-13T20:57:25.469263826Z" level=info msg="Ensure that sandbox c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51 in task-service has been cleanup successfully" Jan 13 20:57:25.470114 containerd[1537]: time="2025-01-13T20:57:25.470101359Z" level=info msg="TearDown network for sandbox \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\" successfully" Jan 13 20:57:25.470550 containerd[1537]: time="2025-01-13T20:57:25.470466236Z" level=info msg="StopPodSandbox for \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\" returns successfully" Jan 13 20:57:25.475198 containerd[1537]: time="2025-01-13T20:57:25.475176638Z" level=info msg="StopPodSandbox for \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\"" Jan 13 20:57:25.475760 containerd[1537]: time="2025-01-13T20:57:25.475353423Z" level=info msg="TearDown network for sandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\" successfully" Jan 13 20:57:25.475760 containerd[1537]: time="2025-01-13T20:57:25.475364547Z" level=info msg="StopPodSandbox for \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\" returns successfully" Jan 13 20:57:25.480037 containerd[1537]: time="2025-01-13T20:57:25.480016847Z" level=info msg="StopPodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\"" Jan 13 20:57:25.480629 containerd[1537]: time="2025-01-13T20:57:25.480198390Z" level=info msg="TearDown network for sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" successfully" Jan 13 20:57:25.480963 containerd[1537]: time="2025-01-13T20:57:25.480884730Z" level=info msg="StopPodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" returns successfully" Jan 13 20:57:25.482488 containerd[1537]: time="2025-01-13T20:57:25.482475103Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\"" Jan 13 20:57:25.484635 containerd[1537]: time="2025-01-13T20:57:25.484621419Z" level=info msg="TearDown network for sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" successfully" Jan 13 20:57:25.484936 containerd[1537]: time="2025-01-13T20:57:25.484717300Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" returns successfully" Jan 13 20:57:25.487016 containerd[1537]: time="2025-01-13T20:57:25.486972073Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:4,}" Jan 13 20:57:25.487415 kubelet[2787]: I0113 20:57:25.487362 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92" Jan 13 20:57:25.489076 containerd[1537]: time="2025-01-13T20:57:25.488974073Z" level=info msg="StopPodSandbox for \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\"" Jan 13 20:57:25.489912 containerd[1537]: time="2025-01-13T20:57:25.489894845Z" level=info msg="Ensure that sandbox d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92 in task-service has been cleanup successfully" Jan 13 20:57:25.491340 containerd[1537]: time="2025-01-13T20:57:25.491321289Z" level=info msg="TearDown network for sandbox \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\" successfully" Jan 13 20:57:25.491340 containerd[1537]: time="2025-01-13T20:57:25.491335291Z" level=info msg="StopPodSandbox for \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\" returns successfully" Jan 13 20:57:25.495067 containerd[1537]: time="2025-01-13T20:57:25.494946658Z" level=info msg="StopPodSandbox for \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\"" Jan 13 20:57:25.495067 containerd[1537]: time="2025-01-13T20:57:25.495003923Z" level=info msg="TearDown network for sandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\" successfully" Jan 13 20:57:25.495067 containerd[1537]: time="2025-01-13T20:57:25.495010761Z" level=info msg="StopPodSandbox for \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\" returns successfully" Jan 13 20:57:25.496109 containerd[1537]: time="2025-01-13T20:57:25.495571257Z" level=info msg="StopPodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\"" Jan 13 20:57:25.496109 containerd[1537]: time="2025-01-13T20:57:25.495622423Z" level=info msg="TearDown network for sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" successfully" Jan 13 20:57:25.496109 containerd[1537]: time="2025-01-13T20:57:25.495630124Z" level=info msg="StopPodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" returns successfully" Jan 13 20:57:25.497779 containerd[1537]: time="2025-01-13T20:57:25.497764390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-7rrvw,Uid:86a1a104-aa85-4352-b08c-36f54c7172c1,Namespace:calico-apiserver,Attempt:3,}" Jan 13 20:57:25.604741 containerd[1537]: time="2025-01-13T20:57:25.604659692Z" level=error msg="Failed to destroy network for sandbox \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.608070 containerd[1537]: time="2025-01-13T20:57:25.607682715Z" level=error msg="encountered an error cleaning up failed sandbox \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.608070 containerd[1537]: time="2025-01-13T20:57:25.607727544Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-67ltf,Uid:6a232370-fd20-47eb-a2e3-9b0d1d786995,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.608434 kubelet[2787]: E0113 20:57:25.607888 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.608434 kubelet[2787]: E0113 20:57:25.607932 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" Jan 13 20:57:25.608434 kubelet[2787]: E0113 20:57:25.607947 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" Jan 13 20:57:25.621050 containerd[1537]: time="2025-01-13T20:57:25.621023357Z" level=error msg="Failed to destroy network for sandbox \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.621244 containerd[1537]: time="2025-01-13T20:57:25.621226595Z" level=error msg="encountered an error cleaning up failed sandbox \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.621285 containerd[1537]: time="2025-01-13T20:57:25.621267769Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.621729 kubelet[2787]: E0113 20:57:25.621404 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.621729 kubelet[2787]: E0113 20:57:25.621443 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:25.621729 kubelet[2787]: E0113 20:57:25.621457 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:25.621816 kubelet[2787]: E0113 20:57:25.621484 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-9c7f5_kube-system(2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-9c7f5_kube-system(2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-9c7f5" podUID="2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9" Jan 13 20:57:25.633533 systemd[1]: run-netns-cni\x2d3729f786\x2d4056\x2dfd95\x2d0c6e\x2d0462926bce83.mount: Deactivated successfully. Jan 13 20:57:25.633590 systemd[1]: run-netns-cni\x2dd73025c9\x2de2bb\x2dd338\x2de6b1\x2d78cfc79fb6b3.mount: Deactivated successfully. Jan 13 20:57:25.641512 containerd[1537]: time="2025-01-13T20:57:25.641467079Z" level=error msg="Failed to destroy network for sandbox \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.642219 containerd[1537]: time="2025-01-13T20:57:25.641798412Z" level=error msg="encountered an error cleaning up failed sandbox \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.642219 containerd[1537]: time="2025-01-13T20:57:25.641853702Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.642328 kubelet[2787]: E0113 20:57:25.641985 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.642328 kubelet[2787]: E0113 20:57:25.642020 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:25.642328 kubelet[2787]: E0113 20:57:25.642035 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:25.642407 kubelet[2787]: E0113 20:57:25.642059 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6df8b96c48-4j6ml_calico-system(be5d865c-3359-4beb-8044-736dced88771)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6df8b96c48-4j6ml_calico-system(be5d865c-3359-4beb-8044-736dced88771)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" podUID="be5d865c-3359-4beb-8044-736dced88771" Jan 13 20:57:25.644526 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75-shm.mount: Deactivated successfully. Jan 13 20:57:25.660098 containerd[1537]: time="2025-01-13T20:57:25.658200903Z" level=error msg="Failed to destroy network for sandbox \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.660098 containerd[1537]: time="2025-01-13T20:57:25.660045596Z" level=error msg="encountered an error cleaning up failed sandbox \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.660098 containerd[1537]: time="2025-01-13T20:57:25.660085214Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.660134 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce-shm.mount: Deactivated successfully. Jan 13 20:57:25.661532 kubelet[2787]: E0113 20:57:25.661508 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.661609 kubelet[2787]: E0113 20:57:25.661598 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:25.661653 kubelet[2787]: E0113 20:57:25.661644 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:25.661714 kubelet[2787]: E0113 20:57:25.661702 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-k7f2m" podUID="fa68722f-5974-4f04-9b6e-46b8f479c300" Jan 13 20:57:25.672439 kubelet[2787]: E0113 20:57:25.672411 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-867f57c995-67ltf_calico-apiserver(6a232370-fd20-47eb-a2e3-9b0d1d786995)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-867f57c995-67ltf_calico-apiserver(6a232370-fd20-47eb-a2e3-9b0d1d786995)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" podUID="6a232370-fd20-47eb-a2e3-9b0d1d786995" Jan 13 20:57:25.685312 containerd[1537]: time="2025-01-13T20:57:25.685284253Z" level=error msg="Failed to destroy network for sandbox \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.685424 containerd[1537]: time="2025-01-13T20:57:25.685299055Z" level=error msg="Failed to destroy network for sandbox \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.685600 containerd[1537]: time="2025-01-13T20:57:25.685584166Z" level=error msg="encountered an error cleaning up failed sandbox \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.685636 containerd[1537]: time="2025-01-13T20:57:25.685621323Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.686308 kubelet[2787]: E0113 20:57:25.685744 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.686308 kubelet[2787]: E0113 20:57:25.685778 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:25.686308 kubelet[2787]: E0113 20:57:25.685790 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:25.686383 kubelet[2787]: E0113 20:57:25.685818 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jkzvs_calico-system(156fa3f2-d364-43dd-86de-274512f7d213)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jkzvs_calico-system(156fa3f2-d364-43dd-86de-274512f7d213)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jkzvs" podUID="156fa3f2-d364-43dd-86de-274512f7d213" Jan 13 20:57:25.695525 kubelet[2787]: E0113 20:57:25.688204 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.695525 kubelet[2787]: E0113 20:57:25.688237 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" Jan 13 20:57:25.695525 kubelet[2787]: E0113 20:57:25.688252 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" Jan 13 20:57:25.695610 containerd[1537]: time="2025-01-13T20:57:25.687769381Z" level=error msg="encountered an error cleaning up failed sandbox \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.695610 containerd[1537]: time="2025-01-13T20:57:25.687813932Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-7rrvw,Uid:86a1a104-aa85-4352-b08c-36f54c7172c1,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:25.687538 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777-shm.mount: Deactivated successfully. Jan 13 20:57:25.695750 kubelet[2787]: E0113 20:57:25.688274 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-867f57c995-7rrvw_calico-apiserver(86a1a104-aa85-4352-b08c-36f54c7172c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-867f57c995-7rrvw_calico-apiserver(86a1a104-aa85-4352-b08c-36f54c7172c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" podUID="86a1a104-aa85-4352-b08c-36f54c7172c1" Jan 13 20:57:25.687599 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7-shm.mount: Deactivated successfully. Jan 13 20:57:26.511311 kubelet[2787]: I0113 20:57:26.511289 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75" Jan 13 20:57:26.512874 containerd[1537]: time="2025-01-13T20:57:26.512346173Z" level=info msg="StopPodSandbox for \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\"" Jan 13 20:57:26.531391 containerd[1537]: time="2025-01-13T20:57:26.531172158Z" level=info msg="Ensure that sandbox f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75 in task-service has been cleanup successfully" Jan 13 20:57:26.532768 systemd[1]: run-netns-cni\x2d7dda192e\x2db84e\x2def79\x2d5a97\x2d62c956e449ae.mount: Deactivated successfully. Jan 13 20:57:26.534386 kubelet[2787]: I0113 20:57:26.533882 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce" Jan 13 20:57:26.534440 containerd[1537]: time="2025-01-13T20:57:26.534196932Z" level=info msg="StopPodSandbox for \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\"" Jan 13 20:57:26.534636 containerd[1537]: time="2025-01-13T20:57:26.534528053Z" level=info msg="TearDown network for sandbox \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\" successfully" Jan 13 20:57:26.534636 containerd[1537]: time="2025-01-13T20:57:26.534539705Z" level=info msg="StopPodSandbox for \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\" returns successfully" Jan 13 20:57:26.535775 containerd[1537]: time="2025-01-13T20:57:26.535141299Z" level=info msg="Ensure that sandbox a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce in task-service has been cleanup successfully" Jan 13 20:57:26.535775 containerd[1537]: time="2025-01-13T20:57:26.535295673Z" level=info msg="TearDown network for sandbox \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\" successfully" Jan 13 20:57:26.535775 containerd[1537]: time="2025-01-13T20:57:26.535307344Z" level=info msg="StopPodSandbox for \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\" returns successfully" Jan 13 20:57:26.536624 containerd[1537]: time="2025-01-13T20:57:26.536041831Z" level=info msg="StopPodSandbox for \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\"" Jan 13 20:57:26.536624 containerd[1537]: time="2025-01-13T20:57:26.536095213Z" level=info msg="TearDown network for sandbox \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\" successfully" Jan 13 20:57:26.536624 containerd[1537]: time="2025-01-13T20:57:26.536105212Z" level=info msg="StopPodSandbox for \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\" returns successfully" Jan 13 20:57:26.536624 containerd[1537]: time="2025-01-13T20:57:26.536143479Z" level=info msg="StopPodSandbox for \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\"" Jan 13 20:57:26.536624 containerd[1537]: time="2025-01-13T20:57:26.536185146Z" level=info msg="TearDown network for sandbox \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\" successfully" Jan 13 20:57:26.536624 containerd[1537]: time="2025-01-13T20:57:26.536192193Z" level=info msg="StopPodSandbox for \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\" returns successfully" Jan 13 20:57:26.537101 containerd[1537]: time="2025-01-13T20:57:26.536902564Z" level=info msg="StopPodSandbox for \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\"" Jan 13 20:57:26.537282 containerd[1537]: time="2025-01-13T20:57:26.537268318Z" level=info msg="TearDown network for sandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\" successfully" Jan 13 20:57:26.537309 containerd[1537]: time="2025-01-13T20:57:26.537280712Z" level=info msg="StopPodSandbox for \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\" returns successfully" Jan 13 20:57:26.537487 containerd[1537]: time="2025-01-13T20:57:26.537199986Z" level=info msg="StopPodSandbox for \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\"" Jan 13 20:57:26.537529 containerd[1537]: time="2025-01-13T20:57:26.537516422Z" level=info msg="TearDown network for sandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\" successfully" Jan 13 20:57:26.537840 containerd[1537]: time="2025-01-13T20:57:26.537526971Z" level=info msg="StopPodSandbox for \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\" returns successfully" Jan 13 20:57:26.538965 kubelet[2787]: I0113 20:57:26.538814 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6" Jan 13 20:57:26.539159 containerd[1537]: time="2025-01-13T20:57:26.538069261Z" level=info msg="StopPodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\"" Jan 13 20:57:26.539258 containerd[1537]: time="2025-01-13T20:57:26.538500085Z" level=info msg="StopPodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\"" Jan 13 20:57:26.539338 containerd[1537]: time="2025-01-13T20:57:26.539329722Z" level=info msg="TearDown network for sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" successfully" Jan 13 20:57:26.539439 containerd[1537]: time="2025-01-13T20:57:26.539377379Z" level=info msg="StopPodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" returns successfully" Jan 13 20:57:26.539439 containerd[1537]: time="2025-01-13T20:57:26.539417730Z" level=info msg="TearDown network for sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" successfully" Jan 13 20:57:26.539439 containerd[1537]: time="2025-01-13T20:57:26.539423868Z" level=info msg="StopPodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" returns successfully" Jan 13 20:57:26.539762 containerd[1537]: time="2025-01-13T20:57:26.539743001Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\"" Jan 13 20:57:26.539921 containerd[1537]: time="2025-01-13T20:57:26.539854506Z" level=info msg="TearDown network for sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" successfully" Jan 13 20:57:26.539921 containerd[1537]: time="2025-01-13T20:57:26.539863177Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" returns successfully" Jan 13 20:57:26.539921 containerd[1537]: time="2025-01-13T20:57:26.539886169Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\"" Jan 13 20:57:26.540077 containerd[1537]: time="2025-01-13T20:57:26.540042409Z" level=info msg="TearDown network for sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" successfully" Jan 13 20:57:26.540077 containerd[1537]: time="2025-01-13T20:57:26.540052240Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" returns successfully" Jan 13 20:57:26.549349 containerd[1537]: time="2025-01-13T20:57:26.540510319Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\"" Jan 13 20:57:26.549349 containerd[1537]: time="2025-01-13T20:57:26.540554159Z" level=info msg="TearDown network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" successfully" Jan 13 20:57:26.549349 containerd[1537]: time="2025-01-13T20:57:26.540560330Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" returns successfully" Jan 13 20:57:26.549349 containerd[1537]: time="2025-01-13T20:57:26.540595871Z" level=info msg="StopPodSandbox for \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\"" Jan 13 20:57:26.549349 containerd[1537]: time="2025-01-13T20:57:26.543448030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:6,}" Jan 13 20:57:26.559771 containerd[1537]: time="2025-01-13T20:57:26.559675185Z" level=info msg="Ensure that sandbox ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6 in task-service has been cleanup successfully" Jan 13 20:57:26.560671 containerd[1537]: time="2025-01-13T20:57:26.559880466Z" level=info msg="TearDown network for sandbox \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\" successfully" Jan 13 20:57:26.560671 containerd[1537]: time="2025-01-13T20:57:26.559890478Z" level=info msg="StopPodSandbox for \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\" returns successfully" Jan 13 20:57:26.564375 containerd[1537]: time="2025-01-13T20:57:26.564351395Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:5,}" Jan 13 20:57:26.579057 containerd[1537]: time="2025-01-13T20:57:26.578912794Z" level=info msg="StopPodSandbox for \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\"" Jan 13 20:57:26.579057 containerd[1537]: time="2025-01-13T20:57:26.578972122Z" level=info msg="TearDown network for sandbox \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\" successfully" Jan 13 20:57:26.579057 containerd[1537]: time="2025-01-13T20:57:26.578979355Z" level=info msg="StopPodSandbox for \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\" returns successfully" Jan 13 20:57:26.579727 containerd[1537]: time="2025-01-13T20:57:26.579619070Z" level=info msg="StopPodSandbox for \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\"" Jan 13 20:57:26.579727 containerd[1537]: time="2025-01-13T20:57:26.579661758Z" level=info msg="TearDown network for sandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\" successfully" Jan 13 20:57:26.579727 containerd[1537]: time="2025-01-13T20:57:26.579667941Z" level=info msg="StopPodSandbox for \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\" returns successfully" Jan 13 20:57:26.580146 containerd[1537]: time="2025-01-13T20:57:26.580072365Z" level=info msg="StopPodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\"" Jan 13 20:57:26.580146 containerd[1537]: time="2025-01-13T20:57:26.580108575Z" level=info msg="TearDown network for sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" successfully" Jan 13 20:57:26.580146 containerd[1537]: time="2025-01-13T20:57:26.580114450Z" level=info msg="StopPodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" returns successfully" Jan 13 20:57:26.581003 containerd[1537]: time="2025-01-13T20:57:26.580842403Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\"" Jan 13 20:57:26.581003 containerd[1537]: time="2025-01-13T20:57:26.580880915Z" level=info msg="TearDown network for sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" successfully" Jan 13 20:57:26.581003 containerd[1537]: time="2025-01-13T20:57:26.580887019Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" returns successfully" Jan 13 20:57:26.581539 containerd[1537]: time="2025-01-13T20:57:26.581378279Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:5,}" Jan 13 20:57:26.581880 kubelet[2787]: I0113 20:57:26.581647 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7" Jan 13 20:57:26.583011 containerd[1537]: time="2025-01-13T20:57:26.582907956Z" level=info msg="StopPodSandbox for \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\"" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.586352097Z" level=info msg="Ensure that sandbox 7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7 in task-service has been cleanup successfully" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.587358104Z" level=info msg="TearDown network for sandbox \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\" successfully" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.587368346Z" level=info msg="StopPodSandbox for \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\" returns successfully" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.587853141Z" level=info msg="StopPodSandbox for \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\"" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.587909735Z" level=info msg="TearDown network for sandbox \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\" successfully" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.587917111Z" level=info msg="StopPodSandbox for \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\" returns successfully" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.588862211Z" level=info msg="StopPodSandbox for \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\"" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.588923367Z" level=info msg="TearDown network for sandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\" successfully" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.588931145Z" level=info msg="StopPodSandbox for \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\" returns successfully" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.589157539Z" level=info msg="StopPodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\"" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.589268613Z" level=info msg="TearDown network for sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" successfully" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.589277070Z" level=info msg="StopPodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" returns successfully" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.589606677Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\"" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.589647679Z" level=info msg="TearDown network for sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" successfully" Jan 13 20:57:26.589733 containerd[1537]: time="2025-01-13T20:57:26.589653434Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" returns successfully" Jan 13 20:57:26.596035 kubelet[2787]: I0113 20:57:26.589460 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777" Jan 13 20:57:26.596035 kubelet[2787]: I0113 20:57:26.593319 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747" Jan 13 20:57:26.596089 containerd[1537]: time="2025-01-13T20:57:26.590106167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:5,}" Jan 13 20:57:26.596089 containerd[1537]: time="2025-01-13T20:57:26.590554915Z" level=info msg="StopPodSandbox for \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\"" Jan 13 20:57:26.596729 containerd[1537]: time="2025-01-13T20:57:26.596703747Z" level=info msg="StopPodSandbox for \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\"" Jan 13 20:57:26.597459 containerd[1537]: time="2025-01-13T20:57:26.597437556Z" level=info msg="Ensure that sandbox 0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777 in task-service has been cleanup successfully" Jan 13 20:57:26.597733 containerd[1537]: time="2025-01-13T20:57:26.597670857Z" level=info msg="TearDown network for sandbox \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\" successfully" Jan 13 20:57:26.597733 containerd[1537]: time="2025-01-13T20:57:26.597681699Z" level=info msg="StopPodSandbox for \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\" returns successfully" Jan 13 20:57:26.599068 containerd[1537]: time="2025-01-13T20:57:26.598976255Z" level=info msg="StopPodSandbox for \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\"" Jan 13 20:57:26.599068 containerd[1537]: time="2025-01-13T20:57:26.599030833Z" level=info msg="TearDown network for sandbox \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\" successfully" Jan 13 20:57:26.599068 containerd[1537]: time="2025-01-13T20:57:26.599038792Z" level=info msg="StopPodSandbox for \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\" returns successfully" Jan 13 20:57:26.599424 containerd[1537]: time="2025-01-13T20:57:26.599339475Z" level=info msg="StopPodSandbox for \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\"" Jan 13 20:57:26.599424 containerd[1537]: time="2025-01-13T20:57:26.599384692Z" level=info msg="TearDown network for sandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\" successfully" Jan 13 20:57:26.599424 containerd[1537]: time="2025-01-13T20:57:26.599390877Z" level=info msg="StopPodSandbox for \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\" returns successfully" Jan 13 20:57:26.600335 containerd[1537]: time="2025-01-13T20:57:26.600247122Z" level=info msg="StopPodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\"" Jan 13 20:57:26.600335 containerd[1537]: time="2025-01-13T20:57:26.600300627Z" level=info msg="TearDown network for sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" successfully" Jan 13 20:57:26.600335 containerd[1537]: time="2025-01-13T20:57:26.600307574Z" level=info msg="StopPodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" returns successfully" Jan 13 20:57:26.602472 containerd[1537]: time="2025-01-13T20:57:26.602433545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-7rrvw,Uid:86a1a104-aa85-4352-b08c-36f54c7172c1,Namespace:calico-apiserver,Attempt:4,}" Jan 13 20:57:26.606680 containerd[1537]: time="2025-01-13T20:57:26.606521596Z" level=info msg="Ensure that sandbox 7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747 in task-service has been cleanup successfully" Jan 13 20:57:26.606680 containerd[1537]: time="2025-01-13T20:57:26.606682092Z" level=info msg="TearDown network for sandbox \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\" successfully" Jan 13 20:57:26.607030 containerd[1537]: time="2025-01-13T20:57:26.606694674Z" level=info msg="StopPodSandbox for \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\" returns successfully" Jan 13 20:57:26.607030 containerd[1537]: time="2025-01-13T20:57:26.606958203Z" level=info msg="StopPodSandbox for \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\"" Jan 13 20:57:26.607030 containerd[1537]: time="2025-01-13T20:57:26.607002608Z" level=info msg="TearDown network for sandbox \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\" successfully" Jan 13 20:57:26.607030 containerd[1537]: time="2025-01-13T20:57:26.607008466Z" level=info msg="StopPodSandbox for \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\" returns successfully" Jan 13 20:57:26.607437 containerd[1537]: time="2025-01-13T20:57:26.607392141Z" level=info msg="StopPodSandbox for \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\"" Jan 13 20:57:26.607437 containerd[1537]: time="2025-01-13T20:57:26.607429292Z" level=info msg="TearDown network for sandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\" successfully" Jan 13 20:57:26.607437 containerd[1537]: time="2025-01-13T20:57:26.607434784Z" level=info msg="StopPodSandbox for \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\" returns successfully" Jan 13 20:57:26.607724 containerd[1537]: time="2025-01-13T20:57:26.607540715Z" level=info msg="StopPodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\"" Jan 13 20:57:26.607724 containerd[1537]: time="2025-01-13T20:57:26.607573746Z" level=info msg="TearDown network for sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" successfully" Jan 13 20:57:26.607724 containerd[1537]: time="2025-01-13T20:57:26.607578726Z" level=info msg="StopPodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" returns successfully" Jan 13 20:57:26.608220 containerd[1537]: time="2025-01-13T20:57:26.608128763Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-67ltf,Uid:6a232370-fd20-47eb-a2e3-9b0d1d786995,Namespace:calico-apiserver,Attempt:4,}" Jan 13 20:57:26.632492 systemd[1]: run-netns-cni\x2da8ea66b9\x2d43ad\x2d945c\x2d5225\x2d7c6da89c5594.mount: Deactivated successfully. Jan 13 20:57:26.633920 systemd[1]: run-netns-cni\x2d581e733c\x2db475\x2d83f4\x2d351b\x2d7bbfb2fd29cf.mount: Deactivated successfully. Jan 13 20:57:26.633977 systemd[1]: run-netns-cni\x2da9b95f74\x2d052b\x2d3445\x2d8161\x2d286fe7ef23cc.mount: Deactivated successfully. Jan 13 20:57:26.634022 systemd[1]: run-netns-cni\x2dd084c626\x2d3479\x2d47ea\x2d0fec\x2d5baefc52a378.mount: Deactivated successfully. Jan 13 20:57:26.634057 systemd[1]: run-netns-cni\x2d82a7dc45\x2d114f\x2d075c\x2db56a\x2d95c0f91658c1.mount: Deactivated successfully. Jan 13 20:57:26.794218 containerd[1537]: time="2025-01-13T20:57:26.793221141Z" level=error msg="Failed to destroy network for sandbox \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.797351 containerd[1537]: time="2025-01-13T20:57:26.797331336Z" level=error msg="encountered an error cleaning up failed sandbox \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.797453 containerd[1537]: time="2025-01-13T20:57:26.797439496Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.797662 kubelet[2787]: E0113 20:57:26.797639 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.797704 kubelet[2787]: E0113 20:57:26.797677 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:26.797704 kubelet[2787]: E0113 20:57:26.797694 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:26.797747 kubelet[2787]: E0113 20:57:26.797721 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-k7f2m" podUID="fa68722f-5974-4f04-9b6e-46b8f479c300" Jan 13 20:57:26.832680 containerd[1537]: time="2025-01-13T20:57:26.832646984Z" level=error msg="Failed to destroy network for sandbox \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.832899 containerd[1537]: time="2025-01-13T20:57:26.832872684Z" level=error msg="encountered an error cleaning up failed sandbox \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.832942 containerd[1537]: time="2025-01-13T20:57:26.832917088Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-7rrvw,Uid:86a1a104-aa85-4352-b08c-36f54c7172c1,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.833296 kubelet[2787]: E0113 20:57:26.833047 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.833296 kubelet[2787]: E0113 20:57:26.833096 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" Jan 13 20:57:26.833296 kubelet[2787]: E0113 20:57:26.833111 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" Jan 13 20:57:26.833401 kubelet[2787]: E0113 20:57:26.833138 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-867f57c995-7rrvw_calico-apiserver(86a1a104-aa85-4352-b08c-36f54c7172c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-867f57c995-7rrvw_calico-apiserver(86a1a104-aa85-4352-b08c-36f54c7172c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" podUID="86a1a104-aa85-4352-b08c-36f54c7172c1" Jan 13 20:57:26.846169 containerd[1537]: time="2025-01-13T20:57:26.846133991Z" level=error msg="Failed to destroy network for sandbox \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.846398 containerd[1537]: time="2025-01-13T20:57:26.846329312Z" level=error msg="encountered an error cleaning up failed sandbox \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.846398 containerd[1537]: time="2025-01-13T20:57:26.846367016Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.846715 kubelet[2787]: E0113 20:57:26.846494 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.846715 kubelet[2787]: E0113 20:57:26.846530 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:26.846715 kubelet[2787]: E0113 20:57:26.846543 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:26.846800 kubelet[2787]: E0113 20:57:26.846567 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6df8b96c48-4j6ml_calico-system(be5d865c-3359-4beb-8044-736dced88771)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6df8b96c48-4j6ml_calico-system(be5d865c-3359-4beb-8044-736dced88771)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" podUID="be5d865c-3359-4beb-8044-736dced88771" Jan 13 20:57:26.850495 containerd[1537]: time="2025-01-13T20:57:26.850400436Z" level=error msg="Failed to destroy network for sandbox \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.850819 containerd[1537]: time="2025-01-13T20:57:26.850674140Z" level=error msg="encountered an error cleaning up failed sandbox \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.850819 containerd[1537]: time="2025-01-13T20:57:26.850711398Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.850940 kubelet[2787]: E0113 20:57:26.850918 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.850988 kubelet[2787]: E0113 20:57:26.850954 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:26.850988 kubelet[2787]: E0113 20:57:26.850979 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:26.851043 kubelet[2787]: E0113 20:57:26.851008 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-9c7f5_kube-system(2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-9c7f5_kube-system(2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-9c7f5" podUID="2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9" Jan 13 20:57:26.858672 containerd[1537]: time="2025-01-13T20:57:26.858528794Z" level=error msg="Failed to destroy network for sandbox \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.859012 containerd[1537]: time="2025-01-13T20:57:26.858985627Z" level=error msg="encountered an error cleaning up failed sandbox \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.859062 containerd[1537]: time="2025-01-13T20:57:26.859025241Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-67ltf,Uid:6a232370-fd20-47eb-a2e3-9b0d1d786995,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.859695 kubelet[2787]: E0113 20:57:26.859671 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.860653 kubelet[2787]: E0113 20:57:26.859704 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" Jan 13 20:57:26.860653 kubelet[2787]: E0113 20:57:26.859717 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" Jan 13 20:57:26.860653 kubelet[2787]: E0113 20:57:26.859741 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-867f57c995-67ltf_calico-apiserver(6a232370-fd20-47eb-a2e3-9b0d1d786995)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-867f57c995-67ltf_calico-apiserver(6a232370-fd20-47eb-a2e3-9b0d1d786995)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" podUID="6a232370-fd20-47eb-a2e3-9b0d1d786995" Jan 13 20:57:26.860816 containerd[1537]: time="2025-01-13T20:57:26.859881818Z" level=error msg="Failed to destroy network for sandbox \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.860816 containerd[1537]: time="2025-01-13T20:57:26.860431259Z" level=error msg="encountered an error cleaning up failed sandbox \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.860816 containerd[1537]: time="2025-01-13T20:57:26.860462970Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.860894 kubelet[2787]: E0113 20:57:26.860549 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:26.860894 kubelet[2787]: E0113 20:57:26.860572 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:26.860894 kubelet[2787]: E0113 20:57:26.860604 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:26.860978 kubelet[2787]: E0113 20:57:26.860633 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jkzvs_calico-system(156fa3f2-d364-43dd-86de-274512f7d213)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jkzvs_calico-system(156fa3f2-d364-43dd-86de-274512f7d213)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jkzvs" podUID="156fa3f2-d364-43dd-86de-274512f7d213" Jan 13 20:57:27.154601 containerd[1537]: time="2025-01-13T20:57:27.153602293Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:27.188778 containerd[1537]: time="2025-01-13T20:57:27.188737738Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Jan 13 20:57:27.198710 containerd[1537]: time="2025-01-13T20:57:27.198667025Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:27.209384 containerd[1537]: time="2025-01-13T20:57:27.208699923Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:27.221695 containerd[1537]: time="2025-01-13T20:57:27.221660362Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 7.028352415s" Jan 13 20:57:27.221805 containerd[1537]: time="2025-01-13T20:57:27.221795112Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Jan 13 20:57:27.434450 containerd[1537]: time="2025-01-13T20:57:27.434423497Z" level=info msg="CreateContainer within sandbox \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 13 20:57:27.555337 containerd[1537]: time="2025-01-13T20:57:27.555304605Z" level=info msg="CreateContainer within sandbox \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111\"" Jan 13 20:57:27.572333 containerd[1537]: time="2025-01-13T20:57:27.572303029Z" level=info msg="StartContainer for \"746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111\"" Jan 13 20:57:27.606770 kubelet[2787]: I0113 20:57:27.606735 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca" Jan 13 20:57:27.607836 containerd[1537]: time="2025-01-13T20:57:27.607806230Z" level=info msg="StopPodSandbox for \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\"" Jan 13 20:57:27.608105 containerd[1537]: time="2025-01-13T20:57:27.608037820Z" level=info msg="Ensure that sandbox e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca in task-service has been cleanup successfully" Jan 13 20:57:27.608460 containerd[1537]: time="2025-01-13T20:57:27.608416517Z" level=info msg="TearDown network for sandbox \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\" successfully" Jan 13 20:57:27.608460 containerd[1537]: time="2025-01-13T20:57:27.608426108Z" level=info msg="StopPodSandbox for \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\" returns successfully" Jan 13 20:57:27.608739 containerd[1537]: time="2025-01-13T20:57:27.608721761Z" level=info msg="StopPodSandbox for \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\"" Jan 13 20:57:27.608779 containerd[1537]: time="2025-01-13T20:57:27.608771374Z" level=info msg="TearDown network for sandbox \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\" successfully" Jan 13 20:57:27.608803 containerd[1537]: time="2025-01-13T20:57:27.608779505Z" level=info msg="StopPodSandbox for \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\" returns successfully" Jan 13 20:57:27.609321 containerd[1537]: time="2025-01-13T20:57:27.609245358Z" level=info msg="StopPodSandbox for \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\"" Jan 13 20:57:27.609610 containerd[1537]: time="2025-01-13T20:57:27.609553705Z" level=info msg="TearDown network for sandbox \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\" successfully" Jan 13 20:57:27.609610 containerd[1537]: time="2025-01-13T20:57:27.609563607Z" level=info msg="StopPodSandbox for \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\" returns successfully" Jan 13 20:57:27.610431 containerd[1537]: time="2025-01-13T20:57:27.609971813Z" level=info msg="StopPodSandbox for \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\"" Jan 13 20:57:27.610431 containerd[1537]: time="2025-01-13T20:57:27.610014269Z" level=info msg="TearDown network for sandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\" successfully" Jan 13 20:57:27.610431 containerd[1537]: time="2025-01-13T20:57:27.610020461Z" level=info msg="StopPodSandbox for \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\" returns successfully" Jan 13 20:57:27.611375 containerd[1537]: time="2025-01-13T20:57:27.610734947Z" level=info msg="StopPodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\"" Jan 13 20:57:27.611375 containerd[1537]: time="2025-01-13T20:57:27.610923800Z" level=info msg="TearDown network for sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" successfully" Jan 13 20:57:27.611375 containerd[1537]: time="2025-01-13T20:57:27.610931936Z" level=info msg="StopPodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" returns successfully" Jan 13 20:57:27.611375 containerd[1537]: time="2025-01-13T20:57:27.611154918Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-67ltf,Uid:6a232370-fd20-47eb-a2e3-9b0d1d786995,Namespace:calico-apiserver,Attempt:5,}" Jan 13 20:57:27.612310 kubelet[2787]: I0113 20:57:27.612294 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac" Jan 13 20:57:27.612594 containerd[1537]: time="2025-01-13T20:57:27.612558749Z" level=info msg="StopPodSandbox for \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\"" Jan 13 20:57:27.614018 kubelet[2787]: I0113 20:57:27.613807 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b" Jan 13 20:57:27.614593 containerd[1537]: time="2025-01-13T20:57:27.614366813Z" level=info msg="Ensure that sandbox 2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac in task-service has been cleanup successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.614697441Z" level=info msg="StopPodSandbox for \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\"" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.614809494Z" level=info msg="Ensure that sandbox fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b in task-service has been cleanup successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.615547346Z" level=info msg="TearDown network for sandbox \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\" successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.615559669Z" level=info msg="StopPodSandbox for \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\" returns successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.615938622Z" level=info msg="TearDown network for sandbox \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\" successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.615946448Z" level=info msg="StopPodSandbox for \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\" returns successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616000736Z" level=info msg="StopPodSandbox for \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\"" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616057315Z" level=info msg="StopPodSandbox for \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\"" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616064502Z" level=info msg="TearDown network for sandbox \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\" successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616072209Z" level=info msg="StopPodSandbox for \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\" returns successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616096923Z" level=info msg="TearDown network for sandbox \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\" successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616123359Z" level=info msg="StopPodSandbox for \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\" returns successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616325118Z" level=info msg="StopPodSandbox for \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\"" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616360703Z" level=info msg="TearDown network for sandbox \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\" successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616366831Z" level=info msg="StopPodSandbox for \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\" returns successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616417910Z" level=info msg="StopPodSandbox for \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\"" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616448368Z" level=info msg="TearDown network for sandbox \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\" successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616453450Z" level=info msg="StopPodSandbox for \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\" returns successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616630221Z" level=info msg="StopPodSandbox for \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\"" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616710167Z" level=info msg="StopPodSandbox for \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\"" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616770378Z" level=info msg="TearDown network for sandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\" successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616778365Z" level=info msg="StopPodSandbox for \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\" returns successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616775022Z" level=info msg="TearDown network for sandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\" successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616798147Z" level=info msg="StopPodSandbox for \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\" returns successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.616957089Z" level=info msg="StopPodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\"" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.617009057Z" level=info msg="TearDown network for sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.617018399Z" level=info msg="StopPodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" returns successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.617045578Z" level=info msg="StopPodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\"" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.617079573Z" level=info msg="TearDown network for sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.617085139Z" level=info msg="StopPodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" returns successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.617330175Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\"" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.617430126Z" level=info msg="TearDown network for sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.617447662Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" returns successfully" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.617725228Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:6,}" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.618006971Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-7rrvw,Uid:86a1a104-aa85-4352-b08c-36f54c7172c1,Namespace:calico-apiserver,Attempt:5,}" Jan 13 20:57:27.622188 containerd[1537]: time="2025-01-13T20:57:27.618252952Z" level=info msg="StopPodSandbox for \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\"" Jan 13 20:57:27.623410 kubelet[2787]: I0113 20:57:27.617579 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596" Jan 13 20:57:27.623410 kubelet[2787]: I0113 20:57:27.621067 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.618371620Z" level=info msg="Ensure that sandbox baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596 in task-service has been cleanup successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.618496645Z" level=info msg="TearDown network for sandbox \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\" successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.618505125Z" level=info msg="StopPodSandbox for \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\" returns successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.618874517Z" level=info msg="StopPodSandbox for \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\"" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.619004420Z" level=info msg="TearDown network for sandbox \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\" successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.619012814Z" level=info msg="StopPodSandbox for \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\" returns successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.619258962Z" level=info msg="StopPodSandbox for \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\"" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.619297566Z" level=info msg="TearDown network for sandbox \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\" successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.619306247Z" level=info msg="StopPodSandbox for \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\" returns successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.619434749Z" level=info msg="StopPodSandbox for \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\"" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.619485700Z" level=info msg="TearDown network for sandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\" successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.619495816Z" level=info msg="StopPodSandbox for \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\" returns successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.619775390Z" level=info msg="StopPodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\"" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.620045343Z" level=info msg="TearDown network for sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.620055226Z" level=info msg="StopPodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" returns successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.620213620Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\"" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.620429649Z" level=info msg="TearDown network for sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.620437872Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" returns successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.620637082Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:6,}" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.621469477Z" level=info msg="StopPodSandbox for \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\"" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.621687270Z" level=info msg="Ensure that sandbox 9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a in task-service has been cleanup successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.621845191Z" level=info msg="TearDown network for sandbox \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\" successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.621854547Z" level=info msg="StopPodSandbox for \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\" returns successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.622067711Z" level=info msg="StopPodSandbox for \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\"" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.622112198Z" level=info msg="TearDown network for sandbox \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\" successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.622120000Z" level=info msg="StopPodSandbox for \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\" returns successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.623076864Z" level=info msg="StopPodSandbox for \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\"" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.623125662Z" level=info msg="TearDown network for sandbox \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\" successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.623154502Z" level=info msg="StopPodSandbox for \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\" returns successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.623297438Z" level=info msg="StopPodSandbox for \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\"" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.623333093Z" level=info msg="TearDown network for sandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\" successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.623338870Z" level=info msg="StopPodSandbox for \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\" returns successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.623459981Z" level=info msg="StopPodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\"" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.623511149Z" level=info msg="TearDown network for sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" successfully" Jan 13 20:57:27.623600 containerd[1537]: time="2025-01-13T20:57:27.623517747Z" level=info msg="StopPodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" returns successfully" Jan 13 20:57:27.627298 kubelet[2787]: I0113 20:57:27.623487 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.623738184Z" level=info msg="StopPodSandbox for \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\"" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.624639836Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\"" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.624698602Z" level=info msg="Ensure that sandbox 2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be in task-service has been cleanup successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.624777610Z" level=info msg="TearDown network for sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.624785278Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" returns successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.625012380Z" level=info msg="TearDown network for sandbox \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\" successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.625023731Z" level=info msg="StopPodSandbox for \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\" returns successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.625478838Z" level=info msg="StopPodSandbox for \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\"" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.625557726Z" level=info msg="TearDown network for sandbox \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\" successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.625602562Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\"" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.625644936Z" level=info msg="TearDown network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.625678073Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" returns successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.625662885Z" level=info msg="StopPodSandbox for \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\" returns successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.625965493Z" level=info msg="StopPodSandbox for \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\"" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.626120782Z" level=info msg="TearDown network for sandbox \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\" successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.626172717Z" level=info msg="StopPodSandbox for \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\" returns successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.626255885Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:7,}" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.626436761Z" level=info msg="StopPodSandbox for \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\"" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.626486298Z" level=info msg="TearDown network for sandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\" successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.626497627Z" level=info msg="StopPodSandbox for \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\" returns successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.626748625Z" level=info msg="StopPodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\"" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.626799300Z" level=info msg="TearDown network for sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.626806723Z" level=info msg="StopPodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" returns successfully" Jan 13 20:57:27.627355 containerd[1537]: time="2025-01-13T20:57:27.626953239Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\"" Jan 13 20:57:27.630379 containerd[1537]: time="2025-01-13T20:57:27.627428188Z" level=info msg="TearDown network for sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" successfully" Jan 13 20:57:27.630379 containerd[1537]: time="2025-01-13T20:57:27.627441091Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" returns successfully" Jan 13 20:57:27.630379 containerd[1537]: time="2025-01-13T20:57:27.630001626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:6,}" Jan 13 20:57:27.631287 systemd[1]: run-netns-cni\x2d18244882\x2d373b\x2d393c\x2d71be\x2d6a9cd9fecb57.mount: Deactivated successfully. Jan 13 20:57:27.631380 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be-shm.mount: Deactivated successfully. Jan 13 20:57:27.631419 systemd[1]: run-netns-cni\x2d98db6b98\x2d8117\x2da2bb\x2d68ad\x2d4165f6be0331.mount: Deactivated successfully. Jan 13 20:57:27.631560 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596-shm.mount: Deactivated successfully. Jan 13 20:57:27.631646 systemd[1]: run-netns-cni\x2dea8d93f5\x2d1041\x2d60bf\x2d647c\x2de617b89ec23d.mount: Deactivated successfully. Jan 13 20:57:27.631685 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a-shm.mount: Deactivated successfully. Jan 13 20:57:27.631746 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2088711286.mount: Deactivated successfully. Jan 13 20:57:27.843525 containerd[1537]: time="2025-01-13T20:57:27.843417727Z" level=error msg="Failed to destroy network for sandbox \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.844696 containerd[1537]: time="2025-01-13T20:57:27.844679053Z" level=error msg="encountered an error cleaning up failed sandbox \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.844805 containerd[1537]: time="2025-01-13T20:57:27.844787438Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.846120 kubelet[2787]: E0113 20:57:27.845012 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.846120 kubelet[2787]: E0113 20:57:27.845066 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:27.846120 kubelet[2787]: E0113 20:57:27.845081 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9c7f5" Jan 13 20:57:27.846751 kubelet[2787]: E0113 20:57:27.845105 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-9c7f5_kube-system(2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-9c7f5_kube-system(2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-9c7f5" podUID="2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9" Jan 13 20:57:27.856639 containerd[1537]: time="2025-01-13T20:57:27.856550508Z" level=error msg="Failed to destroy network for sandbox \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.856995 containerd[1537]: time="2025-01-13T20:57:27.856897388Z" level=error msg="encountered an error cleaning up failed sandbox \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.856995 containerd[1537]: time="2025-01-13T20:57:27.856932327Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.857751 kubelet[2787]: E0113 20:57:27.857058 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.857751 kubelet[2787]: E0113 20:57:27.857104 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:27.857751 kubelet[2787]: E0113 20:57:27.857117 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jkzvs" Jan 13 20:57:27.857868 kubelet[2787]: E0113 20:57:27.857144 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jkzvs_calico-system(156fa3f2-d364-43dd-86de-274512f7d213)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jkzvs_calico-system(156fa3f2-d364-43dd-86de-274512f7d213)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jkzvs" podUID="156fa3f2-d364-43dd-86de-274512f7d213" Jan 13 20:57:27.858314 containerd[1537]: time="2025-01-13T20:57:27.858231470Z" level=error msg="Failed to destroy network for sandbox \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.860140 containerd[1537]: time="2025-01-13T20:57:27.860025569Z" level=error msg="encountered an error cleaning up failed sandbox \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.860140 containerd[1537]: time="2025-01-13T20:57:27.860064654Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-67ltf,Uid:6a232370-fd20-47eb-a2e3-9b0d1d786995,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.864777 containerd[1537]: time="2025-01-13T20:57:27.863273161Z" level=error msg="Failed to destroy network for sandbox \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.864777 containerd[1537]: time="2025-01-13T20:57:27.863589864Z" level=error msg="encountered an error cleaning up failed sandbox \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.864777 containerd[1537]: time="2025-01-13T20:57:27.863626462Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-7rrvw,Uid:86a1a104-aa85-4352-b08c-36f54c7172c1,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.864982 kubelet[2787]: E0113 20:57:27.860182 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.864982 kubelet[2787]: E0113 20:57:27.860222 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" Jan 13 20:57:27.864982 kubelet[2787]: E0113 20:57:27.860236 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" Jan 13 20:57:27.865080 kubelet[2787]: E0113 20:57:27.860273 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-867f57c995-67ltf_calico-apiserver(6a232370-fd20-47eb-a2e3-9b0d1d786995)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-867f57c995-67ltf_calico-apiserver(6a232370-fd20-47eb-a2e3-9b0d1d786995)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" podUID="6a232370-fd20-47eb-a2e3-9b0d1d786995" Jan 13 20:57:27.865080 kubelet[2787]: E0113 20:57:27.863746 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.865080 kubelet[2787]: E0113 20:57:27.863779 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" Jan 13 20:57:27.865154 kubelet[2787]: E0113 20:57:27.863805 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" Jan 13 20:57:27.865154 kubelet[2787]: E0113 20:57:27.863857 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-867f57c995-7rrvw_calico-apiserver(86a1a104-aa85-4352-b08c-36f54c7172c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-867f57c995-7rrvw_calico-apiserver(86a1a104-aa85-4352-b08c-36f54c7172c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" podUID="86a1a104-aa85-4352-b08c-36f54c7172c1" Jan 13 20:57:27.865879 containerd[1537]: time="2025-01-13T20:57:27.865787664Z" level=error msg="Failed to destroy network for sandbox \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.866116 containerd[1537]: time="2025-01-13T20:57:27.866032087Z" level=error msg="encountered an error cleaning up failed sandbox \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.866116 containerd[1537]: time="2025-01-13T20:57:27.866065243Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:7,} failed, error" error="failed to setup network for sandbox \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.866200 kubelet[2787]: E0113 20:57:27.866179 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.866227 kubelet[2787]: E0113 20:57:27.866213 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:27.866267 kubelet[2787]: E0113 20:57:27.866232 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k7f2m" Jan 13 20:57:27.866267 kubelet[2787]: E0113 20:57:27.866257 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-k7f2m_kube-system(fa68722f-5974-4f04-9b6e-46b8f479c300)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-k7f2m" podUID="fa68722f-5974-4f04-9b6e-46b8f479c300" Jan 13 20:57:27.867732 containerd[1537]: time="2025-01-13T20:57:27.867282480Z" level=error msg="Failed to destroy network for sandbox \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.867938 containerd[1537]: time="2025-01-13T20:57:27.867923996Z" level=error msg="encountered an error cleaning up failed sandbox \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.868196 containerd[1537]: time="2025-01-13T20:57:27.868026577Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.868422 kubelet[2787]: E0113 20:57:27.868329 2787 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:57:27.868422 kubelet[2787]: E0113 20:57:27.868359 2787 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:27.868422 kubelet[2787]: E0113 20:57:27.868372 2787 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" Jan 13 20:57:27.868489 kubelet[2787]: E0113 20:57:27.868399 2787 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6df8b96c48-4j6ml_calico-system(be5d865c-3359-4beb-8044-736dced88771)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6df8b96c48-4j6ml_calico-system(be5d865c-3359-4beb-8044-736dced88771)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" podUID="be5d865c-3359-4beb-8044-736dced88771" Jan 13 20:57:27.976971 systemd[1]: Started cri-containerd-746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111.scope - libcontainer container 746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111. Jan 13 20:57:28.050265 containerd[1537]: time="2025-01-13T20:57:28.050194886Z" level=info msg="StartContainer for \"746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111\" returns successfully" Jan 13 20:57:28.377850 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 13 20:57:28.378432 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 13 20:57:28.632463 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a-shm.mount: Deactivated successfully. Jan 13 20:57:28.632552 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb-shm.mount: Deactivated successfully. Jan 13 20:57:28.632607 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd-shm.mount: Deactivated successfully. Jan 13 20:57:28.632652 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42-shm.mount: Deactivated successfully. Jan 13 20:57:28.688793 kubelet[2787]: I0113 20:57:28.688262 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd" Jan 13 20:57:28.689215 containerd[1537]: time="2025-01-13T20:57:28.688891036Z" level=info msg="StopPodSandbox for \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\"" Jan 13 20:57:28.689215 containerd[1537]: time="2025-01-13T20:57:28.689046889Z" level=info msg="Ensure that sandbox 27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd in task-service has been cleanup successfully" Jan 13 20:57:28.693425 containerd[1537]: time="2025-01-13T20:57:28.693356441Z" level=info msg="TearDown network for sandbox \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\" successfully" Jan 13 20:57:28.693425 containerd[1537]: time="2025-01-13T20:57:28.693390204Z" level=info msg="StopPodSandbox for \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\" returns successfully" Jan 13 20:57:28.693928 containerd[1537]: time="2025-01-13T20:57:28.693620828Z" level=info msg="StopPodSandbox for \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\"" Jan 13 20:57:28.693928 containerd[1537]: time="2025-01-13T20:57:28.693686686Z" level=info msg="TearDown network for sandbox \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\" successfully" Jan 13 20:57:28.693928 containerd[1537]: time="2025-01-13T20:57:28.693697062Z" level=info msg="StopPodSandbox for \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\" returns successfully" Jan 13 20:57:28.693928 containerd[1537]: time="2025-01-13T20:57:28.693869062Z" level=info msg="StopPodSandbox for \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\"" Jan 13 20:57:28.693928 containerd[1537]: time="2025-01-13T20:57:28.693911131Z" level=info msg="TearDown network for sandbox \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\" successfully" Jan 13 20:57:28.693928 containerd[1537]: time="2025-01-13T20:57:28.693917825Z" level=info msg="StopPodSandbox for \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\" returns successfully" Jan 13 20:57:28.694816 containerd[1537]: time="2025-01-13T20:57:28.694324470Z" level=info msg="StopPodSandbox for \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\"" Jan 13 20:57:28.694816 containerd[1537]: time="2025-01-13T20:57:28.694367235Z" level=info msg="TearDown network for sandbox \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\" successfully" Jan 13 20:57:28.694816 containerd[1537]: time="2025-01-13T20:57:28.694373978Z" level=info msg="StopPodSandbox for \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\" returns successfully" Jan 13 20:57:28.694541 systemd[1]: run-netns-cni\x2d9d4225b1\x2d3125\x2d373b\x2d5bec\x2da2ee77aa5ba6.mount: Deactivated successfully. Jan 13 20:57:28.695716 containerd[1537]: time="2025-01-13T20:57:28.695695218Z" level=info msg="StopPodSandbox for \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\"" Jan 13 20:57:28.697159 containerd[1537]: time="2025-01-13T20:57:28.696850769Z" level=info msg="TearDown network for sandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\" successfully" Jan 13 20:57:28.697159 containerd[1537]: time="2025-01-13T20:57:28.696864974Z" level=info msg="StopPodSandbox for \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\" returns successfully" Jan 13 20:57:28.697811 containerd[1537]: time="2025-01-13T20:57:28.697705701Z" level=info msg="StopPodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\"" Jan 13 20:57:28.698097 kubelet[2787]: I0113 20:57:28.698080 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb" Jan 13 20:57:28.698199 containerd[1537]: time="2025-01-13T20:57:28.698186330Z" level=info msg="TearDown network for sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" successfully" Jan 13 20:57:28.698544 containerd[1537]: time="2025-01-13T20:57:28.698522957Z" level=info msg="StopPodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" returns successfully" Jan 13 20:57:28.698792 containerd[1537]: time="2025-01-13T20:57:28.698781103Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\"" Jan 13 20:57:28.699244 containerd[1537]: time="2025-01-13T20:57:28.699174052Z" level=info msg="TearDown network for sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" successfully" Jan 13 20:57:28.699500 containerd[1537]: time="2025-01-13T20:57:28.699487507Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" returns successfully" Jan 13 20:57:28.699573 containerd[1537]: time="2025-01-13T20:57:28.699407443Z" level=info msg="StopPodSandbox for \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\"" Jan 13 20:57:28.699914 containerd[1537]: time="2025-01-13T20:57:28.699901496Z" level=info msg="Ensure that sandbox 568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb in task-service has been cleanup successfully" Jan 13 20:57:28.700572 containerd[1537]: time="2025-01-13T20:57:28.700285707Z" level=info msg="TearDown network for sandbox \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\" successfully" Jan 13 20:57:28.700572 containerd[1537]: time="2025-01-13T20:57:28.700296897Z" level=info msg="StopPodSandbox for \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\" returns successfully" Jan 13 20:57:28.700572 containerd[1537]: time="2025-01-13T20:57:28.700024616Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:7,}" Jan 13 20:57:28.704712 systemd[1]: run-netns-cni\x2d4bc0f66d\x2d3369\x2d700d\x2d7b6c\x2d2200d4dbf2dc.mount: Deactivated successfully. Jan 13 20:57:28.705903 containerd[1537]: time="2025-01-13T20:57:28.705076761Z" level=info msg="StopPodSandbox for \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\"" Jan 13 20:57:28.708929 kubelet[2787]: I0113 20:57:28.708607 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42" Jan 13 20:57:28.710795 containerd[1537]: time="2025-01-13T20:57:28.709700358Z" level=info msg="TearDown network for sandbox \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\" successfully" Jan 13 20:57:28.710985 containerd[1537]: time="2025-01-13T20:57:28.708995382Z" level=info msg="StopPodSandbox for \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\"" Jan 13 20:57:28.713097 containerd[1537]: time="2025-01-13T20:57:28.712999075Z" level=info msg="StopPodSandbox for \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\" returns successfully" Jan 13 20:57:28.713869 containerd[1537]: time="2025-01-13T20:57:28.713727928Z" level=info msg="Ensure that sandbox 0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42 in task-service has been cleanup successfully" Jan 13 20:57:28.715989 containerd[1537]: time="2025-01-13T20:57:28.715872262Z" level=info msg="TearDown network for sandbox \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\" successfully" Jan 13 20:57:28.715989 containerd[1537]: time="2025-01-13T20:57:28.715892086Z" level=info msg="StopPodSandbox for \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\" returns successfully" Jan 13 20:57:28.727437 containerd[1537]: time="2025-01-13T20:57:28.725585522Z" level=info msg="StopPodSandbox for \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\"" Jan 13 20:57:28.727437 containerd[1537]: time="2025-01-13T20:57:28.725677373Z" level=info msg="TearDown network for sandbox \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\" successfully" Jan 13 20:57:28.727437 containerd[1537]: time="2025-01-13T20:57:28.725686376Z" level=info msg="StopPodSandbox for \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\" returns successfully" Jan 13 20:57:28.726188 systemd[1]: run-netns-cni\x2df8980cc6\x2df045\x2d4c38\x2ddd70\x2d11f3452d2223.mount: Deactivated successfully. Jan 13 20:57:28.733639 containerd[1537]: time="2025-01-13T20:57:28.733532705Z" level=info msg="StopPodSandbox for \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\"" Jan 13 20:57:28.733639 containerd[1537]: time="2025-01-13T20:57:28.733597972Z" level=info msg="TearDown network for sandbox \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\" successfully" Jan 13 20:57:28.733639 containerd[1537]: time="2025-01-13T20:57:28.733605180Z" level=info msg="StopPodSandbox for \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\" returns successfully" Jan 13 20:57:28.736008 containerd[1537]: time="2025-01-13T20:57:28.735855532Z" level=info msg="StopPodSandbox for \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\"" Jan 13 20:57:28.736008 containerd[1537]: time="2025-01-13T20:57:28.735915841Z" level=info msg="TearDown network for sandbox \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\" successfully" Jan 13 20:57:28.736008 containerd[1537]: time="2025-01-13T20:57:28.735922730Z" level=info msg="StopPodSandbox for \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\" returns successfully" Jan 13 20:57:28.736008 containerd[1537]: time="2025-01-13T20:57:28.735946497Z" level=info msg="StopPodSandbox for \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\"" Jan 13 20:57:28.736008 containerd[1537]: time="2025-01-13T20:57:28.735976143Z" level=info msg="TearDown network for sandbox \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\" successfully" Jan 13 20:57:28.736008 containerd[1537]: time="2025-01-13T20:57:28.735981465Z" level=info msg="StopPodSandbox for \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\" returns successfully" Jan 13 20:57:28.737444 containerd[1537]: time="2025-01-13T20:57:28.737427555Z" level=info msg="StopPodSandbox for \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\"" Jan 13 20:57:28.738138 containerd[1537]: time="2025-01-13T20:57:28.738124012Z" level=info msg="TearDown network for sandbox \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\" successfully" Jan 13 20:57:28.738408 containerd[1537]: time="2025-01-13T20:57:28.738216389Z" level=info msg="StopPodSandbox for \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\" returns successfully" Jan 13 20:57:28.738408 containerd[1537]: time="2025-01-13T20:57:28.738281573Z" level=info msg="StopPodSandbox for \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\"" Jan 13 20:57:28.738408 containerd[1537]: time="2025-01-13T20:57:28.738330743Z" level=info msg="TearDown network for sandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\" successfully" Jan 13 20:57:28.738408 containerd[1537]: time="2025-01-13T20:57:28.738337442Z" level=info msg="StopPodSandbox for \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\" returns successfully" Jan 13 20:57:28.739106 containerd[1537]: time="2025-01-13T20:57:28.739093785Z" level=info msg="StopPodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\"" Jan 13 20:57:28.739202 containerd[1537]: time="2025-01-13T20:57:28.739193142Z" level=info msg="TearDown network for sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" successfully" Jan 13 20:57:28.739240 containerd[1537]: time="2025-01-13T20:57:28.739233195Z" level=info msg="StopPodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" returns successfully" Jan 13 20:57:28.739348 containerd[1537]: time="2025-01-13T20:57:28.739338479Z" level=info msg="StopPodSandbox for \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\"" Jan 13 20:57:28.739426 containerd[1537]: time="2025-01-13T20:57:28.739416529Z" level=info msg="TearDown network for sandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\" successfully" Jan 13 20:57:28.739476 containerd[1537]: time="2025-01-13T20:57:28.739466451Z" level=info msg="StopPodSandbox for \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\" returns successfully" Jan 13 20:57:28.742070 containerd[1537]: time="2025-01-13T20:57:28.742051255Z" level=info msg="StopPodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\"" Jan 13 20:57:28.742570 containerd[1537]: time="2025-01-13T20:57:28.742187858Z" level=info msg="TearDown network for sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" successfully" Jan 13 20:57:28.742570 containerd[1537]: time="2025-01-13T20:57:28.742228725Z" level=info msg="StopPodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" returns successfully" Jan 13 20:57:28.742570 containerd[1537]: time="2025-01-13T20:57:28.742285426Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-7rrvw,Uid:86a1a104-aa85-4352-b08c-36f54c7172c1,Namespace:calico-apiserver,Attempt:6,}" Jan 13 20:57:28.743322 containerd[1537]: time="2025-01-13T20:57:28.743301830Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-67ltf,Uid:6a232370-fd20-47eb-a2e3-9b0d1d786995,Namespace:calico-apiserver,Attempt:6,}" Jan 13 20:57:28.754801 kubelet[2787]: I0113 20:57:28.754198 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a" Jan 13 20:57:28.757726 containerd[1537]: time="2025-01-13T20:57:28.757704701Z" level=info msg="StopPodSandbox for \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\"" Jan 13 20:57:28.758889 containerd[1537]: time="2025-01-13T20:57:28.758869831Z" level=info msg="Ensure that sandbox 1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a in task-service has been cleanup successfully" Jan 13 20:57:28.760858 containerd[1537]: time="2025-01-13T20:57:28.760833849Z" level=info msg="TearDown network for sandbox \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\" successfully" Jan 13 20:57:28.760963 containerd[1537]: time="2025-01-13T20:57:28.760952978Z" level=info msg="StopPodSandbox for \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\" returns successfully" Jan 13 20:57:28.763169 containerd[1537]: time="2025-01-13T20:57:28.763140738Z" level=info msg="StopPodSandbox for \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\"" Jan 13 20:57:28.763472 containerd[1537]: time="2025-01-13T20:57:28.763338047Z" level=info msg="TearDown network for sandbox \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\" successfully" Jan 13 20:57:28.763472 containerd[1537]: time="2025-01-13T20:57:28.763350125Z" level=info msg="StopPodSandbox for \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\" returns successfully" Jan 13 20:57:28.765092 containerd[1537]: time="2025-01-13T20:57:28.765079526Z" level=info msg="StopPodSandbox for \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\"" Jan 13 20:57:28.765208 containerd[1537]: time="2025-01-13T20:57:28.765198861Z" level=info msg="TearDown network for sandbox \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\" successfully" Jan 13 20:57:28.765256 containerd[1537]: time="2025-01-13T20:57:28.765248303Z" level=info msg="StopPodSandbox for \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\" returns successfully" Jan 13 20:57:28.767176 containerd[1537]: time="2025-01-13T20:57:28.767031665Z" level=info msg="StopPodSandbox for \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\"" Jan 13 20:57:28.767176 containerd[1537]: time="2025-01-13T20:57:28.767097219Z" level=info msg="TearDown network for sandbox \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\" successfully" Jan 13 20:57:28.767176 containerd[1537]: time="2025-01-13T20:57:28.767104842Z" level=info msg="StopPodSandbox for \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\" returns successfully" Jan 13 20:57:28.771540 containerd[1537]: time="2025-01-13T20:57:28.767426201Z" level=info msg="StopPodSandbox for \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\"" Jan 13 20:57:28.771540 containerd[1537]: time="2025-01-13T20:57:28.768084035Z" level=info msg="TearDown network for sandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\" successfully" Jan 13 20:57:28.771540 containerd[1537]: time="2025-01-13T20:57:28.768093909Z" level=info msg="StopPodSandbox for \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\" returns successfully" Jan 13 20:57:28.772197 containerd[1537]: time="2025-01-13T20:57:28.771786267Z" level=info msg="StopPodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\"" Jan 13 20:57:28.772197 containerd[1537]: time="2025-01-13T20:57:28.771880440Z" level=info msg="TearDown network for sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" successfully" Jan 13 20:57:28.772197 containerd[1537]: time="2025-01-13T20:57:28.771888723Z" level=info msg="StopPodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" returns successfully" Jan 13 20:57:28.773257 containerd[1537]: time="2025-01-13T20:57:28.772762455Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\"" Jan 13 20:57:28.773257 containerd[1537]: time="2025-01-13T20:57:28.773042453Z" level=info msg="TearDown network for sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" successfully" Jan 13 20:57:28.773257 containerd[1537]: time="2025-01-13T20:57:28.773054306Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" returns successfully" Jan 13 20:57:28.774152 kubelet[2787]: I0113 20:57:28.774044 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43" Jan 13 20:57:28.776248 containerd[1537]: time="2025-01-13T20:57:28.776049218Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:7,}" Jan 13 20:57:28.779566 containerd[1537]: time="2025-01-13T20:57:28.776277673Z" level=info msg="StopPodSandbox for \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\"" Jan 13 20:57:28.779566 containerd[1537]: time="2025-01-13T20:57:28.776390402Z" level=info msg="Ensure that sandbox 9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43 in task-service has been cleanup successfully" Jan 13 20:57:28.784278 containerd[1537]: time="2025-01-13T20:57:28.784257594Z" level=info msg="TearDown network for sandbox \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\" successfully" Jan 13 20:57:28.784384 containerd[1537]: time="2025-01-13T20:57:28.784375624Z" level=info msg="StopPodSandbox for \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\" returns successfully" Jan 13 20:57:28.785281 containerd[1537]: time="2025-01-13T20:57:28.785268405Z" level=info msg="StopPodSandbox for \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\"" Jan 13 20:57:28.785556 containerd[1537]: time="2025-01-13T20:57:28.785545789Z" level=info msg="TearDown network for sandbox \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\" successfully" Jan 13 20:57:28.785601 containerd[1537]: time="2025-01-13T20:57:28.785593860Z" level=info msg="StopPodSandbox for \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\" returns successfully" Jan 13 20:57:28.786308 containerd[1537]: time="2025-01-13T20:57:28.786275169Z" level=info msg="StopPodSandbox for \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\"" Jan 13 20:57:28.786999 containerd[1537]: time="2025-01-13T20:57:28.786980316Z" level=info msg="TearDown network for sandbox \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\" successfully" Jan 13 20:57:28.787065 containerd[1537]: time="2025-01-13T20:57:28.787054744Z" level=info msg="StopPodSandbox for \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\" returns successfully" Jan 13 20:57:28.816704 containerd[1537]: time="2025-01-13T20:57:28.816671602Z" level=info msg="StopPodSandbox for \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\"" Jan 13 20:57:28.818358 containerd[1537]: time="2025-01-13T20:57:28.818329254Z" level=info msg="TearDown network for sandbox \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\" successfully" Jan 13 20:57:28.819359 containerd[1537]: time="2025-01-13T20:57:28.819324866Z" level=info msg="StopPodSandbox for \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\" returns successfully" Jan 13 20:57:28.829790 kubelet[2787]: I0113 20:57:28.820216 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-wxzwq" podStartSLOduration=2.054135168 podStartE2EDuration="22.74928225s" podCreationTimestamp="2025-01-13 20:57:06 +0000 UTC" firstStartedPulling="2025-01-13 20:57:06.528847413 +0000 UTC m=+12.688185390" lastFinishedPulling="2025-01-13 20:57:27.2239945 +0000 UTC m=+33.383332472" observedRunningTime="2025-01-13 20:57:28.718365855 +0000 UTC m=+34.877703834" watchObservedRunningTime="2025-01-13 20:57:28.74928225 +0000 UTC m=+34.908620229" Jan 13 20:57:28.833040 containerd[1537]: time="2025-01-13T20:57:28.832968597Z" level=info msg="StopPodSandbox for \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\"" Jan 13 20:57:28.833373 containerd[1537]: time="2025-01-13T20:57:28.833233782Z" level=info msg="TearDown network for sandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\" successfully" Jan 13 20:57:28.833373 containerd[1537]: time="2025-01-13T20:57:28.833245805Z" level=info msg="StopPodSandbox for \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\" returns successfully" Jan 13 20:57:28.845315 containerd[1537]: time="2025-01-13T20:57:28.845167546Z" level=info msg="StopPodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\"" Jan 13 20:57:28.845315 containerd[1537]: time="2025-01-13T20:57:28.845245495Z" level=info msg="TearDown network for sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" successfully" Jan 13 20:57:28.845315 containerd[1537]: time="2025-01-13T20:57:28.845253215Z" level=info msg="StopPodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" returns successfully" Jan 13 20:57:28.846547 kubelet[2787]: I0113 20:57:28.846475 2787 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74" Jan 13 20:57:28.854747 containerd[1537]: time="2025-01-13T20:57:28.854487651Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\"" Jan 13 20:57:28.854747 containerd[1537]: time="2025-01-13T20:57:28.854567509Z" level=info msg="TearDown network for sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" successfully" Jan 13 20:57:28.854747 containerd[1537]: time="2025-01-13T20:57:28.854575183Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" returns successfully" Jan 13 20:57:28.856900 containerd[1537]: time="2025-01-13T20:57:28.856875212Z" level=info msg="StopPodSandbox for \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\"" Jan 13 20:57:28.857092 containerd[1537]: time="2025-01-13T20:57:28.857036884Z" level=info msg="Ensure that sandbox 7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74 in task-service has been cleanup successfully" Jan 13 20:57:28.857300 containerd[1537]: time="2025-01-13T20:57:28.857287599Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\"" Jan 13 20:57:28.857500 containerd[1537]: time="2025-01-13T20:57:28.857384660Z" level=info msg="TearDown network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" successfully" Jan 13 20:57:28.857500 containerd[1537]: time="2025-01-13T20:57:28.857393895Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" returns successfully" Jan 13 20:57:28.861183 containerd[1537]: time="2025-01-13T20:57:28.860885505Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:8,}" Jan 13 20:57:28.863904 containerd[1537]: time="2025-01-13T20:57:28.863817575Z" level=info msg="TearDown network for sandbox \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\" successfully" Jan 13 20:57:28.863904 containerd[1537]: time="2025-01-13T20:57:28.863858039Z" level=info msg="StopPodSandbox for \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\" returns successfully" Jan 13 20:57:28.865941 containerd[1537]: time="2025-01-13T20:57:28.864764657Z" level=info msg="StopPodSandbox for \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\"" Jan 13 20:57:28.865941 containerd[1537]: time="2025-01-13T20:57:28.864833787Z" level=info msg="TearDown network for sandbox \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\" successfully" Jan 13 20:57:28.865941 containerd[1537]: time="2025-01-13T20:57:28.864841183Z" level=info msg="StopPodSandbox for \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\" returns successfully" Jan 13 20:57:28.865941 containerd[1537]: time="2025-01-13T20:57:28.865587484Z" level=info msg="StopPodSandbox for \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\"" Jan 13 20:57:28.865941 containerd[1537]: time="2025-01-13T20:57:28.865645885Z" level=info msg="TearDown network for sandbox \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\" successfully" Jan 13 20:57:28.865941 containerd[1537]: time="2025-01-13T20:57:28.865653107Z" level=info msg="StopPodSandbox for \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\" returns successfully" Jan 13 20:57:28.867236 containerd[1537]: time="2025-01-13T20:57:28.867163137Z" level=info msg="StopPodSandbox for \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\"" Jan 13 20:57:28.867364 containerd[1537]: time="2025-01-13T20:57:28.867352947Z" level=info msg="TearDown network for sandbox \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\" successfully" Jan 13 20:57:28.867416 containerd[1537]: time="2025-01-13T20:57:28.867399627Z" level=info msg="StopPodSandbox for \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\" returns successfully" Jan 13 20:57:28.871423 containerd[1537]: time="2025-01-13T20:57:28.871399687Z" level=info msg="StopPodSandbox for \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\"" Jan 13 20:57:28.871628 containerd[1537]: time="2025-01-13T20:57:28.871615411Z" level=info msg="TearDown network for sandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\" successfully" Jan 13 20:57:28.871680 containerd[1537]: time="2025-01-13T20:57:28.871671792Z" level=info msg="StopPodSandbox for \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\" returns successfully" Jan 13 20:57:28.872495 containerd[1537]: time="2025-01-13T20:57:28.872103129Z" level=info msg="StopPodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\"" Jan 13 20:57:28.872967 containerd[1537]: time="2025-01-13T20:57:28.872948540Z" level=info msg="TearDown network for sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" successfully" Jan 13 20:57:28.873152 containerd[1537]: time="2025-01-13T20:57:28.873141925Z" level=info msg="StopPodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" returns successfully" Jan 13 20:57:28.875027 containerd[1537]: time="2025-01-13T20:57:28.874999965Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\"" Jan 13 20:57:28.875221 containerd[1537]: time="2025-01-13T20:57:28.875208727Z" level=info msg="TearDown network for sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" successfully" Jan 13 20:57:28.875272 containerd[1537]: time="2025-01-13T20:57:28.875261770Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" returns successfully" Jan 13 20:57:28.877214 containerd[1537]: time="2025-01-13T20:57:28.877181488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:7,}" Jan 13 20:57:29.415700 systemd-networkd[1464]: cali3e8daddf72e: Link UP Jan 13 20:57:29.415835 systemd-networkd[1464]: cali3e8daddf72e: Gained carrier Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:28.917 [INFO][4862] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:28.990 [INFO][4862] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0 calico-kube-controllers-6df8b96c48- calico-system be5d865c-3359-4beb-8044-736dced88771 743 0 2025-01-13 20:57:06 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6df8b96c48 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-6df8b96c48-4j6ml eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali3e8daddf72e [] []}} ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Namespace="calico-system" Pod="calico-kube-controllers-6df8b96c48-4j6ml" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-" Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:28.990 [INFO][4862] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Namespace="calico-system" Pod="calico-kube-controllers-6df8b96c48-4j6ml" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.343 [INFO][4931] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" HandleID="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Workload="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.366 [INFO][4931] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" HandleID="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Workload="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003be960), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-6df8b96c48-4j6ml", "timestamp":"2025-01-13 20:57:29.343539866 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.367 [INFO][4931] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.368 [INFO][4931] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.368 [INFO][4931] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.370 [INFO][4931] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" host="localhost" Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.377 [INFO][4931] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.381 [INFO][4931] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.383 [INFO][4931] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.385 [INFO][4931] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.385 [INFO][4931] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" host="localhost" Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.386 [INFO][4931] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.390 [INFO][4931] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" host="localhost" Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.394 [INFO][4931] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" host="localhost" Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.394 [INFO][4931] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" host="localhost" Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.394 [INFO][4931] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:57:29.429641 containerd[1537]: 2025-01-13 20:57:29.394 [INFO][4931] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" HandleID="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Workload="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:57:29.431235 containerd[1537]: 2025-01-13 20:57:29.396 [INFO][4862] cni-plugin/k8s.go 386: Populated endpoint ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Namespace="calico-system" Pod="calico-kube-controllers-6df8b96c48-4j6ml" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0", GenerateName:"calico-kube-controllers-6df8b96c48-", Namespace:"calico-system", SelfLink:"", UID:"be5d865c-3359-4beb-8044-736dced88771", ResourceVersion:"743", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 57, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6df8b96c48", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-6df8b96c48-4j6ml", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3e8daddf72e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:57:29.431235 containerd[1537]: 2025-01-13 20:57:29.396 [INFO][4862] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Namespace="calico-system" Pod="calico-kube-controllers-6df8b96c48-4j6ml" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:57:29.431235 containerd[1537]: 2025-01-13 20:57:29.396 [INFO][4862] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3e8daddf72e ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Namespace="calico-system" Pod="calico-kube-controllers-6df8b96c48-4j6ml" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:57:29.431235 containerd[1537]: 2025-01-13 20:57:29.409 [INFO][4862] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Namespace="calico-system" Pod="calico-kube-controllers-6df8b96c48-4j6ml" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:57:29.431235 containerd[1537]: 2025-01-13 20:57:29.410 [INFO][4862] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Namespace="calico-system" Pod="calico-kube-controllers-6df8b96c48-4j6ml" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0", GenerateName:"calico-kube-controllers-6df8b96c48-", Namespace:"calico-system", SelfLink:"", UID:"be5d865c-3359-4beb-8044-736dced88771", ResourceVersion:"743", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 57, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6df8b96c48", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba", Pod:"calico-kube-controllers-6df8b96c48-4j6ml", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3e8daddf72e", MAC:"7e:72:40:9b:06:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:57:29.431235 containerd[1537]: 2025-01-13 20:57:29.428 [INFO][4862] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Namespace="calico-system" Pod="calico-kube-controllers-6df8b96c48-4j6ml" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:57:29.475607 containerd[1537]: time="2025-01-13T20:57:29.475053181Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:57:29.475607 containerd[1537]: time="2025-01-13T20:57:29.475112076Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:57:29.475607 containerd[1537]: time="2025-01-13T20:57:29.475125251Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:29.475607 containerd[1537]: time="2025-01-13T20:57:29.475211762Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:29.504992 systemd[1]: Started cri-containerd-1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba.scope - libcontainer container 1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba. Jan 13 20:57:29.510519 systemd-networkd[1464]: calibb7cbae972d: Link UP Jan 13 20:57:29.510710 systemd-networkd[1464]: calibb7cbae972d: Gained carrier Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:28.977 [INFO][4913] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:28.991 [INFO][4913] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--9c7f5-eth0 coredns-6f6b679f8f- kube-system 2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9 739 0 2025-01-13 20:56:59 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-9c7f5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calibb7cbae972d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" Namespace="kube-system" Pod="coredns-6f6b679f8f-9c7f5" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9c7f5-" Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:28.991 [INFO][4913] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" Namespace="kube-system" Pod="coredns-6f6b679f8f-9c7f5" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9c7f5-eth0" Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.343 [INFO][4932] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" HandleID="k8s-pod-network.b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" Workload="localhost-k8s-coredns--6f6b679f8f--9c7f5-eth0" Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.367 [INFO][4932] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" HandleID="k8s-pod-network.b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" Workload="localhost-k8s-coredns--6f6b679f8f--9c7f5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031fcc0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-9c7f5", "timestamp":"2025-01-13 20:57:29.343234697 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.368 [INFO][4932] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.394 [INFO][4932] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.394 [INFO][4932] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.471 [INFO][4932] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" host="localhost" Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.479 [INFO][4932] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.487 [INFO][4932] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.488 [INFO][4932] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.489 [INFO][4932] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.489 [INFO][4932] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" host="localhost" Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.491 [INFO][4932] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.494 [INFO][4932] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" host="localhost" Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.499 [INFO][4932] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" host="localhost" Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.499 [INFO][4932] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" host="localhost" Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.500 [INFO][4932] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:57:29.526545 containerd[1537]: 2025-01-13 20:57:29.500 [INFO][4932] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" HandleID="k8s-pod-network.b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" Workload="localhost-k8s-coredns--6f6b679f8f--9c7f5-eth0" Jan 13 20:57:29.528377 containerd[1537]: 2025-01-13 20:57:29.504 [INFO][4913] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" Namespace="kube-system" Pod="coredns-6f6b679f8f-9c7f5" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9c7f5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--9c7f5-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9", ResourceVersion:"739", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 56, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-9c7f5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibb7cbae972d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:57:29.528377 containerd[1537]: 2025-01-13 20:57:29.504 [INFO][4913] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" Namespace="kube-system" Pod="coredns-6f6b679f8f-9c7f5" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9c7f5-eth0" Jan 13 20:57:29.528377 containerd[1537]: 2025-01-13 20:57:29.504 [INFO][4913] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibb7cbae972d ContainerID="b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" Namespace="kube-system" Pod="coredns-6f6b679f8f-9c7f5" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9c7f5-eth0" Jan 13 20:57:29.528377 containerd[1537]: 2025-01-13 20:57:29.510 [INFO][4913] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" Namespace="kube-system" Pod="coredns-6f6b679f8f-9c7f5" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9c7f5-eth0" Jan 13 20:57:29.528377 containerd[1537]: 2025-01-13 20:57:29.511 [INFO][4913] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" Namespace="kube-system" Pod="coredns-6f6b679f8f-9c7f5" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9c7f5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--9c7f5-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9", ResourceVersion:"739", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 56, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c", Pod:"coredns-6f6b679f8f-9c7f5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibb7cbae972d", MAC:"1e:f7:dd:aa:33:27", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:57:29.528377 containerd[1537]: 2025-01-13 20:57:29.523 [INFO][4913] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c" Namespace="kube-system" Pod="coredns-6f6b679f8f-9c7f5" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9c7f5-eth0" Jan 13 20:57:29.544605 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:57:29.553267 containerd[1537]: time="2025-01-13T20:57:29.553216917Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:57:29.553354 containerd[1537]: time="2025-01-13T20:57:29.553260673Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:57:29.553354 containerd[1537]: time="2025-01-13T20:57:29.553270026Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:29.553354 containerd[1537]: time="2025-01-13T20:57:29.553335149Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:29.567917 systemd[1]: Started cri-containerd-b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c.scope - libcontainer container b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c. Jan 13 20:57:29.591333 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:57:29.592035 containerd[1537]: time="2025-01-13T20:57:29.592011141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df8b96c48-4j6ml,Uid:be5d865c-3359-4beb-8044-736dced88771,Namespace:calico-system,Attempt:7,} returns sandbox id \"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba\"" Jan 13 20:57:29.609990 containerd[1537]: time="2025-01-13T20:57:29.609861455Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Jan 13 20:57:29.621158 containerd[1537]: time="2025-01-13T20:57:29.621081032Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9c7f5,Uid:2e3cb21e-fc92-46c5-8852-4ee6f84c4eb9,Namespace:kube-system,Attempt:7,} returns sandbox id \"b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c\"" Jan 13 20:57:29.626110 containerd[1537]: time="2025-01-13T20:57:29.626093709Z" level=info msg="CreateContainer within sandbox \"b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 13 20:57:29.640179 systemd[1]: run-netns-cni\x2dbe73e748\x2d37b5\x2dd780\x2da6e3\x2d1f7e66140f3c.mount: Deactivated successfully. Jan 13 20:57:29.640244 systemd[1]: run-netns-cni\x2d8cda018d\x2d4709\x2d682c\x2d5e42\x2dff6e0c26c896.mount: Deactivated successfully. Jan 13 20:57:29.640285 systemd[1]: run-netns-cni\x2d303b5541\x2d3db4\x2d5446\x2d1973\x2dfc26f62f0c84.mount: Deactivated successfully. Jan 13 20:57:29.672901 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1748988483.mount: Deactivated successfully. Jan 13 20:57:29.684400 containerd[1537]: time="2025-01-13T20:57:29.684329137Z" level=info msg="CreateContainer within sandbox \"b779ad0510324ccf67dd3e65145cfeb7198db2f5cc45f7f83d80b78f71e1cb9c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"85b70befb39f91d91865b3c9d907a53b7961716cfcb37ba8c55533584b869edd\"" Jan 13 20:57:29.688226 containerd[1537]: time="2025-01-13T20:57:29.687816535Z" level=info msg="StartContainer for \"85b70befb39f91d91865b3c9d907a53b7961716cfcb37ba8c55533584b869edd\"" Jan 13 20:57:29.695186 systemd-networkd[1464]: cali1ebaeb5c24b: Link UP Jan 13 20:57:29.695973 systemd-networkd[1464]: cali1ebaeb5c24b: Gained carrier Jan 13 20:57:29.737988 systemd[1]: Started cri-containerd-85b70befb39f91d91865b3c9d907a53b7961716cfcb37ba8c55533584b869edd.scope - libcontainer container 85b70befb39f91d91865b3c9d907a53b7961716cfcb37ba8c55533584b869edd. Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:28.964 [INFO][4904] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:28.996 [INFO][4904] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--k7f2m-eth0 coredns-6f6b679f8f- kube-system fa68722f-5974-4f04-9b6e-46b8f479c300 742 0 2025-01-13 20:56:59 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-k7f2m eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali1ebaeb5c24b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" Namespace="kube-system" Pod="coredns-6f6b679f8f-k7f2m" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k7f2m-" Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:28.996 [INFO][4904] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" Namespace="kube-system" Pod="coredns-6f6b679f8f-k7f2m" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k7f2m-eth0" Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.343 [INFO][4929] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" HandleID="k8s-pod-network.73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" Workload="localhost-k8s-coredns--6f6b679f8f--k7f2m-eth0" Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.366 [INFO][4929] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" HandleID="k8s-pod-network.73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" Workload="localhost-k8s-coredns--6f6b679f8f--k7f2m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003b09a0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-k7f2m", "timestamp":"2025-01-13 20:57:29.343578165 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.366 [INFO][4929] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.500 [INFO][4929] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.500 [INFO][4929] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.570 [INFO][4929] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" host="localhost" Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.606 [INFO][4929] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.627 [INFO][4929] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.651 [INFO][4929] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.654 [INFO][4929] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.654 [INFO][4929] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" host="localhost" Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.655 [INFO][4929] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8 Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.659 [INFO][4929] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" host="localhost" Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.677 [INFO][4929] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" host="localhost" Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.677 [INFO][4929] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" host="localhost" Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.677 [INFO][4929] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:57:29.753133 containerd[1537]: 2025-01-13 20:57:29.677 [INFO][4929] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" HandleID="k8s-pod-network.73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" Workload="localhost-k8s-coredns--6f6b679f8f--k7f2m-eth0" Jan 13 20:57:29.753764 containerd[1537]: 2025-01-13 20:57:29.683 [INFO][4904] cni-plugin/k8s.go 386: Populated endpoint ContainerID="73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" Namespace="kube-system" Pod="coredns-6f6b679f8f-k7f2m" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k7f2m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--k7f2m-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"fa68722f-5974-4f04-9b6e-46b8f479c300", ResourceVersion:"742", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 56, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-k7f2m", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1ebaeb5c24b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:57:29.753764 containerd[1537]: 2025-01-13 20:57:29.683 [INFO][4904] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" Namespace="kube-system" Pod="coredns-6f6b679f8f-k7f2m" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k7f2m-eth0" Jan 13 20:57:29.753764 containerd[1537]: 2025-01-13 20:57:29.683 [INFO][4904] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1ebaeb5c24b ContainerID="73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" Namespace="kube-system" Pod="coredns-6f6b679f8f-k7f2m" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k7f2m-eth0" Jan 13 20:57:29.753764 containerd[1537]: 2025-01-13 20:57:29.702 [INFO][4904] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" Namespace="kube-system" Pod="coredns-6f6b679f8f-k7f2m" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k7f2m-eth0" Jan 13 20:57:29.753764 containerd[1537]: 2025-01-13 20:57:29.703 [INFO][4904] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" Namespace="kube-system" Pod="coredns-6f6b679f8f-k7f2m" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k7f2m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--k7f2m-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"fa68722f-5974-4f04-9b6e-46b8f479c300", ResourceVersion:"742", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 56, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8", Pod:"coredns-6f6b679f8f-k7f2m", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1ebaeb5c24b", MAC:"f6:2d:95:3f:90:70", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:57:29.753764 containerd[1537]: 2025-01-13 20:57:29.749 [INFO][4904] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8" Namespace="kube-system" Pod="coredns-6f6b679f8f-k7f2m" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k7f2m-eth0" Jan 13 20:57:29.781556 containerd[1537]: time="2025-01-13T20:57:29.781430362Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:57:29.781556 containerd[1537]: time="2025-01-13T20:57:29.781495470Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:57:29.781556 containerd[1537]: time="2025-01-13T20:57:29.781520546Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:29.782360 containerd[1537]: time="2025-01-13T20:57:29.782294786Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:29.789013 systemd-networkd[1464]: cali6af874d9c7b: Link UP Jan 13 20:57:29.790854 systemd-networkd[1464]: cali6af874d9c7b: Gained carrier Jan 13 20:57:29.799529 systemd[1]: Started cri-containerd-73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8.scope - libcontainer container 73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8. Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:28.889 [INFO][4870] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:28.990 [INFO][4870] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--867f57c995--67ltf-eth0 calico-apiserver-867f57c995- calico-apiserver 6a232370-fd20-47eb-a2e3-9b0d1d786995 745 0 2025-01-13 20:57:06 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:867f57c995 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-867f57c995-67ltf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali6af874d9c7b [] []}} ContainerID="6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-67ltf" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--67ltf-" Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:28.990 [INFO][4870] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-67ltf" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--67ltf-eth0" Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.343 [INFO][4928] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" HandleID="k8s-pod-network.6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" Workload="localhost-k8s-calico--apiserver--867f57c995--67ltf-eth0" Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.365 [INFO][4928] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" HandleID="k8s-pod-network.6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" Workload="localhost-k8s-calico--apiserver--867f57c995--67ltf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000050610), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-867f57c995-67ltf", "timestamp":"2025-01-13 20:57:29.34393349 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.366 [INFO][4928] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.678 [INFO][4928] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.679 [INFO][4928] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.683 [INFO][4928] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" host="localhost" Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.717 [INFO][4928] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.755 [INFO][4928] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.763 [INFO][4928] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.764 [INFO][4928] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.764 [INFO][4928] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" host="localhost" Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.765 [INFO][4928] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321 Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.771 [INFO][4928] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" host="localhost" Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.776 [INFO][4928] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" host="localhost" Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.776 [INFO][4928] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" host="localhost" Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.776 [INFO][4928] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:57:29.807306 containerd[1537]: 2025-01-13 20:57:29.776 [INFO][4928] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" HandleID="k8s-pod-network.6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" Workload="localhost-k8s-calico--apiserver--867f57c995--67ltf-eth0" Jan 13 20:57:29.813332 containerd[1537]: 2025-01-13 20:57:29.779 [INFO][4870] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-67ltf" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--67ltf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--867f57c995--67ltf-eth0", GenerateName:"calico-apiserver-867f57c995-", Namespace:"calico-apiserver", SelfLink:"", UID:"6a232370-fd20-47eb-a2e3-9b0d1d786995", ResourceVersion:"745", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 57, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"867f57c995", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-867f57c995-67ltf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6af874d9c7b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:57:29.813332 containerd[1537]: 2025-01-13 20:57:29.780 [INFO][4870] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-67ltf" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--67ltf-eth0" Jan 13 20:57:29.813332 containerd[1537]: 2025-01-13 20:57:29.780 [INFO][4870] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6af874d9c7b ContainerID="6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-67ltf" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--67ltf-eth0" Jan 13 20:57:29.813332 containerd[1537]: 2025-01-13 20:57:29.794 [INFO][4870] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-67ltf" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--67ltf-eth0" Jan 13 20:57:29.813332 containerd[1537]: 2025-01-13 20:57:29.795 [INFO][4870] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-67ltf" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--67ltf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--867f57c995--67ltf-eth0", GenerateName:"calico-apiserver-867f57c995-", Namespace:"calico-apiserver", SelfLink:"", UID:"6a232370-fd20-47eb-a2e3-9b0d1d786995", ResourceVersion:"745", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 57, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"867f57c995", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321", Pod:"calico-apiserver-867f57c995-67ltf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6af874d9c7b", MAC:"ee:59:39:a6:ed:0a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:57:29.813332 containerd[1537]: 2025-01-13 20:57:29.805 [INFO][4870] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-67ltf" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--67ltf-eth0" Jan 13 20:57:29.815028 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:57:29.839515 containerd[1537]: time="2025-01-13T20:57:29.839252041Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:57:29.839714 containerd[1537]: time="2025-01-13T20:57:29.839501701Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:57:29.839714 containerd[1537]: time="2025-01-13T20:57:29.839515290Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:29.840619 containerd[1537]: time="2025-01-13T20:57:29.840557829Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:29.850461 containerd[1537]: time="2025-01-13T20:57:29.850271280Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k7f2m,Uid:fa68722f-5974-4f04-9b6e-46b8f479c300,Namespace:kube-system,Attempt:8,} returns sandbox id \"73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8\"" Jan 13 20:57:29.850627 containerd[1537]: time="2025-01-13T20:57:29.850600190Z" level=info msg="StartContainer for \"85b70befb39f91d91865b3c9d907a53b7961716cfcb37ba8c55533584b869edd\" returns successfully" Jan 13 20:57:29.854924 containerd[1537]: time="2025-01-13T20:57:29.854758265Z" level=info msg="CreateContainer within sandbox \"73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 13 20:57:29.867982 systemd[1]: Started cri-containerd-6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321.scope - libcontainer container 6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321. Jan 13 20:57:29.884137 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:57:29.913346 systemd-networkd[1464]: caliefaa12d2c4d: Link UP Jan 13 20:57:29.913803 systemd-networkd[1464]: caliefaa12d2c4d: Gained carrier Jan 13 20:57:29.929627 containerd[1537]: time="2025-01-13T20:57:29.929551965Z" level=info msg="CreateContainer within sandbox \"73bb684db090a634b612988919a017534ddcc934632d1479af8b27308b83eae8\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"bd9d164df0db7adf0b4fc0316c679c1dda66564f4f380f52602f663cbb91eec8\"" Jan 13 20:57:29.932124 containerd[1537]: time="2025-01-13T20:57:29.931936158Z" level=info msg="StartContainer for \"bd9d164df0db7adf0b4fc0316c679c1dda66564f4f380f52602f663cbb91eec8\"" Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:28.790 [INFO][4837] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:28.990 [INFO][4837] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--jkzvs-eth0 csi-node-driver- calico-system 156fa3f2-d364-43dd-86de-274512f7d213 611 0 2025-01-13 20:57:06 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:56747c9949 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-jkzvs eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] caliefaa12d2c4d [] []}} ContainerID="2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" Namespace="calico-system" Pod="csi-node-driver-jkzvs" WorkloadEndpoint="localhost-k8s-csi--node--driver--jkzvs-" Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:28.990 [INFO][4837] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" Namespace="calico-system" Pod="csi-node-driver-jkzvs" WorkloadEndpoint="localhost-k8s-csi--node--driver--jkzvs-eth0" Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.343 [INFO][4930] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" HandleID="k8s-pod-network.2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" Workload="localhost-k8s-csi--node--driver--jkzvs-eth0" Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.368 [INFO][4930] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" HandleID="k8s-pod-network.2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" Workload="localhost-k8s-csi--node--driver--jkzvs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003a4aa0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-jkzvs", "timestamp":"2025-01-13 20:57:29.343242423 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.368 [INFO][4930] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.777 [INFO][4930] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.777 [INFO][4930] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.790 [INFO][4930] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" host="localhost" Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.814 [INFO][4930] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.856 [INFO][4930] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.860 [INFO][4930] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.864 [INFO][4930] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.864 [INFO][4930] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" host="localhost" Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.879 [INFO][4930] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6 Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.887 [INFO][4930] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" host="localhost" Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.903 [INFO][4930] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" host="localhost" Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.903 [INFO][4930] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" host="localhost" Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.903 [INFO][4930] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:57:29.942971 containerd[1537]: 2025-01-13 20:57:29.903 [INFO][4930] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" HandleID="k8s-pod-network.2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" Workload="localhost-k8s-csi--node--driver--jkzvs-eth0" Jan 13 20:57:29.947944 containerd[1537]: 2025-01-13 20:57:29.909 [INFO][4837] cni-plugin/k8s.go 386: Populated endpoint ContainerID="2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" Namespace="calico-system" Pod="csi-node-driver-jkzvs" WorkloadEndpoint="localhost-k8s-csi--node--driver--jkzvs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--jkzvs-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"156fa3f2-d364-43dd-86de-274512f7d213", ResourceVersion:"611", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 57, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"56747c9949", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-jkzvs", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliefaa12d2c4d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:57:29.947944 containerd[1537]: 2025-01-13 20:57:29.910 [INFO][4837] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" Namespace="calico-system" Pod="csi-node-driver-jkzvs" WorkloadEndpoint="localhost-k8s-csi--node--driver--jkzvs-eth0" Jan 13 20:57:29.947944 containerd[1537]: 2025-01-13 20:57:29.910 [INFO][4837] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliefaa12d2c4d ContainerID="2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" Namespace="calico-system" Pod="csi-node-driver-jkzvs" WorkloadEndpoint="localhost-k8s-csi--node--driver--jkzvs-eth0" Jan 13 20:57:29.947944 containerd[1537]: 2025-01-13 20:57:29.914 [INFO][4837] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" Namespace="calico-system" Pod="csi-node-driver-jkzvs" WorkloadEndpoint="localhost-k8s-csi--node--driver--jkzvs-eth0" Jan 13 20:57:29.947944 containerd[1537]: 2025-01-13 20:57:29.914 [INFO][4837] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" Namespace="calico-system" Pod="csi-node-driver-jkzvs" WorkloadEndpoint="localhost-k8s-csi--node--driver--jkzvs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--jkzvs-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"156fa3f2-d364-43dd-86de-274512f7d213", ResourceVersion:"611", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 57, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"56747c9949", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6", Pod:"csi-node-driver-jkzvs", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliefaa12d2c4d", MAC:"52:55:62:b4:23:e7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:57:29.947944 containerd[1537]: 2025-01-13 20:57:29.940 [INFO][4837] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6" Namespace="calico-system" Pod="csi-node-driver-jkzvs" WorkloadEndpoint="localhost-k8s-csi--node--driver--jkzvs-eth0" Jan 13 20:57:29.950597 containerd[1537]: time="2025-01-13T20:57:29.950220804Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-67ltf,Uid:6a232370-fd20-47eb-a2e3-9b0d1d786995,Namespace:calico-apiserver,Attempt:6,} returns sandbox id \"6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321\"" Jan 13 20:57:29.975973 systemd[1]: Started cri-containerd-bd9d164df0db7adf0b4fc0316c679c1dda66564f4f380f52602f663cbb91eec8.scope - libcontainer container bd9d164df0db7adf0b4fc0316c679c1dda66564f4f380f52602f663cbb91eec8. Jan 13 20:57:29.993599 containerd[1537]: time="2025-01-13T20:57:29.993189905Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:57:29.993599 containerd[1537]: time="2025-01-13T20:57:29.993227409Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:57:29.993599 containerd[1537]: time="2025-01-13T20:57:29.993235662Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:29.993599 containerd[1537]: time="2025-01-13T20:57:29.993285289Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:30.003531 systemd-networkd[1464]: calib5b17fe13c1: Link UP Jan 13 20:57:30.004643 systemd-networkd[1464]: calib5b17fe13c1: Gained carrier Jan 13 20:57:30.010593 kubelet[2787]: I0113 20:57:30.010561 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-9c7f5" podStartSLOduration=31.010547924 podStartE2EDuration="31.010547924s" podCreationTimestamp="2025-01-13 20:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:57:29.92011408 +0000 UTC m=+36.079452060" watchObservedRunningTime="2025-01-13 20:57:30.010547924 +0000 UTC m=+36.169885899" Jan 13 20:57:30.018683 containerd[1537]: time="2025-01-13T20:57:30.018655445Z" level=info msg="StartContainer for \"bd9d164df0db7adf0b4fc0316c679c1dda66564f4f380f52602f663cbb91eec8\" returns successfully" Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:28.893 [INFO][4853] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:28.990 [INFO][4853] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--867f57c995--7rrvw-eth0 calico-apiserver-867f57c995- calico-apiserver 86a1a104-aa85-4352-b08c-36f54c7172c1 744 0 2025-01-13 20:57:06 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:867f57c995 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-867f57c995-7rrvw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib5b17fe13c1 [] []}} ContainerID="ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-7rrvw" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--7rrvw-" Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:28.990 [INFO][4853] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-7rrvw" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--7rrvw-eth0" Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.344 [INFO][4933] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" HandleID="k8s-pod-network.ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" Workload="localhost-k8s-calico--apiserver--867f57c995--7rrvw-eth0" Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.369 [INFO][4933] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" HandleID="k8s-pod-network.ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" Workload="localhost-k8s-calico--apiserver--867f57c995--7rrvw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002f4850), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-867f57c995-7rrvw", "timestamp":"2025-01-13 20:57:29.344127656 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.369 [INFO][4933] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.904 [INFO][4933] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.905 [INFO][4933] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.908 [INFO][4933] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" host="localhost" Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.939 [INFO][4933] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.956 [INFO][4933] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.959 [INFO][4933] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.961 [INFO][4933] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.962 [INFO][4933] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" host="localhost" Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.964 [INFO][4933] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80 Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.971 [INFO][4933] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" host="localhost" Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.993 [INFO][4933] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" host="localhost" Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.994 [INFO][4933] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" host="localhost" Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.994 [INFO][4933] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:57:30.023999 containerd[1537]: 2025-01-13 20:57:29.994 [INFO][4933] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" HandleID="k8s-pod-network.ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" Workload="localhost-k8s-calico--apiserver--867f57c995--7rrvw-eth0" Jan 13 20:57:30.024570 containerd[1537]: 2025-01-13 20:57:29.999 [INFO][4853] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-7rrvw" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--7rrvw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--867f57c995--7rrvw-eth0", GenerateName:"calico-apiserver-867f57c995-", Namespace:"calico-apiserver", SelfLink:"", UID:"86a1a104-aa85-4352-b08c-36f54c7172c1", ResourceVersion:"744", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 57, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"867f57c995", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-867f57c995-7rrvw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib5b17fe13c1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:57:30.024570 containerd[1537]: 2025-01-13 20:57:30.000 [INFO][4853] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-7rrvw" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--7rrvw-eth0" Jan 13 20:57:30.024570 containerd[1537]: 2025-01-13 20:57:30.000 [INFO][4853] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib5b17fe13c1 ContainerID="ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-7rrvw" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--7rrvw-eth0" Jan 13 20:57:30.024570 containerd[1537]: 2025-01-13 20:57:30.005 [INFO][4853] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-7rrvw" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--7rrvw-eth0" Jan 13 20:57:30.024570 containerd[1537]: 2025-01-13 20:57:30.005 [INFO][4853] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-7rrvw" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--7rrvw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--867f57c995--7rrvw-eth0", GenerateName:"calico-apiserver-867f57c995-", Namespace:"calico-apiserver", SelfLink:"", UID:"86a1a104-aa85-4352-b08c-36f54c7172c1", ResourceVersion:"744", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 57, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"867f57c995", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80", Pod:"calico-apiserver-867f57c995-7rrvw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib5b17fe13c1", MAC:"5e:97:6e:16:6e:3d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:57:30.024570 containerd[1537]: 2025-01-13 20:57:30.021 [INFO][4853] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80" Namespace="calico-apiserver" Pod="calico-apiserver-867f57c995-7rrvw" WorkloadEndpoint="localhost-k8s-calico--apiserver--867f57c995--7rrvw-eth0" Jan 13 20:57:30.034839 systemd[1]: Started cri-containerd-2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6.scope - libcontainer container 2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6. Jan 13 20:57:30.044538 containerd[1537]: time="2025-01-13T20:57:30.044359991Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:57:30.044538 containerd[1537]: time="2025-01-13T20:57:30.044413346Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:57:30.044538 containerd[1537]: time="2025-01-13T20:57:30.044433644Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:30.044538 containerd[1537]: time="2025-01-13T20:57:30.044482416Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:30.047160 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:57:30.064059 systemd[1]: Started cri-containerd-ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80.scope - libcontainer container ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80. Jan 13 20:57:30.067797 containerd[1537]: time="2025-01-13T20:57:30.067657146Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jkzvs,Uid:156fa3f2-d364-43dd-86de-274512f7d213,Namespace:calico-system,Attempt:7,} returns sandbox id \"2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6\"" Jan 13 20:57:30.078107 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:57:30.109941 containerd[1537]: time="2025-01-13T20:57:30.109491215Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867f57c995-7rrvw,Uid:86a1a104-aa85-4352-b08c-36f54c7172c1,Namespace:calico-apiserver,Attempt:6,} returns sandbox id \"ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80\"" Jan 13 20:57:30.417855 kernel: bpftool[5508]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Jan 13 20:57:30.638386 systemd-networkd[1464]: vxlan.calico: Link UP Jan 13 20:57:30.638390 systemd-networkd[1464]: vxlan.calico: Gained carrier Jan 13 20:57:30.810947 systemd-networkd[1464]: cali1ebaeb5c24b: Gained IPv6LL Jan 13 20:57:30.938997 systemd-networkd[1464]: cali6af874d9c7b: Gained IPv6LL Jan 13 20:57:30.951795 kubelet[2787]: I0113 20:57:30.951287 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-k7f2m" podStartSLOduration=31.95127356 podStartE2EDuration="31.95127356s" podCreationTimestamp="2025-01-13 20:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:57:30.914083311 +0000 UTC m=+37.073421282" watchObservedRunningTime="2025-01-13 20:57:30.95127356 +0000 UTC m=+37.110611535" Jan 13 20:57:31.002984 systemd-networkd[1464]: cali3e8daddf72e: Gained IPv6LL Jan 13 20:57:31.194969 systemd-networkd[1464]: calib5b17fe13c1: Gained IPv6LL Jan 13 20:57:31.386924 systemd-networkd[1464]: calibb7cbae972d: Gained IPv6LL Jan 13 20:57:31.771268 systemd-networkd[1464]: caliefaa12d2c4d: Gained IPv6LL Jan 13 20:57:32.135261 containerd[1537]: time="2025-01-13T20:57:32.135195154Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:32.136060 containerd[1537]: time="2025-01-13T20:57:32.136037663Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Jan 13 20:57:32.136469 containerd[1537]: time="2025-01-13T20:57:32.136454803Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:32.137816 containerd[1537]: time="2025-01-13T20:57:32.137791432Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:32.138300 containerd[1537]: time="2025-01-13T20:57:32.138042692Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 2.528159472s" Jan 13 20:57:32.138300 containerd[1537]: time="2025-01-13T20:57:32.138060739Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Jan 13 20:57:32.139282 containerd[1537]: time="2025-01-13T20:57:32.139107612Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 13 20:57:32.147083 containerd[1537]: time="2025-01-13T20:57:32.146296845Z" level=info msg="CreateContainer within sandbox \"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 13 20:57:32.167452 containerd[1537]: time="2025-01-13T20:57:32.167422498Z" level=info msg="CreateContainer within sandbox \"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a\"" Jan 13 20:57:32.168093 containerd[1537]: time="2025-01-13T20:57:32.168047087Z" level=info msg="StartContainer for \"f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a\"" Jan 13 20:57:32.194980 systemd[1]: Started cri-containerd-f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a.scope - libcontainer container f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a. Jan 13 20:57:32.224864 containerd[1537]: time="2025-01-13T20:57:32.224480251Z" level=info msg="StartContainer for \"f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a\" returns successfully" Jan 13 20:57:32.282908 systemd-networkd[1464]: vxlan.calico: Gained IPv6LL Jan 13 20:57:32.952742 kubelet[2787]: I0113 20:57:32.952470 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6df8b96c48-4j6ml" podStartSLOduration=24.423252273 podStartE2EDuration="26.952458284s" podCreationTimestamp="2025-01-13 20:57:06 +0000 UTC" firstStartedPulling="2025-01-13 20:57:29.60948848 +0000 UTC m=+35.768826451" lastFinishedPulling="2025-01-13 20:57:32.138694492 +0000 UTC m=+38.298032462" observedRunningTime="2025-01-13 20:57:32.951857468 +0000 UTC m=+39.111195447" watchObservedRunningTime="2025-01-13 20:57:32.952458284 +0000 UTC m=+39.111796258" Jan 13 20:57:34.761723 containerd[1537]: time="2025-01-13T20:57:34.761694948Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:34.762459 containerd[1537]: time="2025-01-13T20:57:34.762436843Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Jan 13 20:57:34.763494 containerd[1537]: time="2025-01-13T20:57:34.763478449Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:34.766522 containerd[1537]: time="2025-01-13T20:57:34.764924292Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:34.766522 containerd[1537]: time="2025-01-13T20:57:34.766416408Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 2.627288448s" Jan 13 20:57:34.766522 containerd[1537]: time="2025-01-13T20:57:34.766433090Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 13 20:57:34.769093 containerd[1537]: time="2025-01-13T20:57:34.769079339Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Jan 13 20:57:34.770363 containerd[1537]: time="2025-01-13T20:57:34.770344334Z" level=info msg="CreateContainer within sandbox \"6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 13 20:57:34.785040 containerd[1537]: time="2025-01-13T20:57:34.785010246Z" level=info msg="CreateContainer within sandbox \"6b1a4d5ad5df781f38fcfcdd9b67cc249b4fdaf8b75a62e7498eb71bb5cee321\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"870d70738ab1363d9cc568c40f02b5a67023bdf5bd6e30275059a46c3d27aa1f\"" Jan 13 20:57:34.785397 containerd[1537]: time="2025-01-13T20:57:34.785362718Z" level=info msg="StartContainer for \"870d70738ab1363d9cc568c40f02b5a67023bdf5bd6e30275059a46c3d27aa1f\"" Jan 13 20:57:34.819147 systemd[1]: Started cri-containerd-870d70738ab1363d9cc568c40f02b5a67023bdf5bd6e30275059a46c3d27aa1f.scope - libcontainer container 870d70738ab1363d9cc568c40f02b5a67023bdf5bd6e30275059a46c3d27aa1f. Jan 13 20:57:34.865170 containerd[1537]: time="2025-01-13T20:57:34.865112565Z" level=info msg="StartContainer for \"870d70738ab1363d9cc568c40f02b5a67023bdf5bd6e30275059a46c3d27aa1f\" returns successfully" Jan 13 20:57:35.944237 kubelet[2787]: I0113 20:57:35.944107 2787 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:57:36.738080 containerd[1537]: time="2025-01-13T20:57:36.738002746Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:36.739546 containerd[1537]: time="2025-01-13T20:57:36.738713294Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Jan 13 20:57:36.740347 containerd[1537]: time="2025-01-13T20:57:36.739583876Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:36.741979 containerd[1537]: time="2025-01-13T20:57:36.741845246Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.972664712s" Jan 13 20:57:36.741979 containerd[1537]: time="2025-01-13T20:57:36.741874214Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Jan 13 20:57:36.745737 containerd[1537]: time="2025-01-13T20:57:36.745701560Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 13 20:57:36.754768 containerd[1537]: time="2025-01-13T20:57:36.754718471Z" level=info msg="CreateContainer within sandbox \"2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jan 13 20:57:36.763939 containerd[1537]: time="2025-01-13T20:57:36.763876041Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:36.815445 containerd[1537]: time="2025-01-13T20:57:36.815365594Z" level=info msg="CreateContainer within sandbox \"2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"2d809c09f176bf34705cd4445182fa115a7381fb389e8e6932a0daebc114d379\"" Jan 13 20:57:36.815873 containerd[1537]: time="2025-01-13T20:57:36.815786878Z" level=info msg="StartContainer for \"2d809c09f176bf34705cd4445182fa115a7381fb389e8e6932a0daebc114d379\"" Jan 13 20:57:36.850013 systemd[1]: Started cri-containerd-2d809c09f176bf34705cd4445182fa115a7381fb389e8e6932a0daebc114d379.scope - libcontainer container 2d809c09f176bf34705cd4445182fa115a7381fb389e8e6932a0daebc114d379. Jan 13 20:57:36.900321 containerd[1537]: time="2025-01-13T20:57:36.900276394Z" level=info msg="StartContainer for \"2d809c09f176bf34705cd4445182fa115a7381fb389e8e6932a0daebc114d379\" returns successfully" Jan 13 20:57:37.155898 containerd[1537]: time="2025-01-13T20:57:37.155383394Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:37.156183 containerd[1537]: time="2025-01-13T20:57:37.156163026Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Jan 13 20:57:37.157933 containerd[1537]: time="2025-01-13T20:57:37.157904828Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 412.168909ms" Jan 13 20:57:37.157999 containerd[1537]: time="2025-01-13T20:57:37.157990165Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 13 20:57:37.158592 containerd[1537]: time="2025-01-13T20:57:37.158580976Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Jan 13 20:57:37.160013 containerd[1537]: time="2025-01-13T20:57:37.159990858Z" level=info msg="CreateContainer within sandbox \"ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 13 20:57:37.181702 containerd[1537]: time="2025-01-13T20:57:37.181660508Z" level=info msg="CreateContainer within sandbox \"ae0258b1ed6d96d4d838026fcc9406080388f6159e702055a54e183070793d80\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"105155284c00d657fb463f94d75ad52c5bbc319d8a8ee880932742aac58f5c7e\"" Jan 13 20:57:37.182787 containerd[1537]: time="2025-01-13T20:57:37.182226884Z" level=info msg="StartContainer for \"105155284c00d657fb463f94d75ad52c5bbc319d8a8ee880932742aac58f5c7e\"" Jan 13 20:57:37.202944 systemd[1]: Started cri-containerd-105155284c00d657fb463f94d75ad52c5bbc319d8a8ee880932742aac58f5c7e.scope - libcontainer container 105155284c00d657fb463f94d75ad52c5bbc319d8a8ee880932742aac58f5c7e. Jan 13 20:57:37.246582 containerd[1537]: time="2025-01-13T20:57:37.246556179Z" level=info msg="StartContainer for \"105155284c00d657fb463f94d75ad52c5bbc319d8a8ee880932742aac58f5c7e\" returns successfully" Jan 13 20:57:37.813869 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount760911406.mount: Deactivated successfully. Jan 13 20:57:37.983475 kubelet[2787]: I0113 20:57:37.983410 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-867f57c995-67ltf" podStartSLOduration=27.169302141 podStartE2EDuration="31.98338542s" podCreationTimestamp="2025-01-13 20:57:06 +0000 UTC" firstStartedPulling="2025-01-13 20:57:29.95466482 +0000 UTC m=+36.114002791" lastFinishedPulling="2025-01-13 20:57:34.7687481 +0000 UTC m=+40.928086070" observedRunningTime="2025-01-13 20:57:34.952930265 +0000 UTC m=+41.112268245" watchObservedRunningTime="2025-01-13 20:57:37.98338542 +0000 UTC m=+44.142723400" Jan 13 20:57:38.036137 kubelet[2787]: I0113 20:57:38.035940 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-867f57c995-7rrvw" podStartSLOduration=24.988378548 podStartE2EDuration="32.035924689s" podCreationTimestamp="2025-01-13 20:57:06 +0000 UTC" firstStartedPulling="2025-01-13 20:57:30.110986324 +0000 UTC m=+36.270324295" lastFinishedPulling="2025-01-13 20:57:37.158532465 +0000 UTC m=+43.317870436" observedRunningTime="2025-01-13 20:57:37.986996715 +0000 UTC m=+44.146334696" watchObservedRunningTime="2025-01-13 20:57:38.035924689 +0000 UTC m=+44.195262670" Jan 13 20:57:39.419027 containerd[1537]: time="2025-01-13T20:57:39.418992740Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:39.419687 containerd[1537]: time="2025-01-13T20:57:39.419658800Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Jan 13 20:57:39.420459 containerd[1537]: time="2025-01-13T20:57:39.419932672Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:39.422863 containerd[1537]: time="2025-01-13T20:57:39.421817657Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:57:39.425604 containerd[1537]: time="2025-01-13T20:57:39.425265605Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 2.266389121s" Jan 13 20:57:39.425604 containerd[1537]: time="2025-01-13T20:57:39.425296260Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Jan 13 20:57:39.427407 containerd[1537]: time="2025-01-13T20:57:39.427384348Z" level=info msg="CreateContainer within sandbox \"2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jan 13 20:57:39.449598 containerd[1537]: time="2025-01-13T20:57:39.449573367Z" level=info msg="CreateContainer within sandbox \"2c72d37fc71383ae1121209ac7edccb695c59ab0f5ee09addd705a90db5c16b6\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"8f12e7aefe669254e8afb14a4f2204cfd4506aac1f437d5087f36b59f5347038\"" Jan 13 20:57:39.450196 containerd[1537]: time="2025-01-13T20:57:39.450089757Z" level=info msg="StartContainer for \"8f12e7aefe669254e8afb14a4f2204cfd4506aac1f437d5087f36b59f5347038\"" Jan 13 20:57:39.469535 systemd[1]: run-containerd-runc-k8s.io-8f12e7aefe669254e8afb14a4f2204cfd4506aac1f437d5087f36b59f5347038-runc.v67VUy.mount: Deactivated successfully. Jan 13 20:57:39.478990 systemd[1]: Started cri-containerd-8f12e7aefe669254e8afb14a4f2204cfd4506aac1f437d5087f36b59f5347038.scope - libcontainer container 8f12e7aefe669254e8afb14a4f2204cfd4506aac1f437d5087f36b59f5347038. Jan 13 20:57:39.507500 containerd[1537]: time="2025-01-13T20:57:39.507478715Z" level=info msg="StartContainer for \"8f12e7aefe669254e8afb14a4f2204cfd4506aac1f437d5087f36b59f5347038\" returns successfully" Jan 13 20:57:40.710884 kubelet[2787]: I0113 20:57:40.710543 2787 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:57:40.731971 kubelet[2787]: I0113 20:57:40.731250 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-jkzvs" podStartSLOduration=25.41524197 podStartE2EDuration="34.731239093s" podCreationTimestamp="2025-01-13 20:57:06 +0000 UTC" firstStartedPulling="2025-01-13 20:57:30.110145262 +0000 UTC m=+36.269483231" lastFinishedPulling="2025-01-13 20:57:39.426142384 +0000 UTC m=+45.585480354" observedRunningTime="2025-01-13 20:57:39.98518194 +0000 UTC m=+46.144519920" watchObservedRunningTime="2025-01-13 20:57:40.731239093 +0000 UTC m=+46.890577066" Jan 13 20:57:40.757070 kubelet[2787]: I0113 20:57:40.757040 2787 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jan 13 20:57:40.770819 kubelet[2787]: I0113 20:57:40.770788 2787 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jan 13 20:57:54.022435 containerd[1537]: time="2025-01-13T20:57:54.014452269Z" level=info msg="StopPodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\"" Jan 13 20:57:54.023009 containerd[1537]: time="2025-01-13T20:57:54.022473150Z" level=info msg="TearDown network for sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" successfully" Jan 13 20:57:54.023009 containerd[1537]: time="2025-01-13T20:57:54.022482666Z" level=info msg="StopPodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" returns successfully" Jan 13 20:57:54.053176 containerd[1537]: time="2025-01-13T20:57:54.053016776Z" level=info msg="RemovePodSandbox for \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\"" Jan 13 20:57:54.061522 containerd[1537]: time="2025-01-13T20:57:54.061380762Z" level=info msg="Forcibly stopping sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\"" Jan 13 20:57:54.061522 containerd[1537]: time="2025-01-13T20:57:54.061439728Z" level=info msg="TearDown network for sandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" successfully" Jan 13 20:57:54.066205 containerd[1537]: time="2025-01-13T20:57:54.066185378Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.070960 containerd[1537]: time="2025-01-13T20:57:54.070940391Z" level=info msg="RemovePodSandbox \"579ca2526769dae6c4056bc33198882e0ec59490bd9a2654a94856ff8700dbcd\" returns successfully" Jan 13 20:57:54.073857 containerd[1537]: time="2025-01-13T20:57:54.073723445Z" level=info msg="StopPodSandbox for \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\"" Jan 13 20:57:54.073857 containerd[1537]: time="2025-01-13T20:57:54.073772817Z" level=info msg="TearDown network for sandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\" successfully" Jan 13 20:57:54.073857 containerd[1537]: time="2025-01-13T20:57:54.073799752Z" level=info msg="StopPodSandbox for \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\" returns successfully" Jan 13 20:57:54.074311 containerd[1537]: time="2025-01-13T20:57:54.074068443Z" level=info msg="RemovePodSandbox for \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\"" Jan 13 20:57:54.074311 containerd[1537]: time="2025-01-13T20:57:54.074131442Z" level=info msg="Forcibly stopping sandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\"" Jan 13 20:57:54.074311 containerd[1537]: time="2025-01-13T20:57:54.074189230Z" level=info msg="TearDown network for sandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\" successfully" Jan 13 20:57:54.075770 containerd[1537]: time="2025-01-13T20:57:54.075703436Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.075770 containerd[1537]: time="2025-01-13T20:57:54.075738553Z" level=info msg="RemovePodSandbox \"6773ff6ff49e3284ebd1d7b7d45ca71b9dc78424c6299af4ab19acf02d98d59d\" returns successfully" Jan 13 20:57:54.076049 containerd[1537]: time="2025-01-13T20:57:54.075923472Z" level=info msg="StopPodSandbox for \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\"" Jan 13 20:57:54.076049 containerd[1537]: time="2025-01-13T20:57:54.075963701Z" level=info msg="TearDown network for sandbox \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\" successfully" Jan 13 20:57:54.076049 containerd[1537]: time="2025-01-13T20:57:54.075969772Z" level=info msg="StopPodSandbox for \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\" returns successfully" Jan 13 20:57:54.076995 containerd[1537]: time="2025-01-13T20:57:54.076264603Z" level=info msg="RemovePodSandbox for \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\"" Jan 13 20:57:54.076995 containerd[1537]: time="2025-01-13T20:57:54.076274820Z" level=info msg="Forcibly stopping sandbox \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\"" Jan 13 20:57:54.076995 containerd[1537]: time="2025-01-13T20:57:54.076304967Z" level=info msg="TearDown network for sandbox \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\" successfully" Jan 13 20:57:54.077905 containerd[1537]: time="2025-01-13T20:57:54.077779077Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.077905 containerd[1537]: time="2025-01-13T20:57:54.077799461Z" level=info msg="RemovePodSandbox \"f744ae71fd15d30a06aaf4c82844fa464fe0aca78caa372f63a9d5e0c83938cc\" returns successfully" Jan 13 20:57:54.078136 containerd[1537]: time="2025-01-13T20:57:54.078002073Z" level=info msg="StopPodSandbox for \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\"" Jan 13 20:57:54.078136 containerd[1537]: time="2025-01-13T20:57:54.078041511Z" level=info msg="TearDown network for sandbox \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\" successfully" Jan 13 20:57:54.078136 containerd[1537]: time="2025-01-13T20:57:54.078047436Z" level=info msg="StopPodSandbox for \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\" returns successfully" Jan 13 20:57:54.078225 containerd[1537]: time="2025-01-13T20:57:54.078206272Z" level=info msg="RemovePodSandbox for \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\"" Jan 13 20:57:54.078225 containerd[1537]: time="2025-01-13T20:57:54.078218392Z" level=info msg="Forcibly stopping sandbox \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\"" Jan 13 20:57:54.078296 containerd[1537]: time="2025-01-13T20:57:54.078266765Z" level=info msg="TearDown network for sandbox \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\" successfully" Jan 13 20:57:54.079428 containerd[1537]: time="2025-01-13T20:57:54.079412331Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.079533 containerd[1537]: time="2025-01-13T20:57:54.079442873Z" level=info msg="RemovePodSandbox \"7d7392d5914976bb63e689d6a7f984b6e6d33b531c844abbfabe42700e946747\" returns successfully" Jan 13 20:57:54.079697 containerd[1537]: time="2025-01-13T20:57:54.079600125Z" level=info msg="StopPodSandbox for \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\"" Jan 13 20:57:54.079753 containerd[1537]: time="2025-01-13T20:57:54.079744647Z" level=info msg="TearDown network for sandbox \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\" successfully" Jan 13 20:57:54.079787 containerd[1537]: time="2025-01-13T20:57:54.079780611Z" level=info msg="StopPodSandbox for \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\" returns successfully" Jan 13 20:57:54.080882 containerd[1537]: time="2025-01-13T20:57:54.079966672Z" level=info msg="RemovePodSandbox for \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\"" Jan 13 20:57:54.080882 containerd[1537]: time="2025-01-13T20:57:54.079978285Z" level=info msg="Forcibly stopping sandbox \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\"" Jan 13 20:57:54.080882 containerd[1537]: time="2025-01-13T20:57:54.080014275Z" level=info msg="TearDown network for sandbox \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\" successfully" Jan 13 20:57:54.093560 containerd[1537]: time="2025-01-13T20:57:54.093520775Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.093679 containerd[1537]: time="2025-01-13T20:57:54.093669283Z" level=info msg="RemovePodSandbox \"e7086542a2e0305c680a7697462e395c048b015ce4a48d060af304f7249dc2ca\" returns successfully" Jan 13 20:57:54.093970 containerd[1537]: time="2025-01-13T20:57:54.093959063Z" level=info msg="StopPodSandbox for \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\"" Jan 13 20:57:54.094090 containerd[1537]: time="2025-01-13T20:57:54.094049086Z" level=info msg="TearDown network for sandbox \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\" successfully" Jan 13 20:57:54.094128 containerd[1537]: time="2025-01-13T20:57:54.094121103Z" level=info msg="StopPodSandbox for \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\" returns successfully" Jan 13 20:57:54.094280 containerd[1537]: time="2025-01-13T20:57:54.094270165Z" level=info msg="RemovePodSandbox for \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\"" Jan 13 20:57:54.094327 containerd[1537]: time="2025-01-13T20:57:54.094320459Z" level=info msg="Forcibly stopping sandbox \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\"" Jan 13 20:57:54.094403 containerd[1537]: time="2025-01-13T20:57:54.094382931Z" level=info msg="TearDown network for sandbox \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\" successfully" Jan 13 20:57:54.095600 containerd[1537]: time="2025-01-13T20:57:54.095586268Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.095656 containerd[1537]: time="2025-01-13T20:57:54.095647990Z" level=info msg="RemovePodSandbox \"0a97fd433f97cb7c7d0f093d465dad472287d6462a73e50df77ac8c844f9db42\" returns successfully" Jan 13 20:57:54.095950 containerd[1537]: time="2025-01-13T20:57:54.095903335Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\"" Jan 13 20:57:54.096010 containerd[1537]: time="2025-01-13T20:57:54.095985590Z" level=info msg="TearDown network for sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" successfully" Jan 13 20:57:54.096057 containerd[1537]: time="2025-01-13T20:57:54.096007636Z" level=info msg="StopPodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" returns successfully" Jan 13 20:57:54.096952 containerd[1537]: time="2025-01-13T20:57:54.096188411Z" level=info msg="RemovePodSandbox for \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\"" Jan 13 20:57:54.096952 containerd[1537]: time="2025-01-13T20:57:54.096200796Z" level=info msg="Forcibly stopping sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\"" Jan 13 20:57:54.096952 containerd[1537]: time="2025-01-13T20:57:54.096231732Z" level=info msg="TearDown network for sandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" successfully" Jan 13 20:57:54.097741 containerd[1537]: time="2025-01-13T20:57:54.097396100Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.097741 containerd[1537]: time="2025-01-13T20:57:54.097414079Z" level=info msg="RemovePodSandbox \"7a154e38080eb71833d1310050ea7f57c2dfda8f10ffd69c1be9e6c83c2e94e1\" returns successfully" Jan 13 20:57:54.097741 containerd[1537]: time="2025-01-13T20:57:54.097529369Z" level=info msg="StopPodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\"" Jan 13 20:57:54.097741 containerd[1537]: time="2025-01-13T20:57:54.097609706Z" level=info msg="TearDown network for sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" successfully" Jan 13 20:57:54.097741 containerd[1537]: time="2025-01-13T20:57:54.097616844Z" level=info msg="StopPodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" returns successfully" Jan 13 20:57:54.097862 containerd[1537]: time="2025-01-13T20:57:54.097759311Z" level=info msg="RemovePodSandbox for \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\"" Jan 13 20:57:54.097862 containerd[1537]: time="2025-01-13T20:57:54.097770674Z" level=info msg="Forcibly stopping sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\"" Jan 13 20:57:54.097862 containerd[1537]: time="2025-01-13T20:57:54.097805823Z" level=info msg="TearDown network for sandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" successfully" Jan 13 20:57:54.098923 containerd[1537]: time="2025-01-13T20:57:54.098908398Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.098958 containerd[1537]: time="2025-01-13T20:57:54.098934607Z" level=info msg="RemovePodSandbox \"2a43d0fbec035d4d390a396e9a0b673f5bf0fb5d81e83e7070766c46f20ecc2b\" returns successfully" Jan 13 20:57:54.099168 containerd[1537]: time="2025-01-13T20:57:54.099081994Z" level=info msg="StopPodSandbox for \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\"" Jan 13 20:57:54.099168 containerd[1537]: time="2025-01-13T20:57:54.099131968Z" level=info msg="TearDown network for sandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\" successfully" Jan 13 20:57:54.099168 containerd[1537]: time="2025-01-13T20:57:54.099139472Z" level=info msg="StopPodSandbox for \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\" returns successfully" Jan 13 20:57:54.099285 containerd[1537]: time="2025-01-13T20:57:54.099267745Z" level=info msg="RemovePodSandbox for \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\"" Jan 13 20:57:54.099317 containerd[1537]: time="2025-01-13T20:57:54.099307447Z" level=info msg="Forcibly stopping sandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\"" Jan 13 20:57:54.099398 containerd[1537]: time="2025-01-13T20:57:54.099340193Z" level=info msg="TearDown network for sandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\" successfully" Jan 13 20:57:54.100498 containerd[1537]: time="2025-01-13T20:57:54.100483756Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.100529 containerd[1537]: time="2025-01-13T20:57:54.100505813Z" level=info msg="RemovePodSandbox \"83e3e6d8d5cb102bc2357633508187e7edcf6d7637eb6110c8990cc48455bdb3\" returns successfully" Jan 13 20:57:54.100638 containerd[1537]: time="2025-01-13T20:57:54.100623450Z" level=info msg="StopPodSandbox for \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\"" Jan 13 20:57:54.100730 containerd[1537]: time="2025-01-13T20:57:54.100715564Z" level=info msg="TearDown network for sandbox \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\" successfully" Jan 13 20:57:54.100730 containerd[1537]: time="2025-01-13T20:57:54.100727219Z" level=info msg="StopPodSandbox for \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\" returns successfully" Jan 13 20:57:54.100910 containerd[1537]: time="2025-01-13T20:57:54.100895573Z" level=info msg="RemovePodSandbox for \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\"" Jan 13 20:57:54.100910 containerd[1537]: time="2025-01-13T20:57:54.100909960Z" level=info msg="Forcibly stopping sandbox \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\"" Jan 13 20:57:54.101021 containerd[1537]: time="2025-01-13T20:57:54.100995356Z" level=info msg="TearDown network for sandbox \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\" successfully" Jan 13 20:57:54.102149 containerd[1537]: time="2025-01-13T20:57:54.102131884Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.102183 containerd[1537]: time="2025-01-13T20:57:54.102159998Z" level=info msg="RemovePodSandbox \"c2369c2cc2076b65cd529ff6fc24cbb8345f0d06480ef9c776918e2ba9b7c020\" returns successfully" Jan 13 20:57:54.102305 containerd[1537]: time="2025-01-13T20:57:54.102290642Z" level=info msg="StopPodSandbox for \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\"" Jan 13 20:57:54.102383 containerd[1537]: time="2025-01-13T20:57:54.102370523Z" level=info msg="TearDown network for sandbox \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\" successfully" Jan 13 20:57:54.102423 containerd[1537]: time="2025-01-13T20:57:54.102381564Z" level=info msg="StopPodSandbox for \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\" returns successfully" Jan 13 20:57:54.102525 containerd[1537]: time="2025-01-13T20:57:54.102511859Z" level=info msg="RemovePodSandbox for \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\"" Jan 13 20:57:54.102556 containerd[1537]: time="2025-01-13T20:57:54.102524945Z" level=info msg="Forcibly stopping sandbox \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\"" Jan 13 20:57:54.102576 containerd[1537]: time="2025-01-13T20:57:54.102559144Z" level=info msg="TearDown network for sandbox \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\" successfully" Jan 13 20:57:54.103738 containerd[1537]: time="2025-01-13T20:57:54.103722419Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.103770 containerd[1537]: time="2025-01-13T20:57:54.103746711Z" level=info msg="RemovePodSandbox \"ca0258fd416d02ef9bf7525048f4ecba90940956c8edb0032e402acce8199fc6\" returns successfully" Jan 13 20:57:54.104048 containerd[1537]: time="2025-01-13T20:57:54.103960442Z" level=info msg="StopPodSandbox for \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\"" Jan 13 20:57:54.104048 containerd[1537]: time="2025-01-13T20:57:54.104001535Z" level=info msg="TearDown network for sandbox \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\" successfully" Jan 13 20:57:54.104048 containerd[1537]: time="2025-01-13T20:57:54.104007785Z" level=info msg="StopPodSandbox for \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\" returns successfully" Jan 13 20:57:54.104181 containerd[1537]: time="2025-01-13T20:57:54.104164734Z" level=info msg="RemovePodSandbox for \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\"" Jan 13 20:57:54.104207 containerd[1537]: time="2025-01-13T20:57:54.104182099Z" level=info msg="Forcibly stopping sandbox \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\"" Jan 13 20:57:54.104303 containerd[1537]: time="2025-01-13T20:57:54.104278763Z" level=info msg="TearDown network for sandbox \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\" successfully" Jan 13 20:57:54.105391 containerd[1537]: time="2025-01-13T20:57:54.105376760Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.105422 containerd[1537]: time="2025-01-13T20:57:54.105400509Z" level=info msg="RemovePodSandbox \"2448c96de60bd9f03478044fe6c3ded2f39f1a51acf3348034d23f30323d70be\" returns successfully" Jan 13 20:57:54.105550 containerd[1537]: time="2025-01-13T20:57:54.105530848Z" level=info msg="StopPodSandbox for \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\"" Jan 13 20:57:54.105590 containerd[1537]: time="2025-01-13T20:57:54.105575934Z" level=info msg="TearDown network for sandbox \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\" successfully" Jan 13 20:57:54.105629 containerd[1537]: time="2025-01-13T20:57:54.105615819Z" level=info msg="StopPodSandbox for \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\" returns successfully" Jan 13 20:57:54.105800 containerd[1537]: time="2025-01-13T20:57:54.105787529Z" level=info msg="RemovePodSandbox for \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\"" Jan 13 20:57:54.105832 containerd[1537]: time="2025-01-13T20:57:54.105801242Z" level=info msg="Forcibly stopping sandbox \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\"" Jan 13 20:57:54.105865 containerd[1537]: time="2025-01-13T20:57:54.105846151Z" level=info msg="TearDown network for sandbox \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\" successfully" Jan 13 20:57:54.106946 containerd[1537]: time="2025-01-13T20:57:54.106931227Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.107163 containerd[1537]: time="2025-01-13T20:57:54.106951189Z" level=info msg="RemovePodSandbox \"7699a5ef43d8d333ea705f11f9970054e2ebe047fa14178c2d6c1c0747acbb74\" returns successfully" Jan 13 20:57:54.107163 containerd[1537]: time="2025-01-13T20:57:54.107090574Z" level=info msg="StopPodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\"" Jan 13 20:57:54.107163 containerd[1537]: time="2025-01-13T20:57:54.107152793Z" level=info msg="TearDown network for sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" successfully" Jan 13 20:57:54.107163 containerd[1537]: time="2025-01-13T20:57:54.107160305Z" level=info msg="StopPodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" returns successfully" Jan 13 20:57:54.107475 containerd[1537]: time="2025-01-13T20:57:54.107278852Z" level=info msg="RemovePodSandbox for \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\"" Jan 13 20:57:54.107475 containerd[1537]: time="2025-01-13T20:57:54.107288281Z" level=info msg="Forcibly stopping sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\"" Jan 13 20:57:54.107475 containerd[1537]: time="2025-01-13T20:57:54.107325178Z" level=info msg="TearDown network for sandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" successfully" Jan 13 20:57:54.108555 containerd[1537]: time="2025-01-13T20:57:54.108538895Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.108584 containerd[1537]: time="2025-01-13T20:57:54.108562012Z" level=info msg="RemovePodSandbox \"839304760da24c1441d0112e871319adc91f7c5835494134e18e0a6874da3df3\" returns successfully" Jan 13 20:57:54.108752 containerd[1537]: time="2025-01-13T20:57:54.108738483Z" level=info msg="StopPodSandbox for \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\"" Jan 13 20:57:54.108793 containerd[1537]: time="2025-01-13T20:57:54.108782233Z" level=info msg="TearDown network for sandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\" successfully" Jan 13 20:57:54.108793 containerd[1537]: time="2025-01-13T20:57:54.108791199Z" level=info msg="StopPodSandbox for \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\" returns successfully" Jan 13 20:57:54.108961 containerd[1537]: time="2025-01-13T20:57:54.108947364Z" level=info msg="RemovePodSandbox for \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\"" Jan 13 20:57:54.108987 containerd[1537]: time="2025-01-13T20:57:54.108973479Z" level=info msg="Forcibly stopping sandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\"" Jan 13 20:57:54.109053 containerd[1537]: time="2025-01-13T20:57:54.109008726Z" level=info msg="TearDown network for sandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\" successfully" Jan 13 20:57:54.110101 containerd[1537]: time="2025-01-13T20:57:54.110085190Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.110318 containerd[1537]: time="2025-01-13T20:57:54.110108582Z" level=info msg="RemovePodSandbox \"00e7310bac04c5c0e1e5cb2b4c508acd9b69c0cd4b4ff4478530f1bce69b95a4\" returns successfully" Jan 13 20:57:54.110318 containerd[1537]: time="2025-01-13T20:57:54.110236716Z" level=info msg="StopPodSandbox for \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\"" Jan 13 20:57:54.110318 containerd[1537]: time="2025-01-13T20:57:54.110279746Z" level=info msg="TearDown network for sandbox \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\" successfully" Jan 13 20:57:54.110318 containerd[1537]: time="2025-01-13T20:57:54.110285986Z" level=info msg="StopPodSandbox for \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\" returns successfully" Jan 13 20:57:54.110443 containerd[1537]: time="2025-01-13T20:57:54.110406620Z" level=info msg="RemovePodSandbox for \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\"" Jan 13 20:57:54.110469 containerd[1537]: time="2025-01-13T20:57:54.110444037Z" level=info msg="Forcibly stopping sandbox \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\"" Jan 13 20:57:54.110533 containerd[1537]: time="2025-01-13T20:57:54.110479028Z" level=info msg="TearDown network for sandbox \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\" successfully" Jan 13 20:57:54.116064 containerd[1537]: time="2025-01-13T20:57:54.116048114Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.116180 containerd[1537]: time="2025-01-13T20:57:54.116070608Z" level=info msg="RemovePodSandbox \"d423032bae4ae844f3e41b03e3d518d17f02f70a9a0fdf0130c6e5341f87ae92\" returns successfully" Jan 13 20:57:54.116236 containerd[1537]: time="2025-01-13T20:57:54.116220324Z" level=info msg="StopPodSandbox for \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\"" Jan 13 20:57:54.116279 containerd[1537]: time="2025-01-13T20:57:54.116265397Z" level=info msg="TearDown network for sandbox \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\" successfully" Jan 13 20:57:54.116279 containerd[1537]: time="2025-01-13T20:57:54.116276550Z" level=info msg="StopPodSandbox for \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\" returns successfully" Jan 13 20:57:54.116409 containerd[1537]: time="2025-01-13T20:57:54.116394183Z" level=info msg="RemovePodSandbox for \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\"" Jan 13 20:57:54.116434 containerd[1537]: time="2025-01-13T20:57:54.116409657Z" level=info msg="Forcibly stopping sandbox \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\"" Jan 13 20:57:54.116476 containerd[1537]: time="2025-01-13T20:57:54.116453471Z" level=info msg="TearDown network for sandbox \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\" successfully" Jan 13 20:57:54.117623 containerd[1537]: time="2025-01-13T20:57:54.117607129Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.117652 containerd[1537]: time="2025-01-13T20:57:54.117627917Z" level=info msg="RemovePodSandbox \"0b2ab397a4622eff468fae535b2c2696e637a6df048ab4e936ca4989959a4777\" returns successfully" Jan 13 20:57:54.117770 containerd[1537]: time="2025-01-13T20:57:54.117754611Z" level=info msg="StopPodSandbox for \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\"" Jan 13 20:57:54.117811 containerd[1537]: time="2025-01-13T20:57:54.117800160Z" level=info msg="TearDown network for sandbox \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\" successfully" Jan 13 20:57:54.117811 containerd[1537]: time="2025-01-13T20:57:54.117806746Z" level=info msg="StopPodSandbox for \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\" returns successfully" Jan 13 20:57:54.117991 containerd[1537]: time="2025-01-13T20:57:54.117978577Z" level=info msg="RemovePodSandbox for \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\"" Jan 13 20:57:54.118012 containerd[1537]: time="2025-01-13T20:57:54.117992702Z" level=info msg="Forcibly stopping sandbox \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\"" Jan 13 20:57:54.118082 containerd[1537]: time="2025-01-13T20:57:54.118059576Z" level=info msg="TearDown network for sandbox \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\" successfully" Jan 13 20:57:54.119193 containerd[1537]: time="2025-01-13T20:57:54.119177707Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.119234 containerd[1537]: time="2025-01-13T20:57:54.119198833Z" level=info msg="RemovePodSandbox \"fbb2ae7ced53b36cf295e74da63c4bc3bb5b01eaa87305770e7f52c092467c2b\" returns successfully" Jan 13 20:57:54.119328 containerd[1537]: time="2025-01-13T20:57:54.119313675Z" level=info msg="StopPodSandbox for \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\"" Jan 13 20:57:54.119381 containerd[1537]: time="2025-01-13T20:57:54.119368686Z" level=info msg="TearDown network for sandbox \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\" successfully" Jan 13 20:57:54.119404 containerd[1537]: time="2025-01-13T20:57:54.119380026Z" level=info msg="StopPodSandbox for \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\" returns successfully" Jan 13 20:57:54.119546 containerd[1537]: time="2025-01-13T20:57:54.119532775Z" level=info msg="RemovePodSandbox for \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\"" Jan 13 20:57:54.119570 containerd[1537]: time="2025-01-13T20:57:54.119547332Z" level=info msg="Forcibly stopping sandbox \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\"" Jan 13 20:57:54.119685 containerd[1537]: time="2025-01-13T20:57:54.119657230Z" level=info msg="TearDown network for sandbox \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\" successfully" Jan 13 20:57:54.120797 containerd[1537]: time="2025-01-13T20:57:54.120781660Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.120848 containerd[1537]: time="2025-01-13T20:57:54.120807022Z" level=info msg="RemovePodSandbox \"568cc7f336833494a2465f9307a7b5a0965dc9be43fed13883783bc3aeb63aeb\" returns successfully" Jan 13 20:57:54.120954 containerd[1537]: time="2025-01-13T20:57:54.120939372Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\"" Jan 13 20:57:54.120993 containerd[1537]: time="2025-01-13T20:57:54.120982227Z" level=info msg="TearDown network for sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" successfully" Jan 13 20:57:54.121014 containerd[1537]: time="2025-01-13T20:57:54.120991850Z" level=info msg="StopPodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" returns successfully" Jan 13 20:57:54.121155 containerd[1537]: time="2025-01-13T20:57:54.121141645Z" level=info msg="RemovePodSandbox for \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\"" Jan 13 20:57:54.121178 containerd[1537]: time="2025-01-13T20:57:54.121157063Z" level=info msg="Forcibly stopping sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\"" Jan 13 20:57:54.121272 containerd[1537]: time="2025-01-13T20:57:54.121236184Z" level=info msg="TearDown network for sandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" successfully" Jan 13 20:57:54.122339 containerd[1537]: time="2025-01-13T20:57:54.122323787Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.122368 containerd[1537]: time="2025-01-13T20:57:54.122344701Z" level=info msg="RemovePodSandbox \"a1b3f6c4906c5373d2ef8287bcc66f55a5a60c1c25b4ef0a9e87cbce13bb75e0\" returns successfully" Jan 13 20:57:54.122488 containerd[1537]: time="2025-01-13T20:57:54.122474280Z" level=info msg="StopPodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\"" Jan 13 20:57:54.122579 containerd[1537]: time="2025-01-13T20:57:54.122565530Z" level=info msg="TearDown network for sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" successfully" Jan 13 20:57:54.122608 containerd[1537]: time="2025-01-13T20:57:54.122589131Z" level=info msg="StopPodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" returns successfully" Jan 13 20:57:54.122725 containerd[1537]: time="2025-01-13T20:57:54.122711898Z" level=info msg="RemovePodSandbox for \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\"" Jan 13 20:57:54.122746 containerd[1537]: time="2025-01-13T20:57:54.122725752Z" level=info msg="Forcibly stopping sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\"" Jan 13 20:57:54.122777 containerd[1537]: time="2025-01-13T20:57:54.122756868Z" level=info msg="TearDown network for sandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" successfully" Jan 13 20:57:54.123962 containerd[1537]: time="2025-01-13T20:57:54.123942769Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.123986 containerd[1537]: time="2025-01-13T20:57:54.123968081Z" level=info msg="RemovePodSandbox \"4c377cd7fb128014144acebbd16d88c24ae636429705895d9aa736fd0181a87f\" returns successfully" Jan 13 20:57:54.124107 containerd[1537]: time="2025-01-13T20:57:54.124091317Z" level=info msg="StopPodSandbox for \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\"" Jan 13 20:57:54.124152 containerd[1537]: time="2025-01-13T20:57:54.124138325Z" level=info msg="TearDown network for sandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\" successfully" Jan 13 20:57:54.124152 containerd[1537]: time="2025-01-13T20:57:54.124149224Z" level=info msg="StopPodSandbox for \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\" returns successfully" Jan 13 20:57:54.124310 containerd[1537]: time="2025-01-13T20:57:54.124296098Z" level=info msg="RemovePodSandbox for \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\"" Jan 13 20:57:54.124332 containerd[1537]: time="2025-01-13T20:57:54.124310233Z" level=info msg="Forcibly stopping sandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\"" Jan 13 20:57:54.124498 containerd[1537]: time="2025-01-13T20:57:54.124341412Z" level=info msg="TearDown network for sandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\" successfully" Jan 13 20:57:54.125501 containerd[1537]: time="2025-01-13T20:57:54.125486375Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.125531 containerd[1537]: time="2025-01-13T20:57:54.125508123Z" level=info msg="RemovePodSandbox \"190027a912e9efb35dff79d082628806028161f65d4e981515671a94d9bc7ef6\" returns successfully" Jan 13 20:57:54.125717 containerd[1537]: time="2025-01-13T20:57:54.125632542Z" level=info msg="StopPodSandbox for \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\"" Jan 13 20:57:54.125717 containerd[1537]: time="2025-01-13T20:57:54.125676115Z" level=info msg="TearDown network for sandbox \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\" successfully" Jan 13 20:57:54.125717 containerd[1537]: time="2025-01-13T20:57:54.125682472Z" level=info msg="StopPodSandbox for \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\" returns successfully" Jan 13 20:57:54.125862 containerd[1537]: time="2025-01-13T20:57:54.125846724Z" level=info msg="RemovePodSandbox for \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\"" Jan 13 20:57:54.126560 containerd[1537]: time="2025-01-13T20:57:54.125904990Z" level=info msg="Forcibly stopping sandbox \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\"" Jan 13 20:57:54.126560 containerd[1537]: time="2025-01-13T20:57:54.125938974Z" level=info msg="TearDown network for sandbox \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\" successfully" Jan 13 20:57:54.126940 containerd[1537]: time="2025-01-13T20:57:54.126927739Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.126994 containerd[1537]: time="2025-01-13T20:57:54.126985970Z" level=info msg="RemovePodSandbox \"c4bf229b6a5ec21abed8cc47d3c13478ec6d49afe4b68a0f4385e55ad0e14a51\" returns successfully" Jan 13 20:57:54.127198 containerd[1537]: time="2025-01-13T20:57:54.127189154Z" level=info msg="StopPodSandbox for \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\"" Jan 13 20:57:54.127268 containerd[1537]: time="2025-01-13T20:57:54.127260311Z" level=info msg="TearDown network for sandbox \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\" successfully" Jan 13 20:57:54.127308 containerd[1537]: time="2025-01-13T20:57:54.127301460Z" level=info msg="StopPodSandbox for \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\" returns successfully" Jan 13 20:57:54.127687 containerd[1537]: time="2025-01-13T20:57:54.127567673Z" level=info msg="RemovePodSandbox for \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\"" Jan 13 20:57:54.127687 containerd[1537]: time="2025-01-13T20:57:54.127583228Z" level=info msg="Forcibly stopping sandbox \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\"" Jan 13 20:57:54.127725 containerd[1537]: time="2025-01-13T20:57:54.127704704Z" level=info msg="TearDown network for sandbox \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\" successfully" Jan 13 20:57:54.128850 containerd[1537]: time="2025-01-13T20:57:54.128820381Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.128901 containerd[1537]: time="2025-01-13T20:57:54.128858142Z" level=info msg="RemovePodSandbox \"7e7c2a6fdcb0bcf511ff933c9577b8adba3d058b9f7f6117a7ed04e8e23afcc7\" returns successfully" Jan 13 20:57:54.129126 containerd[1537]: time="2025-01-13T20:57:54.129007495Z" level=info msg="StopPodSandbox for \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\"" Jan 13 20:57:54.129126 containerd[1537]: time="2025-01-13T20:57:54.129047287Z" level=info msg="TearDown network for sandbox \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\" successfully" Jan 13 20:57:54.129126 containerd[1537]: time="2025-01-13T20:57:54.129053225Z" level=info msg="StopPodSandbox for \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\" returns successfully" Jan 13 20:57:54.129844 containerd[1537]: time="2025-01-13T20:57:54.129216418Z" level=info msg="RemovePodSandbox for \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\"" Jan 13 20:57:54.129844 containerd[1537]: time="2025-01-13T20:57:54.129228314Z" level=info msg="Forcibly stopping sandbox \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\"" Jan 13 20:57:54.129844 containerd[1537]: time="2025-01-13T20:57:54.129255673Z" level=info msg="TearDown network for sandbox \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\" successfully" Jan 13 20:57:54.130300 containerd[1537]: time="2025-01-13T20:57:54.130283280Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.130336 containerd[1537]: time="2025-01-13T20:57:54.130305400Z" level=info msg="RemovePodSandbox \"2870fa0280fa781fbc6d9f505f315012db9656e327d1c235f16b013bcd2685ac\" returns successfully" Jan 13 20:57:54.130522 containerd[1537]: time="2025-01-13T20:57:54.130471716Z" level=info msg="StopPodSandbox for \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\"" Jan 13 20:57:54.130693 containerd[1537]: time="2025-01-13T20:57:54.130587750Z" level=info msg="TearDown network for sandbox \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\" successfully" Jan 13 20:57:54.130693 containerd[1537]: time="2025-01-13T20:57:54.130606116Z" level=info msg="StopPodSandbox for \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\" returns successfully" Jan 13 20:57:54.130776 containerd[1537]: time="2025-01-13T20:57:54.130724378Z" level=info msg="RemovePodSandbox for \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\"" Jan 13 20:57:54.130776 containerd[1537]: time="2025-01-13T20:57:54.130734763Z" level=info msg="Forcibly stopping sandbox \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\"" Jan 13 20:57:54.130776 containerd[1537]: time="2025-01-13T20:57:54.130765182Z" level=info msg="TearDown network for sandbox \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\" successfully" Jan 13 20:57:54.131789 containerd[1537]: time="2025-01-13T20:57:54.131772622Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.131834 containerd[1537]: time="2025-01-13T20:57:54.131795582Z" level=info msg="RemovePodSandbox \"27dc24fbd2094ecd0bd57a59a33da29655263c612e4c3ae63c51fe0ff1f338dd\" returns successfully" Jan 13 20:57:54.132066 containerd[1537]: time="2025-01-13T20:57:54.131951549Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\"" Jan 13 20:57:54.132066 containerd[1537]: time="2025-01-13T20:57:54.131994078Z" level=info msg="TearDown network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" successfully" Jan 13 20:57:54.132066 containerd[1537]: time="2025-01-13T20:57:54.132015819Z" level=info msg="StopPodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" returns successfully" Jan 13 20:57:54.132251 containerd[1537]: time="2025-01-13T20:57:54.132239118Z" level=info msg="RemovePodSandbox for \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\"" Jan 13 20:57:54.132279 containerd[1537]: time="2025-01-13T20:57:54.132250712Z" level=info msg="Forcibly stopping sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\"" Jan 13 20:57:54.132296 containerd[1537]: time="2025-01-13T20:57:54.132282232Z" level=info msg="TearDown network for sandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" successfully" Jan 13 20:57:54.133381 containerd[1537]: time="2025-01-13T20:57:54.133364659Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.133416 containerd[1537]: time="2025-01-13T20:57:54.133385057Z" level=info msg="RemovePodSandbox \"e9dc9c06bc0fd5d4321bf95e5126d04b62bf69c1aa45b80fe59461b7e3589dd0\" returns successfully" Jan 13 20:57:54.133669 containerd[1537]: time="2025-01-13T20:57:54.133534054Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\"" Jan 13 20:57:54.133669 containerd[1537]: time="2025-01-13T20:57:54.133574595Z" level=info msg="TearDown network for sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" successfully" Jan 13 20:57:54.133669 containerd[1537]: time="2025-01-13T20:57:54.133581044Z" level=info msg="StopPodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" returns successfully" Jan 13 20:57:54.133746 containerd[1537]: time="2025-01-13T20:57:54.133703154Z" level=info msg="RemovePodSandbox for \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\"" Jan 13 20:57:54.133746 containerd[1537]: time="2025-01-13T20:57:54.133714116Z" level=info msg="Forcibly stopping sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\"" Jan 13 20:57:54.133781 containerd[1537]: time="2025-01-13T20:57:54.133744554Z" level=info msg="TearDown network for sandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" successfully" Jan 13 20:57:54.134804 containerd[1537]: time="2025-01-13T20:57:54.134789123Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.134851 containerd[1537]: time="2025-01-13T20:57:54.134810570Z" level=info msg="RemovePodSandbox \"c24fbfe4ea9a2e5c9bbeadf2a28f481205668c3eaea9a51be25d6231a85afa74\" returns successfully" Jan 13 20:57:54.134976 containerd[1537]: time="2025-01-13T20:57:54.134958991Z" level=info msg="StopPodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\"" Jan 13 20:57:54.135003 containerd[1537]: time="2025-01-13T20:57:54.134996719Z" level=info msg="TearDown network for sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" successfully" Jan 13 20:57:54.135020 containerd[1537]: time="2025-01-13T20:57:54.135002646Z" level=info msg="StopPodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" returns successfully" Jan 13 20:57:54.135873 containerd[1537]: time="2025-01-13T20:57:54.135128924Z" level=info msg="RemovePodSandbox for \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\"" Jan 13 20:57:54.135873 containerd[1537]: time="2025-01-13T20:57:54.135140147Z" level=info msg="Forcibly stopping sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\"" Jan 13 20:57:54.135873 containerd[1537]: time="2025-01-13T20:57:54.135170725Z" level=info msg="TearDown network for sandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" successfully" Jan 13 20:57:54.136222 containerd[1537]: time="2025-01-13T20:57:54.136205672Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.136250 containerd[1537]: time="2025-01-13T20:57:54.136227462Z" level=info msg="RemovePodSandbox \"0a531d9bfb18145726a03cb37250b2870739cff8641d422f97bea76c65d667b8\" returns successfully" Jan 13 20:57:54.136358 containerd[1537]: time="2025-01-13T20:57:54.136347981Z" level=info msg="StopPodSandbox for \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\"" Jan 13 20:57:54.136459 containerd[1537]: time="2025-01-13T20:57:54.136427373Z" level=info msg="TearDown network for sandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\" successfully" Jan 13 20:57:54.136459 containerd[1537]: time="2025-01-13T20:57:54.136435267Z" level=info msg="StopPodSandbox for \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\" returns successfully" Jan 13 20:57:54.137275 containerd[1537]: time="2025-01-13T20:57:54.136582326Z" level=info msg="RemovePodSandbox for \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\"" Jan 13 20:57:54.137275 containerd[1537]: time="2025-01-13T20:57:54.136593163Z" level=info msg="Forcibly stopping sandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\"" Jan 13 20:57:54.137275 containerd[1537]: time="2025-01-13T20:57:54.136621746Z" level=info msg="TearDown network for sandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\" successfully" Jan 13 20:57:54.137683 containerd[1537]: time="2025-01-13T20:57:54.137671522Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.137753 containerd[1537]: time="2025-01-13T20:57:54.137743543Z" level=info msg="RemovePodSandbox \"547f1197d881175b62bd47cd89a3a48b24819878e975ae9b9db7107a6e2bfc6e\" returns successfully" Jan 13 20:57:54.137909 containerd[1537]: time="2025-01-13T20:57:54.137895741Z" level=info msg="StopPodSandbox for \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\"" Jan 13 20:57:54.137951 containerd[1537]: time="2025-01-13T20:57:54.137940048Z" level=info msg="TearDown network for sandbox \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\" successfully" Jan 13 20:57:54.137951 containerd[1537]: time="2025-01-13T20:57:54.137948388Z" level=info msg="StopPodSandbox for \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\" returns successfully" Jan 13 20:57:54.138088 containerd[1537]: time="2025-01-13T20:57:54.138078087Z" level=info msg="RemovePodSandbox for \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\"" Jan 13 20:57:54.138170 containerd[1537]: time="2025-01-13T20:57:54.138136044Z" level=info msg="Forcibly stopping sandbox \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\"" Jan 13 20:57:54.138254 containerd[1537]: time="2025-01-13T20:57:54.138220913Z" level=info msg="TearDown network for sandbox \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\" successfully" Jan 13 20:57:54.139324 containerd[1537]: time="2025-01-13T20:57:54.139272484Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.139324 containerd[1537]: time="2025-01-13T20:57:54.139292373Z" level=info msg="RemovePodSandbox \"9283f993d5172de77f27192c5e4d3351fad03e1ce2fd6a417ebd34c4cba5071d\" returns successfully" Jan 13 20:57:54.140454 containerd[1537]: time="2025-01-13T20:57:54.139426028Z" level=info msg="StopPodSandbox for \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\"" Jan 13 20:57:54.140454 containerd[1537]: time="2025-01-13T20:57:54.139518529Z" level=info msg="TearDown network for sandbox \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\" successfully" Jan 13 20:57:54.140454 containerd[1537]: time="2025-01-13T20:57:54.139525300Z" level=info msg="StopPodSandbox for \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\" returns successfully" Jan 13 20:57:54.140454 containerd[1537]: time="2025-01-13T20:57:54.139657674Z" level=info msg="RemovePodSandbox for \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\"" Jan 13 20:57:54.140454 containerd[1537]: time="2025-01-13T20:57:54.139668377Z" level=info msg="Forcibly stopping sandbox \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\"" Jan 13 20:57:54.140454 containerd[1537]: time="2025-01-13T20:57:54.139700094Z" level=info msg="TearDown network for sandbox \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\" successfully" Jan 13 20:57:54.140873 containerd[1537]: time="2025-01-13T20:57:54.140861913Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.140922 containerd[1537]: time="2025-01-13T20:57:54.140913775Z" level=info msg="RemovePodSandbox \"a0f1ba0145a51721bd3e83054dd40d4ab938cad51fad15dd9a1f883b78aa70ce\" returns successfully" Jan 13 20:57:54.141125 containerd[1537]: time="2025-01-13T20:57:54.141109665Z" level=info msg="StopPodSandbox for \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\"" Jan 13 20:57:54.141198 containerd[1537]: time="2025-01-13T20:57:54.141185267Z" level=info msg="TearDown network for sandbox \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\" successfully" Jan 13 20:57:54.141198 containerd[1537]: time="2025-01-13T20:57:54.141196036Z" level=info msg="StopPodSandbox for \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\" returns successfully" Jan 13 20:57:54.141479 containerd[1537]: time="2025-01-13T20:57:54.141337050Z" level=info msg="RemovePodSandbox for \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\"" Jan 13 20:57:54.141479 containerd[1537]: time="2025-01-13T20:57:54.141348292Z" level=info msg="Forcibly stopping sandbox \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\"" Jan 13 20:57:54.141479 containerd[1537]: time="2025-01-13T20:57:54.141429958Z" level=info msg="TearDown network for sandbox \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\" successfully" Jan 13 20:57:54.142516 containerd[1537]: time="2025-01-13T20:57:54.142499946Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.142544 containerd[1537]: time="2025-01-13T20:57:54.142521974Z" level=info msg="RemovePodSandbox \"9651cfd1f3f5182a4fa49aa6cd7cfe6198330be54c67cd9c8fefa531d252b91a\" returns successfully" Jan 13 20:57:54.142697 containerd[1537]: time="2025-01-13T20:57:54.142681741Z" level=info msg="StopPodSandbox for \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\"" Jan 13 20:57:54.142735 containerd[1537]: time="2025-01-13T20:57:54.142723026Z" level=info msg="TearDown network for sandbox \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\" successfully" Jan 13 20:57:54.142735 containerd[1537]: time="2025-01-13T20:57:54.142732715Z" level=info msg="StopPodSandbox for \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\" returns successfully" Jan 13 20:57:54.142866 containerd[1537]: time="2025-01-13T20:57:54.142851930Z" level=info msg="RemovePodSandbox for \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\"" Jan 13 20:57:54.142892 containerd[1537]: time="2025-01-13T20:57:54.142866936Z" level=info msg="Forcibly stopping sandbox \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\"" Jan 13 20:57:54.142911 containerd[1537]: time="2025-01-13T20:57:54.142897767Z" level=info msg="TearDown network for sandbox \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\" successfully" Jan 13 20:57:54.143985 containerd[1537]: time="2025-01-13T20:57:54.143970056Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.144015 containerd[1537]: time="2025-01-13T20:57:54.143993158Z" level=info msg="RemovePodSandbox \"9450fb597d4393f3012448045ec5c86e67242c67d55528a0d99ca17339b1fe43\" returns successfully" Jan 13 20:57:54.144280 containerd[1537]: time="2025-01-13T20:57:54.144208907Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\"" Jan 13 20:57:54.144280 containerd[1537]: time="2025-01-13T20:57:54.144247635Z" level=info msg="TearDown network for sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" successfully" Jan 13 20:57:54.144280 containerd[1537]: time="2025-01-13T20:57:54.144253788Z" level=info msg="StopPodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" returns successfully" Jan 13 20:57:54.144382 containerd[1537]: time="2025-01-13T20:57:54.144366918Z" level=info msg="RemovePodSandbox for \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\"" Jan 13 20:57:54.144402 containerd[1537]: time="2025-01-13T20:57:54.144382097Z" level=info msg="Forcibly stopping sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\"" Jan 13 20:57:54.144435 containerd[1537]: time="2025-01-13T20:57:54.144414643Z" level=info msg="TearDown network for sandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" successfully" Jan 13 20:57:54.145582 containerd[1537]: time="2025-01-13T20:57:54.145567869Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.145622 containerd[1537]: time="2025-01-13T20:57:54.145588728Z" level=info msg="RemovePodSandbox \"1cc9e035ba424df699bb0ee8e3fe998336e1d875e6cb6a09e97bfda9bf041d46\" returns successfully" Jan 13 20:57:54.145785 containerd[1537]: time="2025-01-13T20:57:54.145698497Z" level=info msg="StopPodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\"" Jan 13 20:57:54.145785 containerd[1537]: time="2025-01-13T20:57:54.145753172Z" level=info msg="TearDown network for sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" successfully" Jan 13 20:57:54.145785 containerd[1537]: time="2025-01-13T20:57:54.145760064Z" level=info msg="StopPodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" returns successfully" Jan 13 20:57:54.145896 containerd[1537]: time="2025-01-13T20:57:54.145881265Z" level=info msg="RemovePodSandbox for \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\"" Jan 13 20:57:54.145918 containerd[1537]: time="2025-01-13T20:57:54.145895565Z" level=info msg="Forcibly stopping sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\"" Jan 13 20:57:54.145953 containerd[1537]: time="2025-01-13T20:57:54.145929494Z" level=info msg="TearDown network for sandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" successfully" Jan 13 20:57:54.147061 containerd[1537]: time="2025-01-13T20:57:54.147045767Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.147104 containerd[1537]: time="2025-01-13T20:57:54.147068481Z" level=info msg="RemovePodSandbox \"23a86b163a91238f48edeb45bf488e99f35e1c664665927516952ff7959f68c7\" returns successfully" Jan 13 20:57:54.147202 containerd[1537]: time="2025-01-13T20:57:54.147179880Z" level=info msg="StopPodSandbox for \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\"" Jan 13 20:57:54.147275 containerd[1537]: time="2025-01-13T20:57:54.147259181Z" level=info msg="TearDown network for sandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\" successfully" Jan 13 20:57:54.147334 containerd[1537]: time="2025-01-13T20:57:54.147303198Z" level=info msg="StopPodSandbox for \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\" returns successfully" Jan 13 20:57:54.147426 containerd[1537]: time="2025-01-13T20:57:54.147414642Z" level=info msg="RemovePodSandbox for \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\"" Jan 13 20:57:54.147453 containerd[1537]: time="2025-01-13T20:57:54.147426888Z" level=info msg="Forcibly stopping sandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\"" Jan 13 20:57:54.147472 containerd[1537]: time="2025-01-13T20:57:54.147458592Z" level=info msg="TearDown network for sandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\" successfully" Jan 13 20:57:54.148485 containerd[1537]: time="2025-01-13T20:57:54.148470153Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.148510 containerd[1537]: time="2025-01-13T20:57:54.148495235Z" level=info msg="RemovePodSandbox \"298762680c51c3d83cce39b3d673f86e02d79f5511c6d6c0babd76b042dadb5d\" returns successfully" Jan 13 20:57:54.148681 containerd[1537]: time="2025-01-13T20:57:54.148616798Z" level=info msg="StopPodSandbox for \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\"" Jan 13 20:57:54.148681 containerd[1537]: time="2025-01-13T20:57:54.148653298Z" level=info msg="TearDown network for sandbox \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\" successfully" Jan 13 20:57:54.148681 containerd[1537]: time="2025-01-13T20:57:54.148659041Z" level=info msg="StopPodSandbox for \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\" returns successfully" Jan 13 20:57:54.149266 containerd[1537]: time="2025-01-13T20:57:54.149145740Z" level=info msg="RemovePodSandbox for \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\"" Jan 13 20:57:54.149266 containerd[1537]: time="2025-01-13T20:57:54.149157927Z" level=info msg="Forcibly stopping sandbox \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\"" Jan 13 20:57:54.150312 containerd[1537]: time="2025-01-13T20:57:54.149321513Z" level=info msg="TearDown network for sandbox \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\" successfully" Jan 13 20:57:54.151337 containerd[1537]: time="2025-01-13T20:57:54.151325021Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.151410 containerd[1537]: time="2025-01-13T20:57:54.151401203Z" level=info msg="RemovePodSandbox \"420aeb1f0db27650b9d149bfef7f2dececa7e3ed0dd764126e712f3f264189f0\" returns successfully" Jan 13 20:57:54.152632 containerd[1537]: time="2025-01-13T20:57:54.152620774Z" level=info msg="StopPodSandbox for \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\"" Jan 13 20:57:54.152804 containerd[1537]: time="2025-01-13T20:57:54.152795548Z" level=info msg="TearDown network for sandbox \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\" successfully" Jan 13 20:57:54.152943 containerd[1537]: time="2025-01-13T20:57:54.152934840Z" level=info msg="StopPodSandbox for \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\" returns successfully" Jan 13 20:57:54.153262 containerd[1537]: time="2025-01-13T20:57:54.153252749Z" level=info msg="RemovePodSandbox for \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\"" Jan 13 20:57:54.153318 containerd[1537]: time="2025-01-13T20:57:54.153311174Z" level=info msg="Forcibly stopping sandbox \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\"" Jan 13 20:57:54.153404 containerd[1537]: time="2025-01-13T20:57:54.153383408Z" level=info msg="TearDown network for sandbox \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\" successfully" Jan 13 20:57:54.154612 containerd[1537]: time="2025-01-13T20:57:54.154600920Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.154681 containerd[1537]: time="2025-01-13T20:57:54.154672302Z" level=info msg="RemovePodSandbox \"f114e1d00640362f483a3d1fbdb0f6e50e5c4b1c175821549d60664b318b4f75\" returns successfully" Jan 13 20:57:54.154892 containerd[1537]: time="2025-01-13T20:57:54.154871162Z" level=info msg="StopPodSandbox for \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\"" Jan 13 20:57:54.154926 containerd[1537]: time="2025-01-13T20:57:54.154914808Z" level=info msg="TearDown network for sandbox \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\" successfully" Jan 13 20:57:54.154926 containerd[1537]: time="2025-01-13T20:57:54.154923223Z" level=info msg="StopPodSandbox for \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\" returns successfully" Jan 13 20:57:54.155749 containerd[1537]: time="2025-01-13T20:57:54.155086886Z" level=info msg="RemovePodSandbox for \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\"" Jan 13 20:57:54.155749 containerd[1537]: time="2025-01-13T20:57:54.155098844Z" level=info msg="Forcibly stopping sandbox \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\"" Jan 13 20:57:54.155749 containerd[1537]: time="2025-01-13T20:57:54.155129989Z" level=info msg="TearDown network for sandbox \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\" successfully" Jan 13 20:57:54.156201 containerd[1537]: time="2025-01-13T20:57:54.156190288Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.156258 containerd[1537]: time="2025-01-13T20:57:54.156249829Z" level=info msg="RemovePodSandbox \"baf6feaa43460977f95f16b673c52b83ffcadfc8eafe2155f16cb66ff6066596\" returns successfully" Jan 13 20:57:54.156434 containerd[1537]: time="2025-01-13T20:57:54.156420397Z" level=info msg="StopPodSandbox for \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\"" Jan 13 20:57:54.156520 containerd[1537]: time="2025-01-13T20:57:54.156508602Z" level=info msg="TearDown network for sandbox \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\" successfully" Jan 13 20:57:54.156559 containerd[1537]: time="2025-01-13T20:57:54.156519319Z" level=info msg="StopPodSandbox for \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\" returns successfully" Jan 13 20:57:54.156697 containerd[1537]: time="2025-01-13T20:57:54.156682447Z" level=info msg="RemovePodSandbox for \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\"" Jan 13 20:57:54.156733 containerd[1537]: time="2025-01-13T20:57:54.156697693Z" level=info msg="Forcibly stopping sandbox \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\"" Jan 13 20:57:54.156750 containerd[1537]: time="2025-01-13T20:57:54.156735113Z" level=info msg="TearDown network for sandbox \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\" successfully" Jan 13 20:57:54.158122 containerd[1537]: time="2025-01-13T20:57:54.158105908Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:57:54.158153 containerd[1537]: time="2025-01-13T20:57:54.158131701Z" level=info msg="RemovePodSandbox \"1da0f009d38f2e76e7aaf9c993da2c900518004b9b61d8cfc7519b39b824478a\" returns successfully" Jan 13 20:57:56.667789 containerd[1537]: time="2025-01-13T20:57:56.667765782Z" level=info msg="StopContainer for \"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962\" with timeout 300 (s)" Jan 13 20:57:56.668809 containerd[1537]: time="2025-01-13T20:57:56.668299041Z" level=info msg="Stop container \"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962\" with signal terminated" Jan 13 20:57:56.795737 containerd[1537]: time="2025-01-13T20:57:56.795572872Z" level=info msg="StopContainer for \"f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a\" with timeout 30 (s)" Jan 13 20:57:56.796091 containerd[1537]: time="2025-01-13T20:57:56.796067610Z" level=info msg="Stop container \"f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a\" with signal terminated" Jan 13 20:57:56.831596 systemd[1]: cri-containerd-f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a.scope: Deactivated successfully. Jan 13 20:57:56.852212 containerd[1537]: time="2025-01-13T20:57:56.852174643Z" level=info msg="StopContainer for \"746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111\" with timeout 5 (s)" Jan 13 20:57:56.852528 containerd[1537]: time="2025-01-13T20:57:56.852404902Z" level=info msg="Stop container \"746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111\" with signal terminated" Jan 13 20:57:56.858008 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a-rootfs.mount: Deactivated successfully. Jan 13 20:57:56.871579 containerd[1537]: time="2025-01-13T20:57:56.859687297Z" level=info msg="shim disconnected" id=f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a namespace=k8s.io Jan 13 20:57:56.891003 systemd[1]: cri-containerd-746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111.scope: Deactivated successfully. Jan 13 20:57:56.891173 systemd[1]: cri-containerd-746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111.scope: Consumed 1.123s CPU time. Jan 13 20:57:56.904433 containerd[1537]: time="2025-01-13T20:57:56.897935070Z" level=warning msg="cleaning up after shim disconnected" id=f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a namespace=k8s.io Jan 13 20:57:56.904433 containerd[1537]: time="2025-01-13T20:57:56.897954623Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:57:56.913457 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111-rootfs.mount: Deactivated successfully. Jan 13 20:57:56.915402 containerd[1537]: time="2025-01-13T20:57:56.915344544Z" level=info msg="shim disconnected" id=746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111 namespace=k8s.io Jan 13 20:57:56.915402 containerd[1537]: time="2025-01-13T20:57:56.915375767Z" level=warning msg="cleaning up after shim disconnected" id=746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111 namespace=k8s.io Jan 13 20:57:56.915402 containerd[1537]: time="2025-01-13T20:57:56.915381286Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:57:56.958926 containerd[1537]: time="2025-01-13T20:57:56.958869436Z" level=info msg="StopContainer for \"f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a\" returns successfully" Jan 13 20:57:56.960048 containerd[1537]: time="2025-01-13T20:57:56.959815852Z" level=info msg="StopContainer for \"746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111\" returns successfully" Jan 13 20:57:56.961805 containerd[1537]: time="2025-01-13T20:57:56.961731814Z" level=info msg="StopPodSandbox for \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\"" Jan 13 20:57:56.962017 containerd[1537]: time="2025-01-13T20:57:56.961990792Z" level=info msg="StopPodSandbox for \"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba\"" Jan 13 20:57:56.963075 containerd[1537]: time="2025-01-13T20:57:56.962008658Z" level=info msg="Container to stop \"f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 13 20:57:56.965139 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba-shm.mount: Deactivated successfully. Jan 13 20:57:56.965432 containerd[1537]: time="2025-01-13T20:57:56.961756540Z" level=info msg="Container to stop \"746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 13 20:57:56.965495 containerd[1537]: time="2025-01-13T20:57:56.965486153Z" level=info msg="Container to stop \"4c5cf52fd4d67dfef056d20edc195a2a25ed17cd5d64960811f721e6833a70d1\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 13 20:57:56.965554 containerd[1537]: time="2025-01-13T20:57:56.965526256Z" level=info msg="Container to stop \"1d4bf2708587ca90f3dabf85c0cabedb3a0aefec59ff83b769b2cc69bb9d23e6\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 13 20:57:56.970495 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df-shm.mount: Deactivated successfully. Jan 13 20:57:56.975494 systemd[1]: cri-containerd-1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba.scope: Deactivated successfully. Jan 13 20:57:56.977892 systemd[1]: cri-containerd-398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df.scope: Deactivated successfully. Jan 13 20:57:57.002951 containerd[1537]: time="2025-01-13T20:57:57.002899220Z" level=info msg="shim disconnected" id=1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba namespace=k8s.io Jan 13 20:57:57.003070 containerd[1537]: time="2025-01-13T20:57:57.003058850Z" level=warning msg="cleaning up after shim disconnected" id=1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba namespace=k8s.io Jan 13 20:57:57.003124 containerd[1537]: time="2025-01-13T20:57:57.003109422Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:57:57.011513 containerd[1537]: time="2025-01-13T20:57:57.011436002Z" level=info msg="shim disconnected" id=398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df namespace=k8s.io Jan 13 20:57:57.011621 containerd[1537]: time="2025-01-13T20:57:57.011467910Z" level=warning msg="cleaning up after shim disconnected" id=398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df namespace=k8s.io Jan 13 20:57:57.011621 containerd[1537]: time="2025-01-13T20:57:57.011618402Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:57:57.030945 containerd[1537]: time="2025-01-13T20:57:57.030918764Z" level=warning msg="cleanup warnings time=\"2025-01-13T20:57:57Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Jan 13 20:57:57.037366 containerd[1537]: time="2025-01-13T20:57:57.037242694Z" level=info msg="TearDown network for sandbox \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\" successfully" Jan 13 20:57:57.037366 containerd[1537]: time="2025-01-13T20:57:57.037263344Z" level=info msg="StopPodSandbox for \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\" returns successfully" Jan 13 20:57:57.118876 kubelet[2787]: E0113 20:57:57.118807 2787 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="b5b99a2e-4a23-4904-9961-423d7f5593e3" containerName="flexvol-driver" Jan 13 20:57:57.121065 kubelet[2787]: E0113 20:57:57.120805 2787 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="b5b99a2e-4a23-4904-9961-423d7f5593e3" containerName="calico-node" Jan 13 20:57:57.121981 kubelet[2787]: E0113 20:57:57.121860 2787 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="b5b99a2e-4a23-4904-9961-423d7f5593e3" containerName="install-cni" Jan 13 20:57:57.121981 kubelet[2787]: I0113 20:57:57.121903 2787 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5b99a2e-4a23-4904-9961-423d7f5593e3" containerName="calico-node" Jan 13 20:57:57.124631 kubelet[2787]: I0113 20:57:57.124425 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-cni-bin-dir\") pod \"b5b99a2e-4a23-4904-9961-423d7f5593e3\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " Jan 13 20:57:57.124631 kubelet[2787]: I0113 20:57:57.124459 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-var-run-calico\") pod \"b5b99a2e-4a23-4904-9961-423d7f5593e3\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " Jan 13 20:57:57.124631 kubelet[2787]: I0113 20:57:57.124471 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-xtables-lock\") pod \"b5b99a2e-4a23-4904-9961-423d7f5593e3\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " Jan 13 20:57:57.124631 kubelet[2787]: I0113 20:57:57.124480 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-policysync\") pod \"b5b99a2e-4a23-4904-9961-423d7f5593e3\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " Jan 13 20:57:57.124631 kubelet[2787]: I0113 20:57:57.124495 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv84w\" (UniqueName: \"kubernetes.io/projected/b5b99a2e-4a23-4904-9961-423d7f5593e3-kube-api-access-nv84w\") pod \"b5b99a2e-4a23-4904-9961-423d7f5593e3\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " Jan 13 20:57:57.124631 kubelet[2787]: I0113 20:57:57.124504 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-flexvol-driver-host\") pod \"b5b99a2e-4a23-4904-9961-423d7f5593e3\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " Jan 13 20:57:57.124764 kubelet[2787]: I0113 20:57:57.124514 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b5b99a2e-4a23-4904-9961-423d7f5593e3-node-certs\") pod \"b5b99a2e-4a23-4904-9961-423d7f5593e3\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " Jan 13 20:57:57.124764 kubelet[2787]: I0113 20:57:57.124524 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b99a2e-4a23-4904-9961-423d7f5593e3-tigera-ca-bundle\") pod \"b5b99a2e-4a23-4904-9961-423d7f5593e3\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " Jan 13 20:57:57.124764 kubelet[2787]: I0113 20:57:57.124533 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-cni-log-dir\") pod \"b5b99a2e-4a23-4904-9961-423d7f5593e3\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " Jan 13 20:57:57.124764 kubelet[2787]: I0113 20:57:57.124543 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-var-lib-calico\") pod \"b5b99a2e-4a23-4904-9961-423d7f5593e3\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " Jan 13 20:57:57.124764 kubelet[2787]: I0113 20:57:57.124554 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-lib-modules\") pod \"b5b99a2e-4a23-4904-9961-423d7f5593e3\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " Jan 13 20:57:57.124764 kubelet[2787]: I0113 20:57:57.124561 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-cni-net-dir\") pod \"b5b99a2e-4a23-4904-9961-423d7f5593e3\" (UID: \"b5b99a2e-4a23-4904-9961-423d7f5593e3\") " Jan 13 20:57:57.134940 kubelet[2787]: I0113 20:57:57.133483 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "b5b99a2e-4a23-4904-9961-423d7f5593e3" (UID: "b5b99a2e-4a23-4904-9961-423d7f5593e3"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:57:57.134940 kubelet[2787]: I0113 20:57:57.134845 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "b5b99a2e-4a23-4904-9961-423d7f5593e3" (UID: "b5b99a2e-4a23-4904-9961-423d7f5593e3"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:57:57.134940 kubelet[2787]: I0113 20:57:57.134860 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "b5b99a2e-4a23-4904-9961-423d7f5593e3" (UID: "b5b99a2e-4a23-4904-9961-423d7f5593e3"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:57:57.134940 kubelet[2787]: I0113 20:57:57.134871 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-policysync" (OuterVolumeSpecName: "policysync") pod "b5b99a2e-4a23-4904-9961-423d7f5593e3" (UID: "b5b99a2e-4a23-4904-9961-423d7f5593e3"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:57:57.149490 kubelet[2787]: I0113 20:57:57.149455 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "b5b99a2e-4a23-4904-9961-423d7f5593e3" (UID: "b5b99a2e-4a23-4904-9961-423d7f5593e3"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:57:57.150485 kubelet[2787]: I0113 20:57:57.150304 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "b5b99a2e-4a23-4904-9961-423d7f5593e3" (UID: "b5b99a2e-4a23-4904-9961-423d7f5593e3"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:57:57.178689 kubelet[2787]: I0113 20:57:57.178489 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "b5b99a2e-4a23-4904-9961-423d7f5593e3" (UID: "b5b99a2e-4a23-4904-9961-423d7f5593e3"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:57:57.178689 kubelet[2787]: I0113 20:57:57.178571 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "b5b99a2e-4a23-4904-9961-423d7f5593e3" (UID: "b5b99a2e-4a23-4904-9961-423d7f5593e3"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:57:57.178689 kubelet[2787]: I0113 20:57:57.178585 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "b5b99a2e-4a23-4904-9961-423d7f5593e3" (UID: "b5b99a2e-4a23-4904-9961-423d7f5593e3"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:57:57.191417 kubelet[2787]: I0113 20:57:57.191390 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b99a2e-4a23-4904-9961-423d7f5593e3-node-certs" (OuterVolumeSpecName: "node-certs") pod "b5b99a2e-4a23-4904-9961-423d7f5593e3" (UID: "b5b99a2e-4a23-4904-9961-423d7f5593e3"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 13 20:57:57.191650 kubelet[2787]: I0113 20:57:57.191515 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b99a2e-4a23-4904-9961-423d7f5593e3-kube-api-access-nv84w" (OuterVolumeSpecName: "kube-api-access-nv84w") pod "b5b99a2e-4a23-4904-9961-423d7f5593e3" (UID: "b5b99a2e-4a23-4904-9961-423d7f5593e3"). InnerVolumeSpecName "kube-api-access-nv84w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 13 20:57:57.206579 systemd[1]: Created slice kubepods-besteffort-podd51a7cb7_d07d_430b_bbd6_1c0af95001bf.slice - libcontainer container kubepods-besteffort-podd51a7cb7_d07d_430b_bbd6_1c0af95001bf.slice. Jan 13 20:57:57.210065 kubelet[2787]: I0113 20:57:57.210015 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b99a2e-4a23-4904-9961-423d7f5593e3-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "b5b99a2e-4a23-4904-9961-423d7f5593e3" (UID: "b5b99a2e-4a23-4904-9961-423d7f5593e3"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 13 20:57:57.227110 kubelet[2787]: I0113 20:57:57.227023 2787 reconciler_common.go:288] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-cni-bin-dir\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:57.227110 kubelet[2787]: I0113 20:57:57.227045 2787 reconciler_common.go:288] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-xtables-lock\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:57.227110 kubelet[2787]: I0113 20:57:57.227051 2787 reconciler_common.go:288] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-var-run-calico\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:57.227110 kubelet[2787]: I0113 20:57:57.227057 2787 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-nv84w\" (UniqueName: \"kubernetes.io/projected/b5b99a2e-4a23-4904-9961-423d7f5593e3-kube-api-access-nv84w\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:57.227110 kubelet[2787]: I0113 20:57:57.227062 2787 reconciler_common.go:288] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-flexvol-driver-host\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:57.227110 kubelet[2787]: I0113 20:57:57.227067 2787 reconciler_common.go:288] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-policysync\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:57.227110 kubelet[2787]: I0113 20:57:57.227072 2787 reconciler_common.go:288] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b5b99a2e-4a23-4904-9961-423d7f5593e3-node-certs\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:57.227110 kubelet[2787]: I0113 20:57:57.227077 2787 reconciler_common.go:288] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b99a2e-4a23-4904-9961-423d7f5593e3-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:57.227314 kubelet[2787]: I0113 20:57:57.227081 2787 reconciler_common.go:288] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-cni-log-dir\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:57.227314 kubelet[2787]: I0113 20:57:57.227085 2787 reconciler_common.go:288] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-var-lib-calico\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:57.227314 kubelet[2787]: I0113 20:57:57.227091 2787 reconciler_common.go:288] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-cni-net-dir\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:57.227314 kubelet[2787]: I0113 20:57:57.227096 2787 reconciler_common.go:288] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5b99a2e-4a23-4904-9961-423d7f5593e3-lib-modules\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:57.328401 kubelet[2787]: I0113 20:57:57.328184 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d51a7cb7-d07d-430b-bbd6-1c0af95001bf-var-lib-calico\") pod \"calico-node-m5mwg\" (UID: \"d51a7cb7-d07d-430b-bbd6-1c0af95001bf\") " pod="calico-system/calico-node-m5mwg" Jan 13 20:57:57.328401 kubelet[2787]: I0113 20:57:57.328212 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d51a7cb7-d07d-430b-bbd6-1c0af95001bf-policysync\") pod \"calico-node-m5mwg\" (UID: \"d51a7cb7-d07d-430b-bbd6-1c0af95001bf\") " pod="calico-system/calico-node-m5mwg" Jan 13 20:57:57.328401 kubelet[2787]: I0113 20:57:57.328225 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d51a7cb7-d07d-430b-bbd6-1c0af95001bf-cni-log-dir\") pod \"calico-node-m5mwg\" (UID: \"d51a7cb7-d07d-430b-bbd6-1c0af95001bf\") " pod="calico-system/calico-node-m5mwg" Jan 13 20:57:57.328401 kubelet[2787]: I0113 20:57:57.328235 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d51a7cb7-d07d-430b-bbd6-1c0af95001bf-flexvol-driver-host\") pod \"calico-node-m5mwg\" (UID: \"d51a7cb7-d07d-430b-bbd6-1c0af95001bf\") " pod="calico-system/calico-node-m5mwg" Jan 13 20:57:57.328401 kubelet[2787]: I0113 20:57:57.328246 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d51a7cb7-d07d-430b-bbd6-1c0af95001bf-cni-net-dir\") pod \"calico-node-m5mwg\" (UID: \"d51a7cb7-d07d-430b-bbd6-1c0af95001bf\") " pod="calico-system/calico-node-m5mwg" Jan 13 20:57:57.328595 kubelet[2787]: I0113 20:57:57.328258 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d51a7cb7-d07d-430b-bbd6-1c0af95001bf-lib-modules\") pod \"calico-node-m5mwg\" (UID: \"d51a7cb7-d07d-430b-bbd6-1c0af95001bf\") " pod="calico-system/calico-node-m5mwg" Jan 13 20:57:57.328595 kubelet[2787]: I0113 20:57:57.328270 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d51a7cb7-d07d-430b-bbd6-1c0af95001bf-xtables-lock\") pod \"calico-node-m5mwg\" (UID: \"d51a7cb7-d07d-430b-bbd6-1c0af95001bf\") " pod="calico-system/calico-node-m5mwg" Jan 13 20:57:57.328595 kubelet[2787]: I0113 20:57:57.328279 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d51a7cb7-d07d-430b-bbd6-1c0af95001bf-tigera-ca-bundle\") pod \"calico-node-m5mwg\" (UID: \"d51a7cb7-d07d-430b-bbd6-1c0af95001bf\") " pod="calico-system/calico-node-m5mwg" Jan 13 20:57:57.328595 kubelet[2787]: I0113 20:57:57.328289 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d51a7cb7-d07d-430b-bbd6-1c0af95001bf-var-run-calico\") pod \"calico-node-m5mwg\" (UID: \"d51a7cb7-d07d-430b-bbd6-1c0af95001bf\") " pod="calico-system/calico-node-m5mwg" Jan 13 20:57:57.328595 kubelet[2787]: I0113 20:57:57.328303 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d51a7cb7-d07d-430b-bbd6-1c0af95001bf-node-certs\") pod \"calico-node-m5mwg\" (UID: \"d51a7cb7-d07d-430b-bbd6-1c0af95001bf\") " pod="calico-system/calico-node-m5mwg" Jan 13 20:57:57.328682 kubelet[2787]: I0113 20:57:57.328312 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d51a7cb7-d07d-430b-bbd6-1c0af95001bf-cni-bin-dir\") pod \"calico-node-m5mwg\" (UID: \"d51a7cb7-d07d-430b-bbd6-1c0af95001bf\") " pod="calico-system/calico-node-m5mwg" Jan 13 20:57:57.328682 kubelet[2787]: I0113 20:57:57.328322 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sblrz\" (UniqueName: \"kubernetes.io/projected/d51a7cb7-d07d-430b-bbd6-1c0af95001bf-kube-api-access-sblrz\") pod \"calico-node-m5mwg\" (UID: \"d51a7cb7-d07d-430b-bbd6-1c0af95001bf\") " pod="calico-system/calico-node-m5mwg" Jan 13 20:57:57.381981 systemd-networkd[1464]: cali3e8daddf72e: Link DOWN Jan 13 20:57:57.381985 systemd-networkd[1464]: cali3e8daddf72e: Lost carrier Jan 13 20:57:57.463298 containerd[1537]: 2025-01-13 20:57:57.377 [INFO][6103] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Jan 13 20:57:57.463298 containerd[1537]: 2025-01-13 20:57:57.380 [INFO][6103] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" iface="eth0" netns="/var/run/netns/cni-7ccab9b1-053e-1e08-913d-b7a3cebc248d" Jan 13 20:57:57.463298 containerd[1537]: 2025-01-13 20:57:57.381 [INFO][6103] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" iface="eth0" netns="/var/run/netns/cni-7ccab9b1-053e-1e08-913d-b7a3cebc248d" Jan 13 20:57:57.463298 containerd[1537]: 2025-01-13 20:57:57.390 [INFO][6103] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" after=9.787807ms iface="eth0" netns="/var/run/netns/cni-7ccab9b1-053e-1e08-913d-b7a3cebc248d" Jan 13 20:57:57.463298 containerd[1537]: 2025-01-13 20:57:57.393 [INFO][6103] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Jan 13 20:57:57.463298 containerd[1537]: 2025-01-13 20:57:57.393 [INFO][6103] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Jan 13 20:57:57.463298 containerd[1537]: 2025-01-13 20:57:57.427 [INFO][6115] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" HandleID="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Workload="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:57:57.463298 containerd[1537]: 2025-01-13 20:57:57.432 [INFO][6115] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:57:57.463298 containerd[1537]: 2025-01-13 20:57:57.432 [INFO][6115] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:57:57.463298 containerd[1537]: 2025-01-13 20:57:57.459 [INFO][6115] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" HandleID="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Workload="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:57:57.463298 containerd[1537]: 2025-01-13 20:57:57.459 [INFO][6115] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" HandleID="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Workload="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:57:57.463298 containerd[1537]: 2025-01-13 20:57:57.460 [INFO][6115] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:57:57.463298 containerd[1537]: 2025-01-13 20:57:57.462 [INFO][6103] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Jan 13 20:57:57.464254 containerd[1537]: time="2025-01-13T20:57:57.463350588Z" level=info msg="TearDown network for sandbox \"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba\" successfully" Jan 13 20:57:57.464254 containerd[1537]: time="2025-01-13T20:57:57.463367175Z" level=info msg="StopPodSandbox for \"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba\" returns successfully" Jan 13 20:57:57.517202 containerd[1537]: time="2025-01-13T20:57:57.517168946Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-m5mwg,Uid:d51a7cb7-d07d-430b-bbd6-1c0af95001bf,Namespace:calico-system,Attempt:0,}" Jan 13 20:57:57.529961 kubelet[2787]: I0113 20:57:57.529798 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be5d865c-3359-4beb-8044-736dced88771-tigera-ca-bundle\") pod \"be5d865c-3359-4beb-8044-736dced88771\" (UID: \"be5d865c-3359-4beb-8044-736dced88771\") " Jan 13 20:57:57.529961 kubelet[2787]: I0113 20:57:57.529846 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgq9w\" (UniqueName: \"kubernetes.io/projected/be5d865c-3359-4beb-8044-736dced88771-kube-api-access-lgq9w\") pod \"be5d865c-3359-4beb-8044-736dced88771\" (UID: \"be5d865c-3359-4beb-8044-736dced88771\") " Jan 13 20:57:57.536094 kubelet[2787]: I0113 20:57:57.536039 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be5d865c-3359-4beb-8044-736dced88771-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "be5d865c-3359-4beb-8044-736dced88771" (UID: "be5d865c-3359-4beb-8044-736dced88771"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 13 20:57:57.536094 kubelet[2787]: I0113 20:57:57.536078 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5d865c-3359-4beb-8044-736dced88771-kube-api-access-lgq9w" (OuterVolumeSpecName: "kube-api-access-lgq9w") pod "be5d865c-3359-4beb-8044-736dced88771" (UID: "be5d865c-3359-4beb-8044-736dced88771"). InnerVolumeSpecName "kube-api-access-lgq9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 13 20:57:57.557157 containerd[1537]: time="2025-01-13T20:57:57.557051172Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:57:57.557157 containerd[1537]: time="2025-01-13T20:57:57.557081403Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:57:57.557157 containerd[1537]: time="2025-01-13T20:57:57.557090963Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:57.557289 containerd[1537]: time="2025-01-13T20:57:57.557154285Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:57.570921 systemd[1]: Started cri-containerd-59a0aad7c0244e410d919a642b6ea5bc596c24e04ff719b21c7800be3ead7a5a.scope - libcontainer container 59a0aad7c0244e410d919a642b6ea5bc596c24e04ff719b21c7800be3ead7a5a. Jan 13 20:57:57.585484 containerd[1537]: time="2025-01-13T20:57:57.585458167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-m5mwg,Uid:d51a7cb7-d07d-430b-bbd6-1c0af95001bf,Namespace:calico-system,Attempt:0,} returns sandbox id \"59a0aad7c0244e410d919a642b6ea5bc596c24e04ff719b21c7800be3ead7a5a\"" Jan 13 20:57:57.587456 containerd[1537]: time="2025-01-13T20:57:57.587392470Z" level=info msg="CreateContainer within sandbox \"59a0aad7c0244e410d919a642b6ea5bc596c24e04ff719b21c7800be3ead7a5a\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 13 20:57:57.594242 containerd[1537]: time="2025-01-13T20:57:57.594216463Z" level=info msg="CreateContainer within sandbox \"59a0aad7c0244e410d919a642b6ea5bc596c24e04ff719b21c7800be3ead7a5a\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"3148c16e60dbcf7809b201db644558e055b5b57b81c2ddfaca614f9a87f5ad7f\"" Jan 13 20:57:57.595876 containerd[1537]: time="2025-01-13T20:57:57.595260094Z" level=info msg="StartContainer for \"3148c16e60dbcf7809b201db644558e055b5b57b81c2ddfaca614f9a87f5ad7f\"" Jan 13 20:57:57.616033 systemd[1]: Started cri-containerd-3148c16e60dbcf7809b201db644558e055b5b57b81c2ddfaca614f9a87f5ad7f.scope - libcontainer container 3148c16e60dbcf7809b201db644558e055b5b57b81c2ddfaca614f9a87f5ad7f. Jan 13 20:57:57.630097 kubelet[2787]: I0113 20:57:57.630082 2787 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-lgq9w\" (UniqueName: \"kubernetes.io/projected/be5d865c-3359-4beb-8044-736dced88771-kube-api-access-lgq9w\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:57.630197 kubelet[2787]: I0113 20:57:57.630190 2787 reconciler_common.go:288] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be5d865c-3359-4beb-8044-736dced88771-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:57.634407 containerd[1537]: time="2025-01-13T20:57:57.634381644Z" level=info msg="StartContainer for \"3148c16e60dbcf7809b201db644558e055b5b57b81c2ddfaca614f9a87f5ad7f\" returns successfully" Jan 13 20:57:57.707175 systemd[1]: var-lib-kubelet-pods-be5d865c\x2d3359\x2d4beb\x2d8044\x2d736dced88771-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dkube\x2dcontrollers-1.mount: Deactivated successfully. Jan 13 20:57:57.707237 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba-rootfs.mount: Deactivated successfully. Jan 13 20:57:57.707290 systemd[1]: run-netns-cni\x2d7ccab9b1\x2d053e\x2d1e08\x2d913d\x2db7a3cebc248d.mount: Deactivated successfully. Jan 13 20:57:57.707325 systemd[1]: var-lib-kubelet-pods-b5b99a2e\x2d4a23\x2d4904\x2d9961\x2d423d7f5593e3-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. Jan 13 20:57:57.707487 systemd[1]: var-lib-kubelet-pods-be5d865c\x2d3359\x2d4beb\x2d8044\x2d736dced88771-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dlgq9w.mount: Deactivated successfully. Jan 13 20:57:57.707532 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df-rootfs.mount: Deactivated successfully. Jan 13 20:57:57.707581 systemd[1]: var-lib-kubelet-pods-b5b99a2e\x2d4a23\x2d4904\x2d9961\x2d423d7f5593e3-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dnv84w.mount: Deactivated successfully. Jan 13 20:57:57.707619 systemd[1]: var-lib-kubelet-pods-b5b99a2e\x2d4a23\x2d4904\x2d9961\x2d423d7f5593e3-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Jan 13 20:57:57.711024 systemd[1]: cri-containerd-3148c16e60dbcf7809b201db644558e055b5b57b81c2ddfaca614f9a87f5ad7f.scope: Deactivated successfully. Jan 13 20:57:57.725730 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3148c16e60dbcf7809b201db644558e055b5b57b81c2ddfaca614f9a87f5ad7f-rootfs.mount: Deactivated successfully. Jan 13 20:57:57.794873 systemd[1]: cri-containerd-5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962.scope: Deactivated successfully. Jan 13 20:57:57.806989 containerd[1537]: time="2025-01-13T20:57:57.806675726Z" level=info msg="shim disconnected" id=3148c16e60dbcf7809b201db644558e055b5b57b81c2ddfaca614f9a87f5ad7f namespace=k8s.io Jan 13 20:57:57.806989 containerd[1537]: time="2025-01-13T20:57:57.806716837Z" level=warning msg="cleaning up after shim disconnected" id=3148c16e60dbcf7809b201db644558e055b5b57b81c2ddfaca614f9a87f5ad7f namespace=k8s.io Jan 13 20:57:57.806989 containerd[1537]: time="2025-01-13T20:57:57.806725244Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:57:57.826353 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962-rootfs.mount: Deactivated successfully. Jan 13 20:57:57.829198 containerd[1537]: time="2025-01-13T20:57:57.827605139Z" level=info msg="shim disconnected" id=5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962 namespace=k8s.io Jan 13 20:57:57.829198 containerd[1537]: time="2025-01-13T20:57:57.827635660Z" level=warning msg="cleaning up after shim disconnected" id=5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962 namespace=k8s.io Jan 13 20:57:57.829198 containerd[1537]: time="2025-01-13T20:57:57.827641127Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:57:57.838582 containerd[1537]: time="2025-01-13T20:57:57.838560683Z" level=info msg="StopContainer for \"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962\" returns successfully" Jan 13 20:57:57.839570 containerd[1537]: time="2025-01-13T20:57:57.839552828Z" level=info msg="StopPodSandbox for \"fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f\"" Jan 13 20:57:57.839619 containerd[1537]: time="2025-01-13T20:57:57.839592012Z" level=info msg="Container to stop \"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 13 20:57:57.841944 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f-shm.mount: Deactivated successfully. Jan 13 20:57:57.847595 systemd[1]: cri-containerd-fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f.scope: Deactivated successfully. Jan 13 20:57:57.862040 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f-rootfs.mount: Deactivated successfully. Jan 13 20:57:57.862417 containerd[1537]: time="2025-01-13T20:57:57.862362352Z" level=info msg="shim disconnected" id=fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f namespace=k8s.io Jan 13 20:57:57.862502 containerd[1537]: time="2025-01-13T20:57:57.862420054Z" level=warning msg="cleaning up after shim disconnected" id=fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f namespace=k8s.io Jan 13 20:57:57.862502 containerd[1537]: time="2025-01-13T20:57:57.862429577Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:57:57.872836 containerd[1537]: time="2025-01-13T20:57:57.872809031Z" level=info msg="TearDown network for sandbox \"fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f\" successfully" Jan 13 20:57:57.872836 containerd[1537]: time="2025-01-13T20:57:57.872835293Z" level=info msg="StopPodSandbox for \"fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f\" returns successfully" Jan 13 20:57:57.934293 kubelet[2787]: I0113 20:57:57.934176 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm49l\" (UniqueName: \"kubernetes.io/projected/0da3dfd6-758e-47e6-8dbb-82aa95aea79b-kube-api-access-lm49l\") pod \"0da3dfd6-758e-47e6-8dbb-82aa95aea79b\" (UID: \"0da3dfd6-758e-47e6-8dbb-82aa95aea79b\") " Jan 13 20:57:57.934293 kubelet[2787]: I0113 20:57:57.934197 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0da3dfd6-758e-47e6-8dbb-82aa95aea79b-tigera-ca-bundle\") pod \"0da3dfd6-758e-47e6-8dbb-82aa95aea79b\" (UID: \"0da3dfd6-758e-47e6-8dbb-82aa95aea79b\") " Jan 13 20:57:57.934293 kubelet[2787]: I0113 20:57:57.934216 2787 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0da3dfd6-758e-47e6-8dbb-82aa95aea79b-typha-certs\") pod \"0da3dfd6-758e-47e6-8dbb-82aa95aea79b\" (UID: \"0da3dfd6-758e-47e6-8dbb-82aa95aea79b\") " Jan 13 20:57:57.939031 kubelet[2787]: I0113 20:57:57.938764 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da3dfd6-758e-47e6-8dbb-82aa95aea79b-kube-api-access-lm49l" (OuterVolumeSpecName: "kube-api-access-lm49l") pod "0da3dfd6-758e-47e6-8dbb-82aa95aea79b" (UID: "0da3dfd6-758e-47e6-8dbb-82aa95aea79b"). InnerVolumeSpecName "kube-api-access-lm49l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 13 20:57:57.939536 kubelet[2787]: I0113 20:57:57.939524 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0da3dfd6-758e-47e6-8dbb-82aa95aea79b-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "0da3dfd6-758e-47e6-8dbb-82aa95aea79b" (UID: "0da3dfd6-758e-47e6-8dbb-82aa95aea79b"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 13 20:57:57.941195 kubelet[2787]: I0113 20:57:57.941172 2787 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0da3dfd6-758e-47e6-8dbb-82aa95aea79b-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "0da3dfd6-758e-47e6-8dbb-82aa95aea79b" (UID: "0da3dfd6-758e-47e6-8dbb-82aa95aea79b"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 13 20:57:57.945978 systemd[1]: Removed slice kubepods-besteffort-podb5b99a2e_4a23_4904_9961_423d7f5593e3.slice - libcontainer container kubepods-besteffort-podb5b99a2e_4a23_4904_9961_423d7f5593e3.slice. Jan 13 20:57:57.946037 systemd[1]: kubepods-besteffort-podb5b99a2e_4a23_4904_9961_423d7f5593e3.slice: Consumed 1.416s CPU time. Jan 13 20:57:57.947285 systemd[1]: Removed slice kubepods-besteffort-podbe5d865c_3359_4beb_8044_736dced88771.slice - libcontainer container kubepods-besteffort-podbe5d865c_3359_4beb_8044_736dced88771.slice. Jan 13 20:57:58.027711 kubelet[2787]: E0113 20:57:58.027596 2787 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="0da3dfd6-758e-47e6-8dbb-82aa95aea79b" containerName="calico-typha" Jan 13 20:57:58.027711 kubelet[2787]: E0113 20:57:58.027619 2787 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="be5d865c-3359-4beb-8044-736dced88771" containerName="calico-kube-controllers" Jan 13 20:57:58.029112 kubelet[2787]: I0113 20:57:58.028615 2787 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5d865c-3359-4beb-8044-736dced88771" containerName="calico-kube-controllers" Jan 13 20:57:58.029112 kubelet[2787]: I0113 20:57:58.028633 2787 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da3dfd6-758e-47e6-8dbb-82aa95aea79b" containerName="calico-typha" Jan 13 20:57:58.035016 kubelet[2787]: I0113 20:57:58.034813 2787 reconciler_common.go:288] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0da3dfd6-758e-47e6-8dbb-82aa95aea79b-typha-certs\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:58.035016 kubelet[2787]: I0113 20:57:58.034946 2787 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-lm49l\" (UniqueName: \"kubernetes.io/projected/0da3dfd6-758e-47e6-8dbb-82aa95aea79b-kube-api-access-lm49l\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:58.035016 kubelet[2787]: I0113 20:57:58.034956 2787 reconciler_common.go:288] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0da3dfd6-758e-47e6-8dbb-82aa95aea79b-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Jan 13 20:57:58.060481 systemd[1]: Created slice kubepods-besteffort-podaa43432b_53f9_4a20_9d7a_71a00e6072ef.slice - libcontainer container kubepods-besteffort-podaa43432b_53f9_4a20_9d7a_71a00e6072ef.slice. Jan 13 20:57:58.062779 kubelet[2787]: I0113 20:57:58.062468 2787 scope.go:117] "RemoveContainer" containerID="f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a" Jan 13 20:57:58.085497 containerd[1537]: time="2025-01-13T20:57:58.085295117Z" level=info msg="RemoveContainer for \"f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a\"" Jan 13 20:57:58.093003 containerd[1537]: time="2025-01-13T20:57:58.092982501Z" level=info msg="RemoveContainer for \"f9047486ebbaaf21bcacbf49b08ad3093e59154179271df7e93376079015ab1a\" returns successfully" Jan 13 20:57:58.094719 kubelet[2787]: I0113 20:57:58.094706 2787 scope.go:117] "RemoveContainer" containerID="746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111" Jan 13 20:57:58.097323 containerd[1537]: time="2025-01-13T20:57:58.097112501Z" level=info msg="RemoveContainer for \"746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111\"" Jan 13 20:57:58.101867 containerd[1537]: time="2025-01-13T20:57:58.101733162Z" level=info msg="RemoveContainer for \"746900931a458bdb8667e8cb7d11f278f26129a7a60538aba873d0b13fc42111\" returns successfully" Jan 13 20:57:58.102953 kubelet[2787]: I0113 20:57:58.102940 2787 scope.go:117] "RemoveContainer" containerID="4c5cf52fd4d67dfef056d20edc195a2a25ed17cd5d64960811f721e6833a70d1" Jan 13 20:57:58.103469 containerd[1537]: time="2025-01-13T20:57:58.103395435Z" level=info msg="CreateContainer within sandbox \"59a0aad7c0244e410d919a642b6ea5bc596c24e04ff719b21c7800be3ead7a5a\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 13 20:57:58.104173 containerd[1537]: time="2025-01-13T20:57:58.104159984Z" level=info msg="RemoveContainer for \"4c5cf52fd4d67dfef056d20edc195a2a25ed17cd5d64960811f721e6833a70d1\"" Jan 13 20:57:58.107849 systemd[1]: Removed slice kubepods-besteffort-pod0da3dfd6_758e_47e6_8dbb_82aa95aea79b.slice - libcontainer container kubepods-besteffort-pod0da3dfd6_758e_47e6_8dbb_82aa95aea79b.slice. Jan 13 20:57:58.112686 containerd[1537]: time="2025-01-13T20:57:58.112251319Z" level=info msg="RemoveContainer for \"4c5cf52fd4d67dfef056d20edc195a2a25ed17cd5d64960811f721e6833a70d1\" returns successfully" Jan 13 20:57:58.117862 kubelet[2787]: I0113 20:57:58.117847 2787 scope.go:117] "RemoveContainer" containerID="1d4bf2708587ca90f3dabf85c0cabedb3a0aefec59ff83b769b2cc69bb9d23e6" Jan 13 20:57:58.123493 containerd[1537]: time="2025-01-13T20:57:58.118751437Z" level=info msg="RemoveContainer for \"1d4bf2708587ca90f3dabf85c0cabedb3a0aefec59ff83b769b2cc69bb9d23e6\"" Jan 13 20:57:58.129164 containerd[1537]: time="2025-01-13T20:57:58.129093558Z" level=info msg="RemoveContainer for \"1d4bf2708587ca90f3dabf85c0cabedb3a0aefec59ff83b769b2cc69bb9d23e6\" returns successfully" Jan 13 20:57:58.129399 kubelet[2787]: I0113 20:57:58.129326 2787 scope.go:117] "RemoveContainer" containerID="5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962" Jan 13 20:57:58.130140 containerd[1537]: time="2025-01-13T20:57:58.130109204Z" level=info msg="RemoveContainer for \"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962\"" Jan 13 20:57:58.134185 containerd[1537]: time="2025-01-13T20:57:58.134116051Z" level=info msg="CreateContainer within sandbox \"59a0aad7c0244e410d919a642b6ea5bc596c24e04ff719b21c7800be3ead7a5a\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"e4af1ed822bb567b32f0eeeba8fad61b20fc6068346f35ccc5b87d2535f20b75\"" Jan 13 20:57:58.134718 containerd[1537]: time="2025-01-13T20:57:58.134666359Z" level=info msg="StartContainer for \"e4af1ed822bb567b32f0eeeba8fad61b20fc6068346f35ccc5b87d2535f20b75\"" Jan 13 20:57:58.137677 kubelet[2787]: I0113 20:57:58.135863 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa43432b-53f9-4a20-9d7a-71a00e6072ef-tigera-ca-bundle\") pod \"calico-typha-8556c88b4d-b9mqx\" (UID: \"aa43432b-53f9-4a20-9d7a-71a00e6072ef\") " pod="calico-system/calico-typha-8556c88b4d-b9mqx" Jan 13 20:57:58.137677 kubelet[2787]: I0113 20:57:58.135905 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/aa43432b-53f9-4a20-9d7a-71a00e6072ef-typha-certs\") pod \"calico-typha-8556c88b4d-b9mqx\" (UID: \"aa43432b-53f9-4a20-9d7a-71a00e6072ef\") " pod="calico-system/calico-typha-8556c88b4d-b9mqx" Jan 13 20:57:58.137677 kubelet[2787]: I0113 20:57:58.135933 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5z5l\" (UniqueName: \"kubernetes.io/projected/aa43432b-53f9-4a20-9d7a-71a00e6072ef-kube-api-access-x5z5l\") pod \"calico-typha-8556c88b4d-b9mqx\" (UID: \"aa43432b-53f9-4a20-9d7a-71a00e6072ef\") " pod="calico-system/calico-typha-8556c88b4d-b9mqx" Jan 13 20:57:58.137677 kubelet[2787]: I0113 20:57:58.136472 2787 scope.go:117] "RemoveContainer" containerID="5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962" Jan 13 20:57:58.139393 containerd[1537]: time="2025-01-13T20:57:58.135890731Z" level=info msg="RemoveContainer for \"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962\" returns successfully" Jan 13 20:57:58.139393 containerd[1537]: time="2025-01-13T20:57:58.138046450Z" level=error msg="ContainerStatus for \"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962\": not found" Jan 13 20:57:58.147860 kubelet[2787]: E0113 20:57:58.147820 2787 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962\": not found" containerID="5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962" Jan 13 20:57:58.160128 kubelet[2787]: I0113 20:57:58.149379 2787 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962"} err="failed to get container status \"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962\": rpc error: code = NotFound desc = an error occurred when try to find container \"5efcc10efc314a099387f50383d602cdd80151cbf45f18614d8809ac551a6962\": not found" Jan 13 20:57:58.171962 systemd[1]: Started cri-containerd-e4af1ed822bb567b32f0eeeba8fad61b20fc6068346f35ccc5b87d2535f20b75.scope - libcontainer container e4af1ed822bb567b32f0eeeba8fad61b20fc6068346f35ccc5b87d2535f20b75. Jan 13 20:57:58.202960 containerd[1537]: time="2025-01-13T20:57:58.202936239Z" level=info msg="StartContainer for \"e4af1ed822bb567b32f0eeeba8fad61b20fc6068346f35ccc5b87d2535f20b75\" returns successfully" Jan 13 20:57:58.373772 containerd[1537]: time="2025-01-13T20:57:58.373707666Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8556c88b4d-b9mqx,Uid:aa43432b-53f9-4a20-9d7a-71a00e6072ef,Namespace:calico-system,Attempt:0,}" Jan 13 20:57:58.388511 containerd[1537]: time="2025-01-13T20:57:58.388212498Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:57:58.388511 containerd[1537]: time="2025-01-13T20:57:58.388259424Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:57:58.388511 containerd[1537]: time="2025-01-13T20:57:58.388269115Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:58.388511 containerd[1537]: time="2025-01-13T20:57:58.388330099Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:57:58.405926 systemd[1]: Started cri-containerd-6df031f16460c54bb4bbe456620e6954f3e53fbab0172718e08b70c4149c49c3.scope - libcontainer container 6df031f16460c54bb4bbe456620e6954f3e53fbab0172718e08b70c4149c49c3. Jan 13 20:57:58.443387 containerd[1537]: time="2025-01-13T20:57:58.443361677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8556c88b4d-b9mqx,Uid:aa43432b-53f9-4a20-9d7a-71a00e6072ef,Namespace:calico-system,Attempt:0,} returns sandbox id \"6df031f16460c54bb4bbe456620e6954f3e53fbab0172718e08b70c4149c49c3\"" Jan 13 20:57:58.450617 containerd[1537]: time="2025-01-13T20:57:58.450597783Z" level=info msg="CreateContainer within sandbox \"6df031f16460c54bb4bbe456620e6954f3e53fbab0172718e08b70c4149c49c3\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 13 20:57:58.461372 containerd[1537]: time="2025-01-13T20:57:58.461312771Z" level=info msg="CreateContainer within sandbox \"6df031f16460c54bb4bbe456620e6954f3e53fbab0172718e08b70c4149c49c3\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"fb999acca57ac11717577aa9beebc762d4ef855842119e5c13183b72df845622\"" Jan 13 20:57:58.461983 containerd[1537]: time="2025-01-13T20:57:58.461806776Z" level=info msg="StartContainer for \"fb999acca57ac11717577aa9beebc762d4ef855842119e5c13183b72df845622\"" Jan 13 20:57:58.483929 systemd[1]: Started cri-containerd-fb999acca57ac11717577aa9beebc762d4ef855842119e5c13183b72df845622.scope - libcontainer container fb999acca57ac11717577aa9beebc762d4ef855842119e5c13183b72df845622. Jan 13 20:57:58.529171 containerd[1537]: time="2025-01-13T20:57:58.529100562Z" level=info msg="StartContainer for \"fb999acca57ac11717577aa9beebc762d4ef855842119e5c13183b72df845622\" returns successfully" Jan 13 20:57:58.710609 systemd[1]: var-lib-kubelet-pods-0da3dfd6\x2d758e\x2d47e6\x2d8dbb\x2d82aa95aea79b-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Jan 13 20:57:58.710677 systemd[1]: var-lib-kubelet-pods-0da3dfd6\x2d758e\x2d47e6\x2d8dbb\x2d82aa95aea79b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dlm49l.mount: Deactivated successfully. Jan 13 20:57:58.710717 systemd[1]: var-lib-kubelet-pods-0da3dfd6\x2d758e\x2d47e6\x2d8dbb\x2d82aa95aea79b-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Jan 13 20:57:59.141802 kubelet[2787]: I0113 20:57:59.141620 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-8556c88b4d-b9mqx" podStartSLOduration=3.1221484 podStartE2EDuration="3.1221484s" podCreationTimestamp="2025-01-13 20:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:57:59.112690108 +0000 UTC m=+65.272028079" watchObservedRunningTime="2025-01-13 20:57:59.1221484 +0000 UTC m=+65.281486378" Jan 13 20:57:59.864503 systemd[1]: cri-containerd-e4af1ed822bb567b32f0eeeba8fad61b20fc6068346f35ccc5b87d2535f20b75.scope: Deactivated successfully. Jan 13 20:58:00.092632 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e4af1ed822bb567b32f0eeeba8fad61b20fc6068346f35ccc5b87d2535f20b75-rootfs.mount: Deactivated successfully. Jan 13 20:58:00.111550 containerd[1537]: time="2025-01-13T20:58:00.103610052Z" level=info msg="shim disconnected" id=e4af1ed822bb567b32f0eeeba8fad61b20fc6068346f35ccc5b87d2535f20b75 namespace=k8s.io Jan 13 20:58:00.111550 containerd[1537]: time="2025-01-13T20:58:00.111328411Z" level=warning msg="cleaning up after shim disconnected" id=e4af1ed822bb567b32f0eeeba8fad61b20fc6068346f35ccc5b87d2535f20b75 namespace=k8s.io Jan 13 20:58:00.111550 containerd[1537]: time="2025-01-13T20:58:00.111337745Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:58:00.200917 kubelet[2787]: I0113 20:58:00.180947 2787 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0da3dfd6-758e-47e6-8dbb-82aa95aea79b" path="/var/lib/kubelet/pods/0da3dfd6-758e-47e6-8dbb-82aa95aea79b/volumes" Jan 13 20:58:00.202095 kubelet[2787]: I0113 20:58:00.202061 2787 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b99a2e-4a23-4904-9961-423d7f5593e3" path="/var/lib/kubelet/pods/b5b99a2e-4a23-4904-9961-423d7f5593e3/volumes" Jan 13 20:58:00.203359 kubelet[2787]: I0113 20:58:00.203328 2787 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be5d865c-3359-4beb-8044-736dced88771" path="/var/lib/kubelet/pods/be5d865c-3359-4beb-8044-736dced88771/volumes" Jan 13 20:58:00.349103 containerd[1537]: time="2025-01-13T20:58:00.349078206Z" level=info msg="CreateContainer within sandbox \"59a0aad7c0244e410d919a642b6ea5bc596c24e04ff719b21c7800be3ead7a5a\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 13 20:58:00.506807 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1153794823.mount: Deactivated successfully. Jan 13 20:58:00.512713 containerd[1537]: time="2025-01-13T20:58:00.512687589Z" level=info msg="CreateContainer within sandbox \"59a0aad7c0244e410d919a642b6ea5bc596c24e04ff719b21c7800be3ead7a5a\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"23ac3bd8a2d5aad23d212fdcf012718fd38f84f56e37b9e88834ba541599ea36\"" Jan 13 20:58:00.517334 containerd[1537]: time="2025-01-13T20:58:00.516548161Z" level=info msg="StartContainer for \"23ac3bd8a2d5aad23d212fdcf012718fd38f84f56e37b9e88834ba541599ea36\"" Jan 13 20:58:00.534926 systemd[1]: Started cri-containerd-23ac3bd8a2d5aad23d212fdcf012718fd38f84f56e37b9e88834ba541599ea36.scope - libcontainer container 23ac3bd8a2d5aad23d212fdcf012718fd38f84f56e37b9e88834ba541599ea36. Jan 13 20:58:00.575581 containerd[1537]: time="2025-01-13T20:58:00.575555237Z" level=info msg="StartContainer for \"23ac3bd8a2d5aad23d212fdcf012718fd38f84f56e37b9e88834ba541599ea36\" returns successfully" Jan 13 20:58:00.890761 systemd[1]: Created slice kubepods-besteffort-pod30d1ec4c_702f_4000_a441_848accd4d973.slice - libcontainer container kubepods-besteffort-pod30d1ec4c_702f_4000_a441_848accd4d973.slice. Jan 13 20:58:00.977749 kubelet[2787]: I0113 20:58:00.977622 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g2mw\" (UniqueName: \"kubernetes.io/projected/30d1ec4c-702f-4000-a441-848accd4d973-kube-api-access-5g2mw\") pod \"calico-kube-controllers-7f5df76585-9xglt\" (UID: \"30d1ec4c-702f-4000-a441-848accd4d973\") " pod="calico-system/calico-kube-controllers-7f5df76585-9xglt" Jan 13 20:58:00.977749 kubelet[2787]: I0113 20:58:00.977679 2787 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30d1ec4c-702f-4000-a441-848accd4d973-tigera-ca-bundle\") pod \"calico-kube-controllers-7f5df76585-9xglt\" (UID: \"30d1ec4c-702f-4000-a441-848accd4d973\") " pod="calico-system/calico-kube-controllers-7f5df76585-9xglt" Jan 13 20:58:01.188164 kubelet[2787]: I0113 20:58:01.188130 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-m5mwg" podStartSLOduration=4.188118039 podStartE2EDuration="4.188118039s" podCreationTimestamp="2025-01-13 20:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:58:01.187332743 +0000 UTC m=+67.346670724" watchObservedRunningTime="2025-01-13 20:58:01.188118039 +0000 UTC m=+67.347456015" Jan 13 20:58:01.203043 containerd[1537]: time="2025-01-13T20:58:01.203014178Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7f5df76585-9xglt,Uid:30d1ec4c-702f-4000-a441-848accd4d973,Namespace:calico-system,Attempt:0,}" Jan 13 20:58:01.341117 systemd-networkd[1464]: cali9e43c4889e6: Link UP Jan 13 20:58:01.342213 systemd-networkd[1464]: cali9e43c4889e6: Gained carrier Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.289 [INFO][6522] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7f5df76585--9xglt-eth0 calico-kube-controllers-7f5df76585- calico-system 30d1ec4c-702f-4000-a441-848accd4d973 1142 0 2025-01-13 20:57:58 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7f5df76585 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7f5df76585-9xglt eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali9e43c4889e6 [] []}} ContainerID="457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" Namespace="calico-system" Pod="calico-kube-controllers-7f5df76585-9xglt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5df76585--9xglt-" Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.289 [INFO][6522] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" Namespace="calico-system" Pod="calico-kube-controllers-7f5df76585-9xglt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5df76585--9xglt-eth0" Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.312 [INFO][6535] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" HandleID="k8s-pod-network.457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" Workload="localhost-k8s-calico--kube--controllers--7f5df76585--9xglt-eth0" Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.317 [INFO][6535] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" HandleID="k8s-pod-network.457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" Workload="localhost-k8s-calico--kube--controllers--7f5df76585--9xglt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000292b70), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7f5df76585-9xglt", "timestamp":"2025-01-13 20:58:01.312274028 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.317 [INFO][6535] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.317 [INFO][6535] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.317 [INFO][6535] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.318 [INFO][6535] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" host="localhost" Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.321 [INFO][6535] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.323 [INFO][6535] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.324 [INFO][6535] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.325 [INFO][6535] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.325 [INFO][6535] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" host="localhost" Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.326 [INFO][6535] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.328 [INFO][6535] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" host="localhost" Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.332 [INFO][6535] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" host="localhost" Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.332 [INFO][6535] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" host="localhost" Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.332 [INFO][6535] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:58:01.357591 containerd[1537]: 2025-01-13 20:58:01.332 [INFO][6535] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" HandleID="k8s-pod-network.457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" Workload="localhost-k8s-calico--kube--controllers--7f5df76585--9xglt-eth0" Jan 13 20:58:01.360908 containerd[1537]: 2025-01-13 20:58:01.335 [INFO][6522] cni-plugin/k8s.go 386: Populated endpoint ContainerID="457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" Namespace="calico-system" Pod="calico-kube-controllers-7f5df76585-9xglt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5df76585--9xglt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7f5df76585--9xglt-eth0", GenerateName:"calico-kube-controllers-7f5df76585-", Namespace:"calico-system", SelfLink:"", UID:"30d1ec4c-702f-4000-a441-848accd4d973", ResourceVersion:"1142", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 57, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7f5df76585", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7f5df76585-9xglt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9e43c4889e6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:58:01.360908 containerd[1537]: 2025-01-13 20:58:01.335 [INFO][6522] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.135/32] ContainerID="457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" Namespace="calico-system" Pod="calico-kube-controllers-7f5df76585-9xglt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5df76585--9xglt-eth0" Jan 13 20:58:01.360908 containerd[1537]: 2025-01-13 20:58:01.335 [INFO][6522] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9e43c4889e6 ContainerID="457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" Namespace="calico-system" Pod="calico-kube-controllers-7f5df76585-9xglt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5df76585--9xglt-eth0" Jan 13 20:58:01.360908 containerd[1537]: 2025-01-13 20:58:01.342 [INFO][6522] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" Namespace="calico-system" Pod="calico-kube-controllers-7f5df76585-9xglt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5df76585--9xglt-eth0" Jan 13 20:58:01.360908 containerd[1537]: 2025-01-13 20:58:01.343 [INFO][6522] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" Namespace="calico-system" Pod="calico-kube-controllers-7f5df76585-9xglt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5df76585--9xglt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7f5df76585--9xglt-eth0", GenerateName:"calico-kube-controllers-7f5df76585-", Namespace:"calico-system", SelfLink:"", UID:"30d1ec4c-702f-4000-a441-848accd4d973", ResourceVersion:"1142", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 57, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7f5df76585", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc", Pod:"calico-kube-controllers-7f5df76585-9xglt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9e43c4889e6", MAC:"be:1d:0a:6e:7c:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:58:01.360908 containerd[1537]: 2025-01-13 20:58:01.349 [INFO][6522] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc" Namespace="calico-system" Pod="calico-kube-controllers-7f5df76585-9xglt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5df76585--9xglt-eth0" Jan 13 20:58:01.385604 containerd[1537]: time="2025-01-13T20:58:01.385428155Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:58:01.385604 containerd[1537]: time="2025-01-13T20:58:01.385467222Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:58:01.385604 containerd[1537]: time="2025-01-13T20:58:01.385474936Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:58:01.385604 containerd[1537]: time="2025-01-13T20:58:01.385529274Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:58:01.402930 systemd[1]: Started cri-containerd-457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc.scope - libcontainer container 457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc. Jan 13 20:58:01.411683 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:58:01.435545 containerd[1537]: time="2025-01-13T20:58:01.435510948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7f5df76585-9xglt,Uid:30d1ec4c-702f-4000-a441-848accd4d973,Namespace:calico-system,Attempt:0,} returns sandbox id \"457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc\"" Jan 13 20:58:01.470867 containerd[1537]: time="2025-01-13T20:58:01.470121537Z" level=info msg="CreateContainer within sandbox \"457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 13 20:58:01.479713 containerd[1537]: time="2025-01-13T20:58:01.479690172Z" level=info msg="CreateContainer within sandbox \"457f28645a320c24e120eb08ecfffab795df3c3a357f6a6cf00c283bac0301dc\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"ea8cee3a4bfa9f05aaef5dd4645705a2ca545b91ba6aa0c6e02ecc49b2296b92\"" Jan 13 20:58:01.480863 containerd[1537]: time="2025-01-13T20:58:01.480666885Z" level=info msg="StartContainer for \"ea8cee3a4bfa9f05aaef5dd4645705a2ca545b91ba6aa0c6e02ecc49b2296b92\"" Jan 13 20:58:01.500967 systemd[1]: Started cri-containerd-ea8cee3a4bfa9f05aaef5dd4645705a2ca545b91ba6aa0c6e02ecc49b2296b92.scope - libcontainer container ea8cee3a4bfa9f05aaef5dd4645705a2ca545b91ba6aa0c6e02ecc49b2296b92. Jan 13 20:58:01.557204 containerd[1537]: time="2025-01-13T20:58:01.557179636Z" level=info msg="StartContainer for \"ea8cee3a4bfa9f05aaef5dd4645705a2ca545b91ba6aa0c6e02ecc49b2296b92\" returns successfully" Jan 13 20:58:02.490948 systemd-networkd[1464]: cali9e43c4889e6: Gained IPv6LL Jan 13 20:58:03.172938 kubelet[2787]: I0113 20:58:03.129258 2787 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7f5df76585-9xglt" podStartSLOduration=5.103625997 podStartE2EDuration="5.103625997s" podCreationTimestamp="2025-01-13 20:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:58:03.091769057 +0000 UTC m=+69.251107027" watchObservedRunningTime="2025-01-13 20:58:03.103625997 +0000 UTC m=+69.262963975" Jan 13 20:58:25.823979 systemd[1]: Started sshd@7-139.178.70.103:22-147.75.109.163:44038.service - OpenSSH per-connection server daemon (147.75.109.163:44038). Jan 13 20:58:25.962532 sshd[6954]: Accepted publickey for core from 147.75.109.163 port 44038 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:58:25.971736 sshd-session[6954]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:58:25.975153 systemd-logind[1519]: New session 10 of user core. Jan 13 20:58:25.978944 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 13 20:58:26.707256 sshd[6956]: Connection closed by 147.75.109.163 port 44038 Jan 13 20:58:26.712068 systemd-logind[1519]: Session 10 logged out. Waiting for processes to exit. Jan 13 20:58:26.707807 sshd-session[6954]: pam_unix(sshd:session): session closed for user core Jan 13 20:58:26.712200 systemd[1]: sshd@7-139.178.70.103:22-147.75.109.163:44038.service: Deactivated successfully. Jan 13 20:58:26.714013 systemd[1]: session-10.scope: Deactivated successfully. Jan 13 20:58:26.715276 systemd-logind[1519]: Removed session 10. Jan 13 20:58:31.718485 systemd[1]: Started sshd@8-139.178.70.103:22-147.75.109.163:34442.service - OpenSSH per-connection server daemon (147.75.109.163:34442). Jan 13 20:58:31.821787 sshd[7016]: Accepted publickey for core from 147.75.109.163 port 34442 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:58:31.823642 sshd-session[7016]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:58:31.827287 systemd-logind[1519]: New session 11 of user core. Jan 13 20:58:31.833088 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 13 20:58:32.419280 sshd[7018]: Connection closed by 147.75.109.163 port 34442 Jan 13 20:58:32.418884 sshd-session[7016]: pam_unix(sshd:session): session closed for user core Jan 13 20:58:32.421184 systemd[1]: sshd@8-139.178.70.103:22-147.75.109.163:34442.service: Deactivated successfully. Jan 13 20:58:32.422382 systemd[1]: session-11.scope: Deactivated successfully. Jan 13 20:58:32.423008 systemd-logind[1519]: Session 11 logged out. Waiting for processes to exit. Jan 13 20:58:32.423583 systemd-logind[1519]: Removed session 11. Jan 13 20:58:37.433984 systemd[1]: Started sshd@9-139.178.70.103:22-147.75.109.163:38816.service - OpenSSH per-connection server daemon (147.75.109.163:38816). Jan 13 20:58:37.488693 sshd[7030]: Accepted publickey for core from 147.75.109.163 port 38816 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:58:37.489584 sshd-session[7030]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:58:37.492962 systemd-logind[1519]: New session 12 of user core. Jan 13 20:58:37.498955 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 13 20:58:37.647845 sshd[7032]: Connection closed by 147.75.109.163 port 38816 Jan 13 20:58:37.648737 sshd-session[7030]: pam_unix(sshd:session): session closed for user core Jan 13 20:58:37.654736 systemd[1]: sshd@9-139.178.70.103:22-147.75.109.163:38816.service: Deactivated successfully. Jan 13 20:58:37.655711 systemd[1]: session-12.scope: Deactivated successfully. Jan 13 20:58:37.656570 systemd-logind[1519]: Session 12 logged out. Waiting for processes to exit. Jan 13 20:58:37.661034 systemd[1]: Started sshd@10-139.178.70.103:22-147.75.109.163:38820.service - OpenSSH per-connection server daemon (147.75.109.163:38820). Jan 13 20:58:37.663128 systemd-logind[1519]: Removed session 12. Jan 13 20:58:37.697668 sshd[7043]: Accepted publickey for core from 147.75.109.163 port 38820 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:58:37.698794 sshd-session[7043]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:58:37.702308 systemd-logind[1519]: New session 13 of user core. Jan 13 20:58:37.711931 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 13 20:58:37.906004 sshd[7045]: Connection closed by 147.75.109.163 port 38820 Jan 13 20:58:37.907801 sshd-session[7043]: pam_unix(sshd:session): session closed for user core Jan 13 20:58:37.912801 systemd[1]: sshd@10-139.178.70.103:22-147.75.109.163:38820.service: Deactivated successfully. Jan 13 20:58:37.915020 systemd[1]: session-13.scope: Deactivated successfully. Jan 13 20:58:37.916951 systemd-logind[1519]: Session 13 logged out. Waiting for processes to exit. Jan 13 20:58:37.920600 systemd[1]: Started sshd@11-139.178.70.103:22-147.75.109.163:38834.service - OpenSSH per-connection server daemon (147.75.109.163:38834). Jan 13 20:58:37.924459 systemd-logind[1519]: Removed session 13. Jan 13 20:58:37.969950 sshd[7053]: Accepted publickey for core from 147.75.109.163 port 38834 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:58:37.971019 sshd-session[7053]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:58:37.974249 systemd-logind[1519]: New session 14 of user core. Jan 13 20:58:37.978960 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 13 20:58:38.077212 sshd[7055]: Connection closed by 147.75.109.163 port 38834 Jan 13 20:58:38.077619 sshd-session[7053]: pam_unix(sshd:session): session closed for user core Jan 13 20:58:38.079682 systemd-logind[1519]: Session 14 logged out. Waiting for processes to exit. Jan 13 20:58:38.079955 systemd[1]: sshd@11-139.178.70.103:22-147.75.109.163:38834.service: Deactivated successfully. Jan 13 20:58:38.081014 systemd[1]: session-14.scope: Deactivated successfully. Jan 13 20:58:38.081708 systemd-logind[1519]: Removed session 14. Jan 13 20:58:43.087445 systemd[1]: Started sshd@12-139.178.70.103:22-147.75.109.163:38846.service - OpenSSH per-connection server daemon (147.75.109.163:38846). Jan 13 20:58:43.126426 sshd[7065]: Accepted publickey for core from 147.75.109.163 port 38846 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:58:43.127683 sshd-session[7065]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:58:43.132902 systemd-logind[1519]: New session 15 of user core. Jan 13 20:58:43.138191 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 13 20:58:43.272878 sshd[7068]: Connection closed by 147.75.109.163 port 38846 Jan 13 20:58:43.273444 sshd-session[7065]: pam_unix(sshd:session): session closed for user core Jan 13 20:58:43.276109 systemd[1]: sshd@12-139.178.70.103:22-147.75.109.163:38846.service: Deactivated successfully. Jan 13 20:58:43.277377 systemd[1]: session-15.scope: Deactivated successfully. Jan 13 20:58:43.277979 systemd-logind[1519]: Session 15 logged out. Waiting for processes to exit. Jan 13 20:58:43.278617 systemd-logind[1519]: Removed session 15. Jan 13 20:58:48.280905 systemd[1]: Started sshd@13-139.178.70.103:22-147.75.109.163:58730.service - OpenSSH per-connection server daemon (147.75.109.163:58730). Jan 13 20:58:48.318544 sshd[7092]: Accepted publickey for core from 147.75.109.163 port 58730 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:58:48.319555 sshd-session[7092]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:58:48.322954 systemd-logind[1519]: New session 16 of user core. Jan 13 20:58:48.332929 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 13 20:58:48.446749 sshd[7094]: Connection closed by 147.75.109.163 port 58730 Jan 13 20:58:48.448920 sshd-session[7092]: pam_unix(sshd:session): session closed for user core Jan 13 20:58:48.460214 systemd[1]: sshd@13-139.178.70.103:22-147.75.109.163:58730.service: Deactivated successfully. Jan 13 20:58:48.461565 systemd[1]: session-16.scope: Deactivated successfully. Jan 13 20:58:48.462486 systemd-logind[1519]: Session 16 logged out. Waiting for processes to exit. Jan 13 20:58:48.466069 systemd[1]: Started sshd@14-139.178.70.103:22-147.75.109.163:58736.service - OpenSSH per-connection server daemon (147.75.109.163:58736). Jan 13 20:58:48.467284 systemd-logind[1519]: Removed session 16. Jan 13 20:58:48.514855 sshd[7104]: Accepted publickey for core from 147.75.109.163 port 58736 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:58:48.522141 sshd-session[7104]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:58:48.524845 systemd-logind[1519]: New session 17 of user core. Jan 13 20:58:48.536968 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 13 20:58:49.098097 sshd[7106]: Connection closed by 147.75.109.163 port 58736 Jan 13 20:58:49.100637 sshd-session[7104]: pam_unix(sshd:session): session closed for user core Jan 13 20:58:49.104299 systemd[1]: sshd@14-139.178.70.103:22-147.75.109.163:58736.service: Deactivated successfully. Jan 13 20:58:49.105380 systemd[1]: session-17.scope: Deactivated successfully. Jan 13 20:58:49.106185 systemd-logind[1519]: Session 17 logged out. Waiting for processes to exit. Jan 13 20:58:49.111025 systemd[1]: Started sshd@15-139.178.70.103:22-147.75.109.163:58742.service - OpenSSH per-connection server daemon (147.75.109.163:58742). Jan 13 20:58:49.112099 systemd-logind[1519]: Removed session 17. Jan 13 20:58:49.158299 sshd[7115]: Accepted publickey for core from 147.75.109.163 port 58742 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:58:49.159225 sshd-session[7115]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:58:49.162749 systemd-logind[1519]: New session 18 of user core. Jan 13 20:58:49.167916 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 13 20:58:50.800931 sshd[7117]: Connection closed by 147.75.109.163 port 58742 Jan 13 20:58:50.809433 sshd-session[7115]: pam_unix(sshd:session): session closed for user core Jan 13 20:58:50.813135 systemd[1]: Started sshd@16-139.178.70.103:22-147.75.109.163:58744.service - OpenSSH per-connection server daemon (147.75.109.163:58744). Jan 13 20:58:50.829037 systemd[1]: sshd@15-139.178.70.103:22-147.75.109.163:58742.service: Deactivated successfully. Jan 13 20:58:50.830151 systemd[1]: session-18.scope: Deactivated successfully. Jan 13 20:58:50.831002 systemd-logind[1519]: Session 18 logged out. Waiting for processes to exit. Jan 13 20:58:50.832172 systemd-logind[1519]: Removed session 18. Jan 13 20:58:50.914751 sshd[7127]: Accepted publickey for core from 147.75.109.163 port 58744 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:58:50.916142 sshd-session[7127]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:58:50.919614 systemd-logind[1519]: New session 19 of user core. Jan 13 20:58:50.927993 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 13 20:58:51.563074 sshd[7135]: Connection closed by 147.75.109.163 port 58744 Jan 13 20:58:51.565302 sshd-session[7127]: pam_unix(sshd:session): session closed for user core Jan 13 20:58:51.573846 systemd[1]: sshd@16-139.178.70.103:22-147.75.109.163:58744.service: Deactivated successfully. Jan 13 20:58:51.575868 systemd[1]: session-19.scope: Deactivated successfully. Jan 13 20:58:51.578654 systemd-logind[1519]: Session 19 logged out. Waiting for processes to exit. Jan 13 20:58:51.586074 systemd[1]: Started sshd@17-139.178.70.103:22-147.75.109.163:58752.service - OpenSSH per-connection server daemon (147.75.109.163:58752). Jan 13 20:58:51.588170 systemd-logind[1519]: Removed session 19. Jan 13 20:58:51.638884 sshd[7144]: Accepted publickey for core from 147.75.109.163 port 58752 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:58:51.639216 sshd-session[7144]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:58:51.642877 systemd-logind[1519]: New session 20 of user core. Jan 13 20:58:51.647925 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 13 20:58:51.787430 sshd[7146]: Connection closed by 147.75.109.163 port 58752 Jan 13 20:58:51.789639 systemd[1]: sshd@17-139.178.70.103:22-147.75.109.163:58752.service: Deactivated successfully. Jan 13 20:58:51.787993 sshd-session[7144]: pam_unix(sshd:session): session closed for user core Jan 13 20:58:51.791031 systemd[1]: session-20.scope: Deactivated successfully. Jan 13 20:58:51.792183 systemd-logind[1519]: Session 20 logged out. Waiting for processes to exit. Jan 13 20:58:51.792714 systemd-logind[1519]: Removed session 20. Jan 13 20:58:54.329418 containerd[1537]: time="2025-01-13T20:58:54.311196558Z" level=info msg="StopPodSandbox for \"fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f\"" Jan 13 20:58:54.381930 containerd[1537]: time="2025-01-13T20:58:54.363390171Z" level=info msg="TearDown network for sandbox \"fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f\" successfully" Jan 13 20:58:54.381930 containerd[1537]: time="2025-01-13T20:58:54.381927937Z" level=info msg="StopPodSandbox for \"fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f\" returns successfully" Jan 13 20:58:54.387578 containerd[1537]: time="2025-01-13T20:58:54.387554926Z" level=info msg="RemovePodSandbox for \"fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f\"" Jan 13 20:58:54.387632 containerd[1537]: time="2025-01-13T20:58:54.387588388Z" level=info msg="Forcibly stopping sandbox \"fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f\"" Jan 13 20:58:54.387719 containerd[1537]: time="2025-01-13T20:58:54.387631537Z" level=info msg="TearDown network for sandbox \"fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f\" successfully" Jan 13 20:58:54.434793 containerd[1537]: time="2025-01-13T20:58:54.434751455Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:58:54.434964 containerd[1537]: time="2025-01-13T20:58:54.434814695Z" level=info msg="RemovePodSandbox \"fc81c1524318296f26aa30eb0a134194e470ece1d93c3d458620bc89289b4c9f\" returns successfully" Jan 13 20:58:54.435875 containerd[1537]: time="2025-01-13T20:58:54.435301124Z" level=info msg="StopPodSandbox for \"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba\"" Jan 13 20:58:55.092428 containerd[1537]: 2025-01-13 20:58:54.878 [WARNING][7169] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:58:55.092428 containerd[1537]: 2025-01-13 20:58:54.883 [INFO][7169] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Jan 13 20:58:55.092428 containerd[1537]: 2025-01-13 20:58:54.883 [INFO][7169] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" iface="eth0" netns="" Jan 13 20:58:55.092428 containerd[1537]: 2025-01-13 20:58:54.883 [INFO][7169] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Jan 13 20:58:55.092428 containerd[1537]: 2025-01-13 20:58:54.883 [INFO][7169] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Jan 13 20:58:55.092428 containerd[1537]: 2025-01-13 20:58:55.076 [INFO][7175] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" HandleID="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Workload="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:58:55.092428 containerd[1537]: 2025-01-13 20:58:55.080 [INFO][7175] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:58:55.092428 containerd[1537]: 2025-01-13 20:58:55.080 [INFO][7175] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:58:55.092428 containerd[1537]: 2025-01-13 20:58:55.088 [WARNING][7175] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" HandleID="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Workload="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:58:55.092428 containerd[1537]: 2025-01-13 20:58:55.088 [INFO][7175] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" HandleID="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Workload="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:58:55.092428 containerd[1537]: 2025-01-13 20:58:55.089 [INFO][7175] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:58:55.092428 containerd[1537]: 2025-01-13 20:58:55.090 [INFO][7169] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Jan 13 20:58:55.093517 containerd[1537]: time="2025-01-13T20:58:55.092438932Z" level=info msg="TearDown network for sandbox \"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba\" successfully" Jan 13 20:58:55.093517 containerd[1537]: time="2025-01-13T20:58:55.092454689Z" level=info msg="StopPodSandbox for \"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba\" returns successfully" Jan 13 20:58:55.093517 containerd[1537]: time="2025-01-13T20:58:55.092850504Z" level=info msg="RemovePodSandbox for \"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba\"" Jan 13 20:58:55.093517 containerd[1537]: time="2025-01-13T20:58:55.092867081Z" level=info msg="Forcibly stopping sandbox \"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba\"" Jan 13 20:58:55.140375 containerd[1537]: 2025-01-13 20:58:55.117 [WARNING][7193] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:58:55.140375 containerd[1537]: 2025-01-13 20:58:55.117 [INFO][7193] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Jan 13 20:58:55.140375 containerd[1537]: 2025-01-13 20:58:55.117 [INFO][7193] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" iface="eth0" netns="" Jan 13 20:58:55.140375 containerd[1537]: 2025-01-13 20:58:55.117 [INFO][7193] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Jan 13 20:58:55.140375 containerd[1537]: 2025-01-13 20:58:55.117 [INFO][7193] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Jan 13 20:58:55.140375 containerd[1537]: 2025-01-13 20:58:55.132 [INFO][7199] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" HandleID="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Workload="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:58:55.140375 containerd[1537]: 2025-01-13 20:58:55.132 [INFO][7199] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:58:55.140375 containerd[1537]: 2025-01-13 20:58:55.132 [INFO][7199] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:58:55.140375 containerd[1537]: 2025-01-13 20:58:55.136 [WARNING][7199] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" HandleID="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Workload="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:58:55.140375 containerd[1537]: 2025-01-13 20:58:55.136 [INFO][7199] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" HandleID="k8s-pod-network.1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Workload="localhost-k8s-calico--kube--controllers--6df8b96c48--4j6ml-eth0" Jan 13 20:58:55.140375 containerd[1537]: 2025-01-13 20:58:55.136 [INFO][7199] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:58:55.140375 containerd[1537]: 2025-01-13 20:58:55.138 [INFO][7193] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba" Jan 13 20:58:55.143879 containerd[1537]: time="2025-01-13T20:58:55.140456707Z" level=info msg="TearDown network for sandbox \"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba\" successfully" Jan 13 20:58:55.144208 containerd[1537]: time="2025-01-13T20:58:55.144044733Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:58:55.144208 containerd[1537]: time="2025-01-13T20:58:55.144075315Z" level=info msg="RemovePodSandbox \"1623055bf360e3108a6ef596810b2597a1e2e681fe9394fc18d97623f847a9ba\" returns successfully" Jan 13 20:58:55.144885 containerd[1537]: time="2025-01-13T20:58:55.144361522Z" level=info msg="StopPodSandbox for \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\"" Jan 13 20:58:55.144885 containerd[1537]: time="2025-01-13T20:58:55.144419339Z" level=info msg="TearDown network for sandbox \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\" successfully" Jan 13 20:58:55.144885 containerd[1537]: time="2025-01-13T20:58:55.144429523Z" level=info msg="StopPodSandbox for \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\" returns successfully" Jan 13 20:58:55.144885 containerd[1537]: time="2025-01-13T20:58:55.144618812Z" level=info msg="RemovePodSandbox for \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\"" Jan 13 20:58:55.144885 containerd[1537]: time="2025-01-13T20:58:55.144635791Z" level=info msg="Forcibly stopping sandbox \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\"" Jan 13 20:58:55.144885 containerd[1537]: time="2025-01-13T20:58:55.144668488Z" level=info msg="TearDown network for sandbox \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\" successfully" Jan 13 20:58:55.151810 containerd[1537]: time="2025-01-13T20:58:55.151747521Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:58:55.151810 containerd[1537]: time="2025-01-13T20:58:55.151771731Z" level=info msg="RemovePodSandbox \"398254a1c4ef04b96566a48a26cbee333f42fb9b1e7bf4a06513b22d57c9c2df\" returns successfully" Jan 13 20:58:56.810417 systemd[1]: Started sshd@18-139.178.70.103:22-147.75.109.163:58768.service - OpenSSH per-connection server daemon (147.75.109.163:58768). Jan 13 20:58:57.093179 sshd[7209]: Accepted publickey for core from 147.75.109.163 port 58768 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:58:57.096992 sshd-session[7209]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:58:57.105247 systemd-logind[1519]: New session 21 of user core. Jan 13 20:58:57.109212 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 13 20:58:57.548393 sshd[7211]: Connection closed by 147.75.109.163 port 58768 Jan 13 20:58:57.547514 sshd-session[7209]: pam_unix(sshd:session): session closed for user core Jan 13 20:58:57.551213 systemd[1]: sshd@18-139.178.70.103:22-147.75.109.163:58768.service: Deactivated successfully. Jan 13 20:58:57.552342 systemd[1]: session-21.scope: Deactivated successfully. Jan 13 20:58:57.555659 systemd-logind[1519]: Session 21 logged out. Waiting for processes to exit. Jan 13 20:58:57.557773 systemd-logind[1519]: Removed session 21. Jan 13 20:59:02.556122 systemd[1]: Started sshd@19-139.178.70.103:22-147.75.109.163:55708.service - OpenSSH per-connection server daemon (147.75.109.163:55708). Jan 13 20:59:02.759996 sshd[7287]: Accepted publickey for core from 147.75.109.163 port 55708 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:59:02.760763 sshd-session[7287]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:59:02.766357 systemd-logind[1519]: New session 22 of user core. Jan 13 20:59:02.769937 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 13 20:59:03.434723 sshd[7289]: Connection closed by 147.75.109.163 port 55708 Jan 13 20:59:03.435758 sshd-session[7287]: pam_unix(sshd:session): session closed for user core Jan 13 20:59:03.439314 systemd[1]: sshd@19-139.178.70.103:22-147.75.109.163:55708.service: Deactivated successfully. Jan 13 20:59:03.441513 systemd[1]: session-22.scope: Deactivated successfully. Jan 13 20:59:03.442524 systemd-logind[1519]: Session 22 logged out. Waiting for processes to exit. Jan 13 20:59:03.443250 systemd-logind[1519]: Removed session 22. Jan 13 20:59:08.444975 systemd[1]: Started sshd@20-139.178.70.103:22-147.75.109.163:57636.service - OpenSSH per-connection server daemon (147.75.109.163:57636). Jan 13 20:59:08.535849 sshd[7300]: Accepted publickey for core from 147.75.109.163 port 57636 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:59:08.536391 sshd-session[7300]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:59:08.539465 systemd-logind[1519]: New session 23 of user core. Jan 13 20:59:08.548201 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 13 20:59:08.808813 sshd[7302]: Connection closed by 147.75.109.163 port 57636 Jan 13 20:59:08.809385 sshd-session[7300]: pam_unix(sshd:session): session closed for user core Jan 13 20:59:08.811175 systemd-logind[1519]: Session 23 logged out. Waiting for processes to exit. Jan 13 20:59:08.811486 systemd[1]: sshd@20-139.178.70.103:22-147.75.109.163:57636.service: Deactivated successfully. Jan 13 20:59:08.812981 systemd[1]: session-23.scope: Deactivated successfully. Jan 13 20:59:08.814187 systemd-logind[1519]: Removed session 23.