Mar 17 17:53:02.734310 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Mar 17 16:07:40 -00 2025 Mar 17 17:53:02.734327 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=d4b838cd9a6f58e8c4a6b615c32b0b28ee0df1660e34033a8fbd0429c6de5fd0 Mar 17 17:53:02.734333 kernel: Disabled fast string operations Mar 17 17:53:02.736668 kernel: BIOS-provided physical RAM map: Mar 17 17:53:02.736678 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Mar 17 17:53:02.736683 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Mar 17 17:53:02.736690 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Mar 17 17:53:02.736699 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Mar 17 17:53:02.736704 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Mar 17 17:53:02.736708 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Mar 17 17:53:02.736712 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Mar 17 17:53:02.736717 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Mar 17 17:53:02.736721 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Mar 17 17:53:02.736725 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Mar 17 17:53:02.736732 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Mar 17 17:53:02.736737 kernel: NX (Execute Disable) protection: active Mar 17 17:53:02.736741 kernel: APIC: Static calls initialized Mar 17 17:53:02.736747 kernel: SMBIOS 2.7 present. Mar 17 17:53:02.736752 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Mar 17 17:53:02.736756 kernel: vmware: hypercall mode: 0x00 Mar 17 17:53:02.736761 kernel: Hypervisor detected: VMware Mar 17 17:53:02.736766 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Mar 17 17:53:02.736772 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Mar 17 17:53:02.736777 kernel: vmware: using clock offset of 2738225074 ns Mar 17 17:53:02.736782 kernel: tsc: Detected 3408.000 MHz processor Mar 17 17:53:02.736787 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 17 17:53:02.736793 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 17 17:53:02.736797 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Mar 17 17:53:02.736802 kernel: total RAM covered: 3072M Mar 17 17:53:02.736807 kernel: Found optimal setting for mtrr clean up Mar 17 17:53:02.736813 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Mar 17 17:53:02.736819 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Mar 17 17:53:02.736824 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 17 17:53:02.736828 kernel: Using GB pages for direct mapping Mar 17 17:53:02.736833 kernel: ACPI: Early table checksum verification disabled Mar 17 17:53:02.736838 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Mar 17 17:53:02.736843 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Mar 17 17:53:02.736848 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Mar 17 17:53:02.736853 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Mar 17 17:53:02.736858 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Mar 17 17:53:02.736866 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Mar 17 17:53:02.736871 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Mar 17 17:53:02.736876 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Mar 17 17:53:02.736881 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Mar 17 17:53:02.736886 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Mar 17 17:53:02.736893 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Mar 17 17:53:02.736898 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Mar 17 17:53:02.736903 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Mar 17 17:53:02.736909 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Mar 17 17:53:02.736914 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Mar 17 17:53:02.736919 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Mar 17 17:53:02.736924 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Mar 17 17:53:02.736929 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Mar 17 17:53:02.736934 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Mar 17 17:53:02.736939 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Mar 17 17:53:02.736946 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Mar 17 17:53:02.736951 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Mar 17 17:53:02.736956 kernel: system APIC only can use physical flat Mar 17 17:53:02.736961 kernel: APIC: Switched APIC routing to: physical flat Mar 17 17:53:02.736966 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Mar 17 17:53:02.736971 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Mar 17 17:53:02.736976 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Mar 17 17:53:02.736981 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Mar 17 17:53:02.736986 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Mar 17 17:53:02.736992 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Mar 17 17:53:02.736997 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Mar 17 17:53:02.737002 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Mar 17 17:53:02.737007 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Mar 17 17:53:02.737012 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Mar 17 17:53:02.737018 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Mar 17 17:53:02.737022 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Mar 17 17:53:02.737028 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Mar 17 17:53:02.737033 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Mar 17 17:53:02.737038 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Mar 17 17:53:02.737044 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Mar 17 17:53:02.737049 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Mar 17 17:53:02.737054 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Mar 17 17:53:02.737059 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Mar 17 17:53:02.737064 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Mar 17 17:53:02.737069 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Mar 17 17:53:02.737074 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Mar 17 17:53:02.737079 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Mar 17 17:53:02.737084 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Mar 17 17:53:02.737089 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Mar 17 17:53:02.737095 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Mar 17 17:53:02.737100 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Mar 17 17:53:02.737106 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Mar 17 17:53:02.737111 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Mar 17 17:53:02.737116 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Mar 17 17:53:02.737121 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Mar 17 17:53:02.737126 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Mar 17 17:53:02.737131 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Mar 17 17:53:02.737136 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Mar 17 17:53:02.737141 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Mar 17 17:53:02.737147 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Mar 17 17:53:02.737152 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Mar 17 17:53:02.737157 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Mar 17 17:53:02.737162 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Mar 17 17:53:02.737167 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Mar 17 17:53:02.737172 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Mar 17 17:53:02.737177 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Mar 17 17:53:02.737183 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Mar 17 17:53:02.737188 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Mar 17 17:53:02.737193 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Mar 17 17:53:02.737198 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Mar 17 17:53:02.737204 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Mar 17 17:53:02.737209 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Mar 17 17:53:02.737214 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Mar 17 17:53:02.737219 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Mar 17 17:53:02.737224 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Mar 17 17:53:02.737229 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Mar 17 17:53:02.737234 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Mar 17 17:53:02.737239 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Mar 17 17:53:02.737244 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Mar 17 17:53:02.737249 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Mar 17 17:53:02.737255 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Mar 17 17:53:02.737260 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Mar 17 17:53:02.737266 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Mar 17 17:53:02.737275 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Mar 17 17:53:02.737280 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Mar 17 17:53:02.737285 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Mar 17 17:53:02.737291 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Mar 17 17:53:02.737296 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Mar 17 17:53:02.737302 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Mar 17 17:53:02.737308 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Mar 17 17:53:02.737313 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Mar 17 17:53:02.737318 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Mar 17 17:53:02.737324 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Mar 17 17:53:02.737329 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Mar 17 17:53:02.737334 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Mar 17 17:53:02.737348 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Mar 17 17:53:02.737354 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Mar 17 17:53:02.737359 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Mar 17 17:53:02.737365 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Mar 17 17:53:02.737372 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Mar 17 17:53:02.737377 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Mar 17 17:53:02.737383 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Mar 17 17:53:02.737388 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Mar 17 17:53:02.737393 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Mar 17 17:53:02.737399 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Mar 17 17:53:02.737404 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Mar 17 17:53:02.737410 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Mar 17 17:53:02.737415 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Mar 17 17:53:02.737420 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Mar 17 17:53:02.737427 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Mar 17 17:53:02.737432 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Mar 17 17:53:02.737438 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Mar 17 17:53:02.737443 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Mar 17 17:53:02.737448 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Mar 17 17:53:02.737454 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Mar 17 17:53:02.737459 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Mar 17 17:53:02.737464 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Mar 17 17:53:02.737470 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Mar 17 17:53:02.737475 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Mar 17 17:53:02.737481 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Mar 17 17:53:02.737487 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Mar 17 17:53:02.737492 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Mar 17 17:53:02.737497 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Mar 17 17:53:02.737503 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Mar 17 17:53:02.737509 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Mar 17 17:53:02.737514 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Mar 17 17:53:02.737519 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Mar 17 17:53:02.737525 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Mar 17 17:53:02.737530 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Mar 17 17:53:02.737536 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Mar 17 17:53:02.737542 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Mar 17 17:53:02.737547 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Mar 17 17:53:02.737553 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Mar 17 17:53:02.737558 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Mar 17 17:53:02.737563 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Mar 17 17:53:02.737569 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Mar 17 17:53:02.737574 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Mar 17 17:53:02.737579 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Mar 17 17:53:02.737584 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Mar 17 17:53:02.737591 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Mar 17 17:53:02.737596 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Mar 17 17:53:02.737601 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Mar 17 17:53:02.737607 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Mar 17 17:53:02.737612 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Mar 17 17:53:02.737618 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Mar 17 17:53:02.737626 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Mar 17 17:53:02.737632 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Mar 17 17:53:02.737641 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Mar 17 17:53:02.737652 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Mar 17 17:53:02.737662 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Mar 17 17:53:02.737670 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Mar 17 17:53:02.737681 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Mar 17 17:53:02.737687 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Mar 17 17:53:02.737693 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Mar 17 17:53:02.737698 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Mar 17 17:53:02.737704 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Mar 17 17:53:02.737710 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Mar 17 17:53:02.737715 kernel: Zone ranges: Mar 17 17:53:02.737721 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 17 17:53:02.737728 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Mar 17 17:53:02.737734 kernel: Normal empty Mar 17 17:53:02.737739 kernel: Movable zone start for each node Mar 17 17:53:02.737745 kernel: Early memory node ranges Mar 17 17:53:02.737750 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Mar 17 17:53:02.737756 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Mar 17 17:53:02.737761 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Mar 17 17:53:02.737766 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Mar 17 17:53:02.737772 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 17 17:53:02.737777 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Mar 17 17:53:02.737784 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Mar 17 17:53:02.737790 kernel: ACPI: PM-Timer IO Port: 0x1008 Mar 17 17:53:02.737795 kernel: system APIC only can use physical flat Mar 17 17:53:02.737800 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Mar 17 17:53:02.737806 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Mar 17 17:53:02.737811 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Mar 17 17:53:02.737817 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Mar 17 17:53:02.737822 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Mar 17 17:53:02.737828 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Mar 17 17:53:02.737834 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Mar 17 17:53:02.737840 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Mar 17 17:53:02.737845 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Mar 17 17:53:02.737851 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Mar 17 17:53:02.737856 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Mar 17 17:53:02.737861 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Mar 17 17:53:02.737867 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Mar 17 17:53:02.737872 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Mar 17 17:53:02.737878 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Mar 17 17:53:02.737883 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Mar 17 17:53:02.737890 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Mar 17 17:53:02.737895 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Mar 17 17:53:02.737900 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Mar 17 17:53:02.737906 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Mar 17 17:53:02.737911 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Mar 17 17:53:02.737916 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Mar 17 17:53:02.737922 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Mar 17 17:53:02.737928 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Mar 17 17:53:02.737933 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Mar 17 17:53:02.737940 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Mar 17 17:53:02.737945 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Mar 17 17:53:02.737950 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Mar 17 17:53:02.737956 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Mar 17 17:53:02.737961 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Mar 17 17:53:02.737967 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Mar 17 17:53:02.737972 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Mar 17 17:53:02.737978 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Mar 17 17:53:02.737983 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Mar 17 17:53:02.737989 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Mar 17 17:53:02.737995 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Mar 17 17:53:02.738000 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Mar 17 17:53:02.738006 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Mar 17 17:53:02.738011 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Mar 17 17:53:02.738017 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Mar 17 17:53:02.738022 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Mar 17 17:53:02.738027 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Mar 17 17:53:02.738033 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Mar 17 17:53:02.738038 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Mar 17 17:53:02.738044 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Mar 17 17:53:02.738050 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Mar 17 17:53:02.738056 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Mar 17 17:53:02.738061 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Mar 17 17:53:02.738067 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Mar 17 17:53:02.738072 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Mar 17 17:53:02.738077 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Mar 17 17:53:02.738083 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Mar 17 17:53:02.738088 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Mar 17 17:53:02.738094 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Mar 17 17:53:02.738099 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Mar 17 17:53:02.738105 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Mar 17 17:53:02.738111 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Mar 17 17:53:02.738116 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Mar 17 17:53:02.738122 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Mar 17 17:53:02.738127 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Mar 17 17:53:02.738132 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Mar 17 17:53:02.738138 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Mar 17 17:53:02.738143 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Mar 17 17:53:02.738149 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Mar 17 17:53:02.738155 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Mar 17 17:53:02.738160 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Mar 17 17:53:02.738166 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Mar 17 17:53:02.738171 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Mar 17 17:53:02.738177 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Mar 17 17:53:02.738182 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Mar 17 17:53:02.738187 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Mar 17 17:53:02.738193 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Mar 17 17:53:02.738198 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Mar 17 17:53:02.738204 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Mar 17 17:53:02.738210 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Mar 17 17:53:02.738216 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Mar 17 17:53:02.738221 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Mar 17 17:53:02.738227 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Mar 17 17:53:02.738232 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Mar 17 17:53:02.738237 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Mar 17 17:53:02.738243 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Mar 17 17:53:02.738248 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Mar 17 17:53:02.738254 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Mar 17 17:53:02.738259 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Mar 17 17:53:02.738265 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Mar 17 17:53:02.738271 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Mar 17 17:53:02.738276 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Mar 17 17:53:02.738282 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Mar 17 17:53:02.738287 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Mar 17 17:53:02.738293 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Mar 17 17:53:02.738298 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Mar 17 17:53:02.738303 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Mar 17 17:53:02.738313 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Mar 17 17:53:02.738321 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Mar 17 17:53:02.738327 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Mar 17 17:53:02.738332 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Mar 17 17:53:02.739360 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Mar 17 17:53:02.739371 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Mar 17 17:53:02.739378 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Mar 17 17:53:02.739383 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Mar 17 17:53:02.739389 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Mar 17 17:53:02.739394 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Mar 17 17:53:02.739400 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Mar 17 17:53:02.739408 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Mar 17 17:53:02.739413 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Mar 17 17:53:02.739419 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Mar 17 17:53:02.739424 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Mar 17 17:53:02.739430 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Mar 17 17:53:02.739435 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Mar 17 17:53:02.739441 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Mar 17 17:53:02.739446 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Mar 17 17:53:02.739452 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Mar 17 17:53:02.739457 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Mar 17 17:53:02.739464 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Mar 17 17:53:02.739469 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Mar 17 17:53:02.739475 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Mar 17 17:53:02.739480 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Mar 17 17:53:02.739485 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Mar 17 17:53:02.739491 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Mar 17 17:53:02.739497 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Mar 17 17:53:02.739502 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Mar 17 17:53:02.739507 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Mar 17 17:53:02.739514 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Mar 17 17:53:02.739519 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Mar 17 17:53:02.739525 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Mar 17 17:53:02.739530 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Mar 17 17:53:02.739536 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Mar 17 17:53:02.739541 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Mar 17 17:53:02.739547 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Mar 17 17:53:02.739552 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Mar 17 17:53:02.739558 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 17 17:53:02.739564 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Mar 17 17:53:02.739571 kernel: TSC deadline timer available Mar 17 17:53:02.739576 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Mar 17 17:53:02.739582 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Mar 17 17:53:02.739587 kernel: Booting paravirtualized kernel on VMware hypervisor Mar 17 17:53:02.739593 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 17 17:53:02.739599 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Mar 17 17:53:02.739604 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Mar 17 17:53:02.739610 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Mar 17 17:53:02.739615 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Mar 17 17:53:02.739622 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Mar 17 17:53:02.739628 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Mar 17 17:53:02.739633 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Mar 17 17:53:02.739638 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Mar 17 17:53:02.739651 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Mar 17 17:53:02.739658 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Mar 17 17:53:02.739663 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Mar 17 17:53:02.739669 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Mar 17 17:53:02.739675 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Mar 17 17:53:02.739681 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Mar 17 17:53:02.739687 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Mar 17 17:53:02.739693 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Mar 17 17:53:02.739698 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Mar 17 17:53:02.739704 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Mar 17 17:53:02.739710 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Mar 17 17:53:02.739716 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=d4b838cd9a6f58e8c4a6b615c32b0b28ee0df1660e34033a8fbd0429c6de5fd0 Mar 17 17:53:02.739724 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 17:53:02.739729 kernel: random: crng init done Mar 17 17:53:02.739735 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Mar 17 17:53:02.739741 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Mar 17 17:53:02.739747 kernel: printk: log_buf_len min size: 262144 bytes Mar 17 17:53:02.739753 kernel: printk: log_buf_len: 1048576 bytes Mar 17 17:53:02.739759 kernel: printk: early log buf free: 239648(91%) Mar 17 17:53:02.739765 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 17 17:53:02.739771 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 17 17:53:02.739777 kernel: Fallback order for Node 0: 0 Mar 17 17:53:02.739784 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Mar 17 17:53:02.739789 kernel: Policy zone: DMA32 Mar 17 17:53:02.739795 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 17:53:02.739802 kernel: Memory: 1936384K/2096628K available (12288K kernel code, 2303K rwdata, 22744K rodata, 42992K init, 2196K bss, 159984K reserved, 0K cma-reserved) Mar 17 17:53:02.739808 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Mar 17 17:53:02.739815 kernel: ftrace: allocating 37938 entries in 149 pages Mar 17 17:53:02.739821 kernel: ftrace: allocated 149 pages with 4 groups Mar 17 17:53:02.739827 kernel: Dynamic Preempt: voluntary Mar 17 17:53:02.739833 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 17 17:53:02.739839 kernel: rcu: RCU event tracing is enabled. Mar 17 17:53:02.739845 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Mar 17 17:53:02.739851 kernel: Trampoline variant of Tasks RCU enabled. Mar 17 17:53:02.739857 kernel: Rude variant of Tasks RCU enabled. Mar 17 17:53:02.739863 kernel: Tracing variant of Tasks RCU enabled. Mar 17 17:53:02.739869 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 17:53:02.739876 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Mar 17 17:53:02.739881 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Mar 17 17:53:02.739887 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Mar 17 17:53:02.739893 kernel: Console: colour VGA+ 80x25 Mar 17 17:53:02.739899 kernel: printk: console [tty0] enabled Mar 17 17:53:02.739905 kernel: printk: console [ttyS0] enabled Mar 17 17:53:02.739912 kernel: ACPI: Core revision 20230628 Mar 17 17:53:02.739918 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Mar 17 17:53:02.739924 kernel: APIC: Switch to symmetric I/O mode setup Mar 17 17:53:02.739931 kernel: x2apic enabled Mar 17 17:53:02.739936 kernel: APIC: Switched APIC routing to: physical x2apic Mar 17 17:53:02.739942 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 17 17:53:02.739948 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Mar 17 17:53:02.739954 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Mar 17 17:53:02.739960 kernel: Disabled fast string operations Mar 17 17:53:02.739966 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Mar 17 17:53:02.739972 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Mar 17 17:53:02.739978 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 17 17:53:02.739985 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Mar 17 17:53:02.739991 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Mar 17 17:53:02.739997 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Mar 17 17:53:02.740003 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 17 17:53:02.740009 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Mar 17 17:53:02.740014 kernel: RETBleed: Mitigation: Enhanced IBRS Mar 17 17:53:02.740020 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 17 17:53:02.740026 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 17 17:53:02.740032 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Mar 17 17:53:02.740039 kernel: SRBDS: Unknown: Dependent on hypervisor status Mar 17 17:53:02.740045 kernel: GDS: Unknown: Dependent on hypervisor status Mar 17 17:53:02.740051 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 17 17:53:02.740057 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 17 17:53:02.740063 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 17 17:53:02.740068 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 17 17:53:02.740074 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Mar 17 17:53:02.740080 kernel: Freeing SMP alternatives memory: 32K Mar 17 17:53:02.740087 kernel: pid_max: default: 131072 minimum: 1024 Mar 17 17:53:02.740093 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 17 17:53:02.740099 kernel: landlock: Up and running. Mar 17 17:53:02.740105 kernel: SELinux: Initializing. Mar 17 17:53:02.740111 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 17 17:53:02.740117 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 17 17:53:02.740123 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Mar 17 17:53:02.740129 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Mar 17 17:53:02.740135 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Mar 17 17:53:02.740142 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Mar 17 17:53:02.740148 kernel: Performance Events: Skylake events, core PMU driver. Mar 17 17:53:02.740154 kernel: core: CPUID marked event: 'cpu cycles' unavailable Mar 17 17:53:02.740160 kernel: core: CPUID marked event: 'instructions' unavailable Mar 17 17:53:02.740165 kernel: core: CPUID marked event: 'bus cycles' unavailable Mar 17 17:53:02.740171 kernel: core: CPUID marked event: 'cache references' unavailable Mar 17 17:53:02.740177 kernel: core: CPUID marked event: 'cache misses' unavailable Mar 17 17:53:02.740182 kernel: core: CPUID marked event: 'branch instructions' unavailable Mar 17 17:53:02.740188 kernel: core: CPUID marked event: 'branch misses' unavailable Mar 17 17:53:02.740195 kernel: ... version: 1 Mar 17 17:53:02.740201 kernel: ... bit width: 48 Mar 17 17:53:02.740207 kernel: ... generic registers: 4 Mar 17 17:53:02.740213 kernel: ... value mask: 0000ffffffffffff Mar 17 17:53:02.740219 kernel: ... max period: 000000007fffffff Mar 17 17:53:02.740225 kernel: ... fixed-purpose events: 0 Mar 17 17:53:02.740230 kernel: ... event mask: 000000000000000f Mar 17 17:53:02.740236 kernel: signal: max sigframe size: 1776 Mar 17 17:53:02.740242 kernel: rcu: Hierarchical SRCU implementation. Mar 17 17:53:02.740249 kernel: rcu: Max phase no-delay instances is 400. Mar 17 17:53:02.740255 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 17 17:53:02.740261 kernel: smp: Bringing up secondary CPUs ... Mar 17 17:53:02.740268 kernel: smpboot: x86: Booting SMP configuration: Mar 17 17:53:02.740274 kernel: .... node #0, CPUs: #1 Mar 17 17:53:02.740280 kernel: Disabled fast string operations Mar 17 17:53:02.740285 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Mar 17 17:53:02.740291 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Mar 17 17:53:02.740297 kernel: smp: Brought up 1 node, 2 CPUs Mar 17 17:53:02.740303 kernel: smpboot: Max logical packages: 128 Mar 17 17:53:02.740310 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Mar 17 17:53:02.740316 kernel: devtmpfs: initialized Mar 17 17:53:02.740321 kernel: x86/mm: Memory block size: 128MB Mar 17 17:53:02.740327 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Mar 17 17:53:02.740333 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 17:53:02.741424 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Mar 17 17:53:02.741434 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 17:53:02.741440 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 17:53:02.741446 kernel: audit: initializing netlink subsys (disabled) Mar 17 17:53:02.741455 kernel: audit: type=2000 audit(1742233981.067:1): state=initialized audit_enabled=0 res=1 Mar 17 17:53:02.741461 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 17:53:02.741467 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 17 17:53:02.741473 kernel: cpuidle: using governor menu Mar 17 17:53:02.741479 kernel: Simple Boot Flag at 0x36 set to 0x80 Mar 17 17:53:02.741485 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 17:53:02.741491 kernel: dca service started, version 1.12.1 Mar 17 17:53:02.741497 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Mar 17 17:53:02.741503 kernel: PCI: Using configuration type 1 for base access Mar 17 17:53:02.741510 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 17 17:53:02.741516 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 17 17:53:02.741522 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 17 17:53:02.741528 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 17:53:02.741534 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 17 17:53:02.741540 kernel: ACPI: Added _OSI(Module Device) Mar 17 17:53:02.741546 kernel: ACPI: Added _OSI(Processor Device) Mar 17 17:53:02.741551 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 17:53:02.741557 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 17:53:02.741565 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 17:53:02.741570 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Mar 17 17:53:02.741576 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 17 17:53:02.741582 kernel: ACPI: Interpreter enabled Mar 17 17:53:02.741588 kernel: ACPI: PM: (supports S0 S1 S5) Mar 17 17:53:02.741594 kernel: ACPI: Using IOAPIC for interrupt routing Mar 17 17:53:02.741600 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 17 17:53:02.741606 kernel: PCI: Using E820 reservations for host bridge windows Mar 17 17:53:02.741612 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Mar 17 17:53:02.741618 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Mar 17 17:53:02.741701 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 17 17:53:02.741756 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Mar 17 17:53:02.741804 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Mar 17 17:53:02.741813 kernel: PCI host bridge to bus 0000:00 Mar 17 17:53:02.741862 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 17 17:53:02.741909 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Mar 17 17:53:02.741953 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 17 17:53:02.741997 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 17 17:53:02.742041 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Mar 17 17:53:02.742084 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Mar 17 17:53:02.742160 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Mar 17 17:53:02.742216 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Mar 17 17:53:02.742272 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Mar 17 17:53:02.742325 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Mar 17 17:53:02.746406 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Mar 17 17:53:02.746465 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Mar 17 17:53:02.746517 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Mar 17 17:53:02.746567 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Mar 17 17:53:02.746620 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Mar 17 17:53:02.746675 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Mar 17 17:53:02.746724 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Mar 17 17:53:02.746773 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Mar 17 17:53:02.746826 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Mar 17 17:53:02.746879 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Mar 17 17:53:02.746931 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Mar 17 17:53:02.746985 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Mar 17 17:53:02.747034 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Mar 17 17:53:02.747082 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Mar 17 17:53:02.747132 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Mar 17 17:53:02.747181 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Mar 17 17:53:02.747231 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 17 17:53:02.747289 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Mar 17 17:53:02.747382 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.747436 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.747491 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.747542 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.747595 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.747649 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.747703 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.747754 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.747809 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.747860 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.747913 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.747963 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.748020 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.748070 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.748124 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.748175 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.748249 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.748301 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.749837 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.749894 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.749949 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.750000 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.750054 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.750109 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.750162 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.750213 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.750267 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.750323 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.751415 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.751471 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.751525 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.751575 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.751643 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.751695 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.751748 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.751798 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.751854 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.751904 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.751957 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.752007 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.752059 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.752108 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.752164 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.752215 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.752268 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.752318 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.755328 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.755479 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.755542 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.755608 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.755690 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.755743 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.755797 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.755848 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.755903 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.755957 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.756011 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.756062 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.756126 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.756180 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.756234 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.756288 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.756348 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.756401 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.756454 kernel: pci_bus 0000:01: extended config space not accessible Mar 17 17:53:02.756505 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Mar 17 17:53:02.756582 kernel: pci_bus 0000:02: extended config space not accessible Mar 17 17:53:02.756594 kernel: acpiphp: Slot [32] registered Mar 17 17:53:02.756601 kernel: acpiphp: Slot [33] registered Mar 17 17:53:02.756607 kernel: acpiphp: Slot [34] registered Mar 17 17:53:02.756613 kernel: acpiphp: Slot [35] registered Mar 17 17:53:02.756619 kernel: acpiphp: Slot [36] registered Mar 17 17:53:02.756625 kernel: acpiphp: Slot [37] registered Mar 17 17:53:02.756631 kernel: acpiphp: Slot [38] registered Mar 17 17:53:02.756637 kernel: acpiphp: Slot [39] registered Mar 17 17:53:02.756643 kernel: acpiphp: Slot [40] registered Mar 17 17:53:02.756650 kernel: acpiphp: Slot [41] registered Mar 17 17:53:02.756656 kernel: acpiphp: Slot [42] registered Mar 17 17:53:02.756661 kernel: acpiphp: Slot [43] registered Mar 17 17:53:02.756667 kernel: acpiphp: Slot [44] registered Mar 17 17:53:02.756673 kernel: acpiphp: Slot [45] registered Mar 17 17:53:02.756679 kernel: acpiphp: Slot [46] registered Mar 17 17:53:02.756691 kernel: acpiphp: Slot [47] registered Mar 17 17:53:02.756700 kernel: acpiphp: Slot [48] registered Mar 17 17:53:02.756706 kernel: acpiphp: Slot [49] registered Mar 17 17:53:02.756718 kernel: acpiphp: Slot [50] registered Mar 17 17:53:02.756726 kernel: acpiphp: Slot [51] registered Mar 17 17:53:02.756732 kernel: acpiphp: Slot [52] registered Mar 17 17:53:02.756738 kernel: acpiphp: Slot [53] registered Mar 17 17:53:02.756743 kernel: acpiphp: Slot [54] registered Mar 17 17:53:02.756749 kernel: acpiphp: Slot [55] registered Mar 17 17:53:02.756755 kernel: acpiphp: Slot [56] registered Mar 17 17:53:02.756761 kernel: acpiphp: Slot [57] registered Mar 17 17:53:02.756767 kernel: acpiphp: Slot [58] registered Mar 17 17:53:02.756772 kernel: acpiphp: Slot [59] registered Mar 17 17:53:02.756779 kernel: acpiphp: Slot [60] registered Mar 17 17:53:02.756789 kernel: acpiphp: Slot [61] registered Mar 17 17:53:02.756797 kernel: acpiphp: Slot [62] registered Mar 17 17:53:02.756807 kernel: acpiphp: Slot [63] registered Mar 17 17:53:02.756863 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Mar 17 17:53:02.756914 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Mar 17 17:53:02.756964 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Mar 17 17:53:02.757014 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Mar 17 17:53:02.757063 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Mar 17 17:53:02.757120 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Mar 17 17:53:02.757170 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Mar 17 17:53:02.757219 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Mar 17 17:53:02.757268 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Mar 17 17:53:02.757325 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Mar 17 17:53:02.757402 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Mar 17 17:53:02.757456 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Mar 17 17:53:02.757507 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Mar 17 17:53:02.757557 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.757607 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Mar 17 17:53:02.757657 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Mar 17 17:53:02.757706 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Mar 17 17:53:02.757755 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Mar 17 17:53:02.757805 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Mar 17 17:53:02.757858 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Mar 17 17:53:02.757907 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Mar 17 17:53:02.757957 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Mar 17 17:53:02.758007 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Mar 17 17:53:02.758056 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Mar 17 17:53:02.758107 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Mar 17 17:53:02.758156 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Mar 17 17:53:02.758207 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Mar 17 17:53:02.758266 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Mar 17 17:53:02.758317 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Mar 17 17:53:02.758383 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Mar 17 17:53:02.758434 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Mar 17 17:53:02.758482 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Mar 17 17:53:02.758535 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Mar 17 17:53:02.758584 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Mar 17 17:53:02.758633 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Mar 17 17:53:02.758683 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Mar 17 17:53:02.758732 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Mar 17 17:53:02.758781 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Mar 17 17:53:02.758830 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Mar 17 17:53:02.758883 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Mar 17 17:53:02.758932 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Mar 17 17:53:02.758988 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Mar 17 17:53:02.759040 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Mar 17 17:53:02.759091 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Mar 17 17:53:02.759141 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Mar 17 17:53:02.759198 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Mar 17 17:53:02.759257 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Mar 17 17:53:02.759307 kernel: pci 0000:0b:00.0: supports D1 D2 Mar 17 17:53:02.760972 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 17 17:53:02.761038 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Mar 17 17:53:02.761095 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Mar 17 17:53:02.761149 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Mar 17 17:53:02.761201 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Mar 17 17:53:02.761252 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Mar 17 17:53:02.761308 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Mar 17 17:53:02.761370 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Mar 17 17:53:02.761421 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Mar 17 17:53:02.762664 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Mar 17 17:53:02.762721 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Mar 17 17:53:02.762773 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Mar 17 17:53:02.762824 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Mar 17 17:53:02.762876 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Mar 17 17:53:02.762930 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Mar 17 17:53:02.762980 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Mar 17 17:53:02.763031 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Mar 17 17:53:02.763081 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Mar 17 17:53:02.763131 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Mar 17 17:53:02.763182 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Mar 17 17:53:02.763233 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Mar 17 17:53:02.763283 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Mar 17 17:53:02.763343 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Mar 17 17:53:02.763397 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Mar 17 17:53:02.763448 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Mar 17 17:53:02.763500 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Mar 17 17:53:02.763551 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Mar 17 17:53:02.763601 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Mar 17 17:53:02.763653 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Mar 17 17:53:02.763704 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Mar 17 17:53:02.763758 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Mar 17 17:53:02.763809 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Mar 17 17:53:02.763862 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Mar 17 17:53:02.763912 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Mar 17 17:53:02.763962 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Mar 17 17:53:02.764012 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Mar 17 17:53:02.764065 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Mar 17 17:53:02.764119 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Mar 17 17:53:02.764169 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Mar 17 17:53:02.764219 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Mar 17 17:53:02.764271 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Mar 17 17:53:02.764321 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Mar 17 17:53:02.764671 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Mar 17 17:53:02.764727 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Mar 17 17:53:02.764778 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Mar 17 17:53:02.764832 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Mar 17 17:53:02.764884 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Mar 17 17:53:02.764934 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Mar 17 17:53:02.764984 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Mar 17 17:53:02.765036 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Mar 17 17:53:02.765087 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Mar 17 17:53:02.765137 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Mar 17 17:53:02.765188 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Mar 17 17:53:02.765242 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Mar 17 17:53:02.765292 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Mar 17 17:53:02.765581 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Mar 17 17:53:02.765646 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Mar 17 17:53:02.765699 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Mar 17 17:53:02.765751 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Mar 17 17:53:02.765805 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Mar 17 17:53:02.765856 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Mar 17 17:53:02.765910 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Mar 17 17:53:02.765962 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Mar 17 17:53:02.766013 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Mar 17 17:53:02.766066 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Mar 17 17:53:02.766116 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Mar 17 17:53:02.766168 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Mar 17 17:53:02.766218 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Mar 17 17:53:02.766272 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Mar 17 17:53:02.766324 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Mar 17 17:53:02.768567 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Mar 17 17:53:02.768624 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Mar 17 17:53:02.768678 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Mar 17 17:53:02.768730 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Mar 17 17:53:02.768781 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Mar 17 17:53:02.768833 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Mar 17 17:53:02.768884 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Mar 17 17:53:02.768939 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Mar 17 17:53:02.768991 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Mar 17 17:53:02.769041 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Mar 17 17:53:02.769092 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Mar 17 17:53:02.769101 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Mar 17 17:53:02.769107 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Mar 17 17:53:02.769114 kernel: ACPI: PCI: Interrupt link LNKB disabled Mar 17 17:53:02.769120 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 17 17:53:02.769128 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Mar 17 17:53:02.769134 kernel: iommu: Default domain type: Translated Mar 17 17:53:02.769140 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 17 17:53:02.769146 kernel: PCI: Using ACPI for IRQ routing Mar 17 17:53:02.769152 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 17 17:53:02.769158 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Mar 17 17:53:02.769164 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Mar 17 17:53:02.769215 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Mar 17 17:53:02.769267 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Mar 17 17:53:02.769320 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 17 17:53:02.769329 kernel: vgaarb: loaded Mar 17 17:53:02.769336 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Mar 17 17:53:02.769349 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Mar 17 17:53:02.769355 kernel: clocksource: Switched to clocksource tsc-early Mar 17 17:53:02.769361 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 17:53:02.769367 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 17:53:02.769373 kernel: pnp: PnP ACPI init Mar 17 17:53:02.769428 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Mar 17 17:53:02.769480 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Mar 17 17:53:02.769527 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Mar 17 17:53:02.769578 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Mar 17 17:53:02.769628 kernel: pnp 00:06: [dma 2] Mar 17 17:53:02.769679 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Mar 17 17:53:02.769726 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Mar 17 17:53:02.769775 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Mar 17 17:53:02.769784 kernel: pnp: PnP ACPI: found 8 devices Mar 17 17:53:02.769790 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 17 17:53:02.769796 kernel: NET: Registered PF_INET protocol family Mar 17 17:53:02.769802 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 17 17:53:02.769808 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Mar 17 17:53:02.769814 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 17:53:02.769820 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 17 17:53:02.769828 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Mar 17 17:53:02.769834 kernel: TCP: Hash tables configured (established 16384 bind 16384) Mar 17 17:53:02.769840 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 17 17:53:02.769846 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 17 17:53:02.769852 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 17:53:02.769858 kernel: NET: Registered PF_XDP protocol family Mar 17 17:53:02.769910 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Mar 17 17:53:02.769964 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 17 17:53:02.770021 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 17 17:53:02.770075 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 17 17:53:02.770127 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 17 17:53:02.770180 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Mar 17 17:53:02.770234 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Mar 17 17:53:02.770288 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Mar 17 17:53:02.770425 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Mar 17 17:53:02.770482 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Mar 17 17:53:02.770536 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Mar 17 17:53:02.770596 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Mar 17 17:53:02.770650 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Mar 17 17:53:02.770714 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Mar 17 17:53:02.770777 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Mar 17 17:53:02.770829 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Mar 17 17:53:02.770881 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Mar 17 17:53:02.770933 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Mar 17 17:53:02.770985 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Mar 17 17:53:02.771037 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Mar 17 17:53:02.771091 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Mar 17 17:53:02.771142 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Mar 17 17:53:02.771194 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Mar 17 17:53:02.771246 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Mar 17 17:53:02.771297 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Mar 17 17:53:02.771370 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.771426 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.771477 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.771529 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.771582 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.771633 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.771684 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.771736 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.771787 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.771841 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.771893 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.771943 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.771995 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772047 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.772099 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772150 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.772202 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772257 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.772308 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772560 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.772615 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772666 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.772717 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772768 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.772818 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772873 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.772923 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772973 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773025 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773076 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773127 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773177 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773228 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773281 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773332 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773397 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773447 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773498 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773550 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773601 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773652 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773705 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773756 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773807 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773859 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773910 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773960 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774011 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774061 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774112 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774163 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774216 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774267 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774324 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774433 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774485 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774537 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774587 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774637 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774687 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774740 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774790 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774841 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774891 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774941 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774992 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775043 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775095 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775145 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775196 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775251 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775301 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775365 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775417 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775468 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775517 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775569 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775619 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775669 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775723 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775774 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775824 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775875 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775925 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775976 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Mar 17 17:53:02.776028 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Mar 17 17:53:02.776079 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Mar 17 17:53:02.776129 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Mar 17 17:53:02.776179 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Mar 17 17:53:02.776237 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Mar 17 17:53:02.776289 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Mar 17 17:53:02.776346 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Mar 17 17:53:02.776398 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Mar 17 17:53:02.776449 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Mar 17 17:53:02.776500 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Mar 17 17:53:02.776552 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Mar 17 17:53:02.776603 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Mar 17 17:53:02.776657 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Mar 17 17:53:02.776710 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Mar 17 17:53:02.776761 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Mar 17 17:53:02.776812 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Mar 17 17:53:02.776863 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Mar 17 17:53:02.776914 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Mar 17 17:53:02.776964 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Mar 17 17:53:02.777015 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Mar 17 17:53:02.777066 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Mar 17 17:53:02.777119 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Mar 17 17:53:02.777169 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Mar 17 17:53:02.777223 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Mar 17 17:53:02.777274 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Mar 17 17:53:02.777324 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Mar 17 17:53:02.777390 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Mar 17 17:53:02.777445 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Mar 17 17:53:02.777496 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Mar 17 17:53:02.777546 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Mar 17 17:53:02.777596 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Mar 17 17:53:02.777647 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Mar 17 17:53:02.777701 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Mar 17 17:53:02.777753 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Mar 17 17:53:02.777804 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Mar 17 17:53:02.777855 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Mar 17 17:53:02.777908 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Mar 17 17:53:02.777959 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Mar 17 17:53:02.778010 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Mar 17 17:53:02.778060 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Mar 17 17:53:02.778110 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Mar 17 17:53:02.778162 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Mar 17 17:53:02.778213 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Mar 17 17:53:02.778263 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Mar 17 17:53:02.778318 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Mar 17 17:53:02.778498 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Mar 17 17:53:02.778549 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Mar 17 17:53:02.778598 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Mar 17 17:53:02.778649 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Mar 17 17:53:02.778699 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Mar 17 17:53:02.778748 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Mar 17 17:53:02.778799 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Mar 17 17:53:02.778849 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Mar 17 17:53:02.778898 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Mar 17 17:53:02.778949 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Mar 17 17:53:02.779001 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Mar 17 17:53:02.779051 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Mar 17 17:53:02.779102 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Mar 17 17:53:02.779151 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Mar 17 17:53:02.779201 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Mar 17 17:53:02.779254 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Mar 17 17:53:02.779304 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Mar 17 17:53:02.779363 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Mar 17 17:53:02.779413 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Mar 17 17:53:02.779467 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Mar 17 17:53:02.779517 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Mar 17 17:53:02.779568 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Mar 17 17:53:02.779619 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Mar 17 17:53:02.779670 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Mar 17 17:53:02.779721 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Mar 17 17:53:02.779772 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Mar 17 17:53:02.779822 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Mar 17 17:53:02.779873 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Mar 17 17:53:02.779923 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Mar 17 17:53:02.779976 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Mar 17 17:53:02.780026 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Mar 17 17:53:02.780076 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Mar 17 17:53:02.780126 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Mar 17 17:53:02.780176 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Mar 17 17:53:02.780226 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Mar 17 17:53:02.780276 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Mar 17 17:53:02.780326 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Mar 17 17:53:02.780423 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Mar 17 17:53:02.780478 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Mar 17 17:53:02.780529 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Mar 17 17:53:02.780579 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Mar 17 17:53:02.780630 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Mar 17 17:53:02.780681 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Mar 17 17:53:02.780732 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Mar 17 17:53:02.780782 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Mar 17 17:53:02.780831 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Mar 17 17:53:02.780883 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Mar 17 17:53:02.780933 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Mar 17 17:53:02.780985 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Mar 17 17:53:02.781036 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Mar 17 17:53:02.781086 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Mar 17 17:53:02.781137 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Mar 17 17:53:02.781188 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Mar 17 17:53:02.781239 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Mar 17 17:53:02.781289 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Mar 17 17:53:02.781354 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Mar 17 17:53:02.781410 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Mar 17 17:53:02.781465 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Mar 17 17:53:02.781515 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Mar 17 17:53:02.781566 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Mar 17 17:53:02.781616 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Mar 17 17:53:02.781674 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Mar 17 17:53:02.781726 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Mar 17 17:53:02.781777 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Mar 17 17:53:02.781828 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Mar 17 17:53:02.781879 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Mar 17 17:53:02.781929 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Mar 17 17:53:02.781984 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Mar 17 17:53:02.782036 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Mar 17 17:53:02.782082 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Mar 17 17:53:02.782127 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Mar 17 17:53:02.782172 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Mar 17 17:53:02.782217 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Mar 17 17:53:02.782267 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Mar 17 17:53:02.782317 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Mar 17 17:53:02.782379 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Mar 17 17:53:02.782428 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Mar 17 17:53:02.782475 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Mar 17 17:53:02.782522 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Mar 17 17:53:02.782568 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Mar 17 17:53:02.782614 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Mar 17 17:53:02.782665 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Mar 17 17:53:02.782716 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Mar 17 17:53:02.782763 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Mar 17 17:53:02.782814 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Mar 17 17:53:02.782861 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Mar 17 17:53:02.782908 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Mar 17 17:53:02.782958 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Mar 17 17:53:02.783005 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Mar 17 17:53:02.783055 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Mar 17 17:53:02.783105 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Mar 17 17:53:02.783153 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Mar 17 17:53:02.783205 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Mar 17 17:53:02.783252 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Mar 17 17:53:02.783302 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Mar 17 17:53:02.783411 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Mar 17 17:53:02.783463 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Mar 17 17:53:02.783510 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Mar 17 17:53:02.783563 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Mar 17 17:53:02.783621 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Mar 17 17:53:02.783676 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Mar 17 17:53:02.783725 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Mar 17 17:53:02.783772 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Mar 17 17:53:02.783847 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Mar 17 17:53:02.783895 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Mar 17 17:53:02.783942 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Mar 17 17:53:02.783993 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Mar 17 17:53:02.784043 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Mar 17 17:53:02.784093 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Mar 17 17:53:02.784144 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Mar 17 17:53:02.784192 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Mar 17 17:53:02.784244 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Mar 17 17:53:02.784292 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Mar 17 17:53:02.784435 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Mar 17 17:53:02.784489 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Mar 17 17:53:02.784541 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Mar 17 17:53:02.784588 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Mar 17 17:53:02.784639 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Mar 17 17:53:02.784686 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Mar 17 17:53:02.784738 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Mar 17 17:53:02.784789 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Mar 17 17:53:02.784836 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Mar 17 17:53:02.784904 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Mar 17 17:53:02.784954 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Mar 17 17:53:02.785001 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Mar 17 17:53:02.785053 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Mar 17 17:53:02.785101 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Mar 17 17:53:02.785151 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Mar 17 17:53:02.785202 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Mar 17 17:53:02.785251 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Mar 17 17:53:02.785303 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Mar 17 17:53:02.785371 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Mar 17 17:53:02.785425 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Mar 17 17:53:02.785475 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Mar 17 17:53:02.785526 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Mar 17 17:53:02.785574 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Mar 17 17:53:02.785625 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Mar 17 17:53:02.785672 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Mar 17 17:53:02.785728 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Mar 17 17:53:02.785778 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Mar 17 17:53:02.785824 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Mar 17 17:53:02.785875 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Mar 17 17:53:02.785922 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Mar 17 17:53:02.785969 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Mar 17 17:53:02.786043 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Mar 17 17:53:02.786095 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Mar 17 17:53:02.786147 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Mar 17 17:53:02.786195 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Mar 17 17:53:02.786246 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Mar 17 17:53:02.786294 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Mar 17 17:53:02.786636 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Mar 17 17:53:02.786694 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Mar 17 17:53:02.786751 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Mar 17 17:53:02.786800 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Mar 17 17:53:02.786851 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Mar 17 17:53:02.786897 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Mar 17 17:53:02.786954 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Mar 17 17:53:02.786965 kernel: PCI: CLS 32 bytes, default 64 Mar 17 17:53:02.786974 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 17 17:53:02.786980 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Mar 17 17:53:02.786987 kernel: clocksource: Switched to clocksource tsc Mar 17 17:53:02.786993 kernel: Initialise system trusted keyrings Mar 17 17:53:02.787000 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Mar 17 17:53:02.787006 kernel: Key type asymmetric registered Mar 17 17:53:02.787012 kernel: Asymmetric key parser 'x509' registered Mar 17 17:53:02.787018 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 17 17:53:02.787025 kernel: io scheduler mq-deadline registered Mar 17 17:53:02.787032 kernel: io scheduler kyber registered Mar 17 17:53:02.787039 kernel: io scheduler bfq registered Mar 17 17:53:02.787092 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Mar 17 17:53:02.787146 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787199 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Mar 17 17:53:02.787251 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787303 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Mar 17 17:53:02.787368 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787426 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Mar 17 17:53:02.787480 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787534 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Mar 17 17:53:02.787585 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787644 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Mar 17 17:53:02.787700 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787752 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Mar 17 17:53:02.787804 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787859 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Mar 17 17:53:02.787911 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787970 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Mar 17 17:53:02.788025 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.788078 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Mar 17 17:53:02.788130 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.788415 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Mar 17 17:53:02.788472 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.788526 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Mar 17 17:53:02.788579 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.788635 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Mar 17 17:53:02.788687 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.788739 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Mar 17 17:53:02.788791 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.788844 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Mar 17 17:53:02.788899 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.788950 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Mar 17 17:53:02.789003 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.789056 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Mar 17 17:53:02.789107 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.789160 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Mar 17 17:53:02.789214 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.789267 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Mar 17 17:53:02.789318 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.789377 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Mar 17 17:53:02.789430 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.789484 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Mar 17 17:53:02.789539 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.789591 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Mar 17 17:53:02.789643 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.789696 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Mar 17 17:53:02.790093 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790154 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Mar 17 17:53:02.790208 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790261 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Mar 17 17:53:02.790314 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790416 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Mar 17 17:53:02.790470 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790523 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Mar 17 17:53:02.790578 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790631 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Mar 17 17:53:02.790685 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790737 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Mar 17 17:53:02.790788 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790841 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Mar 17 17:53:02.790892 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790944 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Mar 17 17:53:02.790994 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.791045 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Mar 17 17:53:02.791099 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.791109 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 17 17:53:02.791116 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 17:53:02.791122 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 17 17:53:02.791129 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Mar 17 17:53:02.791135 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 17 17:53:02.791141 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 17 17:53:02.791192 kernel: rtc_cmos 00:01: registered as rtc0 Mar 17 17:53:02.791244 kernel: rtc_cmos 00:01: setting system clock to 2025-03-17T17:53:02 UTC (1742233982) Mar 17 17:53:02.791291 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Mar 17 17:53:02.791300 kernel: intel_pstate: CPU model not supported Mar 17 17:53:02.791306 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 17 17:53:02.791313 kernel: NET: Registered PF_INET6 protocol family Mar 17 17:53:02.791319 kernel: Segment Routing with IPv6 Mar 17 17:53:02.791325 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 17:53:02.791332 kernel: NET: Registered PF_PACKET protocol family Mar 17 17:53:02.791345 kernel: Key type dns_resolver registered Mar 17 17:53:02.791354 kernel: IPI shorthand broadcast: enabled Mar 17 17:53:02.791361 kernel: sched_clock: Marking stable (883114441, 223971283)->(1165808046, -58722322) Mar 17 17:53:02.791367 kernel: registered taskstats version 1 Mar 17 17:53:02.791373 kernel: Loading compiled-in X.509 certificates Mar 17 17:53:02.791380 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 608fb88224bc0ea76afefc598557abb0413f36c0' Mar 17 17:53:02.791386 kernel: Key type .fscrypt registered Mar 17 17:53:02.791393 kernel: Key type fscrypt-provisioning registered Mar 17 17:53:02.791399 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 17:53:02.791407 kernel: ima: Allocated hash algorithm: sha1 Mar 17 17:53:02.791413 kernel: ima: No architecture policies found Mar 17 17:53:02.791419 kernel: clk: Disabling unused clocks Mar 17 17:53:02.791426 kernel: Freeing unused kernel image (initmem) memory: 42992K Mar 17 17:53:02.791432 kernel: Write protecting the kernel read-only data: 36864k Mar 17 17:53:02.791438 kernel: Freeing unused kernel image (rodata/data gap) memory: 1832K Mar 17 17:53:02.791445 kernel: Run /init as init process Mar 17 17:53:02.791451 kernel: with arguments: Mar 17 17:53:02.791458 kernel: /init Mar 17 17:53:02.791464 kernel: with environment: Mar 17 17:53:02.791472 kernel: HOME=/ Mar 17 17:53:02.791477 kernel: TERM=linux Mar 17 17:53:02.791484 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 17:53:02.791491 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 17 17:53:02.791499 systemd[1]: Detected virtualization vmware. Mar 17 17:53:02.791506 systemd[1]: Detected architecture x86-64. Mar 17 17:53:02.791513 systemd[1]: Running in initrd. Mar 17 17:53:02.791521 systemd[1]: No hostname configured, using default hostname. Mar 17 17:53:02.791527 systemd[1]: Hostname set to . Mar 17 17:53:02.791534 systemd[1]: Initializing machine ID from random generator. Mar 17 17:53:02.791540 systemd[1]: Queued start job for default target initrd.target. Mar 17 17:53:02.791547 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:53:02.791553 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:53:02.791560 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 17 17:53:02.791567 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:53:02.791575 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 17 17:53:02.791582 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 17 17:53:02.791589 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 17 17:53:02.791596 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 17 17:53:02.791602 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:53:02.791609 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:53:02.791615 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:53:02.791623 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:53:02.791629 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:53:02.791636 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:53:02.791643 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:53:02.791650 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:53:02.791656 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 17 17:53:02.791666 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 17 17:53:02.791673 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:53:02.791679 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:53:02.791687 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:53:02.791694 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:53:02.791700 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 17 17:53:02.791707 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:53:02.791714 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 17 17:53:02.791720 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 17:53:02.791727 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:53:02.791733 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:53:02.791741 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:53:02.791750 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 17 17:53:02.791768 systemd-journald[215]: Collecting audit messages is disabled. Mar 17 17:53:02.791784 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:53:02.791793 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 17:53:02.791800 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 17:53:02.791806 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 17:53:02.791813 kernel: Bridge firewalling registered Mar 17 17:53:02.791820 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:53:02.791837 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:53:02.791845 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:53:02.791852 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:53:02.791858 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:53:02.791865 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:53:02.791872 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:53:02.791880 systemd-journald[215]: Journal started Mar 17 17:53:02.791897 systemd-journald[215]: Runtime Journal (/run/log/journal/5287df66223f4f369c27c9e56f451b31) is 4.8M, max 38.7M, 33.8M free. Mar 17 17:53:02.738719 systemd-modules-load[216]: Inserted module 'overlay' Mar 17 17:53:02.794496 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:53:02.756813 systemd-modules-load[216]: Inserted module 'br_netfilter' Mar 17 17:53:02.794741 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:53:02.800487 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 17 17:53:02.801077 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:53:02.802371 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:53:02.807516 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:53:02.810348 dracut-cmdline[246]: dracut-dracut-053 Mar 17 17:53:02.810348 dracut-cmdline[246]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=d4b838cd9a6f58e8c4a6b615c32b0b28ee0df1660e34033a8fbd0429c6de5fd0 Mar 17 17:53:02.813438 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:53:02.830446 systemd-resolved[260]: Positive Trust Anchors: Mar 17 17:53:02.830626 systemd-resolved[260]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:53:02.830649 systemd-resolved[260]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:53:02.832266 systemd-resolved[260]: Defaulting to hostname 'linux'. Mar 17 17:53:02.833687 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:53:02.833853 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:53:02.856354 kernel: SCSI subsystem initialized Mar 17 17:53:02.862361 kernel: Loading iSCSI transport class v2.0-870. Mar 17 17:53:02.869356 kernel: iscsi: registered transport (tcp) Mar 17 17:53:02.882375 kernel: iscsi: registered transport (qla4xxx) Mar 17 17:53:02.882411 kernel: QLogic iSCSI HBA Driver Mar 17 17:53:02.902393 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 17 17:53:02.906441 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 17 17:53:02.922085 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 17:53:02.922132 kernel: device-mapper: uevent: version 1.0.3 Mar 17 17:53:02.922142 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 17 17:53:02.954372 kernel: raid6: avx2x4 gen() 51886 MB/s Mar 17 17:53:02.970359 kernel: raid6: avx2x2 gen() 52823 MB/s Mar 17 17:53:02.987564 kernel: raid6: avx2x1 gen() 44318 MB/s Mar 17 17:53:02.987613 kernel: raid6: using algorithm avx2x2 gen() 52823 MB/s Mar 17 17:53:03.005553 kernel: raid6: .... xor() 30920 MB/s, rmw enabled Mar 17 17:53:03.005597 kernel: raid6: using avx2x2 recovery algorithm Mar 17 17:53:03.019355 kernel: xor: automatically using best checksumming function avx Mar 17 17:53:03.120355 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 17 17:53:03.126348 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:53:03.130448 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:53:03.139002 systemd-udevd[436]: Using default interface naming scheme 'v255'. Mar 17 17:53:03.141528 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:53:03.144503 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 17 17:53:03.154470 dracut-pre-trigger[437]: rd.md=0: removing MD RAID activation Mar 17 17:53:03.171740 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:53:03.176440 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:53:03.248304 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:53:03.256448 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 17 17:53:03.265907 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 17 17:53:03.266327 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:53:03.266725 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:53:03.267011 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:53:03.272459 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 17 17:53:03.278466 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:53:03.312373 kernel: VMware PVSCSI driver - version 1.0.7.0-k Mar 17 17:53:03.324353 kernel: vmw_pvscsi: using 64bit dma Mar 17 17:53:03.324383 kernel: vmw_pvscsi: max_id: 16 Mar 17 17:53:03.325348 kernel: vmw_pvscsi: setting ring_pages to 8 Mar 17 17:53:03.327348 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Mar 17 17:53:03.331367 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Mar 17 17:53:03.337385 kernel: cryptd: max_cpu_qlen set to 1000 Mar 17 17:53:03.337396 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Mar 17 17:53:03.341460 kernel: vmw_pvscsi: enabling reqCallThreshold Mar 17 17:53:03.341479 kernel: vmw_pvscsi: driver-based request coalescing enabled Mar 17 17:53:03.341487 kernel: vmw_pvscsi: using MSI-X Mar 17 17:53:03.341495 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Mar 17 17:53:03.347351 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Mar 17 17:53:03.347581 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:53:03.348911 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:53:03.351376 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Mar 17 17:53:03.351485 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Mar 17 17:53:03.351559 kernel: AVX2 version of gcm_enc/dec engaged. Mar 17 17:53:03.351739 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:53:03.351843 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:53:03.351871 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:53:03.351982 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:53:03.354534 kernel: AES CTR mode by8 optimization enabled Mar 17 17:53:03.358745 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:53:03.374608 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:53:03.375954 kernel: libata version 3.00 loaded. Mar 17 17:53:03.378472 kernel: ata_piix 0000:00:07.1: version 2.13 Mar 17 17:53:03.382958 kernel: scsi host1: ata_piix Mar 17 17:53:03.383039 kernel: scsi host2: ata_piix Mar 17 17:53:03.383101 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Mar 17 17:53:03.383110 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Mar 17 17:53:03.381941 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:53:03.387931 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Mar 17 17:53:03.409556 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 17 17:53:03.409634 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Mar 17 17:53:03.409696 kernel: sd 0:0:0:0: [sda] Cache data unavailable Mar 17 17:53:03.409756 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Mar 17 17:53:03.409822 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:53:03.409832 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 17 17:53:03.393306 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:53:03.550361 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Mar 17 17:53:03.556351 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Mar 17 17:53:03.575479 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Mar 17 17:53:03.593987 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 17 17:53:03.594005 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (491) Mar 17 17:53:03.594013 kernel: BTRFS: device fsid 2b8ebefd-e897-48f6-96d5-0893fbb7c64a devid 1 transid 40 /dev/sda3 scanned by (udev-worker) (492) Mar 17 17:53:03.594021 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Mar 17 17:53:03.584503 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Mar 17 17:53:03.588002 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Mar 17 17:53:03.593634 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Mar 17 17:53:03.596711 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Mar 17 17:53:03.597006 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Mar 17 17:53:03.601483 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 17 17:53:03.623382 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:53:04.679376 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:53:04.679621 disk-uuid[594]: The operation has completed successfully. Mar 17 17:53:04.750584 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 17:53:04.750888 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 17 17:53:04.755454 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 17 17:53:04.757312 sh[611]: Success Mar 17 17:53:04.766355 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Mar 17 17:53:04.864644 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 17 17:53:04.865550 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 17 17:53:04.865842 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 17 17:53:04.927789 kernel: BTRFS info (device dm-0): first mount of filesystem 2b8ebefd-e897-48f6-96d5-0893fbb7c64a Mar 17 17:53:04.927823 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 17 17:53:04.927832 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 17 17:53:04.928887 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 17 17:53:04.929706 kernel: BTRFS info (device dm-0): using free space tree Mar 17 17:53:04.951369 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 17 17:53:04.954472 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 17 17:53:04.958488 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Mar 17 17:53:04.960044 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 17 17:53:05.033549 kernel: BTRFS info (device sda6): first mount of filesystem 7b241d32-136b-4fe3-b105-cecff2b2cf64 Mar 17 17:53:05.033593 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 17:53:05.033616 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:53:05.039687 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 17 17:53:05.045984 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 17:53:05.048354 kernel: BTRFS info (device sda6): last unmount of filesystem 7b241d32-136b-4fe3-b105-cecff2b2cf64 Mar 17 17:53:05.051968 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 17 17:53:05.058478 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 17 17:53:05.086319 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Mar 17 17:53:05.092482 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 17 17:53:05.153450 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:53:05.157469 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:53:05.168939 systemd-networkd[802]: lo: Link UP Mar 17 17:53:05.168943 systemd-networkd[802]: lo: Gained carrier Mar 17 17:53:05.169599 systemd-networkd[802]: Enumeration completed Mar 17 17:53:05.169772 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:53:05.169850 systemd-networkd[802]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Mar 17 17:53:05.169945 systemd[1]: Reached target network.target - Network. Mar 17 17:53:05.172824 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Mar 17 17:53:05.172953 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Mar 17 17:53:05.173058 systemd-networkd[802]: ens192: Link UP Mar 17 17:53:05.173062 systemd-networkd[802]: ens192: Gained carrier Mar 17 17:53:05.201164 ignition[671]: Ignition 2.20.0 Mar 17 17:53:05.201434 ignition[671]: Stage: fetch-offline Mar 17 17:53:05.201456 ignition[671]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:53:05.201461 ignition[671]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 17 17:53:05.201516 ignition[671]: parsed url from cmdline: "" Mar 17 17:53:05.201518 ignition[671]: no config URL provided Mar 17 17:53:05.201521 ignition[671]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 17:53:05.201525 ignition[671]: no config at "/usr/lib/ignition/user.ign" Mar 17 17:53:05.201886 ignition[671]: config successfully fetched Mar 17 17:53:05.201903 ignition[671]: parsing config with SHA512: 97fcfb50f231721e06a511a1ef981e1eecc5efd0c6f1d1f5d04e157dbd255ce065d55825885a724263dae8204fc6d0afa66d1ed545935fb2395fe3aff242e97f Mar 17 17:53:05.204269 unknown[671]: fetched base config from "system" Mar 17 17:53:05.204275 unknown[671]: fetched user config from "vmware" Mar 17 17:53:05.204497 ignition[671]: fetch-offline: fetch-offline passed Mar 17 17:53:05.204539 ignition[671]: Ignition finished successfully Mar 17 17:53:05.205144 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:53:05.205562 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 17 17:53:05.208536 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 17 17:53:05.216383 ignition[810]: Ignition 2.20.0 Mar 17 17:53:05.216392 ignition[810]: Stage: kargs Mar 17 17:53:05.216501 ignition[810]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:53:05.216507 ignition[810]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 17 17:53:05.217010 ignition[810]: kargs: kargs passed Mar 17 17:53:05.217036 ignition[810]: Ignition finished successfully Mar 17 17:53:05.218345 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 17 17:53:05.221452 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 17 17:53:05.228537 ignition[816]: Ignition 2.20.0 Mar 17 17:53:05.228545 ignition[816]: Stage: disks Mar 17 17:53:05.228652 ignition[816]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:53:05.228657 ignition[816]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 17 17:53:05.229177 ignition[816]: disks: disks passed Mar 17 17:53:05.229203 ignition[816]: Ignition finished successfully Mar 17 17:53:05.229929 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 17 17:53:05.230487 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 17 17:53:05.230725 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 17 17:53:05.230830 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:53:05.230916 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:53:05.231003 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:53:05.234495 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 17 17:53:05.245057 systemd-fsck[824]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 17 17:53:05.246441 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 17 17:53:05.250506 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 17 17:53:05.346272 kernel: EXT4-fs (sda9): mounted filesystem 345fc709-8965-4219-b368-16e508c3d632 r/w with ordered data mode. Quota mode: none. Mar 17 17:53:05.345812 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 17 17:53:05.346250 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 17 17:53:05.360467 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:53:05.362021 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 17 17:53:05.362424 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 17 17:53:05.362460 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 17:53:05.362478 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:53:05.366675 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 17 17:53:05.369358 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (832) Mar 17 17:53:05.372533 kernel: BTRFS info (device sda6): first mount of filesystem 7b241d32-136b-4fe3-b105-cecff2b2cf64 Mar 17 17:53:05.372555 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 17:53:05.372574 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:53:05.372682 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 17 17:53:05.376352 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 17 17:53:05.378301 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:53:05.409363 initrd-setup-root[856]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 17:53:05.417881 initrd-setup-root[863]: cut: /sysroot/etc/group: No such file or directory Mar 17 17:53:05.422303 initrd-setup-root[870]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 17:53:05.424210 initrd-setup-root[877]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 17:53:05.496139 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 17 17:53:05.500473 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 17 17:53:05.502929 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 17 17:53:05.508368 kernel: BTRFS info (device sda6): last unmount of filesystem 7b241d32-136b-4fe3-b105-cecff2b2cf64 Mar 17 17:53:05.522330 ignition[945]: INFO : Ignition 2.20.0 Mar 17 17:53:05.522330 ignition[945]: INFO : Stage: mount Mar 17 17:53:05.523236 ignition[945]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:53:05.523236 ignition[945]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 17 17:53:05.523838 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 17 17:53:05.524544 ignition[945]: INFO : mount: mount passed Mar 17 17:53:05.524544 ignition[945]: INFO : Ignition finished successfully Mar 17 17:53:05.524096 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 17 17:53:05.530463 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 17 17:53:05.922405 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 17 17:53:05.930507 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:53:06.023368 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (956) Mar 17 17:53:06.049297 kernel: BTRFS info (device sda6): first mount of filesystem 7b241d32-136b-4fe3-b105-cecff2b2cf64 Mar 17 17:53:06.049329 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 17:53:06.049358 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:53:06.107365 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 17 17:53:06.115834 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:53:06.134630 ignition[973]: INFO : Ignition 2.20.0 Mar 17 17:53:06.134630 ignition[973]: INFO : Stage: files Mar 17 17:53:06.135064 ignition[973]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:53:06.135064 ignition[973]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 17 17:53:06.135480 ignition[973]: DEBUG : files: compiled without relabeling support, skipping Mar 17 17:53:06.209856 ignition[973]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 17:53:06.209856 ignition[973]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 17:53:06.231495 systemd-networkd[802]: ens192: Gained IPv6LL Mar 17 17:53:06.267792 ignition[973]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 17:53:06.268173 ignition[973]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 17:53:06.268494 unknown[973]: wrote ssh authorized keys file for user: core Mar 17 17:53:06.268912 ignition[973]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 17:53:06.305425 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 17 17:53:06.305425 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 17 17:53:06.376000 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 17 17:53:06.457970 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 17 17:53:06.458276 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 17 17:53:06.458276 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 17:53:06.458276 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:53:06.458276 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:53:06.458276 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:53:06.458276 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:53:06.458276 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:53:06.459410 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:53:06.459410 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:53:06.459410 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:53:06.459410 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 17:53:06.459410 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 17:53:06.459410 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 17:53:06.459410 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Mar 17 17:53:06.946131 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 17 17:53:07.089487 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 17:53:07.089487 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Mar 17 17:53:07.089487 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Mar 17 17:53:07.089487 ignition[973]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Mar 17 17:53:07.289204 ignition[973]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Mar 17 17:53:07.293668 ignition[973]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Mar 17 17:53:07.293668 ignition[973]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Mar 17 17:53:07.293668 ignition[973]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Mar 17 17:53:07.293668 ignition[973]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Mar 17 17:53:07.293668 ignition[973]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:53:07.294727 ignition[973]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:53:07.294727 ignition[973]: INFO : files: files passed Mar 17 17:53:07.294727 ignition[973]: INFO : Ignition finished successfully Mar 17 17:53:07.295050 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 17 17:53:07.299507 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 17 17:53:07.301446 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 17 17:53:07.302902 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 17:53:07.302976 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 17 17:53:07.311522 initrd-setup-root-after-ignition[1004]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:53:07.311522 initrd-setup-root-after-ignition[1004]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:53:07.312476 initrd-setup-root-after-ignition[1008]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:53:07.313549 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:53:07.314005 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 17 17:53:07.318462 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 17 17:53:07.332315 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 17:53:07.332395 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 17 17:53:07.332875 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 17 17:53:07.333020 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 17 17:53:07.333246 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 17 17:53:07.333768 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 17 17:53:07.344020 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:53:07.348450 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 17 17:53:07.355544 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:53:07.355821 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:53:07.356038 systemd[1]: Stopped target timers.target - Timer Units. Mar 17 17:53:07.356211 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 17:53:07.356309 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:53:07.356651 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 17 17:53:07.356845 systemd[1]: Stopped target basic.target - Basic System. Mar 17 17:53:07.357014 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 17 17:53:07.357217 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:53:07.357420 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 17 17:53:07.357794 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 17 17:53:07.357982 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:53:07.358154 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 17 17:53:07.358448 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 17 17:53:07.358649 systemd[1]: Stopped target swap.target - Swaps. Mar 17 17:53:07.358801 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 17:53:07.358893 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:53:07.359240 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:53:07.359426 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:53:07.359618 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 17 17:53:07.359690 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:53:07.359942 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 17:53:07.360006 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 17 17:53:07.360249 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 17:53:07.360369 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:53:07.360697 systemd[1]: Stopped target paths.target - Path Units. Mar 17 17:53:07.360860 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 17:53:07.366373 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:53:07.366659 systemd[1]: Stopped target slices.target - Slice Units. Mar 17 17:53:07.366928 systemd[1]: Stopped target sockets.target - Socket Units. Mar 17 17:53:07.367135 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 17:53:07.367205 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:53:07.367531 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 17:53:07.367596 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:53:07.367922 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 17:53:07.368018 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:53:07.368252 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 17:53:07.368346 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 17 17:53:07.380441 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 17 17:53:07.380540 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 17:53:07.380608 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:53:07.382405 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 17 17:53:07.382606 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 17:53:07.382693 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:53:07.382898 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 17:53:07.382962 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:53:07.386532 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 17:53:07.389637 ignition[1028]: INFO : Ignition 2.20.0 Mar 17 17:53:07.390696 ignition[1028]: INFO : Stage: umount Mar 17 17:53:07.390696 ignition[1028]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:53:07.390696 ignition[1028]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 17 17:53:07.390696 ignition[1028]: INFO : umount: umount passed Mar 17 17:53:07.390696 ignition[1028]: INFO : Ignition finished successfully Mar 17 17:53:07.390555 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 17 17:53:07.392804 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 17:53:07.392858 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 17 17:53:07.393499 systemd[1]: Stopped target network.target - Network. Mar 17 17:53:07.393886 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 17:53:07.393917 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 17 17:53:07.394165 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 17:53:07.394188 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 17 17:53:07.394482 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 17:53:07.394505 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 17 17:53:07.394862 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 17 17:53:07.394885 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 17 17:53:07.395186 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 17 17:53:07.395614 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 17 17:53:07.399064 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 17:53:07.399132 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 17 17:53:07.399360 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 17:53:07.399381 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:53:07.409536 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 17 17:53:07.409679 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 17:53:07.409720 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:53:07.409871 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Mar 17 17:53:07.409894 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Mar 17 17:53:07.410077 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:53:07.414429 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 17:53:07.414803 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 17:53:07.414883 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 17 17:53:07.416886 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 17:53:07.416919 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:53:07.417563 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 17:53:07.417589 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 17 17:53:07.418609 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 17 17:53:07.418635 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:53:07.419001 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 17:53:07.419109 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:53:07.420679 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 17:53:07.420723 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 17 17:53:07.420870 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 17:53:07.420891 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:53:07.421066 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 17:53:07.421096 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:53:07.421549 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 17:53:07.421581 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 17 17:53:07.422178 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:53:07.422204 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:53:07.427480 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 17 17:53:07.427677 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 17 17:53:07.427722 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:53:07.427996 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:53:07.428025 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:53:07.429301 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 17:53:07.429497 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 17 17:53:07.434470 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 17:53:07.434543 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 17 17:53:07.456909 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 17:53:07.456983 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 17 17:53:07.457295 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 17 17:53:07.457434 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 17:53:07.457466 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 17 17:53:07.462449 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 17 17:53:07.479430 systemd[1]: Switching root. Mar 17 17:53:07.508417 systemd-journald[215]: Journal stopped Mar 17 17:53:02.734310 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Mar 17 16:07:40 -00 2025 Mar 17 17:53:02.734327 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=d4b838cd9a6f58e8c4a6b615c32b0b28ee0df1660e34033a8fbd0429c6de5fd0 Mar 17 17:53:02.734333 kernel: Disabled fast string operations Mar 17 17:53:02.736668 kernel: BIOS-provided physical RAM map: Mar 17 17:53:02.736678 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Mar 17 17:53:02.736683 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Mar 17 17:53:02.736690 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Mar 17 17:53:02.736699 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Mar 17 17:53:02.736704 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Mar 17 17:53:02.736708 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Mar 17 17:53:02.736712 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Mar 17 17:53:02.736717 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Mar 17 17:53:02.736721 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Mar 17 17:53:02.736725 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Mar 17 17:53:02.736732 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Mar 17 17:53:02.736737 kernel: NX (Execute Disable) protection: active Mar 17 17:53:02.736741 kernel: APIC: Static calls initialized Mar 17 17:53:02.736747 kernel: SMBIOS 2.7 present. Mar 17 17:53:02.736752 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Mar 17 17:53:02.736756 kernel: vmware: hypercall mode: 0x00 Mar 17 17:53:02.736761 kernel: Hypervisor detected: VMware Mar 17 17:53:02.736766 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Mar 17 17:53:02.736772 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Mar 17 17:53:02.736777 kernel: vmware: using clock offset of 2738225074 ns Mar 17 17:53:02.736782 kernel: tsc: Detected 3408.000 MHz processor Mar 17 17:53:02.736787 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 17 17:53:02.736793 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 17 17:53:02.736797 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Mar 17 17:53:02.736802 kernel: total RAM covered: 3072M Mar 17 17:53:02.736807 kernel: Found optimal setting for mtrr clean up Mar 17 17:53:02.736813 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Mar 17 17:53:02.736819 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Mar 17 17:53:02.736824 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 17 17:53:02.736828 kernel: Using GB pages for direct mapping Mar 17 17:53:02.736833 kernel: ACPI: Early table checksum verification disabled Mar 17 17:53:02.736838 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Mar 17 17:53:02.736843 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Mar 17 17:53:02.736848 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Mar 17 17:53:02.736853 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Mar 17 17:53:02.736858 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Mar 17 17:53:02.736866 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Mar 17 17:53:02.736871 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Mar 17 17:53:02.736876 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Mar 17 17:53:02.736881 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Mar 17 17:53:02.736886 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Mar 17 17:53:02.736893 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Mar 17 17:53:02.736898 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Mar 17 17:53:02.736903 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Mar 17 17:53:02.736909 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Mar 17 17:53:02.736914 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Mar 17 17:53:02.736919 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Mar 17 17:53:02.736924 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Mar 17 17:53:02.736929 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Mar 17 17:53:02.736934 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Mar 17 17:53:02.736939 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Mar 17 17:53:02.736946 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Mar 17 17:53:02.736951 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Mar 17 17:53:02.736956 kernel: system APIC only can use physical flat Mar 17 17:53:02.736961 kernel: APIC: Switched APIC routing to: physical flat Mar 17 17:53:02.736966 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Mar 17 17:53:02.736971 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Mar 17 17:53:02.736976 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Mar 17 17:53:02.736981 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Mar 17 17:53:02.736986 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Mar 17 17:53:02.736992 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Mar 17 17:53:02.736997 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Mar 17 17:53:02.737002 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Mar 17 17:53:02.737007 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Mar 17 17:53:02.737012 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Mar 17 17:53:02.737018 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Mar 17 17:53:02.737022 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Mar 17 17:53:02.737028 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Mar 17 17:53:02.737033 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Mar 17 17:53:02.737038 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Mar 17 17:53:02.737044 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Mar 17 17:53:02.737049 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Mar 17 17:53:02.737054 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Mar 17 17:53:02.737059 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Mar 17 17:53:02.737064 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Mar 17 17:53:02.737069 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Mar 17 17:53:02.737074 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Mar 17 17:53:02.737079 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Mar 17 17:53:02.737084 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Mar 17 17:53:02.737089 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Mar 17 17:53:02.737095 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Mar 17 17:53:02.737100 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Mar 17 17:53:02.737106 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Mar 17 17:53:02.737111 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Mar 17 17:53:02.737116 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Mar 17 17:53:02.737121 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Mar 17 17:53:02.737126 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Mar 17 17:53:02.737131 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Mar 17 17:53:02.737136 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Mar 17 17:53:02.737141 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Mar 17 17:53:02.737147 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Mar 17 17:53:02.737152 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Mar 17 17:53:02.737157 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Mar 17 17:53:02.737162 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Mar 17 17:53:02.737167 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Mar 17 17:53:02.737172 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Mar 17 17:53:02.737177 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Mar 17 17:53:02.737183 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Mar 17 17:53:02.737188 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Mar 17 17:53:02.737193 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Mar 17 17:53:02.737198 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Mar 17 17:53:02.737204 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Mar 17 17:53:02.737209 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Mar 17 17:53:02.737214 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Mar 17 17:53:02.737219 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Mar 17 17:53:02.737224 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Mar 17 17:53:02.737229 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Mar 17 17:53:02.737234 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Mar 17 17:53:02.737239 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Mar 17 17:53:02.737244 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Mar 17 17:53:02.737249 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Mar 17 17:53:02.737255 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Mar 17 17:53:02.737260 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Mar 17 17:53:02.737266 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Mar 17 17:53:02.737275 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Mar 17 17:53:02.737280 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Mar 17 17:53:02.737285 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Mar 17 17:53:02.737291 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Mar 17 17:53:02.737296 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Mar 17 17:53:02.737302 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Mar 17 17:53:02.737308 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Mar 17 17:53:02.737313 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Mar 17 17:53:02.737318 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Mar 17 17:53:02.737324 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Mar 17 17:53:02.737329 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Mar 17 17:53:02.737334 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Mar 17 17:53:02.737348 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Mar 17 17:53:02.737354 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Mar 17 17:53:02.737359 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Mar 17 17:53:02.737365 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Mar 17 17:53:02.737372 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Mar 17 17:53:02.737377 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Mar 17 17:53:02.737383 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Mar 17 17:53:02.737388 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Mar 17 17:53:02.737393 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Mar 17 17:53:02.737399 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Mar 17 17:53:02.737404 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Mar 17 17:53:02.737410 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Mar 17 17:53:02.737415 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Mar 17 17:53:02.737420 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Mar 17 17:53:02.737427 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Mar 17 17:53:02.737432 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Mar 17 17:53:02.737438 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Mar 17 17:53:02.737443 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Mar 17 17:53:02.737448 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Mar 17 17:53:02.737454 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Mar 17 17:53:02.737459 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Mar 17 17:53:02.737464 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Mar 17 17:53:02.737470 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Mar 17 17:53:02.737475 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Mar 17 17:53:02.737481 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Mar 17 17:53:02.737487 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Mar 17 17:53:02.737492 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Mar 17 17:53:02.737497 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Mar 17 17:53:02.737503 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Mar 17 17:53:02.737509 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Mar 17 17:53:02.737514 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Mar 17 17:53:02.737519 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Mar 17 17:53:02.737525 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Mar 17 17:53:02.737530 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Mar 17 17:53:02.737536 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Mar 17 17:53:02.737542 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Mar 17 17:53:02.737547 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Mar 17 17:53:02.737553 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Mar 17 17:53:02.737558 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Mar 17 17:53:02.737563 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Mar 17 17:53:02.737569 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Mar 17 17:53:02.737574 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Mar 17 17:53:02.737579 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Mar 17 17:53:02.737584 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Mar 17 17:53:02.737591 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Mar 17 17:53:02.737596 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Mar 17 17:53:02.737601 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Mar 17 17:53:02.737607 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Mar 17 17:53:02.737612 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Mar 17 17:53:02.737618 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Mar 17 17:53:02.737626 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Mar 17 17:53:02.737632 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Mar 17 17:53:02.737641 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Mar 17 17:53:02.737652 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Mar 17 17:53:02.737662 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Mar 17 17:53:02.737670 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Mar 17 17:53:02.737681 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Mar 17 17:53:02.737687 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Mar 17 17:53:02.737693 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Mar 17 17:53:02.737698 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Mar 17 17:53:02.737704 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Mar 17 17:53:02.737710 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Mar 17 17:53:02.737715 kernel: Zone ranges: Mar 17 17:53:02.737721 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 17 17:53:02.737728 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Mar 17 17:53:02.737734 kernel: Normal empty Mar 17 17:53:02.737739 kernel: Movable zone start for each node Mar 17 17:53:02.737745 kernel: Early memory node ranges Mar 17 17:53:02.737750 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Mar 17 17:53:02.737756 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Mar 17 17:53:02.737761 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Mar 17 17:53:02.737766 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Mar 17 17:53:02.737772 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 17 17:53:02.737777 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Mar 17 17:53:02.737784 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Mar 17 17:53:02.737790 kernel: ACPI: PM-Timer IO Port: 0x1008 Mar 17 17:53:02.737795 kernel: system APIC only can use physical flat Mar 17 17:53:02.737800 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Mar 17 17:53:02.737806 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Mar 17 17:53:02.737811 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Mar 17 17:53:02.737817 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Mar 17 17:53:02.737822 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Mar 17 17:53:02.737828 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Mar 17 17:53:02.737834 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Mar 17 17:53:02.737840 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Mar 17 17:53:02.737845 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Mar 17 17:53:02.737851 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Mar 17 17:53:02.737856 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Mar 17 17:53:02.737861 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Mar 17 17:53:02.737867 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Mar 17 17:53:02.737872 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Mar 17 17:53:02.737878 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Mar 17 17:53:02.737883 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Mar 17 17:53:02.737890 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Mar 17 17:53:02.737895 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Mar 17 17:53:02.737900 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Mar 17 17:53:02.737906 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Mar 17 17:53:02.737911 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Mar 17 17:53:02.737916 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Mar 17 17:53:02.737922 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Mar 17 17:53:02.737928 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Mar 17 17:53:02.737933 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Mar 17 17:53:02.737940 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Mar 17 17:53:02.737945 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Mar 17 17:53:02.737950 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Mar 17 17:53:02.737956 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Mar 17 17:53:02.737961 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Mar 17 17:53:02.737967 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Mar 17 17:53:02.737972 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Mar 17 17:53:02.737978 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Mar 17 17:53:02.737983 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Mar 17 17:53:02.737989 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Mar 17 17:53:02.737995 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Mar 17 17:53:02.738000 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Mar 17 17:53:02.738006 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Mar 17 17:53:02.738011 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Mar 17 17:53:02.738017 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Mar 17 17:53:02.738022 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Mar 17 17:53:02.738027 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Mar 17 17:53:02.738033 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Mar 17 17:53:02.738038 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Mar 17 17:53:02.738044 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Mar 17 17:53:02.738050 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Mar 17 17:53:02.738056 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Mar 17 17:53:02.738061 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Mar 17 17:53:02.738067 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Mar 17 17:53:02.738072 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Mar 17 17:53:02.738077 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Mar 17 17:53:02.738083 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Mar 17 17:53:02.738088 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Mar 17 17:53:02.738094 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Mar 17 17:53:02.738099 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Mar 17 17:53:02.738105 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Mar 17 17:53:02.738111 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Mar 17 17:53:02.738116 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Mar 17 17:53:02.738122 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Mar 17 17:53:02.738127 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Mar 17 17:53:02.738132 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Mar 17 17:53:02.738138 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Mar 17 17:53:02.738143 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Mar 17 17:53:02.738149 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Mar 17 17:53:02.738155 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Mar 17 17:53:02.738160 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Mar 17 17:53:02.738166 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Mar 17 17:53:02.738171 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Mar 17 17:53:02.738177 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Mar 17 17:53:02.738182 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Mar 17 17:53:02.738187 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Mar 17 17:53:02.738193 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Mar 17 17:53:02.738198 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Mar 17 17:53:02.738204 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Mar 17 17:53:02.738210 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Mar 17 17:53:02.738216 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Mar 17 17:53:02.738221 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Mar 17 17:53:02.738227 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Mar 17 17:53:02.738232 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Mar 17 17:53:02.738237 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Mar 17 17:53:02.738243 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Mar 17 17:53:02.738248 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Mar 17 17:53:02.738254 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Mar 17 17:53:02.738259 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Mar 17 17:53:02.738265 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Mar 17 17:53:02.738271 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Mar 17 17:53:02.738276 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Mar 17 17:53:02.738282 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Mar 17 17:53:02.738287 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Mar 17 17:53:02.738293 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Mar 17 17:53:02.738298 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Mar 17 17:53:02.738303 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Mar 17 17:53:02.738313 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Mar 17 17:53:02.738321 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Mar 17 17:53:02.738327 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Mar 17 17:53:02.738332 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Mar 17 17:53:02.739360 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Mar 17 17:53:02.739371 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Mar 17 17:53:02.739378 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Mar 17 17:53:02.739383 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Mar 17 17:53:02.739389 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Mar 17 17:53:02.739394 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Mar 17 17:53:02.739400 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Mar 17 17:53:02.739408 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Mar 17 17:53:02.739413 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Mar 17 17:53:02.739419 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Mar 17 17:53:02.739424 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Mar 17 17:53:02.739430 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Mar 17 17:53:02.739435 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Mar 17 17:53:02.739441 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Mar 17 17:53:02.739446 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Mar 17 17:53:02.739452 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Mar 17 17:53:02.739457 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Mar 17 17:53:02.739464 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Mar 17 17:53:02.739469 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Mar 17 17:53:02.739475 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Mar 17 17:53:02.739480 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Mar 17 17:53:02.739485 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Mar 17 17:53:02.739491 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Mar 17 17:53:02.739497 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Mar 17 17:53:02.739502 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Mar 17 17:53:02.739507 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Mar 17 17:53:02.739514 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Mar 17 17:53:02.739519 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Mar 17 17:53:02.739525 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Mar 17 17:53:02.739530 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Mar 17 17:53:02.739536 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Mar 17 17:53:02.739541 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Mar 17 17:53:02.739547 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Mar 17 17:53:02.739552 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Mar 17 17:53:02.739558 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 17 17:53:02.739564 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Mar 17 17:53:02.739571 kernel: TSC deadline timer available Mar 17 17:53:02.739576 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Mar 17 17:53:02.739582 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Mar 17 17:53:02.739587 kernel: Booting paravirtualized kernel on VMware hypervisor Mar 17 17:53:02.739593 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 17 17:53:02.739599 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Mar 17 17:53:02.739604 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Mar 17 17:53:02.739610 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Mar 17 17:53:02.739615 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Mar 17 17:53:02.739622 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Mar 17 17:53:02.739628 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Mar 17 17:53:02.739633 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Mar 17 17:53:02.739638 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Mar 17 17:53:02.739651 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Mar 17 17:53:02.739658 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Mar 17 17:53:02.739663 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Mar 17 17:53:02.739669 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Mar 17 17:53:02.739675 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Mar 17 17:53:02.739681 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Mar 17 17:53:02.739687 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Mar 17 17:53:02.739693 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Mar 17 17:53:02.739698 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Mar 17 17:53:02.739704 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Mar 17 17:53:02.739710 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Mar 17 17:53:02.739716 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=d4b838cd9a6f58e8c4a6b615c32b0b28ee0df1660e34033a8fbd0429c6de5fd0 Mar 17 17:53:02.739724 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 17:53:02.739729 kernel: random: crng init done Mar 17 17:53:02.739735 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Mar 17 17:53:02.739741 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Mar 17 17:53:02.739747 kernel: printk: log_buf_len min size: 262144 bytes Mar 17 17:53:02.739753 kernel: printk: log_buf_len: 1048576 bytes Mar 17 17:53:02.739759 kernel: printk: early log buf free: 239648(91%) Mar 17 17:53:02.739765 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 17 17:53:02.739771 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 17 17:53:02.739777 kernel: Fallback order for Node 0: 0 Mar 17 17:53:02.739784 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Mar 17 17:53:02.739789 kernel: Policy zone: DMA32 Mar 17 17:53:02.739795 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 17:53:02.739802 kernel: Memory: 1936384K/2096628K available (12288K kernel code, 2303K rwdata, 22744K rodata, 42992K init, 2196K bss, 159984K reserved, 0K cma-reserved) Mar 17 17:53:02.739808 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Mar 17 17:53:02.739815 kernel: ftrace: allocating 37938 entries in 149 pages Mar 17 17:53:02.739821 kernel: ftrace: allocated 149 pages with 4 groups Mar 17 17:53:02.739827 kernel: Dynamic Preempt: voluntary Mar 17 17:53:02.739833 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 17 17:53:02.739839 kernel: rcu: RCU event tracing is enabled. Mar 17 17:53:02.739845 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Mar 17 17:53:02.739851 kernel: Trampoline variant of Tasks RCU enabled. Mar 17 17:53:02.739857 kernel: Rude variant of Tasks RCU enabled. Mar 17 17:53:02.739863 kernel: Tracing variant of Tasks RCU enabled. Mar 17 17:53:02.739869 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 17:53:02.739876 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Mar 17 17:53:02.739881 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Mar 17 17:53:02.739887 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Mar 17 17:53:02.739893 kernel: Console: colour VGA+ 80x25 Mar 17 17:53:02.739899 kernel: printk: console [tty0] enabled Mar 17 17:53:02.739905 kernel: printk: console [ttyS0] enabled Mar 17 17:53:02.739912 kernel: ACPI: Core revision 20230628 Mar 17 17:53:02.739918 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Mar 17 17:53:02.739924 kernel: APIC: Switch to symmetric I/O mode setup Mar 17 17:53:02.739931 kernel: x2apic enabled Mar 17 17:53:02.739936 kernel: APIC: Switched APIC routing to: physical x2apic Mar 17 17:53:02.739942 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 17 17:53:02.739948 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Mar 17 17:53:02.739954 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Mar 17 17:53:02.739960 kernel: Disabled fast string operations Mar 17 17:53:02.739966 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Mar 17 17:53:02.739972 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Mar 17 17:53:02.739978 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 17 17:53:02.739985 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Mar 17 17:53:02.739991 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Mar 17 17:53:02.739997 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Mar 17 17:53:02.740003 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 17 17:53:02.740009 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Mar 17 17:53:02.740014 kernel: RETBleed: Mitigation: Enhanced IBRS Mar 17 17:53:02.740020 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 17 17:53:02.740026 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 17 17:53:02.740032 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Mar 17 17:53:02.740039 kernel: SRBDS: Unknown: Dependent on hypervisor status Mar 17 17:53:02.740045 kernel: GDS: Unknown: Dependent on hypervisor status Mar 17 17:53:02.740051 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 17 17:53:02.740057 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 17 17:53:02.740063 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 17 17:53:02.740068 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 17 17:53:02.740074 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Mar 17 17:53:02.740080 kernel: Freeing SMP alternatives memory: 32K Mar 17 17:53:02.740087 kernel: pid_max: default: 131072 minimum: 1024 Mar 17 17:53:02.740093 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 17 17:53:02.740099 kernel: landlock: Up and running. Mar 17 17:53:02.740105 kernel: SELinux: Initializing. Mar 17 17:53:02.740111 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 17 17:53:02.740117 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 17 17:53:02.740123 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Mar 17 17:53:02.740129 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Mar 17 17:53:02.740135 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Mar 17 17:53:02.740142 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Mar 17 17:53:02.740148 kernel: Performance Events: Skylake events, core PMU driver. Mar 17 17:53:02.740154 kernel: core: CPUID marked event: 'cpu cycles' unavailable Mar 17 17:53:02.740160 kernel: core: CPUID marked event: 'instructions' unavailable Mar 17 17:53:02.740165 kernel: core: CPUID marked event: 'bus cycles' unavailable Mar 17 17:53:02.740171 kernel: core: CPUID marked event: 'cache references' unavailable Mar 17 17:53:02.740177 kernel: core: CPUID marked event: 'cache misses' unavailable Mar 17 17:53:02.740182 kernel: core: CPUID marked event: 'branch instructions' unavailable Mar 17 17:53:02.740188 kernel: core: CPUID marked event: 'branch misses' unavailable Mar 17 17:53:02.740195 kernel: ... version: 1 Mar 17 17:53:02.740201 kernel: ... bit width: 48 Mar 17 17:53:02.740207 kernel: ... generic registers: 4 Mar 17 17:53:02.740213 kernel: ... value mask: 0000ffffffffffff Mar 17 17:53:02.740219 kernel: ... max period: 000000007fffffff Mar 17 17:53:02.740225 kernel: ... fixed-purpose events: 0 Mar 17 17:53:02.740230 kernel: ... event mask: 000000000000000f Mar 17 17:53:02.740236 kernel: signal: max sigframe size: 1776 Mar 17 17:53:02.740242 kernel: rcu: Hierarchical SRCU implementation. Mar 17 17:53:02.740249 kernel: rcu: Max phase no-delay instances is 400. Mar 17 17:53:02.740255 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 17 17:53:02.740261 kernel: smp: Bringing up secondary CPUs ... Mar 17 17:53:02.740268 kernel: smpboot: x86: Booting SMP configuration: Mar 17 17:53:02.740274 kernel: .... node #0, CPUs: #1 Mar 17 17:53:02.740280 kernel: Disabled fast string operations Mar 17 17:53:02.740285 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Mar 17 17:53:02.740291 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Mar 17 17:53:02.740297 kernel: smp: Brought up 1 node, 2 CPUs Mar 17 17:53:02.740303 kernel: smpboot: Max logical packages: 128 Mar 17 17:53:02.740310 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Mar 17 17:53:02.740316 kernel: devtmpfs: initialized Mar 17 17:53:02.740321 kernel: x86/mm: Memory block size: 128MB Mar 17 17:53:02.740327 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Mar 17 17:53:02.740333 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 17:53:02.741424 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Mar 17 17:53:02.741434 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 17:53:02.741440 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 17:53:02.741446 kernel: audit: initializing netlink subsys (disabled) Mar 17 17:53:02.741455 kernel: audit: type=2000 audit(1742233981.067:1): state=initialized audit_enabled=0 res=1 Mar 17 17:53:02.741461 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 17:53:02.741467 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 17 17:53:02.741473 kernel: cpuidle: using governor menu Mar 17 17:53:02.741479 kernel: Simple Boot Flag at 0x36 set to 0x80 Mar 17 17:53:02.741485 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 17:53:02.741491 kernel: dca service started, version 1.12.1 Mar 17 17:53:02.741497 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Mar 17 17:53:02.741503 kernel: PCI: Using configuration type 1 for base access Mar 17 17:53:02.741510 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 17 17:53:02.741516 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 17 17:53:02.741522 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 17 17:53:02.741528 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 17:53:02.741534 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 17 17:53:02.741540 kernel: ACPI: Added _OSI(Module Device) Mar 17 17:53:02.741546 kernel: ACPI: Added _OSI(Processor Device) Mar 17 17:53:02.741551 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 17:53:02.741557 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 17:53:02.741565 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 17:53:02.741570 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Mar 17 17:53:02.741576 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 17 17:53:02.741582 kernel: ACPI: Interpreter enabled Mar 17 17:53:02.741588 kernel: ACPI: PM: (supports S0 S1 S5) Mar 17 17:53:02.741594 kernel: ACPI: Using IOAPIC for interrupt routing Mar 17 17:53:02.741600 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 17 17:53:02.741606 kernel: PCI: Using E820 reservations for host bridge windows Mar 17 17:53:02.741612 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Mar 17 17:53:02.741618 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Mar 17 17:53:02.741701 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 17 17:53:02.741756 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Mar 17 17:53:02.741804 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Mar 17 17:53:02.741813 kernel: PCI host bridge to bus 0000:00 Mar 17 17:53:02.741862 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 17 17:53:02.741909 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Mar 17 17:53:02.741953 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 17 17:53:02.741997 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 17 17:53:02.742041 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Mar 17 17:53:02.742084 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Mar 17 17:53:02.742160 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Mar 17 17:53:02.742216 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Mar 17 17:53:02.742272 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Mar 17 17:53:02.742325 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Mar 17 17:53:02.746406 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Mar 17 17:53:02.746465 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Mar 17 17:53:02.746517 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Mar 17 17:53:02.746567 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Mar 17 17:53:02.746620 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Mar 17 17:53:02.746675 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Mar 17 17:53:02.746724 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Mar 17 17:53:02.746773 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Mar 17 17:53:02.746826 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Mar 17 17:53:02.746879 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Mar 17 17:53:02.746931 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Mar 17 17:53:02.746985 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Mar 17 17:53:02.747034 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Mar 17 17:53:02.747082 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Mar 17 17:53:02.747132 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Mar 17 17:53:02.747181 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Mar 17 17:53:02.747231 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 17 17:53:02.747289 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Mar 17 17:53:02.747382 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.747436 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.747491 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.747542 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.747595 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.747649 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.747703 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.747754 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.747809 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.747860 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.747913 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.747963 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.748020 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.748070 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.748124 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.748175 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.748249 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.748301 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.749837 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.749894 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.749949 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.750000 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.750054 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.750109 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.750162 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.750213 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.750267 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.750323 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.751415 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.751471 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.751525 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.751575 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.751643 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.751695 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.751748 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.751798 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.751854 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.751904 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.751957 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.752007 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.752059 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.752108 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.752164 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.752215 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.752268 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.752318 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.755328 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.755479 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.755542 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.755608 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.755690 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.755743 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.755797 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.755848 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.755903 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.755957 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.756011 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.756062 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.756126 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.756180 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.756234 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.756288 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.756348 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Mar 17 17:53:02.756401 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.756454 kernel: pci_bus 0000:01: extended config space not accessible Mar 17 17:53:02.756505 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Mar 17 17:53:02.756582 kernel: pci_bus 0000:02: extended config space not accessible Mar 17 17:53:02.756594 kernel: acpiphp: Slot [32] registered Mar 17 17:53:02.756601 kernel: acpiphp: Slot [33] registered Mar 17 17:53:02.756607 kernel: acpiphp: Slot [34] registered Mar 17 17:53:02.756613 kernel: acpiphp: Slot [35] registered Mar 17 17:53:02.756619 kernel: acpiphp: Slot [36] registered Mar 17 17:53:02.756625 kernel: acpiphp: Slot [37] registered Mar 17 17:53:02.756631 kernel: acpiphp: Slot [38] registered Mar 17 17:53:02.756637 kernel: acpiphp: Slot [39] registered Mar 17 17:53:02.756643 kernel: acpiphp: Slot [40] registered Mar 17 17:53:02.756650 kernel: acpiphp: Slot [41] registered Mar 17 17:53:02.756656 kernel: acpiphp: Slot [42] registered Mar 17 17:53:02.756661 kernel: acpiphp: Slot [43] registered Mar 17 17:53:02.756667 kernel: acpiphp: Slot [44] registered Mar 17 17:53:02.756673 kernel: acpiphp: Slot [45] registered Mar 17 17:53:02.756679 kernel: acpiphp: Slot [46] registered Mar 17 17:53:02.756691 kernel: acpiphp: Slot [47] registered Mar 17 17:53:02.756700 kernel: acpiphp: Slot [48] registered Mar 17 17:53:02.756706 kernel: acpiphp: Slot [49] registered Mar 17 17:53:02.756718 kernel: acpiphp: Slot [50] registered Mar 17 17:53:02.756726 kernel: acpiphp: Slot [51] registered Mar 17 17:53:02.756732 kernel: acpiphp: Slot [52] registered Mar 17 17:53:02.756738 kernel: acpiphp: Slot [53] registered Mar 17 17:53:02.756743 kernel: acpiphp: Slot [54] registered Mar 17 17:53:02.756749 kernel: acpiphp: Slot [55] registered Mar 17 17:53:02.756755 kernel: acpiphp: Slot [56] registered Mar 17 17:53:02.756761 kernel: acpiphp: Slot [57] registered Mar 17 17:53:02.756767 kernel: acpiphp: Slot [58] registered Mar 17 17:53:02.756772 kernel: acpiphp: Slot [59] registered Mar 17 17:53:02.756779 kernel: acpiphp: Slot [60] registered Mar 17 17:53:02.756789 kernel: acpiphp: Slot [61] registered Mar 17 17:53:02.756797 kernel: acpiphp: Slot [62] registered Mar 17 17:53:02.756807 kernel: acpiphp: Slot [63] registered Mar 17 17:53:02.756863 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Mar 17 17:53:02.756914 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Mar 17 17:53:02.756964 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Mar 17 17:53:02.757014 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Mar 17 17:53:02.757063 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Mar 17 17:53:02.757120 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Mar 17 17:53:02.757170 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Mar 17 17:53:02.757219 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Mar 17 17:53:02.757268 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Mar 17 17:53:02.757325 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Mar 17 17:53:02.757402 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Mar 17 17:53:02.757456 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Mar 17 17:53:02.757507 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Mar 17 17:53:02.757557 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Mar 17 17:53:02.757607 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Mar 17 17:53:02.757657 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Mar 17 17:53:02.757706 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Mar 17 17:53:02.757755 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Mar 17 17:53:02.757805 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Mar 17 17:53:02.757858 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Mar 17 17:53:02.757907 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Mar 17 17:53:02.757957 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Mar 17 17:53:02.758007 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Mar 17 17:53:02.758056 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Mar 17 17:53:02.758107 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Mar 17 17:53:02.758156 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Mar 17 17:53:02.758207 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Mar 17 17:53:02.758266 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Mar 17 17:53:02.758317 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Mar 17 17:53:02.758383 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Mar 17 17:53:02.758434 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Mar 17 17:53:02.758482 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Mar 17 17:53:02.758535 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Mar 17 17:53:02.758584 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Mar 17 17:53:02.758633 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Mar 17 17:53:02.758683 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Mar 17 17:53:02.758732 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Mar 17 17:53:02.758781 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Mar 17 17:53:02.758830 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Mar 17 17:53:02.758883 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Mar 17 17:53:02.758932 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Mar 17 17:53:02.758988 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Mar 17 17:53:02.759040 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Mar 17 17:53:02.759091 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Mar 17 17:53:02.759141 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Mar 17 17:53:02.759198 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Mar 17 17:53:02.759257 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Mar 17 17:53:02.759307 kernel: pci 0000:0b:00.0: supports D1 D2 Mar 17 17:53:02.760972 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 17 17:53:02.761038 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Mar 17 17:53:02.761095 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Mar 17 17:53:02.761149 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Mar 17 17:53:02.761201 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Mar 17 17:53:02.761252 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Mar 17 17:53:02.761308 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Mar 17 17:53:02.761370 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Mar 17 17:53:02.761421 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Mar 17 17:53:02.762664 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Mar 17 17:53:02.762721 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Mar 17 17:53:02.762773 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Mar 17 17:53:02.762824 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Mar 17 17:53:02.762876 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Mar 17 17:53:02.762930 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Mar 17 17:53:02.762980 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Mar 17 17:53:02.763031 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Mar 17 17:53:02.763081 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Mar 17 17:53:02.763131 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Mar 17 17:53:02.763182 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Mar 17 17:53:02.763233 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Mar 17 17:53:02.763283 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Mar 17 17:53:02.763343 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Mar 17 17:53:02.763397 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Mar 17 17:53:02.763448 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Mar 17 17:53:02.763500 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Mar 17 17:53:02.763551 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Mar 17 17:53:02.763601 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Mar 17 17:53:02.763653 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Mar 17 17:53:02.763704 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Mar 17 17:53:02.763758 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Mar 17 17:53:02.763809 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Mar 17 17:53:02.763862 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Mar 17 17:53:02.763912 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Mar 17 17:53:02.763962 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Mar 17 17:53:02.764012 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Mar 17 17:53:02.764065 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Mar 17 17:53:02.764119 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Mar 17 17:53:02.764169 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Mar 17 17:53:02.764219 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Mar 17 17:53:02.764271 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Mar 17 17:53:02.764321 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Mar 17 17:53:02.764671 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Mar 17 17:53:02.764727 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Mar 17 17:53:02.764778 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Mar 17 17:53:02.764832 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Mar 17 17:53:02.764884 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Mar 17 17:53:02.764934 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Mar 17 17:53:02.764984 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Mar 17 17:53:02.765036 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Mar 17 17:53:02.765087 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Mar 17 17:53:02.765137 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Mar 17 17:53:02.765188 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Mar 17 17:53:02.765242 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Mar 17 17:53:02.765292 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Mar 17 17:53:02.765581 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Mar 17 17:53:02.765646 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Mar 17 17:53:02.765699 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Mar 17 17:53:02.765751 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Mar 17 17:53:02.765805 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Mar 17 17:53:02.765856 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Mar 17 17:53:02.765910 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Mar 17 17:53:02.765962 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Mar 17 17:53:02.766013 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Mar 17 17:53:02.766066 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Mar 17 17:53:02.766116 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Mar 17 17:53:02.766168 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Mar 17 17:53:02.766218 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Mar 17 17:53:02.766272 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Mar 17 17:53:02.766324 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Mar 17 17:53:02.768567 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Mar 17 17:53:02.768624 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Mar 17 17:53:02.768678 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Mar 17 17:53:02.768730 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Mar 17 17:53:02.768781 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Mar 17 17:53:02.768833 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Mar 17 17:53:02.768884 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Mar 17 17:53:02.768939 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Mar 17 17:53:02.768991 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Mar 17 17:53:02.769041 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Mar 17 17:53:02.769092 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Mar 17 17:53:02.769101 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Mar 17 17:53:02.769107 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Mar 17 17:53:02.769114 kernel: ACPI: PCI: Interrupt link LNKB disabled Mar 17 17:53:02.769120 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 17 17:53:02.769128 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Mar 17 17:53:02.769134 kernel: iommu: Default domain type: Translated Mar 17 17:53:02.769140 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 17 17:53:02.769146 kernel: PCI: Using ACPI for IRQ routing Mar 17 17:53:02.769152 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 17 17:53:02.769158 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Mar 17 17:53:02.769164 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Mar 17 17:53:02.769215 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Mar 17 17:53:02.769267 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Mar 17 17:53:02.769320 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 17 17:53:02.769329 kernel: vgaarb: loaded Mar 17 17:53:02.769336 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Mar 17 17:53:02.769349 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Mar 17 17:53:02.769355 kernel: clocksource: Switched to clocksource tsc-early Mar 17 17:53:02.769361 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 17:53:02.769367 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 17:53:02.769373 kernel: pnp: PnP ACPI init Mar 17 17:53:02.769428 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Mar 17 17:53:02.769480 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Mar 17 17:53:02.769527 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Mar 17 17:53:02.769578 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Mar 17 17:53:02.769628 kernel: pnp 00:06: [dma 2] Mar 17 17:53:02.769679 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Mar 17 17:53:02.769726 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Mar 17 17:53:02.769775 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Mar 17 17:53:02.769784 kernel: pnp: PnP ACPI: found 8 devices Mar 17 17:53:02.769790 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 17 17:53:02.769796 kernel: NET: Registered PF_INET protocol family Mar 17 17:53:02.769802 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 17 17:53:02.769808 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Mar 17 17:53:02.769814 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 17:53:02.769820 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 17 17:53:02.769828 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Mar 17 17:53:02.769834 kernel: TCP: Hash tables configured (established 16384 bind 16384) Mar 17 17:53:02.769840 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 17 17:53:02.769846 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 17 17:53:02.769852 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 17:53:02.769858 kernel: NET: Registered PF_XDP protocol family Mar 17 17:53:02.769910 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Mar 17 17:53:02.769964 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 17 17:53:02.770021 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 17 17:53:02.770075 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 17 17:53:02.770127 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 17 17:53:02.770180 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Mar 17 17:53:02.770234 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Mar 17 17:53:02.770288 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Mar 17 17:53:02.770425 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Mar 17 17:53:02.770482 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Mar 17 17:53:02.770536 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Mar 17 17:53:02.770596 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Mar 17 17:53:02.770650 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Mar 17 17:53:02.770714 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Mar 17 17:53:02.770777 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Mar 17 17:53:02.770829 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Mar 17 17:53:02.770881 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Mar 17 17:53:02.770933 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Mar 17 17:53:02.770985 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Mar 17 17:53:02.771037 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Mar 17 17:53:02.771091 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Mar 17 17:53:02.771142 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Mar 17 17:53:02.771194 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Mar 17 17:53:02.771246 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Mar 17 17:53:02.771297 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Mar 17 17:53:02.771370 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.771426 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.771477 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.771529 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.771582 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.771633 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.771684 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.771736 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.771787 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.771841 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.771893 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.771943 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.771995 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772047 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.772099 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772150 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.772202 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772257 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.772308 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772560 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.772615 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772666 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.772717 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772768 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.772818 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772873 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.772923 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.772973 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773025 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773076 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773127 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773177 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773228 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773281 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773332 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773397 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773447 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773498 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773550 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773601 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773652 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773705 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773756 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773807 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773859 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.773910 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.773960 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774011 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774061 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774112 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774163 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774216 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774267 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774324 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774433 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774485 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774537 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774587 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774637 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774687 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774740 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774790 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774841 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774891 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.774941 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.774992 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775043 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775095 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775145 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775196 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775251 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775301 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775365 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775417 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775468 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775517 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775569 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775619 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775669 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775723 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775774 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775824 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775875 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Mar 17 17:53:02.775925 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Mar 17 17:53:02.775976 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Mar 17 17:53:02.776028 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Mar 17 17:53:02.776079 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Mar 17 17:53:02.776129 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Mar 17 17:53:02.776179 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Mar 17 17:53:02.776237 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Mar 17 17:53:02.776289 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Mar 17 17:53:02.776346 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Mar 17 17:53:02.776398 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Mar 17 17:53:02.776449 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Mar 17 17:53:02.776500 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Mar 17 17:53:02.776552 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Mar 17 17:53:02.776603 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Mar 17 17:53:02.776657 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Mar 17 17:53:02.776710 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Mar 17 17:53:02.776761 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Mar 17 17:53:02.776812 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Mar 17 17:53:02.776863 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Mar 17 17:53:02.776914 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Mar 17 17:53:02.776964 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Mar 17 17:53:02.777015 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Mar 17 17:53:02.777066 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Mar 17 17:53:02.777119 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Mar 17 17:53:02.777169 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Mar 17 17:53:02.777223 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Mar 17 17:53:02.777274 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Mar 17 17:53:02.777324 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Mar 17 17:53:02.777390 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Mar 17 17:53:02.777445 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Mar 17 17:53:02.777496 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Mar 17 17:53:02.777546 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Mar 17 17:53:02.777596 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Mar 17 17:53:02.777647 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Mar 17 17:53:02.777701 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Mar 17 17:53:02.777753 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Mar 17 17:53:02.777804 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Mar 17 17:53:02.777855 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Mar 17 17:53:02.777908 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Mar 17 17:53:02.777959 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Mar 17 17:53:02.778010 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Mar 17 17:53:02.778060 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Mar 17 17:53:02.778110 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Mar 17 17:53:02.778162 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Mar 17 17:53:02.778213 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Mar 17 17:53:02.778263 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Mar 17 17:53:02.778318 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Mar 17 17:53:02.778498 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Mar 17 17:53:02.778549 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Mar 17 17:53:02.778598 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Mar 17 17:53:02.778649 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Mar 17 17:53:02.778699 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Mar 17 17:53:02.778748 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Mar 17 17:53:02.778799 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Mar 17 17:53:02.778849 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Mar 17 17:53:02.778898 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Mar 17 17:53:02.778949 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Mar 17 17:53:02.779001 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Mar 17 17:53:02.779051 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Mar 17 17:53:02.779102 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Mar 17 17:53:02.779151 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Mar 17 17:53:02.779201 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Mar 17 17:53:02.779254 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Mar 17 17:53:02.779304 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Mar 17 17:53:02.779363 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Mar 17 17:53:02.779413 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Mar 17 17:53:02.779467 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Mar 17 17:53:02.779517 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Mar 17 17:53:02.779568 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Mar 17 17:53:02.779619 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Mar 17 17:53:02.779670 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Mar 17 17:53:02.779721 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Mar 17 17:53:02.779772 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Mar 17 17:53:02.779822 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Mar 17 17:53:02.779873 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Mar 17 17:53:02.779923 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Mar 17 17:53:02.779976 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Mar 17 17:53:02.780026 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Mar 17 17:53:02.780076 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Mar 17 17:53:02.780126 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Mar 17 17:53:02.780176 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Mar 17 17:53:02.780226 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Mar 17 17:53:02.780276 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Mar 17 17:53:02.780326 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Mar 17 17:53:02.780423 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Mar 17 17:53:02.780478 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Mar 17 17:53:02.780529 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Mar 17 17:53:02.780579 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Mar 17 17:53:02.780630 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Mar 17 17:53:02.780681 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Mar 17 17:53:02.780732 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Mar 17 17:53:02.780782 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Mar 17 17:53:02.780831 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Mar 17 17:53:02.780883 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Mar 17 17:53:02.780933 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Mar 17 17:53:02.780985 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Mar 17 17:53:02.781036 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Mar 17 17:53:02.781086 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Mar 17 17:53:02.781137 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Mar 17 17:53:02.781188 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Mar 17 17:53:02.781239 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Mar 17 17:53:02.781289 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Mar 17 17:53:02.781354 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Mar 17 17:53:02.781410 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Mar 17 17:53:02.781465 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Mar 17 17:53:02.781515 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Mar 17 17:53:02.781566 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Mar 17 17:53:02.781616 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Mar 17 17:53:02.781674 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Mar 17 17:53:02.781726 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Mar 17 17:53:02.781777 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Mar 17 17:53:02.781828 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Mar 17 17:53:02.781879 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Mar 17 17:53:02.781929 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Mar 17 17:53:02.781984 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Mar 17 17:53:02.782036 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Mar 17 17:53:02.782082 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Mar 17 17:53:02.782127 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Mar 17 17:53:02.782172 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Mar 17 17:53:02.782217 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Mar 17 17:53:02.782267 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Mar 17 17:53:02.782317 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Mar 17 17:53:02.782379 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Mar 17 17:53:02.782428 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Mar 17 17:53:02.782475 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Mar 17 17:53:02.782522 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Mar 17 17:53:02.782568 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Mar 17 17:53:02.782614 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Mar 17 17:53:02.782665 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Mar 17 17:53:02.782716 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Mar 17 17:53:02.782763 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Mar 17 17:53:02.782814 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Mar 17 17:53:02.782861 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Mar 17 17:53:02.782908 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Mar 17 17:53:02.782958 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Mar 17 17:53:02.783005 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Mar 17 17:53:02.783055 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Mar 17 17:53:02.783105 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Mar 17 17:53:02.783153 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Mar 17 17:53:02.783205 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Mar 17 17:53:02.783252 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Mar 17 17:53:02.783302 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Mar 17 17:53:02.783411 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Mar 17 17:53:02.783463 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Mar 17 17:53:02.783510 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Mar 17 17:53:02.783563 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Mar 17 17:53:02.783621 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Mar 17 17:53:02.783676 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Mar 17 17:53:02.783725 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Mar 17 17:53:02.783772 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Mar 17 17:53:02.783847 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Mar 17 17:53:02.783895 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Mar 17 17:53:02.783942 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Mar 17 17:53:02.783993 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Mar 17 17:53:02.784043 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Mar 17 17:53:02.784093 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Mar 17 17:53:02.784144 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Mar 17 17:53:02.784192 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Mar 17 17:53:02.784244 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Mar 17 17:53:02.784292 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Mar 17 17:53:02.784435 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Mar 17 17:53:02.784489 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Mar 17 17:53:02.784541 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Mar 17 17:53:02.784588 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Mar 17 17:53:02.784639 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Mar 17 17:53:02.784686 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Mar 17 17:53:02.784738 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Mar 17 17:53:02.784789 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Mar 17 17:53:02.784836 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Mar 17 17:53:02.784904 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Mar 17 17:53:02.784954 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Mar 17 17:53:02.785001 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Mar 17 17:53:02.785053 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Mar 17 17:53:02.785101 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Mar 17 17:53:02.785151 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Mar 17 17:53:02.785202 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Mar 17 17:53:02.785251 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Mar 17 17:53:02.785303 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Mar 17 17:53:02.785371 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Mar 17 17:53:02.785425 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Mar 17 17:53:02.785475 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Mar 17 17:53:02.785526 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Mar 17 17:53:02.785574 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Mar 17 17:53:02.785625 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Mar 17 17:53:02.785672 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Mar 17 17:53:02.785728 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Mar 17 17:53:02.785778 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Mar 17 17:53:02.785824 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Mar 17 17:53:02.785875 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Mar 17 17:53:02.785922 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Mar 17 17:53:02.785969 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Mar 17 17:53:02.786043 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Mar 17 17:53:02.786095 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Mar 17 17:53:02.786147 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Mar 17 17:53:02.786195 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Mar 17 17:53:02.786246 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Mar 17 17:53:02.786294 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Mar 17 17:53:02.786636 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Mar 17 17:53:02.786694 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Mar 17 17:53:02.786751 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Mar 17 17:53:02.786800 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Mar 17 17:53:02.786851 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Mar 17 17:53:02.786897 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Mar 17 17:53:02.786954 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Mar 17 17:53:02.786965 kernel: PCI: CLS 32 bytes, default 64 Mar 17 17:53:02.786974 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 17 17:53:02.786980 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Mar 17 17:53:02.786987 kernel: clocksource: Switched to clocksource tsc Mar 17 17:53:02.786993 kernel: Initialise system trusted keyrings Mar 17 17:53:02.787000 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Mar 17 17:53:02.787006 kernel: Key type asymmetric registered Mar 17 17:53:02.787012 kernel: Asymmetric key parser 'x509' registered Mar 17 17:53:02.787018 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 17 17:53:02.787025 kernel: io scheduler mq-deadline registered Mar 17 17:53:02.787032 kernel: io scheduler kyber registered Mar 17 17:53:02.787039 kernel: io scheduler bfq registered Mar 17 17:53:02.787092 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Mar 17 17:53:02.787146 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787199 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Mar 17 17:53:02.787251 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787303 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Mar 17 17:53:02.787368 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787426 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Mar 17 17:53:02.787480 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787534 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Mar 17 17:53:02.787585 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787644 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Mar 17 17:53:02.787700 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787752 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Mar 17 17:53:02.787804 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787859 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Mar 17 17:53:02.787911 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.787970 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Mar 17 17:53:02.788025 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.788078 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Mar 17 17:53:02.788130 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.788415 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Mar 17 17:53:02.788472 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.788526 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Mar 17 17:53:02.788579 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.788635 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Mar 17 17:53:02.788687 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.788739 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Mar 17 17:53:02.788791 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.788844 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Mar 17 17:53:02.788899 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.788950 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Mar 17 17:53:02.789003 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.789056 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Mar 17 17:53:02.789107 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.789160 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Mar 17 17:53:02.789214 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.789267 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Mar 17 17:53:02.789318 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.789377 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Mar 17 17:53:02.789430 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.789484 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Mar 17 17:53:02.789539 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.789591 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Mar 17 17:53:02.789643 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.789696 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Mar 17 17:53:02.790093 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790154 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Mar 17 17:53:02.790208 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790261 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Mar 17 17:53:02.790314 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790416 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Mar 17 17:53:02.790470 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790523 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Mar 17 17:53:02.790578 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790631 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Mar 17 17:53:02.790685 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790737 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Mar 17 17:53:02.790788 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790841 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Mar 17 17:53:02.790892 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.790944 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Mar 17 17:53:02.790994 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.791045 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Mar 17 17:53:02.791099 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 17 17:53:02.791109 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 17 17:53:02.791116 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 17:53:02.791122 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 17 17:53:02.791129 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Mar 17 17:53:02.791135 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 17 17:53:02.791141 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 17 17:53:02.791192 kernel: rtc_cmos 00:01: registered as rtc0 Mar 17 17:53:02.791244 kernel: rtc_cmos 00:01: setting system clock to 2025-03-17T17:53:02 UTC (1742233982) Mar 17 17:53:02.791291 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Mar 17 17:53:02.791300 kernel: intel_pstate: CPU model not supported Mar 17 17:53:02.791306 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 17 17:53:02.791313 kernel: NET: Registered PF_INET6 protocol family Mar 17 17:53:02.791319 kernel: Segment Routing with IPv6 Mar 17 17:53:02.791325 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 17:53:02.791332 kernel: NET: Registered PF_PACKET protocol family Mar 17 17:53:02.791345 kernel: Key type dns_resolver registered Mar 17 17:53:02.791354 kernel: IPI shorthand broadcast: enabled Mar 17 17:53:02.791361 kernel: sched_clock: Marking stable (883114441, 223971283)->(1165808046, -58722322) Mar 17 17:53:02.791367 kernel: registered taskstats version 1 Mar 17 17:53:02.791373 kernel: Loading compiled-in X.509 certificates Mar 17 17:53:02.791380 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 608fb88224bc0ea76afefc598557abb0413f36c0' Mar 17 17:53:02.791386 kernel: Key type .fscrypt registered Mar 17 17:53:02.791393 kernel: Key type fscrypt-provisioning registered Mar 17 17:53:02.791399 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 17:53:02.791407 kernel: ima: Allocated hash algorithm: sha1 Mar 17 17:53:02.791413 kernel: ima: No architecture policies found Mar 17 17:53:02.791419 kernel: clk: Disabling unused clocks Mar 17 17:53:02.791426 kernel: Freeing unused kernel image (initmem) memory: 42992K Mar 17 17:53:02.791432 kernel: Write protecting the kernel read-only data: 36864k Mar 17 17:53:02.791438 kernel: Freeing unused kernel image (rodata/data gap) memory: 1832K Mar 17 17:53:02.791445 kernel: Run /init as init process Mar 17 17:53:02.791451 kernel: with arguments: Mar 17 17:53:02.791458 kernel: /init Mar 17 17:53:02.791464 kernel: with environment: Mar 17 17:53:02.791472 kernel: HOME=/ Mar 17 17:53:02.791477 kernel: TERM=linux Mar 17 17:53:02.791484 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 17:53:02.791491 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 17 17:53:02.791499 systemd[1]: Detected virtualization vmware. Mar 17 17:53:02.791506 systemd[1]: Detected architecture x86-64. Mar 17 17:53:02.791513 systemd[1]: Running in initrd. Mar 17 17:53:02.791521 systemd[1]: No hostname configured, using default hostname. Mar 17 17:53:02.791527 systemd[1]: Hostname set to . Mar 17 17:53:02.791534 systemd[1]: Initializing machine ID from random generator. Mar 17 17:53:02.791540 systemd[1]: Queued start job for default target initrd.target. Mar 17 17:53:02.791547 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:53:02.791553 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:53:02.791560 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 17 17:53:02.791567 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:53:02.791575 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 17 17:53:02.791582 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 17 17:53:02.791589 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 17 17:53:02.791596 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 17 17:53:02.791602 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:53:02.791609 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:53:02.791615 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:53:02.791623 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:53:02.791629 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:53:02.791636 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:53:02.791643 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:53:02.791650 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:53:02.791656 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 17 17:53:02.791666 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 17 17:53:02.791673 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:53:02.791679 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:53:02.791687 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:53:02.791694 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:53:02.791700 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 17 17:53:02.791707 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:53:02.791714 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 17 17:53:02.791720 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 17:53:02.791727 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:53:02.791733 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:53:02.791741 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:53:02.791750 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 17 17:53:02.791768 systemd-journald[215]: Collecting audit messages is disabled. Mar 17 17:53:02.791784 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:53:02.791793 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 17:53:02.791800 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 17:53:02.791806 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 17:53:02.791813 kernel: Bridge firewalling registered Mar 17 17:53:02.791820 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:53:02.791837 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:53:02.791845 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:53:02.791852 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:53:02.791858 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:53:02.791865 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:53:02.791872 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:53:02.791880 systemd-journald[215]: Journal started Mar 17 17:53:02.791897 systemd-journald[215]: Runtime Journal (/run/log/journal/5287df66223f4f369c27c9e56f451b31) is 4.8M, max 38.7M, 33.8M free. Mar 17 17:53:02.738719 systemd-modules-load[216]: Inserted module 'overlay' Mar 17 17:53:02.794496 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:53:02.756813 systemd-modules-load[216]: Inserted module 'br_netfilter' Mar 17 17:53:02.794741 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:53:02.800487 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 17 17:53:02.801077 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:53:02.802371 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:53:02.807516 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:53:02.810348 dracut-cmdline[246]: dracut-dracut-053 Mar 17 17:53:02.810348 dracut-cmdline[246]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=d4b838cd9a6f58e8c4a6b615c32b0b28ee0df1660e34033a8fbd0429c6de5fd0 Mar 17 17:53:02.813438 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:53:02.830446 systemd-resolved[260]: Positive Trust Anchors: Mar 17 17:53:02.830626 systemd-resolved[260]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:53:02.830649 systemd-resolved[260]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:53:02.832266 systemd-resolved[260]: Defaulting to hostname 'linux'. Mar 17 17:53:02.833687 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:53:02.833853 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:53:02.856354 kernel: SCSI subsystem initialized Mar 17 17:53:02.862361 kernel: Loading iSCSI transport class v2.0-870. Mar 17 17:53:02.869356 kernel: iscsi: registered transport (tcp) Mar 17 17:53:02.882375 kernel: iscsi: registered transport (qla4xxx) Mar 17 17:53:02.882411 kernel: QLogic iSCSI HBA Driver Mar 17 17:53:02.902393 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 17 17:53:02.906441 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 17 17:53:02.922085 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 17:53:02.922132 kernel: device-mapper: uevent: version 1.0.3 Mar 17 17:53:02.922142 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 17 17:53:02.954372 kernel: raid6: avx2x4 gen() 51886 MB/s Mar 17 17:53:02.970359 kernel: raid6: avx2x2 gen() 52823 MB/s Mar 17 17:53:02.987564 kernel: raid6: avx2x1 gen() 44318 MB/s Mar 17 17:53:02.987613 kernel: raid6: using algorithm avx2x2 gen() 52823 MB/s Mar 17 17:53:03.005553 kernel: raid6: .... xor() 30920 MB/s, rmw enabled Mar 17 17:53:03.005597 kernel: raid6: using avx2x2 recovery algorithm Mar 17 17:53:03.019355 kernel: xor: automatically using best checksumming function avx Mar 17 17:53:03.120355 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 17 17:53:03.126348 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:53:03.130448 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:53:03.139002 systemd-udevd[436]: Using default interface naming scheme 'v255'. Mar 17 17:53:03.141528 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:53:03.144503 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 17 17:53:03.154470 dracut-pre-trigger[437]: rd.md=0: removing MD RAID activation Mar 17 17:53:03.171740 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:53:03.176440 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:53:03.248304 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:53:03.256448 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 17 17:53:03.265907 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 17 17:53:03.266327 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:53:03.266725 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:53:03.267011 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:53:03.272459 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 17 17:53:03.278466 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:53:03.312373 kernel: VMware PVSCSI driver - version 1.0.7.0-k Mar 17 17:53:03.324353 kernel: vmw_pvscsi: using 64bit dma Mar 17 17:53:03.324383 kernel: vmw_pvscsi: max_id: 16 Mar 17 17:53:03.325348 kernel: vmw_pvscsi: setting ring_pages to 8 Mar 17 17:53:03.327348 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Mar 17 17:53:03.331367 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Mar 17 17:53:03.337385 kernel: cryptd: max_cpu_qlen set to 1000 Mar 17 17:53:03.337396 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Mar 17 17:53:03.341460 kernel: vmw_pvscsi: enabling reqCallThreshold Mar 17 17:53:03.341479 kernel: vmw_pvscsi: driver-based request coalescing enabled Mar 17 17:53:03.341487 kernel: vmw_pvscsi: using MSI-X Mar 17 17:53:03.341495 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Mar 17 17:53:03.347351 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Mar 17 17:53:03.347581 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:53:03.348911 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:53:03.351376 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Mar 17 17:53:03.351485 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Mar 17 17:53:03.351559 kernel: AVX2 version of gcm_enc/dec engaged. Mar 17 17:53:03.351739 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:53:03.351843 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:53:03.351871 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:53:03.351982 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:53:03.354534 kernel: AES CTR mode by8 optimization enabled Mar 17 17:53:03.358745 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:53:03.374608 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:53:03.375954 kernel: libata version 3.00 loaded. Mar 17 17:53:03.378472 kernel: ata_piix 0000:00:07.1: version 2.13 Mar 17 17:53:03.382958 kernel: scsi host1: ata_piix Mar 17 17:53:03.383039 kernel: scsi host2: ata_piix Mar 17 17:53:03.383101 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Mar 17 17:53:03.383110 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Mar 17 17:53:03.381941 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:53:03.387931 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Mar 17 17:53:03.409556 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 17 17:53:03.409634 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Mar 17 17:53:03.409696 kernel: sd 0:0:0:0: [sda] Cache data unavailable Mar 17 17:53:03.409756 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Mar 17 17:53:03.409822 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:53:03.409832 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 17 17:53:03.393306 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:53:03.550361 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Mar 17 17:53:03.556351 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Mar 17 17:53:03.575479 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Mar 17 17:53:03.593987 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 17 17:53:03.594005 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (491) Mar 17 17:53:03.594013 kernel: BTRFS: device fsid 2b8ebefd-e897-48f6-96d5-0893fbb7c64a devid 1 transid 40 /dev/sda3 scanned by (udev-worker) (492) Mar 17 17:53:03.594021 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Mar 17 17:53:03.584503 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Mar 17 17:53:03.588002 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Mar 17 17:53:03.593634 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Mar 17 17:53:03.596711 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Mar 17 17:53:03.597006 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Mar 17 17:53:03.601483 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 17 17:53:03.623382 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:53:04.679376 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:53:04.679621 disk-uuid[594]: The operation has completed successfully. Mar 17 17:53:04.750584 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 17:53:04.750888 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 17 17:53:04.755454 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 17 17:53:04.757312 sh[611]: Success Mar 17 17:53:04.766355 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Mar 17 17:53:04.864644 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 17 17:53:04.865550 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 17 17:53:04.865842 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 17 17:53:04.927789 kernel: BTRFS info (device dm-0): first mount of filesystem 2b8ebefd-e897-48f6-96d5-0893fbb7c64a Mar 17 17:53:04.927823 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 17 17:53:04.927832 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 17 17:53:04.928887 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 17 17:53:04.929706 kernel: BTRFS info (device dm-0): using free space tree Mar 17 17:53:04.951369 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 17 17:53:04.954472 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 17 17:53:04.958488 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Mar 17 17:53:04.960044 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 17 17:53:05.033549 kernel: BTRFS info (device sda6): first mount of filesystem 7b241d32-136b-4fe3-b105-cecff2b2cf64 Mar 17 17:53:05.033593 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 17:53:05.033616 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:53:05.039687 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 17 17:53:05.045984 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 17:53:05.048354 kernel: BTRFS info (device sda6): last unmount of filesystem 7b241d32-136b-4fe3-b105-cecff2b2cf64 Mar 17 17:53:05.051968 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 17 17:53:05.058478 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 17 17:53:05.086319 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Mar 17 17:53:05.092482 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 17 17:53:05.153450 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:53:05.157469 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:53:05.168939 systemd-networkd[802]: lo: Link UP Mar 17 17:53:05.168943 systemd-networkd[802]: lo: Gained carrier Mar 17 17:53:05.169599 systemd-networkd[802]: Enumeration completed Mar 17 17:53:05.169772 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:53:05.169850 systemd-networkd[802]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Mar 17 17:53:05.169945 systemd[1]: Reached target network.target - Network. Mar 17 17:53:05.172824 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Mar 17 17:53:05.172953 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Mar 17 17:53:05.173058 systemd-networkd[802]: ens192: Link UP Mar 17 17:53:05.173062 systemd-networkd[802]: ens192: Gained carrier Mar 17 17:53:05.201164 ignition[671]: Ignition 2.20.0 Mar 17 17:53:05.201434 ignition[671]: Stage: fetch-offline Mar 17 17:53:05.201456 ignition[671]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:53:05.201461 ignition[671]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 17 17:53:05.201516 ignition[671]: parsed url from cmdline: "" Mar 17 17:53:05.201518 ignition[671]: no config URL provided Mar 17 17:53:05.201521 ignition[671]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 17:53:05.201525 ignition[671]: no config at "/usr/lib/ignition/user.ign" Mar 17 17:53:05.201886 ignition[671]: config successfully fetched Mar 17 17:53:05.201903 ignition[671]: parsing config with SHA512: 97fcfb50f231721e06a511a1ef981e1eecc5efd0c6f1d1f5d04e157dbd255ce065d55825885a724263dae8204fc6d0afa66d1ed545935fb2395fe3aff242e97f Mar 17 17:53:05.204269 unknown[671]: fetched base config from "system" Mar 17 17:53:05.204275 unknown[671]: fetched user config from "vmware" Mar 17 17:53:05.204497 ignition[671]: fetch-offline: fetch-offline passed Mar 17 17:53:05.204539 ignition[671]: Ignition finished successfully Mar 17 17:53:05.205144 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:53:05.205562 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 17 17:53:05.208536 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 17 17:53:05.216383 ignition[810]: Ignition 2.20.0 Mar 17 17:53:05.216392 ignition[810]: Stage: kargs Mar 17 17:53:05.216501 ignition[810]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:53:05.216507 ignition[810]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 17 17:53:05.217010 ignition[810]: kargs: kargs passed Mar 17 17:53:05.217036 ignition[810]: Ignition finished successfully Mar 17 17:53:05.218345 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 17 17:53:05.221452 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 17 17:53:05.228537 ignition[816]: Ignition 2.20.0 Mar 17 17:53:05.228545 ignition[816]: Stage: disks Mar 17 17:53:05.228652 ignition[816]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:53:05.228657 ignition[816]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 17 17:53:05.229177 ignition[816]: disks: disks passed Mar 17 17:53:05.229203 ignition[816]: Ignition finished successfully Mar 17 17:53:05.229929 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 17 17:53:05.230487 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 17 17:53:05.230725 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 17 17:53:05.230830 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:53:05.230916 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:53:05.231003 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:53:05.234495 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 17 17:53:05.245057 systemd-fsck[824]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 17 17:53:05.246441 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 17 17:53:05.250506 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 17 17:53:05.346272 kernel: EXT4-fs (sda9): mounted filesystem 345fc709-8965-4219-b368-16e508c3d632 r/w with ordered data mode. Quota mode: none. Mar 17 17:53:05.345812 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 17 17:53:05.346250 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 17 17:53:05.360467 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:53:05.362021 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 17 17:53:05.362424 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 17 17:53:05.362460 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 17:53:05.362478 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:53:05.366675 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 17 17:53:05.369358 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (832) Mar 17 17:53:05.372533 kernel: BTRFS info (device sda6): first mount of filesystem 7b241d32-136b-4fe3-b105-cecff2b2cf64 Mar 17 17:53:05.372555 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 17:53:05.372574 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:53:05.372682 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 17 17:53:05.376352 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 17 17:53:05.378301 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:53:05.409363 initrd-setup-root[856]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 17:53:05.417881 initrd-setup-root[863]: cut: /sysroot/etc/group: No such file or directory Mar 17 17:53:05.422303 initrd-setup-root[870]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 17:53:05.424210 initrd-setup-root[877]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 17:53:05.496139 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 17 17:53:05.500473 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 17 17:53:05.502929 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 17 17:53:05.508368 kernel: BTRFS info (device sda6): last unmount of filesystem 7b241d32-136b-4fe3-b105-cecff2b2cf64 Mar 17 17:53:05.522330 ignition[945]: INFO : Ignition 2.20.0 Mar 17 17:53:05.522330 ignition[945]: INFO : Stage: mount Mar 17 17:53:05.523236 ignition[945]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:53:05.523236 ignition[945]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 17 17:53:05.523838 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 17 17:53:05.524544 ignition[945]: INFO : mount: mount passed Mar 17 17:53:05.524544 ignition[945]: INFO : Ignition finished successfully Mar 17 17:53:05.524096 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 17 17:53:05.530463 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 17 17:53:05.922405 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 17 17:53:05.930507 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:53:06.023368 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (956) Mar 17 17:53:06.049297 kernel: BTRFS info (device sda6): first mount of filesystem 7b241d32-136b-4fe3-b105-cecff2b2cf64 Mar 17 17:53:06.049329 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 17:53:06.049358 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:53:06.107365 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 17 17:53:06.115834 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:53:06.134630 ignition[973]: INFO : Ignition 2.20.0 Mar 17 17:53:06.134630 ignition[973]: INFO : Stage: files Mar 17 17:53:06.135064 ignition[973]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:53:06.135064 ignition[973]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 17 17:53:06.135480 ignition[973]: DEBUG : files: compiled without relabeling support, skipping Mar 17 17:53:06.209856 ignition[973]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 17:53:06.209856 ignition[973]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 17:53:06.231495 systemd-networkd[802]: ens192: Gained IPv6LL Mar 17 17:53:06.267792 ignition[973]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 17:53:06.268173 ignition[973]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 17:53:06.268494 unknown[973]: wrote ssh authorized keys file for user: core Mar 17 17:53:06.268912 ignition[973]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 17:53:06.305425 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 17 17:53:06.305425 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 17 17:53:06.376000 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 17 17:53:06.457970 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 17 17:53:06.458276 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 17 17:53:06.458276 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 17:53:06.458276 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:53:06.458276 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:53:06.458276 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:53:06.458276 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:53:06.458276 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:53:06.459410 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:53:06.459410 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:53:06.459410 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:53:06.459410 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 17:53:06.459410 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 17:53:06.459410 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 17:53:06.459410 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Mar 17 17:53:06.946131 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 17 17:53:07.089487 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 17:53:07.089487 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Mar 17 17:53:07.089487 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Mar 17 17:53:07.089487 ignition[973]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Mar 17 17:53:07.090733 ignition[973]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Mar 17 17:53:07.289204 ignition[973]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Mar 17 17:53:07.293668 ignition[973]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Mar 17 17:53:07.293668 ignition[973]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Mar 17 17:53:07.293668 ignition[973]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Mar 17 17:53:07.293668 ignition[973]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Mar 17 17:53:07.293668 ignition[973]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:53:07.294727 ignition[973]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:53:07.294727 ignition[973]: INFO : files: files passed Mar 17 17:53:07.294727 ignition[973]: INFO : Ignition finished successfully Mar 17 17:53:07.295050 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 17 17:53:07.299507 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 17 17:53:07.301446 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 17 17:53:07.302902 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 17:53:07.302976 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 17 17:53:07.311522 initrd-setup-root-after-ignition[1004]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:53:07.311522 initrd-setup-root-after-ignition[1004]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:53:07.312476 initrd-setup-root-after-ignition[1008]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:53:07.313549 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:53:07.314005 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 17 17:53:07.318462 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 17 17:53:07.332315 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 17:53:07.332395 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 17 17:53:07.332875 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 17 17:53:07.333020 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 17 17:53:07.333246 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 17 17:53:07.333768 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 17 17:53:07.344020 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:53:07.348450 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 17 17:53:07.355544 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:53:07.355821 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:53:07.356038 systemd[1]: Stopped target timers.target - Timer Units. Mar 17 17:53:07.356211 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 17:53:07.356309 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:53:07.356651 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 17 17:53:07.356845 systemd[1]: Stopped target basic.target - Basic System. Mar 17 17:53:07.357014 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 17 17:53:07.357217 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:53:07.357420 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 17 17:53:07.357794 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 17 17:53:07.357982 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:53:07.358154 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 17 17:53:07.358448 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 17 17:53:07.358649 systemd[1]: Stopped target swap.target - Swaps. Mar 17 17:53:07.358801 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 17:53:07.358893 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:53:07.359240 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:53:07.359426 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:53:07.359618 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 17 17:53:07.359690 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:53:07.359942 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 17:53:07.360006 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 17 17:53:07.360249 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 17:53:07.360369 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:53:07.360697 systemd[1]: Stopped target paths.target - Path Units. Mar 17 17:53:07.360860 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 17:53:07.366373 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:53:07.366659 systemd[1]: Stopped target slices.target - Slice Units. Mar 17 17:53:07.366928 systemd[1]: Stopped target sockets.target - Socket Units. Mar 17 17:53:07.367135 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 17:53:07.367205 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:53:07.367531 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 17:53:07.367596 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:53:07.367922 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 17:53:07.368018 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:53:07.368252 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 17:53:07.368346 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 17 17:53:07.380441 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 17 17:53:07.380540 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 17:53:07.380608 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:53:07.382405 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 17 17:53:07.382606 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 17:53:07.382693 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:53:07.382898 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 17:53:07.382962 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:53:07.386532 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 17:53:07.389637 ignition[1028]: INFO : Ignition 2.20.0 Mar 17 17:53:07.390696 ignition[1028]: INFO : Stage: umount Mar 17 17:53:07.390696 ignition[1028]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:53:07.390696 ignition[1028]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 17 17:53:07.390696 ignition[1028]: INFO : umount: umount passed Mar 17 17:53:07.390696 ignition[1028]: INFO : Ignition finished successfully Mar 17 17:53:07.390555 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 17 17:53:07.392804 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 17:53:07.392858 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 17 17:53:07.393499 systemd[1]: Stopped target network.target - Network. Mar 17 17:53:07.393886 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 17:53:07.393917 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 17 17:53:07.394165 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 17:53:07.394188 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 17 17:53:07.394482 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 17:53:07.394505 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 17 17:53:07.394862 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 17 17:53:07.394885 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 17 17:53:07.395186 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 17 17:53:07.395614 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 17 17:53:07.399064 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 17:53:07.399132 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 17 17:53:07.399360 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 17:53:07.399381 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:53:07.409536 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 17 17:53:07.409679 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 17:53:07.409720 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:53:07.409871 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Mar 17 17:53:07.409894 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Mar 17 17:53:07.410077 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:53:07.414429 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 17:53:07.414803 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 17:53:07.414883 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 17 17:53:07.416886 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 17:53:07.416919 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:53:07.417563 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 17:53:07.417589 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 17 17:53:07.418609 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 17 17:53:07.418635 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:53:07.419001 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 17:53:07.419109 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:53:07.420679 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 17:53:07.420723 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 17 17:53:07.420870 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 17:53:07.420891 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:53:07.421066 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 17:53:07.421096 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:53:07.421549 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 17:53:07.421581 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 17 17:53:07.422178 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:53:07.422204 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:53:07.427480 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 17 17:53:07.427677 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 17 17:53:07.427722 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:53:07.427996 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:53:07.428025 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:53:07.429301 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 17:53:07.429497 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 17 17:53:07.434470 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 17:53:07.434543 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 17 17:53:07.456909 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 17:53:07.456983 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 17 17:53:07.457295 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 17 17:53:07.457434 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 17:53:07.457466 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 17 17:53:07.462449 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 17 17:53:07.479430 systemd[1]: Switching root. Mar 17 17:53:07.508417 systemd-journald[215]: Journal stopped Mar 17 17:53:09.190502 systemd-journald[215]: Received SIGTERM from PID 1 (systemd). Mar 17 17:53:09.190537 kernel: SELinux: policy capability network_peer_controls=1 Mar 17 17:53:09.190546 kernel: SELinux: policy capability open_perms=1 Mar 17 17:53:09.190556 kernel: SELinux: policy capability extended_socket_class=1 Mar 17 17:53:09.190563 kernel: SELinux: policy capability always_check_network=0 Mar 17 17:53:09.190569 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 17 17:53:09.190577 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 17 17:53:09.190583 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 17 17:53:09.190588 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 17 17:53:09.190594 kernel: audit: type=1403 audit(1742233988.300:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 17 17:53:09.190601 systemd[1]: Successfully loaded SELinux policy in 32.662ms. Mar 17 17:53:09.190607 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 10.097ms. Mar 17 17:53:09.190614 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 17 17:53:09.190622 systemd[1]: Detected virtualization vmware. Mar 17 17:53:09.190629 systemd[1]: Detected architecture x86-64. Mar 17 17:53:09.190635 systemd[1]: Detected first boot. Mar 17 17:53:09.190641 systemd[1]: Initializing machine ID from random generator. Mar 17 17:53:09.190650 zram_generator::config[1070]: No configuration found. Mar 17 17:53:09.190657 systemd[1]: Populated /etc with preset unit settings. Mar 17 17:53:09.190664 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Mar 17 17:53:09.190671 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Mar 17 17:53:09.190678 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 17 17:53:09.190684 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 17 17:53:09.190690 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 17 17:53:09.190698 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 17 17:53:09.190705 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 17 17:53:09.190712 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 17 17:53:09.190718 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 17 17:53:09.190725 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 17 17:53:09.190731 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 17 17:53:09.190738 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 17 17:53:09.190745 systemd[1]: Created slice user.slice - User and Session Slice. Mar 17 17:53:09.190752 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:53:09.190759 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:53:09.190765 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 17 17:53:09.190771 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 17 17:53:09.190778 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 17 17:53:09.190785 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:53:09.190791 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 17 17:53:09.190800 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:53:09.190807 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 17 17:53:09.190815 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 17 17:53:09.190821 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 17 17:53:09.190828 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 17 17:53:09.190835 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:53:09.190841 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:53:09.190848 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:53:09.190855 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:53:09.190862 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 17 17:53:09.190869 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 17 17:53:09.190876 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:53:09.190883 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:53:09.190890 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:53:09.190897 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 17 17:53:09.190907 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 17 17:53:09.190918 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 17 17:53:09.190925 systemd[1]: Mounting media.mount - External Media Directory... Mar 17 17:53:09.190932 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 17:53:09.190939 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 17 17:53:09.190946 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 17 17:53:09.190954 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 17 17:53:09.190961 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 17 17:53:09.190968 systemd[1]: Reached target machines.target - Containers. Mar 17 17:53:09.190975 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 17 17:53:09.190982 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Mar 17 17:53:09.190989 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:53:09.190996 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 17 17:53:09.191002 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:53:09.191011 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 17:53:09.191018 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:53:09.191024 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 17 17:53:09.191031 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:53:09.191038 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 17 17:53:09.191044 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 17 17:53:09.191051 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 17 17:53:09.191058 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 17 17:53:09.191065 systemd[1]: Stopped systemd-fsck-usr.service. Mar 17 17:53:09.191073 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:53:09.191080 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:53:09.191087 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 17 17:53:09.191094 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 17 17:53:09.191100 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:53:09.191107 systemd[1]: verity-setup.service: Deactivated successfully. Mar 17 17:53:09.191114 systemd[1]: Stopped verity-setup.service. Mar 17 17:53:09.191121 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 17:53:09.191129 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 17 17:53:09.191136 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 17 17:53:09.191142 systemd[1]: Mounted media.mount - External Media Directory. Mar 17 17:53:09.191149 kernel: fuse: init (API version 7.39) Mar 17 17:53:09.191156 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 17 17:53:09.191182 systemd-journald[1154]: Collecting audit messages is disabled. Mar 17 17:53:09.191199 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 17 17:53:09.191206 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 17 17:53:09.191213 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:53:09.191220 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 17 17:53:09.191227 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 17 17:53:09.191234 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:53:09.191240 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:53:09.191249 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:53:09.191255 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:53:09.191262 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 17 17:53:09.191270 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 17 17:53:09.191277 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:53:09.191284 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 17 17:53:09.191292 systemd-journald[1154]: Journal started Mar 17 17:53:09.191308 systemd-journald[1154]: Runtime Journal (/run/log/journal/54b358d74aea442ab232598c85bc12ec) is 4.8M, max 38.7M, 33.8M free. Mar 17 17:53:08.966284 systemd[1]: Queued start job for default target multi-user.target. Mar 17 17:53:09.012670 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 17 17:53:09.012897 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 17 17:53:09.191843 jq[1138]: true Mar 17 17:53:09.206361 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:53:09.206401 kernel: ACPI: bus type drm_connector registered Mar 17 17:53:09.200562 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 17 17:53:09.204500 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 17:53:09.204660 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 17:53:09.209122 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 17 17:53:09.214381 jq[1170]: true Mar 17 17:53:09.217448 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 17 17:53:09.219223 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 17 17:53:09.221053 kernel: loop: module loaded Mar 17 17:53:09.219365 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 17 17:53:09.219388 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:53:09.220082 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 17 17:53:09.223506 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 17 17:53:09.227419 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 17 17:53:09.227593 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:53:09.246553 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 17 17:53:09.249064 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 17 17:53:09.249214 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 17:53:09.251833 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 17 17:53:09.255376 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:53:09.257480 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 17 17:53:09.258689 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 17 17:53:09.258979 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:53:09.259073 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:53:09.261766 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 17 17:53:09.261951 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 17 17:53:09.263383 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 17 17:53:09.265915 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 17:53:09.267643 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 17 17:53:09.278466 systemd-journald[1154]: Time spent on flushing to /var/log/journal/54b358d74aea442ab232598c85bc12ec is 69.214ms for 1832 entries. Mar 17 17:53:09.278466 systemd-journald[1154]: System Journal (/var/log/journal/54b358d74aea442ab232598c85bc12ec) is 8.0M, max 584.8M, 576.8M free. Mar 17 17:53:09.403963 systemd-journald[1154]: Received client request to flush runtime journal. Mar 17 17:53:09.404016 kernel: loop0: detected capacity change from 0 to 138184 Mar 17 17:53:09.283237 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 17 17:53:09.283811 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 17 17:53:09.286213 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 17 17:53:09.296464 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:53:09.386532 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:53:09.394486 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 17 17:53:09.394779 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 17 17:53:09.396432 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:53:09.408883 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 17 17:53:09.410810 ignition[1206]: Ignition 2.20.0 Mar 17 17:53:09.411019 ignition[1206]: deleting config from guestinfo properties Mar 17 17:53:09.470259 ignition[1206]: Successfully deleted config Mar 17 17:53:09.471695 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 17 17:53:09.472585 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Mar 17 17:53:09.474903 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 17 17:53:09.475934 udevadm[1226]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 17 17:53:09.491235 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 17 17:53:09.493115 systemd-tmpfiles[1227]: ACLs are not supported, ignoring. Mar 17 17:53:09.493127 systemd-tmpfiles[1227]: ACLs are not supported, ignoring. Mar 17 17:53:09.498312 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:53:09.517355 kernel: loop1: detected capacity change from 0 to 2944 Mar 17 17:53:09.632367 kernel: loop2: detected capacity change from 0 to 140992 Mar 17 17:53:09.679754 kernel: loop3: detected capacity change from 0 to 210664 Mar 17 17:53:09.787359 kernel: loop4: detected capacity change from 0 to 138184 Mar 17 17:53:09.814489 kernel: loop5: detected capacity change from 0 to 2944 Mar 17 17:53:09.825365 kernel: loop6: detected capacity change from 0 to 140992 Mar 17 17:53:09.884395 kernel: loop7: detected capacity change from 0 to 210664 Mar 17 17:53:09.940144 (sd-merge)[1240]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Mar 17 17:53:09.940698 (sd-merge)[1240]: Merged extensions into '/usr'. Mar 17 17:53:09.948473 systemd[1]: Reloading requested from client PID 1205 ('systemd-sysext') (unit systemd-sysext.service)... Mar 17 17:53:09.948560 systemd[1]: Reloading... Mar 17 17:53:10.005689 zram_generator::config[1265]: No configuration found. Mar 17 17:53:10.085633 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Mar 17 17:53:10.101880 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:53:10.130907 systemd[1]: Reloading finished in 181 ms. Mar 17 17:53:10.154460 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 17 17:53:10.166525 systemd[1]: Starting ensure-sysext.service... Mar 17 17:53:10.170534 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:53:10.170929 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 17 17:53:10.177525 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:53:10.180466 systemd[1]: Reloading requested from client PID 1321 ('systemctl') (unit ensure-sysext.service)... Mar 17 17:53:10.180479 systemd[1]: Reloading... Mar 17 17:53:10.194715 systemd-tmpfiles[1322]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 17 17:53:10.194976 systemd-tmpfiles[1322]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 17 17:53:10.196485 systemd-tmpfiles[1322]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 17 17:53:10.197062 systemd-tmpfiles[1322]: ACLs are not supported, ignoring. Mar 17 17:53:10.197106 systemd-tmpfiles[1322]: ACLs are not supported, ignoring. Mar 17 17:53:10.201093 systemd-tmpfiles[1322]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 17:53:10.201101 systemd-tmpfiles[1322]: Skipping /boot Mar 17 17:53:10.212072 systemd-tmpfiles[1322]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 17:53:10.212079 systemd-tmpfiles[1322]: Skipping /boot Mar 17 17:53:10.227953 systemd-udevd[1324]: Using default interface naming scheme 'v255'. Mar 17 17:53:10.240357 zram_generator::config[1348]: No configuration found. Mar 17 17:53:10.305836 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Mar 17 17:53:10.321255 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:53:10.349490 systemd[1]: Reloading finished in 167 ms. Mar 17 17:53:10.363662 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:53:10.367022 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:53:10.371510 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 17 17:53:10.373712 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 17 17:53:10.377751 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:53:10.379491 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 17 17:53:10.382973 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 17:53:10.383739 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:53:10.385976 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:53:10.387812 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:53:10.387968 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:53:10.388036 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 17:53:10.389212 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 17:53:10.389299 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:53:10.390510 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 17 17:53:10.390733 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 17:53:10.393571 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:53:10.393660 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:53:10.393975 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 17:53:10.395492 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 17:53:10.395664 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:53:10.395734 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 17:53:10.395800 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 17:53:10.396752 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:53:10.396855 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:53:10.399663 systemd[1]: Finished ensure-sysext.service. Mar 17 17:53:10.408136 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 17 17:53:10.409336 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 17 17:53:10.409721 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 17:53:10.409864 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 17:53:10.415835 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:53:10.416159 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:53:10.417104 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 17:53:10.432142 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:53:10.441241 ldconfig[1200]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 17 17:53:10.442516 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:53:10.449498 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 17 17:53:10.450042 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 17 17:53:10.456532 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 17 17:53:10.471942 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 17 17:53:10.474545 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 17 17:53:10.492573 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 17 17:53:10.493935 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 17:53:10.502806 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 17 17:53:10.513926 augenrules[1472]: No rules Mar 17 17:53:10.518585 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:53:10.518726 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:53:10.561619 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 17 17:53:10.561831 systemd[1]: Reached target time-set.target - System Time Set. Mar 17 17:53:10.581763 systemd-networkd[1435]: lo: Link UP Mar 17 17:53:10.581767 systemd-networkd[1435]: lo: Gained carrier Mar 17 17:53:10.582498 systemd-networkd[1435]: Enumeration completed Mar 17 17:53:10.582771 systemd-networkd[1435]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Mar 17 17:53:10.583609 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:53:10.587582 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Mar 17 17:53:10.587735 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Mar 17 17:53:10.588530 systemd-networkd[1435]: ens192: Link UP Mar 17 17:53:10.588666 systemd-networkd[1435]: ens192: Gained carrier Mar 17 17:53:10.590475 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 17 17:53:10.592758 systemd-timesyncd[1427]: Network configuration changed, trying to establish connection. Mar 17 17:53:10.593670 systemd-resolved[1411]: Positive Trust Anchors: Mar 17 17:53:10.593827 systemd-resolved[1411]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:53:10.593883 systemd-resolved[1411]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:53:10.632629 systemd-resolved[1411]: Defaulting to hostname 'linux'. Mar 17 17:53:10.635350 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (1467) Mar 17 17:53:10.635692 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:53:10.635923 systemd[1]: Reached target network.target - Network. Mar 17 17:53:10.636083 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:53:10.637348 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 17 17:53:10.650380 kernel: ACPI: button: Power Button [PWRF] Mar 17 17:53:10.664352 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Mar 17 17:53:10.689369 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Mar 17 17:53:10.690691 kernel: Guest personality initialized and is active Mar 17 17:53:10.690719 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Mar 17 17:53:10.691554 kernel: Initialized host personality Mar 17 17:53:10.691496 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:53:10.696242 (udev-worker)[1454]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Mar 17 17:53:10.701354 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input4 Mar 17 17:53:10.712354 kernel: mousedev: PS/2 mouse device common for all mice Mar 17 17:53:10.747821 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Mar 17 17:53:10.757521 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 17 17:53:10.793523 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 17 17:53:10.798652 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 17 17:53:10.803538 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 17 17:53:10.831700 lvm[1503]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 17:53:10.865695 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 17 17:53:10.866006 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:53:10.869495 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 17 17:53:10.874361 lvm[1505]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 17:53:10.908812 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 17 17:53:10.993876 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:53:10.994125 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:53:10.994290 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 17 17:53:10.994424 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 17 17:53:10.994631 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 17 17:53:10.994831 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 17 17:53:10.994952 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 17 17:53:10.995073 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 17 17:53:10.995102 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:53:10.995191 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:53:11.002425 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 17 17:53:11.003413 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 17 17:53:11.026737 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 17 17:53:11.027204 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 17 17:53:11.027359 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:53:11.027449 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:53:11.027558 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 17 17:53:11.027575 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 17 17:53:11.028386 systemd[1]: Starting containerd.service - containerd container runtime... Mar 17 17:53:11.031431 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 17 17:53:11.032891 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 17 17:53:11.034498 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 17 17:53:11.036371 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 17 17:53:11.037073 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 17 17:53:11.039568 jq[1514]: false Mar 17 17:53:11.039510 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 17 17:53:11.041327 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 17 17:53:11.042866 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 17 17:53:11.046278 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 17 17:53:11.046585 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 17 17:53:11.047168 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 17 17:53:11.052976 systemd[1]: Starting update-engine.service - Update Engine... Mar 17 17:53:11.053959 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 17 17:53:11.055108 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Mar 17 17:53:11.057584 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 17 17:53:11.057711 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 17 17:53:11.058691 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 17 17:53:11.058791 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 17 17:53:11.084021 extend-filesystems[1515]: Found loop4 Mar 17 17:53:11.084378 extend-filesystems[1515]: Found loop5 Mar 17 17:53:11.084378 extend-filesystems[1515]: Found loop6 Mar 17 17:53:11.084378 extend-filesystems[1515]: Found loop7 Mar 17 17:53:11.084378 extend-filesystems[1515]: Found sda Mar 17 17:53:11.084378 extend-filesystems[1515]: Found sda1 Mar 17 17:53:11.084378 extend-filesystems[1515]: Found sda2 Mar 17 17:53:11.084378 extend-filesystems[1515]: Found sda3 Mar 17 17:53:11.084378 extend-filesystems[1515]: Found usr Mar 17 17:53:11.084378 extend-filesystems[1515]: Found sda4 Mar 17 17:53:11.084378 extend-filesystems[1515]: Found sda6 Mar 17 17:53:11.084378 extend-filesystems[1515]: Found sda7 Mar 17 17:53:11.084378 extend-filesystems[1515]: Found sda9 Mar 17 17:53:11.084378 extend-filesystems[1515]: Checking size of /dev/sda9 Mar 17 17:53:11.097773 update_engine[1522]: I20250317 17:53:11.088433 1522 main.cc:92] Flatcar Update Engine starting Mar 17 17:53:11.085943 systemd[1]: motdgen.service: Deactivated successfully. Mar 17 17:53:11.086072 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 17 17:53:11.087186 (ntainerd)[1540]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 17 17:53:11.099356 jq[1524]: true Mar 17 17:53:11.105662 extend-filesystems[1515]: Old size kept for /dev/sda9 Mar 17 17:53:11.105662 extend-filesystems[1515]: Found sr0 Mar 17 17:53:11.106981 dbus-daemon[1513]: [system] SELinux support is enabled Mar 17 17:53:11.108559 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 17 17:53:11.121254 update_engine[1522]: I20250317 17:53:11.111524 1522 update_check_scheduler.cc:74] Next update check in 8m44s Mar 17 17:53:11.110075 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 17 17:53:11.110980 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 17 17:53:11.118537 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Mar 17 17:53:11.122629 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 17 17:53:11.122657 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 17 17:53:11.122810 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 17 17:53:11.122825 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 17 17:53:11.131355 tar[1528]: linux-amd64/helm Mar 17 17:53:11.132441 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Mar 17 17:53:11.132814 systemd[1]: Started update-engine.service - Update Engine. Mar 17 17:53:11.137454 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 17 17:53:11.139908 jq[1547]: true Mar 17 17:53:11.160427 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Mar 17 17:53:11.178381 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (1454) Mar 17 17:53:11.183961 unknown[1550]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Mar 17 17:53:11.187790 unknown[1550]: Core dump limit set to -1 Mar 17 17:53:11.206354 kernel: NET: Registered PF_VSOCK protocol family Mar 17 17:53:11.207156 systemd-logind[1520]: Watching system buttons on /dev/input/event1 (Power Button) Mar 17 17:53:11.207172 systemd-logind[1520]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 17 17:53:11.210811 systemd-logind[1520]: New seat seat0. Mar 17 17:53:11.211467 systemd[1]: Started systemd-logind.service - User Login Management. Mar 17 17:53:11.258522 bash[1576]: Updated "/home/core/.ssh/authorized_keys" Mar 17 17:53:11.260330 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 17 17:53:11.261742 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 17 17:53:11.313415 locksmithd[1556]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 17 17:53:11.540377 containerd[1540]: time="2025-03-17T17:53:11.537613658Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Mar 17 17:53:11.578108 containerd[1540]: time="2025-03-17T17:53:11.578077650Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:53:11.581682 containerd[1540]: time="2025-03-17T17:53:11.581652872Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.83-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:53:11.581765 containerd[1540]: time="2025-03-17T17:53:11.581755542Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 17 17:53:11.581831 containerd[1540]: time="2025-03-17T17:53:11.581822130Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 17 17:53:11.581966 containerd[1540]: time="2025-03-17T17:53:11.581956489Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 17 17:53:11.583929 containerd[1540]: time="2025-03-17T17:53:11.583362915Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 17 17:53:11.583929 containerd[1540]: time="2025-03-17T17:53:11.583417991Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:53:11.583929 containerd[1540]: time="2025-03-17T17:53:11.583430356Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:53:11.583929 containerd[1540]: time="2025-03-17T17:53:11.583539853Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:53:11.583929 containerd[1540]: time="2025-03-17T17:53:11.583550176Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 17 17:53:11.583929 containerd[1540]: time="2025-03-17T17:53:11.583557329Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:53:11.583929 containerd[1540]: time="2025-03-17T17:53:11.583562792Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 17 17:53:11.583929 containerd[1540]: time="2025-03-17T17:53:11.583610974Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:53:11.583929 containerd[1540]: time="2025-03-17T17:53:11.583752754Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:53:11.583929 containerd[1540]: time="2025-03-17T17:53:11.583821522Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:53:11.583929 containerd[1540]: time="2025-03-17T17:53:11.583834342Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 17 17:53:11.584115 containerd[1540]: time="2025-03-17T17:53:11.583881809Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 17 17:53:11.584115 containerd[1540]: time="2025-03-17T17:53:11.583909278Z" level=info msg="metadata content store policy set" policy=shared Mar 17 17:53:11.776920 sshd_keygen[1541]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 17 17:53:11.796302 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 17 17:53:11.807882 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 17 17:53:11.810886 systemd[1]: issuegen.service: Deactivated successfully. Mar 17 17:53:11.811086 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 17 17:53:11.816269 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 17 17:53:11.817410 containerd[1540]: time="2025-03-17T17:53:11.817383872Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 17 17:53:11.817440 containerd[1540]: time="2025-03-17T17:53:11.817425036Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 17 17:53:11.817440 containerd[1540]: time="2025-03-17T17:53:11.817436058Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 17 17:53:11.817474 containerd[1540]: time="2025-03-17T17:53:11.817447851Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 17 17:53:11.817474 containerd[1540]: time="2025-03-17T17:53:11.817456229Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 17 17:53:11.817577 containerd[1540]: time="2025-03-17T17:53:11.817557674Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 17 17:53:11.818627 containerd[1540]: time="2025-03-17T17:53:11.817742710Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 17 17:53:11.818627 containerd[1540]: time="2025-03-17T17:53:11.817812355Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 17 17:53:11.818627 containerd[1540]: time="2025-03-17T17:53:11.817822924Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 17 17:53:11.818627 containerd[1540]: time="2025-03-17T17:53:11.817832560Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 17 17:53:11.818627 containerd[1540]: time="2025-03-17T17:53:11.817841024Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 17 17:53:11.818627 containerd[1540]: time="2025-03-17T17:53:11.817849039Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 17 17:53:11.818627 containerd[1540]: time="2025-03-17T17:53:11.817855987Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 17 17:53:11.818627 containerd[1540]: time="2025-03-17T17:53:11.817863749Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 17 17:53:11.818627 containerd[1540]: time="2025-03-17T17:53:11.817871649Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 17 17:53:11.818627 containerd[1540]: time="2025-03-17T17:53:11.817879242Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 17 17:53:11.818627 containerd[1540]: time="2025-03-17T17:53:11.817886856Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 17 17:53:11.818627 containerd[1540]: time="2025-03-17T17:53:11.817893260Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 17 17:53:11.818627 containerd[1540]: time="2025-03-17T17:53:11.817904922Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818627 containerd[1540]: time="2025-03-17T17:53:11.817912396Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.817919788Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.817927023Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.817933979Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.817944022Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.817950974Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.817957582Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.817966064Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.817974318Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.817980460Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.817986646Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.817992833Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.818000368Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.818013800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.818022120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.818817 containerd[1540]: time="2025-03-17T17:53:11.818028253Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 17 17:53:11.819007 containerd[1540]: time="2025-03-17T17:53:11.818052534Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 17 17:53:11.819007 containerd[1540]: time="2025-03-17T17:53:11.818063541Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 17 17:53:11.819007 containerd[1540]: time="2025-03-17T17:53:11.818069513Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 17 17:53:11.819007 containerd[1540]: time="2025-03-17T17:53:11.818075891Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 17 17:53:11.819007 containerd[1540]: time="2025-03-17T17:53:11.818080689Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.819007 containerd[1540]: time="2025-03-17T17:53:11.818087423Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 17 17:53:11.819007 containerd[1540]: time="2025-03-17T17:53:11.818092943Z" level=info msg="NRI interface is disabled by configuration." Mar 17 17:53:11.819007 containerd[1540]: time="2025-03-17T17:53:11.818098969Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 17 17:53:11.819120 containerd[1540]: time="2025-03-17T17:53:11.818260293Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 17 17:53:11.819120 containerd[1540]: time="2025-03-17T17:53:11.818287702Z" level=info msg="Connect containerd service" Mar 17 17:53:11.819120 containerd[1540]: time="2025-03-17T17:53:11.818308764Z" level=info msg="using legacy CRI server" Mar 17 17:53:11.819120 containerd[1540]: time="2025-03-17T17:53:11.818313111Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 17 17:53:11.819120 containerd[1540]: time="2025-03-17T17:53:11.818398158Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 17 17:53:11.819420 containerd[1540]: time="2025-03-17T17:53:11.819398517Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 17:53:11.819504 containerd[1540]: time="2025-03-17T17:53:11.819490069Z" level=info msg="Start subscribing containerd event" Mar 17 17:53:11.819547 containerd[1540]: time="2025-03-17T17:53:11.819539956Z" level=info msg="Start recovering state" Mar 17 17:53:11.819602 containerd[1540]: time="2025-03-17T17:53:11.819595065Z" level=info msg="Start event monitor" Mar 17 17:53:11.819637 containerd[1540]: time="2025-03-17T17:53:11.819631780Z" level=info msg="Start snapshots syncer" Mar 17 17:53:11.819667 containerd[1540]: time="2025-03-17T17:53:11.819661480Z" level=info msg="Start cni network conf syncer for default" Mar 17 17:53:11.819698 containerd[1540]: time="2025-03-17T17:53:11.819691996Z" level=info msg="Start streaming server" Mar 17 17:53:11.820034 containerd[1540]: time="2025-03-17T17:53:11.820024812Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 17 17:53:11.820102 containerd[1540]: time="2025-03-17T17:53:11.820085568Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 17 17:53:11.820195 systemd[1]: Started containerd.service - containerd container runtime. Mar 17 17:53:11.821744 containerd[1540]: time="2025-03-17T17:53:11.820991568Z" level=info msg="containerd successfully booted in 0.284655s" Mar 17 17:53:11.825770 tar[1528]: linux-amd64/LICENSE Mar 17 17:53:11.825806 tar[1528]: linux-amd64/README.md Mar 17 17:53:11.830791 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 17 17:53:11.832939 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 17 17:53:11.834493 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 17 17:53:11.835948 systemd[1]: Reached target getty.target - Login Prompts. Mar 17 17:53:11.836393 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 17 17:53:12.439501 systemd-networkd[1435]: ens192: Gained IPv6LL Mar 17 17:53:12.439917 systemd-timesyncd[1427]: Network configuration changed, trying to establish connection. Mar 17 17:53:12.441529 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 17 17:53:12.442102 systemd[1]: Reached target network-online.target - Network is Online. Mar 17 17:53:12.446505 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Mar 17 17:53:12.456070 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:53:12.459525 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 17 17:53:12.503987 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 17 17:53:12.515228 systemd[1]: coreos-metadata.service: Deactivated successfully. Mar 17 17:53:12.515383 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Mar 17 17:53:12.516106 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 17 17:53:14.825333 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:53:14.825884 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 17 17:53:14.828127 systemd[1]: Startup finished in 964ms (kernel) + 5.673s (initrd) + 6.559s (userspace) = 13.197s. Mar 17 17:53:14.831988 (kubelet)[1692]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:53:14.864228 login[1610]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 17 17:53:14.865582 login[1611]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 17 17:53:14.872040 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 17 17:53:14.876557 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 17 17:53:14.879058 systemd-logind[1520]: New session 1 of user core. Mar 17 17:53:14.881910 systemd-logind[1520]: New session 2 of user core. Mar 17 17:53:14.886282 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 17 17:53:14.890557 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 17 17:53:14.893156 (systemd)[1699]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 17 17:53:14.956774 systemd[1699]: Queued start job for default target default.target. Mar 17 17:53:14.964106 systemd[1699]: Created slice app.slice - User Application Slice. Mar 17 17:53:14.964121 systemd[1699]: Reached target paths.target - Paths. Mar 17 17:53:14.964130 systemd[1699]: Reached target timers.target - Timers. Mar 17 17:53:14.965206 systemd[1699]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 17 17:53:14.973005 systemd[1699]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 17 17:53:14.973108 systemd[1699]: Reached target sockets.target - Sockets. Mar 17 17:53:14.973155 systemd[1699]: Reached target basic.target - Basic System. Mar 17 17:53:14.973229 systemd[1699]: Reached target default.target - Main User Target. Mar 17 17:53:14.973249 systemd[1699]: Startup finished in 75ms. Mar 17 17:53:14.973374 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 17 17:53:14.979475 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 17 17:53:14.980261 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 17 17:53:15.538678 kubelet[1692]: E0317 17:53:15.538622 1692 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:53:15.540176 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:53:15.540369 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:53:25.790579 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 17 17:53:25.801538 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:53:26.135726 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:53:26.138319 (kubelet)[1742]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:53:26.214761 kubelet[1742]: E0317 17:53:26.214724 1742 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:53:26.216922 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:53:26.217001 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:53:36.467288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 17 17:53:36.473469 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:53:36.765560 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:53:36.768153 (kubelet)[1758]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:53:36.798274 kubelet[1758]: E0317 17:53:36.798238 1758 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:53:36.800142 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:53:36.800226 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:54:59.253398 systemd-resolved[1411]: Clock change detected. Flushing caches. Mar 17 17:54:59.253728 systemd-timesyncd[1427]: Contacted time server 142.202.190.19:123 (2.flatcar.pool.ntp.org). Mar 17 17:54:59.253761 systemd-timesyncd[1427]: Initial clock synchronization to Mon 2025-03-17 17:54:59.253367 UTC. Mar 17 17:55:03.588883 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 17 17:55:03.599745 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:55:03.879920 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:55:03.883621 (kubelet)[1774]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:55:03.906865 kubelet[1774]: E0317 17:55:03.906840 1774 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:55:03.908415 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:55:03.908500 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:55:07.870507 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 17 17:55:07.871433 systemd[1]: Started sshd@0-139.178.70.104:22-147.75.109.163:42200.service - OpenSSH per-connection server daemon (147.75.109.163:42200). Mar 17 17:55:07.945929 sshd[1784]: Accepted publickey for core from 147.75.109.163 port 42200 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:55:07.947057 sshd-session[1784]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:55:07.950973 systemd-logind[1520]: New session 3 of user core. Mar 17 17:55:07.959733 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 17 17:55:08.021689 systemd[1]: Started sshd@1-139.178.70.104:22-147.75.109.163:42204.service - OpenSSH per-connection server daemon (147.75.109.163:42204). Mar 17 17:55:08.053912 sshd[1789]: Accepted publickey for core from 147.75.109.163 port 42204 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:55:08.054831 sshd-session[1789]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:55:08.057779 systemd-logind[1520]: New session 4 of user core. Mar 17 17:55:08.063699 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 17 17:55:08.112067 sshd[1791]: Connection closed by 147.75.109.163 port 42204 Mar 17 17:55:08.112406 sshd-session[1789]: pam_unix(sshd:session): session closed for user core Mar 17 17:55:08.121627 systemd[1]: sshd@1-139.178.70.104:22-147.75.109.163:42204.service: Deactivated successfully. Mar 17 17:55:08.122591 systemd[1]: session-4.scope: Deactivated successfully. Mar 17 17:55:08.123332 systemd-logind[1520]: Session 4 logged out. Waiting for processes to exit. Mar 17 17:55:08.127892 systemd[1]: Started sshd@2-139.178.70.104:22-147.75.109.163:42214.service - OpenSSH per-connection server daemon (147.75.109.163:42214). Mar 17 17:55:08.129719 systemd-logind[1520]: Removed session 4. Mar 17 17:55:08.158507 sshd[1796]: Accepted publickey for core from 147.75.109.163 port 42214 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:55:08.159222 sshd-session[1796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:55:08.162619 systemd-logind[1520]: New session 5 of user core. Mar 17 17:55:08.167704 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 17 17:55:08.215058 sshd[1798]: Connection closed by 147.75.109.163 port 42214 Mar 17 17:55:08.214980 sshd-session[1796]: pam_unix(sshd:session): session closed for user core Mar 17 17:55:08.223974 systemd[1]: sshd@2-139.178.70.104:22-147.75.109.163:42214.service: Deactivated successfully. Mar 17 17:55:08.225184 systemd[1]: session-5.scope: Deactivated successfully. Mar 17 17:55:08.225695 systemd-logind[1520]: Session 5 logged out. Waiting for processes to exit. Mar 17 17:55:08.227058 systemd[1]: Started sshd@3-139.178.70.104:22-147.75.109.163:42216.service - OpenSSH per-connection server daemon (147.75.109.163:42216). Mar 17 17:55:08.228904 systemd-logind[1520]: Removed session 5. Mar 17 17:55:08.263296 sshd[1803]: Accepted publickey for core from 147.75.109.163 port 42216 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:55:08.263970 sshd-session[1803]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:55:08.266199 systemd-logind[1520]: New session 6 of user core. Mar 17 17:55:08.272736 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 17 17:55:08.321003 sshd[1805]: Connection closed by 147.75.109.163 port 42216 Mar 17 17:55:08.321282 sshd-session[1803]: pam_unix(sshd:session): session closed for user core Mar 17 17:55:08.334563 systemd[1]: sshd@3-139.178.70.104:22-147.75.109.163:42216.service: Deactivated successfully. Mar 17 17:55:08.335412 systemd[1]: session-6.scope: Deactivated successfully. Mar 17 17:55:08.336183 systemd-logind[1520]: Session 6 logged out. Waiting for processes to exit. Mar 17 17:55:08.339692 systemd[1]: Started sshd@4-139.178.70.104:22-147.75.109.163:42218.service - OpenSSH per-connection server daemon (147.75.109.163:42218). Mar 17 17:55:08.341738 systemd-logind[1520]: Removed session 6. Mar 17 17:55:08.370534 sshd[1810]: Accepted publickey for core from 147.75.109.163 port 42218 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:55:08.371240 sshd-session[1810]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:55:08.374221 systemd-logind[1520]: New session 7 of user core. Mar 17 17:55:08.377652 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 17 17:55:08.436600 sudo[1813]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 17 17:55:08.437042 sudo[1813]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:55:08.447174 sudo[1813]: pam_unix(sudo:session): session closed for user root Mar 17 17:55:08.448038 sshd[1812]: Connection closed by 147.75.109.163 port 42218 Mar 17 17:55:08.448491 sshd-session[1810]: pam_unix(sshd:session): session closed for user core Mar 17 17:55:08.456026 systemd[1]: sshd@4-139.178.70.104:22-147.75.109.163:42218.service: Deactivated successfully. Mar 17 17:55:08.456871 systemd[1]: session-7.scope: Deactivated successfully. Mar 17 17:55:08.457260 systemd-logind[1520]: Session 7 logged out. Waiting for processes to exit. Mar 17 17:55:08.458429 systemd[1]: Started sshd@5-139.178.70.104:22-147.75.109.163:42232.service - OpenSSH per-connection server daemon (147.75.109.163:42232). Mar 17 17:55:08.459889 systemd-logind[1520]: Removed session 7. Mar 17 17:55:08.498506 sshd[1818]: Accepted publickey for core from 147.75.109.163 port 42232 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:55:08.499355 sshd-session[1818]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:55:08.502192 systemd-logind[1520]: New session 8 of user core. Mar 17 17:55:08.513750 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 17 17:55:08.562155 sudo[1822]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 17 17:55:08.562315 sudo[1822]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:55:08.564133 sudo[1822]: pam_unix(sudo:session): session closed for user root Mar 17 17:55:08.566938 sudo[1821]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 17 17:55:08.567085 sudo[1821]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:55:08.574743 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:55:08.589820 augenrules[1844]: No rules Mar 17 17:55:08.590109 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:55:08.590234 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:55:08.591161 sudo[1821]: pam_unix(sudo:session): session closed for user root Mar 17 17:55:08.591815 sshd[1820]: Connection closed by 147.75.109.163 port 42232 Mar 17 17:55:08.591977 sshd-session[1818]: pam_unix(sshd:session): session closed for user core Mar 17 17:55:08.594811 systemd[1]: sshd@5-139.178.70.104:22-147.75.109.163:42232.service: Deactivated successfully. Mar 17 17:55:08.595555 systemd[1]: session-8.scope: Deactivated successfully. Mar 17 17:55:08.596273 systemd-logind[1520]: Session 8 logged out. Waiting for processes to exit. Mar 17 17:55:08.596888 systemd[1]: Started sshd@6-139.178.70.104:22-147.75.109.163:42242.service - OpenSSH per-connection server daemon (147.75.109.163:42242). Mar 17 17:55:08.597879 systemd-logind[1520]: Removed session 8. Mar 17 17:55:08.639748 sshd[1852]: Accepted publickey for core from 147.75.109.163 port 42242 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:55:08.640793 sshd-session[1852]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:55:08.642970 systemd-logind[1520]: New session 9 of user core. Mar 17 17:55:08.653656 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 17 17:55:08.704262 sudo[1855]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 17 17:55:08.704772 sudo[1855]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:55:09.000704 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 17 17:55:09.000786 (dockerd)[1872]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 17 17:55:09.245005 dockerd[1872]: time="2025-03-17T17:55:09.244971367Z" level=info msg="Starting up" Mar 17 17:55:09.337164 dockerd[1872]: time="2025-03-17T17:55:09.337112494Z" level=info msg="Loading containers: start." Mar 17 17:55:09.456568 kernel: Initializing XFRM netlink socket Mar 17 17:55:09.520171 systemd-networkd[1435]: docker0: Link UP Mar 17 17:55:09.540402 dockerd[1872]: time="2025-03-17T17:55:09.540373470Z" level=info msg="Loading containers: done." Mar 17 17:55:09.549573 dockerd[1872]: time="2025-03-17T17:55:09.549540904Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 17 17:55:09.549646 dockerd[1872]: time="2025-03-17T17:55:09.549604839Z" level=info msg="Docker daemon" commit=8b539b8df24032dabeaaa099cf1d0535ef0286a3 containerd-snapshotter=false storage-driver=overlay2 version=27.2.1 Mar 17 17:55:09.549664 dockerd[1872]: time="2025-03-17T17:55:09.549659622Z" level=info msg="Daemon has completed initialization" Mar 17 17:55:09.565829 dockerd[1872]: time="2025-03-17T17:55:09.565805984Z" level=info msg="API listen on /run/docker.sock" Mar 17 17:55:09.565863 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 17 17:55:10.452969 containerd[1540]: time="2025-03-17T17:55:10.452753950Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 17 17:55:11.062501 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3864520817.mount: Deactivated successfully. Mar 17 17:55:12.274685 containerd[1540]: time="2025-03-17T17:55:12.274639179Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:12.277170 containerd[1540]: time="2025-03-17T17:55:12.277136668Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.11: active requests=0, bytes read=32674573" Mar 17 17:55:12.279468 containerd[1540]: time="2025-03-17T17:55:12.279436214Z" level=info msg="ImageCreate event name:\"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:12.283772 containerd[1540]: time="2025-03-17T17:55:12.283738400Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:12.284616 containerd[1540]: time="2025-03-17T17:55:12.284465696Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.11\" with image id \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\", size \"32671373\" in 1.83168288s" Mar 17 17:55:12.284616 containerd[1540]: time="2025-03-17T17:55:12.284490516Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\"" Mar 17 17:55:12.301655 containerd[1540]: time="2025-03-17T17:55:12.301616076Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 17 17:55:13.172745 update_engine[1522]: I20250317 17:55:13.172688 1522 update_attempter.cc:509] Updating boot flags... Mar 17 17:55:13.216589 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (2132) Mar 17 17:55:13.320552 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (2136) Mar 17 17:55:13.790115 containerd[1540]: time="2025-03-17T17:55:13.790033240Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:13.790642 containerd[1540]: time="2025-03-17T17:55:13.790617178Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.11: active requests=0, bytes read=29619772" Mar 17 17:55:13.791103 containerd[1540]: time="2025-03-17T17:55:13.790906691Z" level=info msg="ImageCreate event name:\"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:13.792492 containerd[1540]: time="2025-03-17T17:55:13.792463456Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:13.793163 containerd[1540]: time="2025-03-17T17:55:13.793096520Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.11\" with image id \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\", size \"31107380\" in 1.491447809s" Mar 17 17:55:13.793163 containerd[1540]: time="2025-03-17T17:55:13.793113431Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\"" Mar 17 17:55:13.805957 containerd[1540]: time="2025-03-17T17:55:13.805938234Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 17 17:55:14.123862 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 17 17:55:14.133664 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:55:14.195855 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:55:14.198250 (kubelet)[2158]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:55:14.269358 kubelet[2158]: E0317 17:55:14.269317 2158 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:55:14.271024 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:55:14.271146 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:55:15.373752 containerd[1540]: time="2025-03-17T17:55:15.373724597Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:15.374588 containerd[1540]: time="2025-03-17T17:55:15.374532776Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.11: active requests=0, bytes read=17903309" Mar 17 17:55:15.374948 containerd[1540]: time="2025-03-17T17:55:15.374932844Z" level=info msg="ImageCreate event name:\"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:15.376341 containerd[1540]: time="2025-03-17T17:55:15.376318658Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:15.376995 containerd[1540]: time="2025-03-17T17:55:15.376922633Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.11\" with image id \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\", size \"19390935\" in 1.570866714s" Mar 17 17:55:15.376995 containerd[1540]: time="2025-03-17T17:55:15.376940173Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\"" Mar 17 17:55:15.390181 containerd[1540]: time="2025-03-17T17:55:15.390096657Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 17 17:55:16.358631 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1376290387.mount: Deactivated successfully. Mar 17 17:55:17.184955 containerd[1540]: time="2025-03-17T17:55:17.184750769Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:17.185237 containerd[1540]: time="2025-03-17T17:55:17.185170977Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.11: active requests=0, bytes read=29185372" Mar 17 17:55:17.185485 containerd[1540]: time="2025-03-17T17:55:17.185462357Z" level=info msg="ImageCreate event name:\"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:17.186594 containerd[1540]: time="2025-03-17T17:55:17.186579595Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:17.187078 containerd[1540]: time="2025-03-17T17:55:17.187062165Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.11\" with image id \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\", repo tag \"registry.k8s.io/kube-proxy:v1.30.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\", size \"29184391\" in 1.796942463s" Mar 17 17:55:17.187190 containerd[1540]: time="2025-03-17T17:55:17.187125189Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\"" Mar 17 17:55:17.201678 containerd[1540]: time="2025-03-17T17:55:17.201649390Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 17 17:55:17.764529 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2411413704.mount: Deactivated successfully. Mar 17 17:55:18.467910 containerd[1540]: time="2025-03-17T17:55:18.467863304Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:18.475626 containerd[1540]: time="2025-03-17T17:55:18.475579342Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" Mar 17 17:55:18.484190 containerd[1540]: time="2025-03-17T17:55:18.484140101Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:18.493884 containerd[1540]: time="2025-03-17T17:55:18.493853808Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:18.494458 containerd[1540]: time="2025-03-17T17:55:18.494444211Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.292769681s" Mar 17 17:55:18.494618 containerd[1540]: time="2025-03-17T17:55:18.494502059Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 17 17:55:18.508938 containerd[1540]: time="2025-03-17T17:55:18.508889647Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 17 17:55:19.022268 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4238287721.mount: Deactivated successfully. Mar 17 17:55:19.023819 containerd[1540]: time="2025-03-17T17:55:19.023797671Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:19.024341 containerd[1540]: time="2025-03-17T17:55:19.024317616Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322290" Mar 17 17:55:19.025015 containerd[1540]: time="2025-03-17T17:55:19.024836191Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:19.026088 containerd[1540]: time="2025-03-17T17:55:19.026075429Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:19.026417 containerd[1540]: time="2025-03-17T17:55:19.026401872Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 517.388483ms" Mar 17 17:55:19.026452 containerd[1540]: time="2025-03-17T17:55:19.026417002Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Mar 17 17:55:19.041258 containerd[1540]: time="2025-03-17T17:55:19.041237323Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 17 17:55:19.564945 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2196831627.mount: Deactivated successfully. Mar 17 17:55:23.642698 containerd[1540]: time="2025-03-17T17:55:23.642661112Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:23.643358 containerd[1540]: time="2025-03-17T17:55:23.643039933Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238571" Mar 17 17:55:23.643787 containerd[1540]: time="2025-03-17T17:55:23.643765921Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:23.645475 containerd[1540]: time="2025-03-17T17:55:23.645459394Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:23.646449 containerd[1540]: time="2025-03-17T17:55:23.646193547Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 4.604842861s" Mar 17 17:55:23.646449 containerd[1540]: time="2025-03-17T17:55:23.646211639Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Mar 17 17:55:24.317941 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 17 17:55:24.325690 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:55:24.511726 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:55:24.514696 (kubelet)[2354]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:55:24.578087 kubelet[2354]: E0317 17:55:24.577900 2354 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:55:24.579476 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:55:24.579562 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:55:25.132673 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:55:25.140722 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:55:25.154755 systemd[1]: Reloading requested from client PID 2368 ('systemctl') (unit session-9.scope)... Mar 17 17:55:25.154774 systemd[1]: Reloading... Mar 17 17:55:25.213585 zram_generator::config[2405]: No configuration found. Mar 17 17:55:25.266056 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Mar 17 17:55:25.281035 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:55:25.324285 systemd[1]: Reloading finished in 169 ms. Mar 17 17:55:25.362605 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 17 17:55:25.362673 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 17 17:55:25.362885 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:55:25.368754 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:55:25.651186 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:55:25.658867 (kubelet)[2473]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 17:55:25.721785 kubelet[2473]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:55:25.721785 kubelet[2473]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 17:55:25.721785 kubelet[2473]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:55:25.728332 kubelet[2473]: I0317 17:55:25.728003 2473 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 17:55:26.082585 kubelet[2473]: I0317 17:55:26.082263 2473 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 17:55:26.082585 kubelet[2473]: I0317 17:55:26.082282 2473 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 17:55:26.082585 kubelet[2473]: I0317 17:55:26.082429 2473 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 17:55:26.103593 kubelet[2473]: I0317 17:55:26.103575 2473 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:55:26.107234 kubelet[2473]: E0317 17:55:26.107146 2473 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://139.178.70.104:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:26.117930 kubelet[2473]: I0317 17:55:26.117905 2473 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 17:55:26.121158 kubelet[2473]: I0317 17:55:26.121120 2473 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 17:55:26.122502 kubelet[2473]: I0317 17:55:26.121161 2473 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 17:55:26.122973 kubelet[2473]: I0317 17:55:26.122957 2473 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 17:55:26.123003 kubelet[2473]: I0317 17:55:26.122977 2473 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 17:55:26.123096 kubelet[2473]: I0317 17:55:26.123085 2473 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:55:26.123775 kubelet[2473]: I0317 17:55:26.123765 2473 kubelet.go:400] "Attempting to sync node with API server" Mar 17 17:55:26.123801 kubelet[2473]: I0317 17:55:26.123779 2473 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 17:55:26.124418 kubelet[2473]: I0317 17:55:26.124404 2473 kubelet.go:312] "Adding apiserver pod source" Mar 17 17:55:26.125466 kubelet[2473]: I0317 17:55:26.125319 2473 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 17:55:26.125873 kubelet[2473]: W0317 17:55:26.125836 2473 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:26.125911 kubelet[2473]: E0317 17:55:26.125877 2473 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:26.128580 kubelet[2473]: W0317 17:55:26.128292 2473 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:26.128580 kubelet[2473]: E0317 17:55:26.128328 2473 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:26.132068 kubelet[2473]: I0317 17:55:26.132048 2473 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Mar 17 17:55:26.133545 kubelet[2473]: I0317 17:55:26.133419 2473 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 17:55:26.135530 kubelet[2473]: W0317 17:55:26.135150 2473 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 17 17:55:26.137394 kubelet[2473]: I0317 17:55:26.137372 2473 server.go:1264] "Started kubelet" Mar 17 17:55:26.143152 kubelet[2473]: E0317 17:55:26.143077 2473 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.104:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.104:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.182da8b4fbfab1d4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-03-17 17:55:26.137340372 +0000 UTC m=+0.476183843,LastTimestamp:2025-03-17 17:55:26.137340372 +0000 UTC m=+0.476183843,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 17 17:55:26.143255 kubelet[2473]: I0317 17:55:26.143198 2473 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 17:55:26.143541 kubelet[2473]: I0317 17:55:26.143384 2473 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 17:55:26.143541 kubelet[2473]: I0317 17:55:26.143410 2473 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 17:55:26.143840 kubelet[2473]: I0317 17:55:26.143826 2473 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 17:55:26.145800 kubelet[2473]: I0317 17:55:26.145539 2473 server.go:455] "Adding debug handlers to kubelet server" Mar 17 17:55:26.147920 kubelet[2473]: I0317 17:55:26.147628 2473 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 17:55:26.147920 kubelet[2473]: I0317 17:55:26.147683 2473 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 17:55:26.147920 kubelet[2473]: I0317 17:55:26.147717 2473 reconciler.go:26] "Reconciler: start to sync state" Mar 17 17:55:26.147920 kubelet[2473]: W0317 17:55:26.147893 2473 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:26.147920 kubelet[2473]: E0317 17:55:26.147918 2473 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:26.149241 kubelet[2473]: E0317 17:55:26.149216 2473 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="200ms" Mar 17 17:55:26.150106 kubelet[2473]: I0317 17:55:26.150093 2473 factory.go:221] Registration of the systemd container factory successfully Mar 17 17:55:26.150138 kubelet[2473]: I0317 17:55:26.150129 2473 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 17:55:26.150328 kubelet[2473]: E0317 17:55:26.150315 2473 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 17:55:26.150750 kubelet[2473]: I0317 17:55:26.150739 2473 factory.go:221] Registration of the containerd container factory successfully Mar 17 17:55:26.158078 kubelet[2473]: I0317 17:55:26.158052 2473 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 17:55:26.158659 kubelet[2473]: I0317 17:55:26.158646 2473 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 17:55:26.158659 kubelet[2473]: I0317 17:55:26.158660 2473 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 17:55:26.158711 kubelet[2473]: I0317 17:55:26.158673 2473 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 17:55:26.158711 kubelet[2473]: E0317 17:55:26.158695 2473 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 17:55:26.162426 kubelet[2473]: W0317 17:55:26.162397 2473 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:26.162460 kubelet[2473]: E0317 17:55:26.162430 2473 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:26.172480 kubelet[2473]: I0317 17:55:26.172433 2473 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 17:55:26.172480 kubelet[2473]: I0317 17:55:26.172442 2473 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 17:55:26.172480 kubelet[2473]: I0317 17:55:26.172452 2473 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:55:26.173344 kubelet[2473]: I0317 17:55:26.173329 2473 policy_none.go:49] "None policy: Start" Mar 17 17:55:26.173611 kubelet[2473]: I0317 17:55:26.173601 2473 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 17:55:26.173611 kubelet[2473]: I0317 17:55:26.173612 2473 state_mem.go:35] "Initializing new in-memory state store" Mar 17 17:55:26.176791 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 17 17:55:26.184993 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 17 17:55:26.187532 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 17 17:55:26.194005 kubelet[2473]: I0317 17:55:26.193983 2473 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 17:55:26.194188 kubelet[2473]: I0317 17:55:26.194090 2473 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 17:55:26.194188 kubelet[2473]: I0317 17:55:26.194155 2473 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 17:55:26.195466 kubelet[2473]: E0317 17:55:26.195429 2473 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Mar 17 17:55:26.248859 kubelet[2473]: I0317 17:55:26.248838 2473 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:55:26.249154 kubelet[2473]: E0317 17:55:26.249135 2473 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Mar 17 17:55:26.259382 kubelet[2473]: I0317 17:55:26.259337 2473 topology_manager.go:215] "Topology Admit Handler" podUID="0ef19627ca24862cf024fdd8913ca0e9" podNamespace="kube-system" podName="kube-apiserver-localhost" Mar 17 17:55:26.260239 kubelet[2473]: I0317 17:55:26.260213 2473 topology_manager.go:215] "Topology Admit Handler" podUID="23a18e2dc14f395c5f1bea711a5a9344" podNamespace="kube-system" podName="kube-controller-manager-localhost" Mar 17 17:55:26.261489 kubelet[2473]: I0317 17:55:26.261397 2473 topology_manager.go:215] "Topology Admit Handler" podUID="d79ab404294384d4bcc36fb5b5509bbb" podNamespace="kube-system" podName="kube-scheduler-localhost" Mar 17 17:55:26.265743 systemd[1]: Created slice kubepods-burstable-pod0ef19627ca24862cf024fdd8913ca0e9.slice - libcontainer container kubepods-burstable-pod0ef19627ca24862cf024fdd8913ca0e9.slice. Mar 17 17:55:26.275690 systemd[1]: Created slice kubepods-burstable-pod23a18e2dc14f395c5f1bea711a5a9344.slice - libcontainer container kubepods-burstable-pod23a18e2dc14f395c5f1bea711a5a9344.slice. Mar 17 17:55:26.287854 systemd[1]: Created slice kubepods-burstable-podd79ab404294384d4bcc36fb5b5509bbb.slice - libcontainer container kubepods-burstable-podd79ab404294384d4bcc36fb5b5509bbb.slice. Mar 17 17:55:26.349773 kubelet[2473]: E0317 17:55:26.349695 2473 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="400ms" Mar 17 17:55:26.353276 kubelet[2473]: I0317 17:55:26.353238 2473 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0ef19627ca24862cf024fdd8913ca0e9-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"0ef19627ca24862cf024fdd8913ca0e9\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:55:26.451097 kubelet[2473]: I0317 17:55:26.451072 2473 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:55:26.451369 kubelet[2473]: E0317 17:55:26.451349 2473 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Mar 17 17:55:26.453637 kubelet[2473]: I0317 17:55:26.453620 2473 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:55:26.453719 kubelet[2473]: I0317 17:55:26.453641 2473 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:55:26.453719 kubelet[2473]: I0317 17:55:26.453656 2473 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:55:26.453719 kubelet[2473]: I0317 17:55:26.453668 2473 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d79ab404294384d4bcc36fb5b5509bbb-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d79ab404294384d4bcc36fb5b5509bbb\") " pod="kube-system/kube-scheduler-localhost" Mar 17 17:55:26.453719 kubelet[2473]: I0317 17:55:26.453704 2473 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0ef19627ca24862cf024fdd8913ca0e9-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"0ef19627ca24862cf024fdd8913ca0e9\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:55:26.453719 kubelet[2473]: I0317 17:55:26.453718 2473 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:55:26.453840 kubelet[2473]: I0317 17:55:26.453730 2473 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:55:26.453840 kubelet[2473]: I0317 17:55:26.453743 2473 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0ef19627ca24862cf024fdd8913ca0e9-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"0ef19627ca24862cf024fdd8913ca0e9\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:55:26.574059 containerd[1540]: time="2025-03-17T17:55:26.573956974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:0ef19627ca24862cf024fdd8913ca0e9,Namespace:kube-system,Attempt:0,}" Mar 17 17:55:26.586808 containerd[1540]: time="2025-03-17T17:55:26.586766464Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:23a18e2dc14f395c5f1bea711a5a9344,Namespace:kube-system,Attempt:0,}" Mar 17 17:55:26.590307 containerd[1540]: time="2025-03-17T17:55:26.590280494Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d79ab404294384d4bcc36fb5b5509bbb,Namespace:kube-system,Attempt:0,}" Mar 17 17:55:26.750265 kubelet[2473]: E0317 17:55:26.750216 2473 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="800ms" Mar 17 17:55:26.852609 kubelet[2473]: I0317 17:55:26.852587 2473 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:55:26.852861 kubelet[2473]: E0317 17:55:26.852838 2473 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Mar 17 17:55:27.109806 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3734373624.mount: Deactivated successfully. Mar 17 17:55:27.111573 containerd[1540]: time="2025-03-17T17:55:27.111441318Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:55:27.112170 containerd[1540]: time="2025-03-17T17:55:27.112144252Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Mar 17 17:55:27.112657 containerd[1540]: time="2025-03-17T17:55:27.112641651Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:55:27.114593 containerd[1540]: time="2025-03-17T17:55:27.114576698Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:55:27.115041 containerd[1540]: time="2025-03-17T17:55:27.114934081Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 17 17:55:27.115079 containerd[1540]: time="2025-03-17T17:55:27.115051604Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 541.035129ms" Mar 17 17:55:27.116044 containerd[1540]: time="2025-03-17T17:55:27.115483971Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 17 17:55:27.116044 containerd[1540]: time="2025-03-17T17:55:27.115799849Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:55:27.116358 containerd[1540]: time="2025-03-17T17:55:27.116344463Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:55:27.118364 containerd[1540]: time="2025-03-17T17:55:27.118209640Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 527.886605ms" Mar 17 17:55:27.118986 containerd[1540]: time="2025-03-17T17:55:27.118967229Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 532.137658ms" Mar 17 17:55:27.227270 containerd[1540]: time="2025-03-17T17:55:27.226207952Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:55:27.227270 containerd[1540]: time="2025-03-17T17:55:27.226260667Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:55:27.227270 containerd[1540]: time="2025-03-17T17:55:27.226273055Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:55:27.227270 containerd[1540]: time="2025-03-17T17:55:27.226355490Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:55:27.229831 containerd[1540]: time="2025-03-17T17:55:27.225451247Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:55:27.229831 containerd[1540]: time="2025-03-17T17:55:27.228932986Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:55:27.229831 containerd[1540]: time="2025-03-17T17:55:27.228942792Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:55:27.229831 containerd[1540]: time="2025-03-17T17:55:27.228981586Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:55:27.233602 containerd[1540]: time="2025-03-17T17:55:27.233547019Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:55:27.233810 containerd[1540]: time="2025-03-17T17:55:27.233786444Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:55:27.233849 containerd[1540]: time="2025-03-17T17:55:27.233821552Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:55:27.234016 containerd[1540]: time="2025-03-17T17:55:27.233980558Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:55:27.246685 systemd[1]: Started cri-containerd-15f6a92de503ebd979ea1000532551345b43aeb4a3062ff75c4422bcc7d191d6.scope - libcontainer container 15f6a92de503ebd979ea1000532551345b43aeb4a3062ff75c4422bcc7d191d6. Mar 17 17:55:27.249928 systemd[1]: Started cri-containerd-42ef184166b87bf7ca0989807d0dff748d3532c3dd33c880b334ad92335b3cbf.scope - libcontainer container 42ef184166b87bf7ca0989807d0dff748d3532c3dd33c880b334ad92335b3cbf. Mar 17 17:55:27.252902 systemd[1]: Started cri-containerd-ca94ea953a4946e440933fcace59d915644eef8b0fc6149ed44074f74932a360.scope - libcontainer container ca94ea953a4946e440933fcace59d915644eef8b0fc6149ed44074f74932a360. Mar 17 17:55:27.292266 containerd[1540]: time="2025-03-17T17:55:27.292245493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:0ef19627ca24862cf024fdd8913ca0e9,Namespace:kube-system,Attempt:0,} returns sandbox id \"15f6a92de503ebd979ea1000532551345b43aeb4a3062ff75c4422bcc7d191d6\"" Mar 17 17:55:27.297046 containerd[1540]: time="2025-03-17T17:55:27.296456059Z" level=info msg="CreateContainer within sandbox \"15f6a92de503ebd979ea1000532551345b43aeb4a3062ff75c4422bcc7d191d6\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 17 17:55:27.302996 containerd[1540]: time="2025-03-17T17:55:27.302971857Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:23a18e2dc14f395c5f1bea711a5a9344,Namespace:kube-system,Attempt:0,} returns sandbox id \"ca94ea953a4946e440933fcace59d915644eef8b0fc6149ed44074f74932a360\"" Mar 17 17:55:27.309588 containerd[1540]: time="2025-03-17T17:55:27.309464046Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d79ab404294384d4bcc36fb5b5509bbb,Namespace:kube-system,Attempt:0,} returns sandbox id \"42ef184166b87bf7ca0989807d0dff748d3532c3dd33c880b334ad92335b3cbf\"" Mar 17 17:55:27.313972 containerd[1540]: time="2025-03-17T17:55:27.313955702Z" level=info msg="CreateContainer within sandbox \"ca94ea953a4946e440933fcace59d915644eef8b0fc6149ed44074f74932a360\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 17 17:55:27.314144 containerd[1540]: time="2025-03-17T17:55:27.314084237Z" level=info msg="CreateContainer within sandbox \"42ef184166b87bf7ca0989807d0dff748d3532c3dd33c880b334ad92335b3cbf\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 17 17:55:27.318698 containerd[1540]: time="2025-03-17T17:55:27.318649385Z" level=info msg="CreateContainer within sandbox \"15f6a92de503ebd979ea1000532551345b43aeb4a3062ff75c4422bcc7d191d6\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"fdc1b5dd04f5f46223b7dec58bfa5a32d492780e789c82095a1a9f17c0a3f908\"" Mar 17 17:55:27.319002 containerd[1540]: time="2025-03-17T17:55:27.318989167Z" level=info msg="StartContainer for \"fdc1b5dd04f5f46223b7dec58bfa5a32d492780e789c82095a1a9f17c0a3f908\"" Mar 17 17:55:27.324585 containerd[1540]: time="2025-03-17T17:55:27.324527371Z" level=info msg="CreateContainer within sandbox \"ca94ea953a4946e440933fcace59d915644eef8b0fc6149ed44074f74932a360\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"3c4002c5911597b827324618ad0f68e68d5826872eba89344d7d70701b29c0f8\"" Mar 17 17:55:27.325302 containerd[1540]: time="2025-03-17T17:55:27.325209143Z" level=info msg="StartContainer for \"3c4002c5911597b827324618ad0f68e68d5826872eba89344d7d70701b29c0f8\"" Mar 17 17:55:27.326514 containerd[1540]: time="2025-03-17T17:55:27.326496710Z" level=info msg="CreateContainer within sandbox \"42ef184166b87bf7ca0989807d0dff748d3532c3dd33c880b334ad92335b3cbf\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"dba40fc9fee30e1707a216347721d0aabbb83a96782c64bddb0e138fae80e4fb\"" Mar 17 17:55:27.326912 containerd[1540]: time="2025-03-17T17:55:27.326860755Z" level=info msg="StartContainer for \"dba40fc9fee30e1707a216347721d0aabbb83a96782c64bddb0e138fae80e4fb\"" Mar 17 17:55:27.348698 systemd[1]: Started cri-containerd-3c4002c5911597b827324618ad0f68e68d5826872eba89344d7d70701b29c0f8.scope - libcontainer container 3c4002c5911597b827324618ad0f68e68d5826872eba89344d7d70701b29c0f8. Mar 17 17:55:27.349570 systemd[1]: Started cri-containerd-fdc1b5dd04f5f46223b7dec58bfa5a32d492780e789c82095a1a9f17c0a3f908.scope - libcontainer container fdc1b5dd04f5f46223b7dec58bfa5a32d492780e789c82095a1a9f17c0a3f908. Mar 17 17:55:27.353142 systemd[1]: Started cri-containerd-dba40fc9fee30e1707a216347721d0aabbb83a96782c64bddb0e138fae80e4fb.scope - libcontainer container dba40fc9fee30e1707a216347721d0aabbb83a96782c64bddb0e138fae80e4fb. Mar 17 17:55:27.364371 kubelet[2473]: W0317 17:55:27.363376 2473 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:27.364371 kubelet[2473]: E0317 17:55:27.363752 2473 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:27.367053 kubelet[2473]: W0317 17:55:27.367023 2473 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:27.367099 kubelet[2473]: E0317 17:55:27.367059 2473 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:27.387193 containerd[1540]: time="2025-03-17T17:55:27.387086666Z" level=info msg="StartContainer for \"3c4002c5911597b827324618ad0f68e68d5826872eba89344d7d70701b29c0f8\" returns successfully" Mar 17 17:55:27.389846 kubelet[2473]: W0317 17:55:27.389633 2473 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:27.389846 kubelet[2473]: E0317 17:55:27.389671 2473 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:27.399481 containerd[1540]: time="2025-03-17T17:55:27.399453680Z" level=info msg="StartContainer for \"fdc1b5dd04f5f46223b7dec58bfa5a32d492780e789c82095a1a9f17c0a3f908\" returns successfully" Mar 17 17:55:27.413570 containerd[1540]: time="2025-03-17T17:55:27.413495462Z" level=info msg="StartContainer for \"dba40fc9fee30e1707a216347721d0aabbb83a96782c64bddb0e138fae80e4fb\" returns successfully" Mar 17 17:55:27.550665 kubelet[2473]: E0317 17:55:27.550621 2473 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="1.6s" Mar 17 17:55:27.592137 kubelet[2473]: W0317 17:55:27.592085 2473 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:27.592137 kubelet[2473]: E0317 17:55:27.592124 2473 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Mar 17 17:55:27.654381 kubelet[2473]: I0317 17:55:27.654368 2473 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:55:27.654693 kubelet[2473]: E0317 17:55:27.654679 2473 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Mar 17 17:55:29.107341 kubelet[2473]: E0317 17:55:29.107316 2473 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Mar 17 17:55:29.152670 kubelet[2473]: E0317 17:55:29.152622 2473 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Mar 17 17:55:29.256582 kubelet[2473]: I0317 17:55:29.256433 2473 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:55:29.268525 kubelet[2473]: I0317 17:55:29.268367 2473 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Mar 17 17:55:29.274259 kubelet[2473]: E0317 17:55:29.274241 2473 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:55:30.128835 kubelet[2473]: I0317 17:55:30.128746 2473 apiserver.go:52] "Watching apiserver" Mar 17 17:55:30.147957 kubelet[2473]: I0317 17:55:30.147921 2473 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 17:55:30.714822 systemd[1]: Reloading requested from client PID 2745 ('systemctl') (unit session-9.scope)... Mar 17 17:55:30.714833 systemd[1]: Reloading... Mar 17 17:55:30.773602 zram_generator::config[2790]: No configuration found. Mar 17 17:55:30.831356 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Mar 17 17:55:30.846322 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:55:30.896205 systemd[1]: Reloading finished in 181 ms. Mar 17 17:55:30.917037 kubelet[2473]: I0317 17:55:30.916991 2473 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:55:30.917160 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:55:30.931265 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 17:55:30.931408 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:55:30.936827 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:55:31.135644 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:55:31.144848 (kubelet)[2850]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 17:55:31.233291 kubelet[2850]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:55:31.233498 kubelet[2850]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 17:55:31.233524 kubelet[2850]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:55:31.233634 kubelet[2850]: I0317 17:55:31.233613 2850 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 17:55:31.236050 kubelet[2850]: I0317 17:55:31.236035 2850 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 17:55:31.236050 kubelet[2850]: I0317 17:55:31.236048 2850 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 17:55:31.236160 kubelet[2850]: I0317 17:55:31.236149 2850 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 17:55:31.236912 kubelet[2850]: I0317 17:55:31.236901 2850 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 17 17:55:31.237580 kubelet[2850]: I0317 17:55:31.237535 2850 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:55:31.255396 kubelet[2850]: I0317 17:55:31.255381 2850 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 17:55:31.256458 kubelet[2850]: I0317 17:55:31.255719 2850 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 17:55:31.256458 kubelet[2850]: I0317 17:55:31.255737 2850 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 17:55:31.256458 kubelet[2850]: I0317 17:55:31.255840 2850 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 17:55:31.256458 kubelet[2850]: I0317 17:55:31.255847 2850 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 17:55:31.256458 kubelet[2850]: I0317 17:55:31.255873 2850 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:55:31.256641 kubelet[2850]: I0317 17:55:31.255931 2850 kubelet.go:400] "Attempting to sync node with API server" Mar 17 17:55:31.256641 kubelet[2850]: I0317 17:55:31.255939 2850 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 17:55:31.256641 kubelet[2850]: I0317 17:55:31.255951 2850 kubelet.go:312] "Adding apiserver pod source" Mar 17 17:55:31.256641 kubelet[2850]: I0317 17:55:31.255962 2850 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 17:55:31.277081 kubelet[2850]: I0317 17:55:31.277058 2850 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Mar 17 17:55:31.277194 kubelet[2850]: I0317 17:55:31.277183 2850 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 17:55:31.277728 kubelet[2850]: I0317 17:55:31.277417 2850 server.go:1264] "Started kubelet" Mar 17 17:55:31.279607 kubelet[2850]: I0317 17:55:31.277910 2850 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 17:55:31.279607 kubelet[2850]: I0317 17:55:31.278065 2850 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 17:55:31.279607 kubelet[2850]: I0317 17:55:31.278083 2850 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 17:55:31.279607 kubelet[2850]: I0317 17:55:31.278691 2850 server.go:455] "Adding debug handlers to kubelet server" Mar 17 17:55:31.279607 kubelet[2850]: I0317 17:55:31.279357 2850 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 17:55:31.282674 kubelet[2850]: E0317 17:55:31.282662 2850 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 17:55:31.282858 kubelet[2850]: I0317 17:55:31.282852 2850 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 17:55:31.282977 kubelet[2850]: I0317 17:55:31.282972 2850 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 17:55:31.283076 kubelet[2850]: I0317 17:55:31.283071 2850 reconciler.go:26] "Reconciler: start to sync state" Mar 17 17:55:31.283494 kubelet[2850]: I0317 17:55:31.283485 2850 factory.go:221] Registration of the systemd container factory successfully Mar 17 17:55:31.283632 kubelet[2850]: I0317 17:55:31.283622 2850 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 17:55:31.284447 kubelet[2850]: I0317 17:55:31.284440 2850 factory.go:221] Registration of the containerd container factory successfully Mar 17 17:55:31.310636 kubelet[2850]: I0317 17:55:31.310616 2850 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 17:55:31.311458 kubelet[2850]: I0317 17:55:31.311440 2850 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 17:55:31.311542 kubelet[2850]: I0317 17:55:31.311536 2850 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 17:55:31.311622 kubelet[2850]: I0317 17:55:31.311608 2850 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 17:55:31.311697 kubelet[2850]: E0317 17:55:31.311678 2850 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 17:55:31.334059 kubelet[2850]: I0317 17:55:31.334043 2850 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 17:55:31.334059 kubelet[2850]: I0317 17:55:31.334053 2850 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 17:55:31.334168 kubelet[2850]: I0317 17:55:31.334074 2850 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:55:31.334187 kubelet[2850]: I0317 17:55:31.334177 2850 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 17 17:55:31.334203 kubelet[2850]: I0317 17:55:31.334183 2850 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 17 17:55:31.334203 kubelet[2850]: I0317 17:55:31.334194 2850 policy_none.go:49] "None policy: Start" Mar 17 17:55:31.334536 kubelet[2850]: I0317 17:55:31.334528 2850 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 17:55:31.334620 kubelet[2850]: I0317 17:55:31.334615 2850 state_mem.go:35] "Initializing new in-memory state store" Mar 17 17:55:31.334764 kubelet[2850]: I0317 17:55:31.334758 2850 state_mem.go:75] "Updated machine memory state" Mar 17 17:55:31.337297 kubelet[2850]: I0317 17:55:31.337287 2850 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 17:55:31.337680 kubelet[2850]: I0317 17:55:31.337419 2850 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 17:55:31.339049 kubelet[2850]: I0317 17:55:31.339042 2850 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 17:55:31.412457 kubelet[2850]: I0317 17:55:31.412336 2850 topology_manager.go:215] "Topology Admit Handler" podUID="0ef19627ca24862cf024fdd8913ca0e9" podNamespace="kube-system" podName="kube-apiserver-localhost" Mar 17 17:55:31.412457 kubelet[2850]: I0317 17:55:31.412412 2850 topology_manager.go:215] "Topology Admit Handler" podUID="23a18e2dc14f395c5f1bea711a5a9344" podNamespace="kube-system" podName="kube-controller-manager-localhost" Mar 17 17:55:31.413443 kubelet[2850]: I0317 17:55:31.413010 2850 topology_manager.go:215] "Topology Admit Handler" podUID="d79ab404294384d4bcc36fb5b5509bbb" podNamespace="kube-system" podName="kube-scheduler-localhost" Mar 17 17:55:31.420157 kubelet[2850]: E0317 17:55:31.420102 2850 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Mar 17 17:55:31.441086 kubelet[2850]: I0317 17:55:31.441070 2850 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:55:31.444458 kubelet[2850]: I0317 17:55:31.444001 2850 kubelet_node_status.go:112] "Node was previously registered" node="localhost" Mar 17 17:55:31.444458 kubelet[2850]: I0317 17:55:31.444047 2850 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Mar 17 17:55:31.484363 kubelet[2850]: I0317 17:55:31.484216 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:55:31.484363 kubelet[2850]: I0317 17:55:31.484239 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:55:31.484363 kubelet[2850]: I0317 17:55:31.484252 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:55:31.484363 kubelet[2850]: I0317 17:55:31.484262 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:55:31.484363 kubelet[2850]: I0317 17:55:31.484272 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0ef19627ca24862cf024fdd8913ca0e9-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"0ef19627ca24862cf024fdd8913ca0e9\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:55:31.484535 kubelet[2850]: I0317 17:55:31.484283 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0ef19627ca24862cf024fdd8913ca0e9-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"0ef19627ca24862cf024fdd8913ca0e9\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:55:31.484535 kubelet[2850]: I0317 17:55:31.484297 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0ef19627ca24862cf024fdd8913ca0e9-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"0ef19627ca24862cf024fdd8913ca0e9\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:55:31.484535 kubelet[2850]: I0317 17:55:31.484307 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:55:31.484535 kubelet[2850]: I0317 17:55:31.484316 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d79ab404294384d4bcc36fb5b5509bbb-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d79ab404294384d4bcc36fb5b5509bbb\") " pod="kube-system/kube-scheduler-localhost" Mar 17 17:55:32.256630 kubelet[2850]: I0317 17:55:32.256604 2850 apiserver.go:52] "Watching apiserver" Mar 17 17:55:32.283540 kubelet[2850]: I0317 17:55:32.283516 2850 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 17:55:32.575305 kubelet[2850]: E0317 17:55:32.574881 2850 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Mar 17 17:55:32.575598 kubelet[2850]: I0317 17:55:32.575438 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.575417168 podStartE2EDuration="1.575417168s" podCreationTimestamp="2025-03-17 17:55:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:55:32.574791237 +0000 UTC m=+1.415930504" watchObservedRunningTime="2025-03-17 17:55:32.575417168 +0000 UTC m=+1.416556428" Mar 17 17:55:32.594428 kubelet[2850]: I0317 17:55:32.594363 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.594349923 podStartE2EDuration="1.594349923s" podCreationTimestamp="2025-03-17 17:55:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:55:32.593912334 +0000 UTC m=+1.435051594" watchObservedRunningTime="2025-03-17 17:55:32.594349923 +0000 UTC m=+1.435489188" Mar 17 17:55:32.636196 kubelet[2850]: I0317 17:55:32.636058 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=3.6360465140000002 podStartE2EDuration="3.636046514s" podCreationTimestamp="2025-03-17 17:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:55:32.622136575 +0000 UTC m=+1.463275842" watchObservedRunningTime="2025-03-17 17:55:32.636046514 +0000 UTC m=+1.477185792" Mar 17 17:55:35.910873 sudo[1855]: pam_unix(sudo:session): session closed for user root Mar 17 17:55:35.912319 sshd[1854]: Connection closed by 147.75.109.163 port 42242 Mar 17 17:55:35.912980 sshd-session[1852]: pam_unix(sshd:session): session closed for user core Mar 17 17:55:35.915278 systemd[1]: sshd@6-139.178.70.104:22-147.75.109.163:42242.service: Deactivated successfully. Mar 17 17:55:35.916486 systemd[1]: session-9.scope: Deactivated successfully. Mar 17 17:55:35.916663 systemd[1]: session-9.scope: Consumed 2.541s CPU time, 185.2M memory peak, 0B memory swap peak. Mar 17 17:55:35.917116 systemd-logind[1520]: Session 9 logged out. Waiting for processes to exit. Mar 17 17:55:35.917906 systemd-logind[1520]: Removed session 9. Mar 17 17:55:46.009900 kubelet[2850]: I0317 17:55:46.009868 2850 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 17 17:55:46.010396 containerd[1540]: time="2025-03-17T17:55:46.010372566Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 17 17:55:46.011015 kubelet[2850]: I0317 17:55:46.010650 2850 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 17 17:55:46.718286 kubelet[2850]: I0317 17:55:46.718246 2850 topology_manager.go:215] "Topology Admit Handler" podUID="9df8a4c6-2d66-4d9e-babe-64212d7c1f63" podNamespace="kube-system" podName="kube-proxy-ssc2t" Mar 17 17:55:46.727446 systemd[1]: Created slice kubepods-besteffort-pod9df8a4c6_2d66_4d9e_babe_64212d7c1f63.slice - libcontainer container kubepods-besteffort-pod9df8a4c6_2d66_4d9e_babe_64212d7c1f63.slice. Mar 17 17:55:46.775972 kubelet[2850]: I0317 17:55:46.775938 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9df8a4c6-2d66-4d9e-babe-64212d7c1f63-lib-modules\") pod \"kube-proxy-ssc2t\" (UID: \"9df8a4c6-2d66-4d9e-babe-64212d7c1f63\") " pod="kube-system/kube-proxy-ssc2t" Mar 17 17:55:46.775972 kubelet[2850]: I0317 17:55:46.775966 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9df8a4c6-2d66-4d9e-babe-64212d7c1f63-xtables-lock\") pod \"kube-proxy-ssc2t\" (UID: \"9df8a4c6-2d66-4d9e-babe-64212d7c1f63\") " pod="kube-system/kube-proxy-ssc2t" Mar 17 17:55:46.776126 kubelet[2850]: I0317 17:55:46.775977 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jchqs\" (UniqueName: \"kubernetes.io/projected/9df8a4c6-2d66-4d9e-babe-64212d7c1f63-kube-api-access-jchqs\") pod \"kube-proxy-ssc2t\" (UID: \"9df8a4c6-2d66-4d9e-babe-64212d7c1f63\") " pod="kube-system/kube-proxy-ssc2t" Mar 17 17:55:46.776126 kubelet[2850]: I0317 17:55:46.775995 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/9df8a4c6-2d66-4d9e-babe-64212d7c1f63-kube-proxy\") pod \"kube-proxy-ssc2t\" (UID: \"9df8a4c6-2d66-4d9e-babe-64212d7c1f63\") " pod="kube-system/kube-proxy-ssc2t" Mar 17 17:55:47.044036 containerd[1540]: time="2025-03-17T17:55:47.043907731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-ssc2t,Uid:9df8a4c6-2d66-4d9e-babe-64212d7c1f63,Namespace:kube-system,Attempt:0,}" Mar 17 17:55:47.065758 kubelet[2850]: I0317 17:55:47.065523 2850 topology_manager.go:215] "Topology Admit Handler" podUID="b2a0081a-82c0-422d-8b6d-365e0b76d36a" podNamespace="tigera-operator" podName="tigera-operator-6479d6dc54-ll5wt" Mar 17 17:55:47.071449 systemd[1]: Created slice kubepods-besteffort-podb2a0081a_82c0_422d_8b6d_365e0b76d36a.slice - libcontainer container kubepods-besteffort-podb2a0081a_82c0_422d_8b6d_365e0b76d36a.slice. Mar 17 17:55:47.078114 kubelet[2850]: I0317 17:55:47.078058 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw2jw\" (UniqueName: \"kubernetes.io/projected/b2a0081a-82c0-422d-8b6d-365e0b76d36a-kube-api-access-lw2jw\") pod \"tigera-operator-6479d6dc54-ll5wt\" (UID: \"b2a0081a-82c0-422d-8b6d-365e0b76d36a\") " pod="tigera-operator/tigera-operator-6479d6dc54-ll5wt" Mar 17 17:55:47.078114 kubelet[2850]: I0317 17:55:47.078084 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b2a0081a-82c0-422d-8b6d-365e0b76d36a-var-lib-calico\") pod \"tigera-operator-6479d6dc54-ll5wt\" (UID: \"b2a0081a-82c0-422d-8b6d-365e0b76d36a\") " pod="tigera-operator/tigera-operator-6479d6dc54-ll5wt" Mar 17 17:55:47.230171 containerd[1540]: time="2025-03-17T17:55:47.230014506Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:55:47.230171 containerd[1540]: time="2025-03-17T17:55:47.230051141Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:55:47.230171 containerd[1540]: time="2025-03-17T17:55:47.230058891Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:55:47.230171 containerd[1540]: time="2025-03-17T17:55:47.230114828Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:55:47.247656 systemd[1]: Started cri-containerd-fc9534a8caccf70a802bfc93ebd5501b8963cf37ebd0de7bf27487d2a0f58b93.scope - libcontainer container fc9534a8caccf70a802bfc93ebd5501b8963cf37ebd0de7bf27487d2a0f58b93. Mar 17 17:55:47.260035 containerd[1540]: time="2025-03-17T17:55:47.260008150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-ssc2t,Uid:9df8a4c6-2d66-4d9e-babe-64212d7c1f63,Namespace:kube-system,Attempt:0,} returns sandbox id \"fc9534a8caccf70a802bfc93ebd5501b8963cf37ebd0de7bf27487d2a0f58b93\"" Mar 17 17:55:47.262058 containerd[1540]: time="2025-03-17T17:55:47.262038553Z" level=info msg="CreateContainer within sandbox \"fc9534a8caccf70a802bfc93ebd5501b8963cf37ebd0de7bf27487d2a0f58b93\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 17 17:55:47.376059 containerd[1540]: time="2025-03-17T17:55:47.376030859Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-ll5wt,Uid:b2a0081a-82c0-422d-8b6d-365e0b76d36a,Namespace:tigera-operator,Attempt:0,}" Mar 17 17:55:47.484151 containerd[1540]: time="2025-03-17T17:55:47.483688546Z" level=info msg="CreateContainer within sandbox \"fc9534a8caccf70a802bfc93ebd5501b8963cf37ebd0de7bf27487d2a0f58b93\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"5a4de8ebefe6ad498af36c4f8eb61ff633542adb4f525fffd04a36c7c8eb9709\"" Mar 17 17:55:47.484417 containerd[1540]: time="2025-03-17T17:55:47.484394445Z" level=info msg="StartContainer for \"5a4de8ebefe6ad498af36c4f8eb61ff633542adb4f525fffd04a36c7c8eb9709\"" Mar 17 17:55:47.490622 containerd[1540]: time="2025-03-17T17:55:47.488769267Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:55:47.490622 containerd[1540]: time="2025-03-17T17:55:47.488850099Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:55:47.490622 containerd[1540]: time="2025-03-17T17:55:47.488865124Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:55:47.490622 containerd[1540]: time="2025-03-17T17:55:47.488950979Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:55:47.504689 systemd[1]: Started cri-containerd-af11619c2a1bf7fba6cb62778b816572756c54a7680f66e618247ef32dc34e90.scope - libcontainer container af11619c2a1bf7fba6cb62778b816572756c54a7680f66e618247ef32dc34e90. Mar 17 17:55:47.507519 systemd[1]: Started cri-containerd-5a4de8ebefe6ad498af36c4f8eb61ff633542adb4f525fffd04a36c7c8eb9709.scope - libcontainer container 5a4de8ebefe6ad498af36c4f8eb61ff633542adb4f525fffd04a36c7c8eb9709. Mar 17 17:55:47.542047 containerd[1540]: time="2025-03-17T17:55:47.542021485Z" level=info msg="StartContainer for \"5a4de8ebefe6ad498af36c4f8eb61ff633542adb4f525fffd04a36c7c8eb9709\" returns successfully" Mar 17 17:55:47.542136 containerd[1540]: time="2025-03-17T17:55:47.542072264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-ll5wt,Uid:b2a0081a-82c0-422d-8b6d-365e0b76d36a,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"af11619c2a1bf7fba6cb62778b816572756c54a7680f66e618247ef32dc34e90\"" Mar 17 17:55:47.543394 containerd[1540]: time="2025-03-17T17:55:47.543097675Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 17 17:55:48.354914 kubelet[2850]: I0317 17:55:48.354880 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-ssc2t" podStartSLOduration=2.354869884 podStartE2EDuration="2.354869884s" podCreationTimestamp="2025-03-17 17:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:55:48.354754651 +0000 UTC m=+17.195893920" watchObservedRunningTime="2025-03-17 17:55:48.354869884 +0000 UTC m=+17.196009154" Mar 17 17:55:49.927033 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3267728057.mount: Deactivated successfully. Mar 17 17:55:51.005580 containerd[1540]: time="2025-03-17T17:55:51.005438498Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:51.006034 containerd[1540]: time="2025-03-17T17:55:51.005963227Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 17 17:55:51.006751 containerd[1540]: time="2025-03-17T17:55:51.006312371Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:51.034664 containerd[1540]: time="2025-03-17T17:55:51.034619046Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:51.035401 containerd[1540]: time="2025-03-17T17:55:51.035110536Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 3.491993925s" Mar 17 17:55:51.035401 containerd[1540]: time="2025-03-17T17:55:51.035130185Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 17 17:55:51.045094 containerd[1540]: time="2025-03-17T17:55:51.045003567Z" level=info msg="CreateContainer within sandbox \"af11619c2a1bf7fba6cb62778b816572756c54a7680f66e618247ef32dc34e90\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 17 17:55:51.063108 containerd[1540]: time="2025-03-17T17:55:51.063076689Z" level=info msg="CreateContainer within sandbox \"af11619c2a1bf7fba6cb62778b816572756c54a7680f66e618247ef32dc34e90\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"b6d7dc1276f70a21e4a05232f857dc6ab9da74bf51531227890cf0b1d544e52f\"" Mar 17 17:55:51.063524 containerd[1540]: time="2025-03-17T17:55:51.063461842Z" level=info msg="StartContainer for \"b6d7dc1276f70a21e4a05232f857dc6ab9da74bf51531227890cf0b1d544e52f\"" Mar 17 17:55:51.083655 systemd[1]: Started cri-containerd-b6d7dc1276f70a21e4a05232f857dc6ab9da74bf51531227890cf0b1d544e52f.scope - libcontainer container b6d7dc1276f70a21e4a05232f857dc6ab9da74bf51531227890cf0b1d544e52f. Mar 17 17:55:51.103319 containerd[1540]: time="2025-03-17T17:55:51.103267752Z" level=info msg="StartContainer for \"b6d7dc1276f70a21e4a05232f857dc6ab9da74bf51531227890cf0b1d544e52f\" returns successfully" Mar 17 17:55:53.908541 kubelet[2850]: I0317 17:55:53.908192 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6479d6dc54-ll5wt" podStartSLOduration=3.41327243 podStartE2EDuration="6.9081789s" podCreationTimestamp="2025-03-17 17:55:47 +0000 UTC" firstStartedPulling="2025-03-17 17:55:47.542905879 +0000 UTC m=+16.384045137" lastFinishedPulling="2025-03-17 17:55:51.037812348 +0000 UTC m=+19.878951607" observedRunningTime="2025-03-17 17:55:51.364374865 +0000 UTC m=+20.205514132" watchObservedRunningTime="2025-03-17 17:55:53.9081789 +0000 UTC m=+22.749318163" Mar 17 17:55:53.909584 kubelet[2850]: I0317 17:55:53.909383 2850 topology_manager.go:215] "Topology Admit Handler" podUID="6b6ca053-adf6-44c0-bdfd-693c6b378420" podNamespace="calico-system" podName="calico-typha-54db56c7d7-2vgs8" Mar 17 17:55:53.922584 systemd[1]: Created slice kubepods-besteffort-pod6b6ca053_adf6_44c0_bdfd_693c6b378420.slice - libcontainer container kubepods-besteffort-pod6b6ca053_adf6_44c0_bdfd_693c6b378420.slice. Mar 17 17:55:53.925128 kubelet[2850]: I0317 17:55:53.924218 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b6ca053-adf6-44c0-bdfd-693c6b378420-tigera-ca-bundle\") pod \"calico-typha-54db56c7d7-2vgs8\" (UID: \"6b6ca053-adf6-44c0-bdfd-693c6b378420\") " pod="calico-system/calico-typha-54db56c7d7-2vgs8" Mar 17 17:55:53.925128 kubelet[2850]: I0317 17:55:53.924271 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rxlv\" (UniqueName: \"kubernetes.io/projected/6b6ca053-adf6-44c0-bdfd-693c6b378420-kube-api-access-7rxlv\") pod \"calico-typha-54db56c7d7-2vgs8\" (UID: \"6b6ca053-adf6-44c0-bdfd-693c6b378420\") " pod="calico-system/calico-typha-54db56c7d7-2vgs8" Mar 17 17:55:53.925128 kubelet[2850]: I0317 17:55:53.924285 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6b6ca053-adf6-44c0-bdfd-693c6b378420-typha-certs\") pod \"calico-typha-54db56c7d7-2vgs8\" (UID: \"6b6ca053-adf6-44c0-bdfd-693c6b378420\") " pod="calico-system/calico-typha-54db56c7d7-2vgs8" Mar 17 17:55:53.958295 kubelet[2850]: I0317 17:55:53.958262 2850 topology_manager.go:215] "Topology Admit Handler" podUID="9c6a8252-f745-4277-ab3b-8e3e7085b489" podNamespace="calico-system" podName="calico-node-7zjsl" Mar 17 17:55:53.965095 systemd[1]: Created slice kubepods-besteffort-pod9c6a8252_f745_4277_ab3b_8e3e7085b489.slice - libcontainer container kubepods-besteffort-pod9c6a8252_f745_4277_ab3b_8e3e7085b489.slice. Mar 17 17:55:54.025247 kubelet[2850]: I0317 17:55:54.024812 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-lib-modules\") pod \"calico-node-7zjsl\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " pod="calico-system/calico-node-7zjsl" Mar 17 17:55:54.025247 kubelet[2850]: I0317 17:55:54.024839 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6a8252-f745-4277-ab3b-8e3e7085b489-tigera-ca-bundle\") pod \"calico-node-7zjsl\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " pod="calico-system/calico-node-7zjsl" Mar 17 17:55:54.025247 kubelet[2850]: I0317 17:55:54.024855 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-cni-log-dir\") pod \"calico-node-7zjsl\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " pod="calico-system/calico-node-7zjsl" Mar 17 17:55:54.025247 kubelet[2850]: I0317 17:55:54.024872 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsld9\" (UniqueName: \"kubernetes.io/projected/9c6a8252-f745-4277-ab3b-8e3e7085b489-kube-api-access-hsld9\") pod \"calico-node-7zjsl\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " pod="calico-system/calico-node-7zjsl" Mar 17 17:55:54.025247 kubelet[2850]: I0317 17:55:54.024886 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-policysync\") pod \"calico-node-7zjsl\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " pod="calico-system/calico-node-7zjsl" Mar 17 17:55:54.026766 kubelet[2850]: I0317 17:55:54.024896 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-var-run-calico\") pod \"calico-node-7zjsl\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " pod="calico-system/calico-node-7zjsl" Mar 17 17:55:54.026766 kubelet[2850]: I0317 17:55:54.024906 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-cni-net-dir\") pod \"calico-node-7zjsl\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " pod="calico-system/calico-node-7zjsl" Mar 17 17:55:54.026766 kubelet[2850]: I0317 17:55:54.024917 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-flexvol-driver-host\") pod \"calico-node-7zjsl\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " pod="calico-system/calico-node-7zjsl" Mar 17 17:55:54.026766 kubelet[2850]: I0317 17:55:54.024928 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/9c6a8252-f745-4277-ab3b-8e3e7085b489-node-certs\") pod \"calico-node-7zjsl\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " pod="calico-system/calico-node-7zjsl" Mar 17 17:55:54.026766 kubelet[2850]: I0317 17:55:54.024936 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-var-lib-calico\") pod \"calico-node-7zjsl\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " pod="calico-system/calico-node-7zjsl" Mar 17 17:55:54.026851 kubelet[2850]: I0317 17:55:54.024945 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-cni-bin-dir\") pod \"calico-node-7zjsl\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " pod="calico-system/calico-node-7zjsl" Mar 17 17:55:54.026851 kubelet[2850]: I0317 17:55:54.024961 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-xtables-lock\") pod \"calico-node-7zjsl\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " pod="calico-system/calico-node-7zjsl" Mar 17 17:55:54.066986 kubelet[2850]: I0317 17:55:54.066954 2850 topology_manager.go:215] "Topology Admit Handler" podUID="0d288c4c-94be-4e72-9025-53893bb68385" podNamespace="calico-system" podName="csi-node-driver-2cntr" Mar 17 17:55:54.067269 kubelet[2850]: E0317 17:55:54.067151 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2cntr" podUID="0d288c4c-94be-4e72-9025-53893bb68385" Mar 17 17:55:54.125622 kubelet[2850]: I0317 17:55:54.125599 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/0d288c4c-94be-4e72-9025-53893bb68385-varrun\") pod \"csi-node-driver-2cntr\" (UID: \"0d288c4c-94be-4e72-9025-53893bb68385\") " pod="calico-system/csi-node-driver-2cntr" Mar 17 17:55:54.125710 kubelet[2850]: I0317 17:55:54.125632 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d288c4c-94be-4e72-9025-53893bb68385-registration-dir\") pod \"csi-node-driver-2cntr\" (UID: \"0d288c4c-94be-4e72-9025-53893bb68385\") " pod="calico-system/csi-node-driver-2cntr" Mar 17 17:55:54.125710 kubelet[2850]: I0317 17:55:54.125654 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d288c4c-94be-4e72-9025-53893bb68385-kubelet-dir\") pod \"csi-node-driver-2cntr\" (UID: \"0d288c4c-94be-4e72-9025-53893bb68385\") " pod="calico-system/csi-node-driver-2cntr" Mar 17 17:55:54.125710 kubelet[2850]: I0317 17:55:54.125664 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbwd7\" (UniqueName: \"kubernetes.io/projected/0d288c4c-94be-4e72-9025-53893bb68385-kube-api-access-mbwd7\") pod \"csi-node-driver-2cntr\" (UID: \"0d288c4c-94be-4e72-9025-53893bb68385\") " pod="calico-system/csi-node-driver-2cntr" Mar 17 17:55:54.125710 kubelet[2850]: I0317 17:55:54.125709 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d288c4c-94be-4e72-9025-53893bb68385-socket-dir\") pod \"csi-node-driver-2cntr\" (UID: \"0d288c4c-94be-4e72-9025-53893bb68385\") " pod="calico-system/csi-node-driver-2cntr" Mar 17 17:55:54.126266 kubelet[2850]: E0317 17:55:54.126252 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.126266 kubelet[2850]: W0317 17:55:54.126262 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.126575 kubelet[2850]: E0317 17:55:54.126495 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.126575 kubelet[2850]: W0317 17:55:54.126503 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.127614 kubelet[2850]: E0317 17:55:54.126899 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.127614 kubelet[2850]: E0317 17:55:54.127082 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.127614 kubelet[2850]: W0317 17:55:54.127088 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.127614 kubelet[2850]: E0317 17:55:54.127094 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.127614 kubelet[2850]: E0317 17:55:54.127186 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.127614 kubelet[2850]: W0317 17:55:54.127191 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.127614 kubelet[2850]: E0317 17:55:54.127196 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.127614 kubelet[2850]: E0317 17:55:54.127584 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.127614 kubelet[2850]: W0317 17:55:54.127590 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.127614 kubelet[2850]: E0317 17:55:54.127595 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.129565 kubelet[2850]: E0317 17:55:54.127856 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.129565 kubelet[2850]: W0317 17:55:54.127863 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.129565 kubelet[2850]: E0317 17:55:54.127869 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.129565 kubelet[2850]: E0317 17:55:54.128309 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.130450 kubelet[2850]: E0317 17:55:54.129719 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.130450 kubelet[2850]: W0317 17:55:54.129730 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.130450 kubelet[2850]: E0317 17:55:54.129743 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.132026 kubelet[2850]: E0317 17:55:54.131909 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.132026 kubelet[2850]: W0317 17:55:54.131916 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.132026 kubelet[2850]: E0317 17:55:54.131924 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.132272 kubelet[2850]: E0317 17:55:54.132239 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.132272 kubelet[2850]: W0317 17:55:54.132247 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.132353 kubelet[2850]: E0317 17:55:54.132300 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.132471 kubelet[2850]: E0317 17:55:54.132466 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.132530 kubelet[2850]: W0317 17:55:54.132500 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.132611 kubelet[2850]: E0317 17:55:54.132596 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.132780 kubelet[2850]: E0317 17:55:54.132721 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.132780 kubelet[2850]: W0317 17:55:54.132727 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.132849 kubelet[2850]: E0317 17:55:54.132841 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.132927 kubelet[2850]: E0317 17:55:54.132899 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.132927 kubelet[2850]: W0317 17:55:54.132904 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.133018 kubelet[2850]: E0317 17:55:54.132976 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.133078 kubelet[2850]: E0317 17:55:54.133073 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.133110 kubelet[2850]: W0317 17:55:54.133105 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.133248 kubelet[2850]: E0317 17:55:54.133211 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.133248 kubelet[2850]: E0317 17:55:54.133238 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.133248 kubelet[2850]: W0317 17:55:54.133242 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.133456 kubelet[2850]: E0317 17:55:54.133382 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.133456 kubelet[2850]: E0317 17:55:54.133413 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.133456 kubelet[2850]: W0317 17:55:54.133417 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.133456 kubelet[2850]: E0317 17:55:54.133424 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.133663 kubelet[2850]: E0317 17:55:54.133608 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.133663 kubelet[2850]: W0317 17:55:54.133614 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.133663 kubelet[2850]: E0317 17:55:54.133625 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.133830 kubelet[2850]: E0317 17:55:54.133815 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.133830 kubelet[2850]: W0317 17:55:54.133820 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.133920 kubelet[2850]: E0317 17:55:54.133875 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.133984 kubelet[2850]: E0317 17:55:54.133979 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.134061 kubelet[2850]: W0317 17:55:54.134011 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.134061 kubelet[2850]: E0317 17:55:54.134022 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.134207 kubelet[2850]: E0317 17:55:54.134193 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.134207 kubelet[2850]: W0317 17:55:54.134198 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.134277 kubelet[2850]: E0317 17:55:54.134246 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.134424 kubelet[2850]: E0317 17:55:54.134379 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.134424 kubelet[2850]: W0317 17:55:54.134384 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.134424 kubelet[2850]: E0317 17:55:54.134391 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.134596 kubelet[2850]: E0317 17:55:54.134517 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.134596 kubelet[2850]: W0317 17:55:54.134523 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.134596 kubelet[2850]: E0317 17:55:54.134528 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.134744 kubelet[2850]: E0317 17:55:54.134739 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.134801 kubelet[2850]: W0317 17:55:54.134773 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.135286 kubelet[2850]: E0317 17:55:54.134843 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.135391 kubelet[2850]: E0317 17:55:54.135385 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.135428 kubelet[2850]: W0317 17:55:54.135423 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.135516 kubelet[2850]: E0317 17:55:54.135510 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.135918 kubelet[2850]: E0317 17:55:54.135729 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.135964 kubelet[2850]: W0317 17:55:54.135957 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.136023 kubelet[2850]: E0317 17:55:54.135996 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.136646 kubelet[2850]: E0317 17:55:54.136186 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.136646 kubelet[2850]: W0317 17:55:54.136192 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.136646 kubelet[2850]: E0317 17:55:54.136198 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.138484 kubelet[2850]: E0317 17:55:54.138475 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.138533 kubelet[2850]: W0317 17:55:54.138526 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.138633 kubelet[2850]: E0317 17:55:54.138625 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.142343 kubelet[2850]: E0317 17:55:54.142331 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.142743 kubelet[2850]: W0317 17:55:54.142484 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.142743 kubelet[2850]: E0317 17:55:54.142498 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.227352 kubelet[2850]: E0317 17:55:54.227144 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.227352 kubelet[2850]: W0317 17:55:54.227158 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.227352 kubelet[2850]: E0317 17:55:54.227171 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.227642 kubelet[2850]: E0317 17:55:54.227503 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.227642 kubelet[2850]: W0317 17:55:54.227509 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.227642 kubelet[2850]: E0317 17:55:54.227520 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.229756 kubelet[2850]: E0317 17:55:54.229667 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.229756 kubelet[2850]: W0317 17:55:54.229689 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.229756 kubelet[2850]: E0317 17:55:54.229743 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.229937 kubelet[2850]: E0317 17:55:54.229924 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.229937 kubelet[2850]: W0317 17:55:54.229932 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.229990 kubelet[2850]: E0317 17:55:54.229980 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.230119 kubelet[2850]: E0317 17:55:54.230108 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.230119 kubelet[2850]: W0317 17:55:54.230116 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.230219 kubelet[2850]: E0317 17:55:54.230208 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.230308 kubelet[2850]: E0317 17:55:54.230297 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.230308 kubelet[2850]: W0317 17:55:54.230304 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.230356 kubelet[2850]: E0317 17:55:54.230338 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.230519 kubelet[2850]: E0317 17:55:54.230508 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.230543 kubelet[2850]: W0317 17:55:54.230516 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.230543 kubelet[2850]: E0317 17:55:54.230531 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.230659 kubelet[2850]: E0317 17:55:54.230649 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.230659 kubelet[2850]: W0317 17:55:54.230657 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.230698 kubelet[2850]: E0317 17:55:54.230664 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.230843 kubelet[2850]: E0317 17:55:54.230831 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.230843 kubelet[2850]: W0317 17:55:54.230839 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.230883 kubelet[2850]: E0317 17:55:54.230846 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.231170 kubelet[2850]: E0317 17:55:54.231158 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.231170 kubelet[2850]: W0317 17:55:54.231168 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.231224 kubelet[2850]: E0317 17:55:54.231174 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.231315 kubelet[2850]: E0317 17:55:54.231304 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.231315 kubelet[2850]: W0317 17:55:54.231312 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.231354 kubelet[2850]: E0317 17:55:54.231324 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.231514 kubelet[2850]: E0317 17:55:54.231502 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.231514 kubelet[2850]: W0317 17:55:54.231510 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.231604 kubelet[2850]: E0317 17:55:54.231577 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.231629 kubelet[2850]: E0317 17:55:54.231616 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.231629 kubelet[2850]: W0317 17:55:54.231621 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.231703 kubelet[2850]: E0317 17:55:54.231693 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.231730 kubelet[2850]: E0317 17:55:54.231718 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.231730 kubelet[2850]: W0317 17:55:54.231722 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.231800 kubelet[2850]: E0317 17:55:54.231789 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.231913 kubelet[2850]: E0317 17:55:54.231900 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.231913 kubelet[2850]: W0317 17:55:54.231907 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.232033 kubelet[2850]: E0317 17:55:54.231981 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.232061 kubelet[2850]: E0317 17:55:54.232045 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.232061 kubelet[2850]: W0317 17:55:54.232050 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.232061 kubelet[2850]: E0317 17:55:54.232056 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.232221 kubelet[2850]: E0317 17:55:54.232207 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.232221 kubelet[2850]: W0317 17:55:54.232214 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.232275 kubelet[2850]: E0317 17:55:54.232265 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.232384 kubelet[2850]: E0317 17:55:54.232374 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.232405 kubelet[2850]: W0317 17:55:54.232392 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.232405 kubelet[2850]: E0317 17:55:54.232400 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.232535 kubelet[2850]: E0317 17:55:54.232497 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.232535 kubelet[2850]: W0317 17:55:54.232502 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.232535 kubelet[2850]: E0317 17:55:54.232510 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.232635 kubelet[2850]: E0317 17:55:54.232626 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.232635 kubelet[2850]: W0317 17:55:54.232632 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.232677 kubelet[2850]: E0317 17:55:54.232643 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.232785 kubelet[2850]: E0317 17:55:54.232735 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.232785 kubelet[2850]: W0317 17:55:54.232739 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.232785 kubelet[2850]: E0317 17:55:54.232745 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.232922 kubelet[2850]: E0317 17:55:54.232911 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.232922 kubelet[2850]: W0317 17:55:54.232918 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.232966 kubelet[2850]: E0317 17:55:54.232923 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.233070 kubelet[2850]: E0317 17:55:54.233012 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.233070 kubelet[2850]: W0317 17:55:54.233018 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.233070 kubelet[2850]: E0317 17:55:54.233023 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.233126 kubelet[2850]: E0317 17:55:54.233118 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.233126 kubelet[2850]: W0317 17:55:54.233122 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.233157 kubelet[2850]: E0317 17:55:54.233127 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.234692 kubelet[2850]: E0317 17:55:54.234679 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.234692 kubelet[2850]: W0317 17:55:54.234688 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.234748 kubelet[2850]: E0317 17:55:54.234695 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.235234 containerd[1540]: time="2025-03-17T17:55:54.235212617Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-54db56c7d7-2vgs8,Uid:6b6ca053-adf6-44c0-bdfd-693c6b378420,Namespace:calico-system,Attempt:0,}" Mar 17 17:55:54.240481 kubelet[2850]: E0317 17:55:54.240440 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:54.240481 kubelet[2850]: W0317 17:55:54.240451 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:54.240481 kubelet[2850]: E0317 17:55:54.240461 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:54.259398 containerd[1540]: time="2025-03-17T17:55:54.257501857Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:55:54.259398 containerd[1540]: time="2025-03-17T17:55:54.257534125Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:55:54.259398 containerd[1540]: time="2025-03-17T17:55:54.257550378Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:55:54.259398 containerd[1540]: time="2025-03-17T17:55:54.257602224Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:55:54.269171 containerd[1540]: time="2025-03-17T17:55:54.268828721Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-7zjsl,Uid:9c6a8252-f745-4277-ab3b-8e3e7085b489,Namespace:calico-system,Attempt:0,}" Mar 17 17:55:54.270709 systemd[1]: Started cri-containerd-d5234b7313f495dde6f98b88c9f7e58b63a4975c3f4cdd573f5eba8224e9bb3b.scope - libcontainer container d5234b7313f495dde6f98b88c9f7e58b63a4975c3f4cdd573f5eba8224e9bb3b. Mar 17 17:55:54.307953 containerd[1540]: time="2025-03-17T17:55:54.307928523Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-54db56c7d7-2vgs8,Uid:6b6ca053-adf6-44c0-bdfd-693c6b378420,Namespace:calico-system,Attempt:0,} returns sandbox id \"d5234b7313f495dde6f98b88c9f7e58b63a4975c3f4cdd573f5eba8224e9bb3b\"" Mar 17 17:55:54.314047 containerd[1540]: time="2025-03-17T17:55:54.312487956Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:55:54.314047 containerd[1540]: time="2025-03-17T17:55:54.312538204Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:55:54.314047 containerd[1540]: time="2025-03-17T17:55:54.312555219Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:55:54.314047 containerd[1540]: time="2025-03-17T17:55:54.312630926Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:55:54.327617 containerd[1540]: time="2025-03-17T17:55:54.327546431Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 17 17:55:54.338654 systemd[1]: Started cri-containerd-62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200.scope - libcontainer container 62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200. Mar 17 17:55:54.353253 containerd[1540]: time="2025-03-17T17:55:54.353201219Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-7zjsl,Uid:9c6a8252-f745-4277-ab3b-8e3e7085b489,Namespace:calico-system,Attempt:0,} returns sandbox id \"62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200\"" Mar 17 17:55:55.312929 kubelet[2850]: E0317 17:55:55.312722 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2cntr" podUID="0d288c4c-94be-4e72-9025-53893bb68385" Mar 17 17:55:56.943500 containerd[1540]: time="2025-03-17T17:55:56.943414280Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:56.943902 containerd[1540]: time="2025-03-17T17:55:56.943872360Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 17 17:55:56.944462 containerd[1540]: time="2025-03-17T17:55:56.944179126Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:56.945181 containerd[1540]: time="2025-03-17T17:55:56.945160653Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:56.945653 containerd[1540]: time="2025-03-17T17:55:56.945547703Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 2.617961449s" Mar 17 17:55:56.945653 containerd[1540]: time="2025-03-17T17:55:56.945574114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 17 17:55:56.953143 containerd[1540]: time="2025-03-17T17:55:56.953128092Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 17 17:55:56.976790 containerd[1540]: time="2025-03-17T17:55:56.976761341Z" level=info msg="CreateContainer within sandbox \"d5234b7313f495dde6f98b88c9f7e58b63a4975c3f4cdd573f5eba8224e9bb3b\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 17 17:55:56.984407 containerd[1540]: time="2025-03-17T17:55:56.984380691Z" level=info msg="CreateContainer within sandbox \"d5234b7313f495dde6f98b88c9f7e58b63a4975c3f4cdd573f5eba8224e9bb3b\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b\"" Mar 17 17:55:56.985336 containerd[1540]: time="2025-03-17T17:55:56.985321286Z" level=info msg="StartContainer for \"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b\"" Mar 17 17:55:56.986327 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2964969498.mount: Deactivated successfully. Mar 17 17:55:57.025792 systemd[1]: Started cri-containerd-0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b.scope - libcontainer container 0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b. Mar 17 17:55:57.058523 containerd[1540]: time="2025-03-17T17:55:57.058496987Z" level=info msg="StartContainer for \"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b\" returns successfully" Mar 17 17:55:57.324008 kubelet[2850]: E0317 17:55:57.323708 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2cntr" podUID="0d288c4c-94be-4e72-9025-53893bb68385" Mar 17 17:55:57.438815 kubelet[2850]: E0317 17:55:57.438792 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.438815 kubelet[2850]: W0317 17:55:57.438810 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.438963 kubelet[2850]: E0317 17:55:57.438824 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.438963 kubelet[2850]: E0317 17:55:57.438923 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.438963 kubelet[2850]: W0317 17:55:57.438928 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.438963 kubelet[2850]: E0317 17:55:57.438933 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.439070 kubelet[2850]: E0317 17:55:57.439007 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.439070 kubelet[2850]: W0317 17:55:57.439012 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.439070 kubelet[2850]: E0317 17:55:57.439016 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.439589 kubelet[2850]: E0317 17:55:57.439094 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.439589 kubelet[2850]: W0317 17:55:57.439099 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.439589 kubelet[2850]: E0317 17:55:57.439104 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.439589 kubelet[2850]: E0317 17:55:57.439194 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.439589 kubelet[2850]: W0317 17:55:57.439198 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.439589 kubelet[2850]: E0317 17:55:57.439204 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.439589 kubelet[2850]: E0317 17:55:57.439282 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.439589 kubelet[2850]: W0317 17:55:57.439287 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.439589 kubelet[2850]: E0317 17:55:57.439291 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.439589 kubelet[2850]: E0317 17:55:57.439366 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.439825 kubelet[2850]: W0317 17:55:57.439371 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.439825 kubelet[2850]: E0317 17:55:57.439375 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.440067 kubelet[2850]: E0317 17:55:57.439977 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.440067 kubelet[2850]: W0317 17:55:57.439988 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.440067 kubelet[2850]: E0317 17:55:57.439999 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.440316 kubelet[2850]: E0317 17:55:57.440236 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.440316 kubelet[2850]: W0317 17:55:57.440253 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.440316 kubelet[2850]: E0317 17:55:57.440259 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.440491 kubelet[2850]: E0317 17:55:57.440452 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.440491 kubelet[2850]: W0317 17:55:57.440458 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.440491 kubelet[2850]: E0317 17:55:57.440463 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.440791 kubelet[2850]: E0317 17:55:57.440717 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.440791 kubelet[2850]: W0317 17:55:57.440725 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.440791 kubelet[2850]: E0317 17:55:57.440730 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.440940 kubelet[2850]: E0317 17:55:57.440904 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.440940 kubelet[2850]: W0317 17:55:57.440911 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.440940 kubelet[2850]: E0317 17:55:57.440916 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.441105 kubelet[2850]: E0317 17:55:57.441098 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.441183 kubelet[2850]: W0317 17:55:57.441147 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.441183 kubelet[2850]: E0317 17:55:57.441158 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.441393 kubelet[2850]: E0317 17:55:57.441342 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.441393 kubelet[2850]: W0317 17:55:57.441347 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.441393 kubelet[2850]: E0317 17:55:57.441352 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.441552 kubelet[2850]: E0317 17:55:57.441514 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.441552 kubelet[2850]: W0317 17:55:57.441519 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.441552 kubelet[2850]: E0317 17:55:57.441524 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.453987 kubelet[2850]: E0317 17:55:57.453964 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.453987 kubelet[2850]: W0317 17:55:57.453981 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.453987 kubelet[2850]: E0317 17:55:57.453995 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.454276 kubelet[2850]: E0317 17:55:57.454107 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.454276 kubelet[2850]: W0317 17:55:57.454112 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.454276 kubelet[2850]: E0317 17:55:57.454120 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.454276 kubelet[2850]: E0317 17:55:57.454213 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.454276 kubelet[2850]: W0317 17:55:57.454217 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.454276 kubelet[2850]: E0317 17:55:57.454222 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.454540 kubelet[2850]: E0317 17:55:57.454460 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.454540 kubelet[2850]: W0317 17:55:57.454468 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.454540 kubelet[2850]: E0317 17:55:57.454477 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.454709 kubelet[2850]: E0317 17:55:57.454666 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.454709 kubelet[2850]: W0317 17:55:57.454672 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.454709 kubelet[2850]: E0317 17:55:57.454678 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.455012 kubelet[2850]: E0317 17:55:57.454930 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.455012 kubelet[2850]: W0317 17:55:57.454938 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.455012 kubelet[2850]: E0317 17:55:57.454948 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.455149 kubelet[2850]: E0317 17:55:57.455076 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.455149 kubelet[2850]: W0317 17:55:57.455081 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.455305 kubelet[2850]: E0317 17:55:57.455206 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.455550 kubelet[2850]: E0317 17:55:57.455493 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.455550 kubelet[2850]: W0317 17:55:57.455499 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.455550 kubelet[2850]: E0317 17:55:57.455509 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.455990 kubelet[2850]: E0317 17:55:57.455686 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.455990 kubelet[2850]: W0317 17:55:57.455691 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.455990 kubelet[2850]: E0317 17:55:57.455708 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.455990 kubelet[2850]: E0317 17:55:57.455797 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.455990 kubelet[2850]: W0317 17:55:57.455802 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.455990 kubelet[2850]: E0317 17:55:57.455817 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.455990 kubelet[2850]: E0317 17:55:57.455899 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.455990 kubelet[2850]: W0317 17:55:57.455903 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.455990 kubelet[2850]: E0317 17:55:57.455913 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.455990 kubelet[2850]: E0317 17:55:57.455991 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.456293 kubelet[2850]: W0317 17:55:57.455995 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.456293 kubelet[2850]: E0317 17:55:57.456010 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.456293 kubelet[2850]: E0317 17:55:57.456115 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.456293 kubelet[2850]: W0317 17:55:57.456119 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.456293 kubelet[2850]: E0317 17:55:57.456126 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.456458 kubelet[2850]: E0317 17:55:57.456430 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.456458 kubelet[2850]: W0317 17:55:57.456437 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.456458 kubelet[2850]: E0317 17:55:57.456448 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.456533 kubelet[2850]: E0317 17:55:57.456527 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.456533 kubelet[2850]: W0317 17:55:57.456531 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.456688 kubelet[2850]: E0317 17:55:57.456538 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.456688 kubelet[2850]: E0317 17:55:57.456657 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.456688 kubelet[2850]: W0317 17:55:57.456662 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.456688 kubelet[2850]: E0317 17:55:57.456667 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.456774 kubelet[2850]: E0317 17:55:57.456757 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.456774 kubelet[2850]: W0317 17:55:57.456762 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.456774 kubelet[2850]: E0317 17:55:57.456766 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:57.456962 kubelet[2850]: E0317 17:55:57.456952 2850 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:55:57.456962 kubelet[2850]: W0317 17:55:57.456959 2850 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:55:57.457007 kubelet[2850]: E0317 17:55:57.456964 2850 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:55:58.352265 containerd[1540]: time="2025-03-17T17:55:58.352235951Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:58.353029 containerd[1540]: time="2025-03-17T17:55:58.352999609Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 17 17:55:58.353297 containerd[1540]: time="2025-03-17T17:55:58.353285864Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:58.356443 containerd[1540]: time="2025-03-17T17:55:58.355941041Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:55:58.357100 containerd[1540]: time="2025-03-17T17:55:58.357083933Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 1.403860368s" Mar 17 17:55:58.357164 containerd[1540]: time="2025-03-17T17:55:58.357155756Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 17 17:55:58.365051 containerd[1540]: time="2025-03-17T17:55:58.365023897Z" level=info msg="CreateContainer within sandbox \"62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 17:55:58.371636 containerd[1540]: time="2025-03-17T17:55:58.371610746Z" level=info msg="CreateContainer within sandbox \"62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7\"" Mar 17 17:55:58.372684 containerd[1540]: time="2025-03-17T17:55:58.372669318Z" level=info msg="StartContainer for \"08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7\"" Mar 17 17:55:58.381517 kubelet[2850]: I0317 17:55:58.381357 2850 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:55:58.401861 systemd[1]: Started cri-containerd-08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7.scope - libcontainer container 08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7. Mar 17 17:55:58.430252 containerd[1540]: time="2025-03-17T17:55:58.430092012Z" level=info msg="StartContainer for \"08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7\" returns successfully" Mar 17 17:55:58.443696 systemd[1]: cri-containerd-08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7.scope: Deactivated successfully. Mar 17 17:55:58.872325 containerd[1540]: time="2025-03-17T17:55:58.856999618Z" level=info msg="shim disconnected" id=08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7 namespace=k8s.io Mar 17 17:55:58.872325 containerd[1540]: time="2025-03-17T17:55:58.872210741Z" level=warning msg="cleaning up after shim disconnected" id=08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7 namespace=k8s.io Mar 17 17:55:58.872325 containerd[1540]: time="2025-03-17T17:55:58.872221310Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:55:58.966807 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7-rootfs.mount: Deactivated successfully. Mar 17 17:55:59.312971 kubelet[2850]: E0317 17:55:59.312706 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2cntr" podUID="0d288c4c-94be-4e72-9025-53893bb68385" Mar 17 17:55:59.387307 containerd[1540]: time="2025-03-17T17:55:59.387262824Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 17 17:55:59.397352 kubelet[2850]: I0317 17:55:59.396075 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-54db56c7d7-2vgs8" podStartSLOduration=3.770173537 podStartE2EDuration="6.396061272s" podCreationTimestamp="2025-03-17 17:55:53 +0000 UTC" firstStartedPulling="2025-03-17 17:55:54.327156993 +0000 UTC m=+23.168296252" lastFinishedPulling="2025-03-17 17:55:56.953044728 +0000 UTC m=+25.794183987" observedRunningTime="2025-03-17 17:55:57.391243092 +0000 UTC m=+26.232382361" watchObservedRunningTime="2025-03-17 17:55:59.396061272 +0000 UTC m=+28.237200542" Mar 17 17:56:01.313284 kubelet[2850]: E0317 17:56:01.312608 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2cntr" podUID="0d288c4c-94be-4e72-9025-53893bb68385" Mar 17 17:56:01.951457 kubelet[2850]: I0317 17:56:01.951266 2850 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:56:03.036986 containerd[1540]: time="2025-03-17T17:56:03.036951338Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:03.037862 containerd[1540]: time="2025-03-17T17:56:03.037391796Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 17 17:56:03.037862 containerd[1540]: time="2025-03-17T17:56:03.037833707Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:03.039423 containerd[1540]: time="2025-03-17T17:56:03.039392943Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:03.040334 containerd[1540]: time="2025-03-17T17:56:03.039982823Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 3.652680975s" Mar 17 17:56:03.040334 containerd[1540]: time="2025-03-17T17:56:03.040005011Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 17 17:56:03.041790 containerd[1540]: time="2025-03-17T17:56:03.041725551Z" level=info msg="CreateContainer within sandbox \"62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 17:56:03.051542 containerd[1540]: time="2025-03-17T17:56:03.051485018Z" level=info msg="CreateContainer within sandbox \"62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12\"" Mar 17 17:56:03.052821 containerd[1540]: time="2025-03-17T17:56:03.051928479Z" level=info msg="StartContainer for \"672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12\"" Mar 17 17:56:03.098780 systemd[1]: Started cri-containerd-672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12.scope - libcontainer container 672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12. Mar 17 17:56:03.153658 containerd[1540]: time="2025-03-17T17:56:03.153610150Z" level=info msg="StartContainer for \"672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12\" returns successfully" Mar 17 17:56:03.312832 kubelet[2850]: E0317 17:56:03.312749 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2cntr" podUID="0d288c4c-94be-4e72-9025-53893bb68385" Mar 17 17:56:04.621105 systemd[1]: cri-containerd-672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12.scope: Deactivated successfully. Mar 17 17:56:04.637657 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12-rootfs.mount: Deactivated successfully. Mar 17 17:56:04.639116 containerd[1540]: time="2025-03-17T17:56:04.639008247Z" level=info msg="shim disconnected" id=672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12 namespace=k8s.io Mar 17 17:56:04.639116 containerd[1540]: time="2025-03-17T17:56:04.639045787Z" level=warning msg="cleaning up after shim disconnected" id=672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12 namespace=k8s.io Mar 17 17:56:04.639116 containerd[1540]: time="2025-03-17T17:56:04.639051073Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:56:04.679274 kubelet[2850]: I0317 17:56:04.679254 2850 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 17 17:56:04.701499 kubelet[2850]: I0317 17:56:04.701472 2850 topology_manager.go:215] "Topology Admit Handler" podUID="46b937aa-d1db-4705-9bec-d4bd7aeaeceb" podNamespace="kube-system" podName="coredns-7db6d8ff4d-v59bf" Mar 17 17:56:04.703579 kubelet[2850]: I0317 17:56:04.703234 2850 topology_manager.go:215] "Topology Admit Handler" podUID="f3d38051-dd31-4085-95b4-5054901044b2" podNamespace="kube-system" podName="coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:04.704965 kubelet[2850]: I0317 17:56:04.704900 2850 topology_manager.go:215] "Topology Admit Handler" podUID="4761939f-21ce-4484-88c7-08bcb4f65c5c" podNamespace="calico-system" podName="calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:04.706039 kubelet[2850]: I0317 17:56:04.705892 2850 topology_manager.go:215] "Topology Admit Handler" podUID="411709d2-a807-4f34-9412-d952d186c81f" podNamespace="calico-apiserver" podName="calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:04.710125 kubelet[2850]: I0317 17:56:04.710105 2850 topology_manager.go:215] "Topology Admit Handler" podUID="e62b9bfa-d049-4110-b093-e476a26ef5be" podNamespace="calico-apiserver" podName="calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:04.721612 systemd[1]: Created slice kubepods-burstable-pod46b937aa_d1db_4705_9bec_d4bd7aeaeceb.slice - libcontainer container kubepods-burstable-pod46b937aa_d1db_4705_9bec_d4bd7aeaeceb.slice. Mar 17 17:56:04.729402 systemd[1]: Created slice kubepods-burstable-podf3d38051_dd31_4085_95b4_5054901044b2.slice - libcontainer container kubepods-burstable-podf3d38051_dd31_4085_95b4_5054901044b2.slice. Mar 17 17:56:04.733941 systemd[1]: Created slice kubepods-besteffort-pod4761939f_21ce_4484_88c7_08bcb4f65c5c.slice - libcontainer container kubepods-besteffort-pod4761939f_21ce_4484_88c7_08bcb4f65c5c.slice. Mar 17 17:56:04.740745 systemd[1]: Created slice kubepods-besteffort-pod411709d2_a807_4f34_9412_d952d186c81f.slice - libcontainer container kubepods-besteffort-pod411709d2_a807_4f34_9412_d952d186c81f.slice. Mar 17 17:56:04.745286 systemd[1]: Created slice kubepods-besteffort-pode62b9bfa_d049_4110_b093_e476a26ef5be.slice - libcontainer container kubepods-besteffort-pode62b9bfa_d049_4110_b093_e476a26ef5be.slice. Mar 17 17:56:04.796680 kubelet[2850]: I0317 17:56:04.796651 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3d38051-dd31-4085-95b4-5054901044b2-config-volume\") pod \"coredns-7db6d8ff4d-4ntmw\" (UID: \"f3d38051-dd31-4085-95b4-5054901044b2\") " pod="kube-system/coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:04.796680 kubelet[2850]: I0317 17:56:04.796683 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rszp\" (UniqueName: \"kubernetes.io/projected/46b937aa-d1db-4705-9bec-d4bd7aeaeceb-kube-api-access-9rszp\") pod \"coredns-7db6d8ff4d-v59bf\" (UID: \"46b937aa-d1db-4705-9bec-d4bd7aeaeceb\") " pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:04.796807 kubelet[2850]: I0317 17:56:04.796699 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmfc9\" (UniqueName: \"kubernetes.io/projected/f3d38051-dd31-4085-95b4-5054901044b2-kube-api-access-fmfc9\") pod \"coredns-7db6d8ff4d-4ntmw\" (UID: \"f3d38051-dd31-4085-95b4-5054901044b2\") " pod="kube-system/coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:04.796807 kubelet[2850]: I0317 17:56:04.796718 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7j64\" (UniqueName: \"kubernetes.io/projected/4761939f-21ce-4484-88c7-08bcb4f65c5c-kube-api-access-t7j64\") pod \"calico-kube-controllers-7c4bd45cb8-lpjjh\" (UID: \"4761939f-21ce-4484-88c7-08bcb4f65c5c\") " pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:04.796807 kubelet[2850]: I0317 17:56:04.796730 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/411709d2-a807-4f34-9412-d952d186c81f-calico-apiserver-certs\") pod \"calico-apiserver-8466cfddd6-dpzgd\" (UID: \"411709d2-a807-4f34-9412-d952d186c81f\") " pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:04.796807 kubelet[2850]: I0317 17:56:04.796740 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvhpr\" (UniqueName: \"kubernetes.io/projected/411709d2-a807-4f34-9412-d952d186c81f-kube-api-access-fvhpr\") pod \"calico-apiserver-8466cfddd6-dpzgd\" (UID: \"411709d2-a807-4f34-9412-d952d186c81f\") " pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:04.796807 kubelet[2850]: I0317 17:56:04.796751 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46b937aa-d1db-4705-9bec-d4bd7aeaeceb-config-volume\") pod \"coredns-7db6d8ff4d-v59bf\" (UID: \"46b937aa-d1db-4705-9bec-d4bd7aeaeceb\") " pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:04.798077 kubelet[2850]: I0317 17:56:04.796761 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e62b9bfa-d049-4110-b093-e476a26ef5be-calico-apiserver-certs\") pod \"calico-apiserver-8466cfddd6-28hkf\" (UID: \"e62b9bfa-d049-4110-b093-e476a26ef5be\") " pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:04.798077 kubelet[2850]: I0317 17:56:04.796772 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhf4t\" (UniqueName: \"kubernetes.io/projected/e62b9bfa-d049-4110-b093-e476a26ef5be-kube-api-access-bhf4t\") pod \"calico-apiserver-8466cfddd6-28hkf\" (UID: \"e62b9bfa-d049-4110-b093-e476a26ef5be\") " pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:04.798077 kubelet[2850]: I0317 17:56:04.796782 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4761939f-21ce-4484-88c7-08bcb4f65c5c-tigera-ca-bundle\") pod \"calico-kube-controllers-7c4bd45cb8-lpjjh\" (UID: \"4761939f-21ce-4484-88c7-08bcb4f65c5c\") " pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:05.026041 containerd[1540]: time="2025-03-17T17:56:05.026015913Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:0,}" Mar 17 17:56:05.033579 containerd[1540]: time="2025-03-17T17:56:05.032831166Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-4ntmw,Uid:f3d38051-dd31-4085-95b4-5054901044b2,Namespace:kube-system,Attempt:0,}" Mar 17 17:56:05.040069 containerd[1540]: time="2025-03-17T17:56:05.040041176Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c4bd45cb8-lpjjh,Uid:4761939f-21ce-4484-88c7-08bcb4f65c5c,Namespace:calico-system,Attempt:0,}" Mar 17 17:56:05.044132 containerd[1540]: time="2025-03-17T17:56:05.043993717Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-dpzgd,Uid:411709d2-a807-4f34-9412-d952d186c81f,Namespace:calico-apiserver,Attempt:0,}" Mar 17 17:56:05.061015 containerd[1540]: time="2025-03-17T17:56:05.060987646Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-28hkf,Uid:e62b9bfa-d049-4110-b093-e476a26ef5be,Namespace:calico-apiserver,Attempt:0,}" Mar 17 17:56:05.280168 containerd[1540]: time="2025-03-17T17:56:05.279968121Z" level=error msg="Failed to destroy network for sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.288391 containerd[1540]: time="2025-03-17T17:56:05.288155052Z" level=error msg="Failed to destroy network for sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.288877 containerd[1540]: time="2025-03-17T17:56:05.288664474Z" level=error msg="encountered an error cleaning up failed sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.288877 containerd[1540]: time="2025-03-17T17:56:05.288710602Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.288877 containerd[1540]: time="2025-03-17T17:56:05.288749766Z" level=error msg="Failed to destroy network for sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.301052 containerd[1540]: time="2025-03-17T17:56:05.289008949Z" level=error msg="encountered an error cleaning up failed sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.301052 containerd[1540]: time="2025-03-17T17:56:05.289038483Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-4ntmw,Uid:f3d38051-dd31-4085-95b4-5054901044b2,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.301052 containerd[1540]: time="2025-03-17T17:56:05.288674472Z" level=error msg="encountered an error cleaning up failed sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.301052 containerd[1540]: time="2025-03-17T17:56:05.297823793Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-28hkf,Uid:e62b9bfa-d049-4110-b093-e476a26ef5be,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.301052 containerd[1540]: time="2025-03-17T17:56:05.288713171Z" level=error msg="Failed to destroy network for sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.301052 containerd[1540]: time="2025-03-17T17:56:05.298032453Z" level=error msg="encountered an error cleaning up failed sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.301052 containerd[1540]: time="2025-03-17T17:56:05.298084503Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-dpzgd,Uid:411709d2-a807-4f34-9412-d952d186c81f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.301052 containerd[1540]: time="2025-03-17T17:56:05.300525682Z" level=error msg="Failed to destroy network for sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.301052 containerd[1540]: time="2025-03-17T17:56:05.300719146Z" level=error msg="encountered an error cleaning up failed sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.301052 containerd[1540]: time="2025-03-17T17:56:05.300757661Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c4bd45cb8-lpjjh,Uid:4761939f-21ce-4484-88c7-08bcb4f65c5c,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.317508 kubelet[2850]: E0317 17:56:05.298283 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.317508 kubelet[2850]: E0317 17:56:05.316584 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:05.317508 kubelet[2850]: E0317 17:56:05.316601 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:05.317646 kubelet[2850]: E0317 17:56:05.316627 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8466cfddd6-dpzgd_calico-apiserver(411709d2-a807-4f34-9412-d952d186c81f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8466cfddd6-dpzgd_calico-apiserver(411709d2-a807-4f34-9412-d952d186c81f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" podUID="411709d2-a807-4f34-9412-d952d186c81f" Mar 17 17:56:05.317646 kubelet[2850]: E0317 17:56:05.316772 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.317646 kubelet[2850]: E0317 17:56:05.316785 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:05.321171 kubelet[2850]: E0317 17:56:05.316795 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:05.321171 kubelet[2850]: E0317 17:56:05.316811 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7c4bd45cb8-lpjjh_calico-system(4761939f-21ce-4484-88c7-08bcb4f65c5c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7c4bd45cb8-lpjjh_calico-system(4761939f-21ce-4484-88c7-08bcb4f65c5c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" podUID="4761939f-21ce-4484-88c7-08bcb4f65c5c" Mar 17 17:56:05.321171 kubelet[2850]: E0317 17:56:05.298505 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.321253 kubelet[2850]: E0317 17:56:05.316836 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:05.321253 kubelet[2850]: E0317 17:56:05.316846 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:05.321253 kubelet[2850]: E0317 17:56:05.316858 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-v59bf_kube-system(46b937aa-d1db-4705-9bec-d4bd7aeaeceb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-v59bf_kube-system(46b937aa-d1db-4705-9bec-d4bd7aeaeceb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-v59bf" podUID="46b937aa-d1db-4705-9bec-d4bd7aeaeceb" Mar 17 17:56:05.322141 kubelet[2850]: E0317 17:56:05.316876 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.322141 kubelet[2850]: E0317 17:56:05.316886 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:05.322141 kubelet[2850]: E0317 17:56:05.316894 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:05.322203 kubelet[2850]: E0317 17:56:05.316908 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-4ntmw_kube-system(f3d38051-dd31-4085-95b4-5054901044b2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-4ntmw_kube-system(f3d38051-dd31-4085-95b4-5054901044b2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-4ntmw" podUID="f3d38051-dd31-4085-95b4-5054901044b2" Mar 17 17:56:05.322203 kubelet[2850]: E0317 17:56:05.316924 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.322203 kubelet[2850]: E0317 17:56:05.316933 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:05.322274 kubelet[2850]: E0317 17:56:05.316941 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:05.322274 kubelet[2850]: E0317 17:56:05.316954 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8466cfddd6-28hkf_calico-apiserver(e62b9bfa-d049-4110-b093-e476a26ef5be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8466cfddd6-28hkf_calico-apiserver(e62b9bfa-d049-4110-b093-e476a26ef5be)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" podUID="e62b9bfa-d049-4110-b093-e476a26ef5be" Mar 17 17:56:05.327188 systemd[1]: Created slice kubepods-besteffort-pod0d288c4c_94be_4e72_9025_53893bb68385.slice - libcontainer container kubepods-besteffort-pod0d288c4c_94be_4e72_9025_53893bb68385.slice. Mar 17 17:56:05.328469 containerd[1540]: time="2025-03-17T17:56:05.328449237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2cntr,Uid:0d288c4c-94be-4e72-9025-53893bb68385,Namespace:calico-system,Attempt:0,}" Mar 17 17:56:05.365459 containerd[1540]: time="2025-03-17T17:56:05.365364645Z" level=error msg="Failed to destroy network for sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.365771 containerd[1540]: time="2025-03-17T17:56:05.365680490Z" level=error msg="encountered an error cleaning up failed sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.365771 containerd[1540]: time="2025-03-17T17:56:05.365726609Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2cntr,Uid:0d288c4c-94be-4e72-9025-53893bb68385,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.365953 kubelet[2850]: E0317 17:56:05.365929 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.365999 kubelet[2850]: E0317 17:56:05.365967 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2cntr" Mar 17 17:56:05.365999 kubelet[2850]: E0317 17:56:05.365983 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2cntr" Mar 17 17:56:05.366038 kubelet[2850]: E0317 17:56:05.366014 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-2cntr_calico-system(0d288c4c-94be-4e72-9025-53893bb68385)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-2cntr_calico-system(0d288c4c-94be-4e72-9025-53893bb68385)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2cntr" podUID="0d288c4c-94be-4e72-9025-53893bb68385" Mar 17 17:56:05.397822 kubelet[2850]: I0317 17:56:05.397455 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1" Mar 17 17:56:05.404800 kubelet[2850]: I0317 17:56:05.404205 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b" Mar 17 17:56:05.407342 kubelet[2850]: I0317 17:56:05.406784 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca" Mar 17 17:56:05.409144 kubelet[2850]: I0317 17:56:05.409131 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8" Mar 17 17:56:05.420081 kubelet[2850]: I0317 17:56:05.420059 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc" Mar 17 17:56:05.420879 kubelet[2850]: I0317 17:56:05.420862 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d" Mar 17 17:56:05.429132 containerd[1540]: time="2025-03-17T17:56:05.428293895Z" level=info msg="StopPodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\"" Mar 17 17:56:05.429132 containerd[1540]: time="2025-03-17T17:56:05.428471581Z" level=info msg="StopPodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\"" Mar 17 17:56:05.431064 containerd[1540]: time="2025-03-17T17:56:05.431042650Z" level=info msg="Ensure that sandbox 3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d in task-service has been cleanup successfully" Mar 17 17:56:05.431894 containerd[1540]: time="2025-03-17T17:56:05.431870033Z" level=info msg="Ensure that sandbox 792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b in task-service has been cleanup successfully" Mar 17 17:56:05.432098 containerd[1540]: time="2025-03-17T17:56:05.431977224Z" level=info msg="TearDown network for sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" successfully" Mar 17 17:56:05.432098 containerd[1540]: time="2025-03-17T17:56:05.432094111Z" level=info msg="StopPodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" returns successfully" Mar 17 17:56:05.432170 containerd[1540]: time="2025-03-17T17:56:05.431246850Z" level=info msg="StopPodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\"" Mar 17 17:56:05.432493 containerd[1540]: time="2025-03-17T17:56:05.432079909Z" level=info msg="TearDown network for sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" successfully" Mar 17 17:56:05.432493 containerd[1540]: time="2025-03-17T17:56:05.432237796Z" level=info msg="StopPodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" returns successfully" Mar 17 17:56:05.432493 containerd[1540]: time="2025-03-17T17:56:05.432253520Z" level=info msg="Ensure that sandbox 60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8 in task-service has been cleanup successfully" Mar 17 17:56:05.432493 containerd[1540]: time="2025-03-17T17:56:05.431335519Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 17 17:56:05.433048 containerd[1540]: time="2025-03-17T17:56:05.433032093Z" level=info msg="TearDown network for sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" successfully" Mar 17 17:56:05.433048 containerd[1540]: time="2025-03-17T17:56:05.433043707Z" level=info msg="StopPodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" returns successfully" Mar 17 17:56:05.433152 containerd[1540]: time="2025-03-17T17:56:05.431357094Z" level=info msg="StopPodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\"" Mar 17 17:56:05.433397 containerd[1540]: time="2025-03-17T17:56:05.433280016Z" level=info msg="Ensure that sandbox 9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1 in task-service has been cleanup successfully" Mar 17 17:56:05.434253 containerd[1540]: time="2025-03-17T17:56:05.433704521Z" level=info msg="TearDown network for sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" successfully" Mar 17 17:56:05.434253 containerd[1540]: time="2025-03-17T17:56:05.433715294Z" level=info msg="StopPodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" returns successfully" Mar 17 17:56:05.434253 containerd[1540]: time="2025-03-17T17:56:05.433292052Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-4ntmw,Uid:f3d38051-dd31-4085-95b4-5054901044b2,Namespace:kube-system,Attempt:1,}" Mar 17 17:56:05.434605 containerd[1540]: time="2025-03-17T17:56:05.434588362Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-dpzgd,Uid:411709d2-a807-4f34-9412-d952d186c81f,Namespace:calico-apiserver,Attempt:1,}" Mar 17 17:56:05.435344 containerd[1540]: time="2025-03-17T17:56:05.431257161Z" level=info msg="StopPodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\"" Mar 17 17:56:05.435344 containerd[1540]: time="2025-03-17T17:56:05.435246236Z" level=info msg="Ensure that sandbox 0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc in task-service has been cleanup successfully" Mar 17 17:56:05.435478 containerd[1540]: time="2025-03-17T17:56:05.435467417Z" level=info msg="TearDown network for sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" successfully" Mar 17 17:56:05.435529 containerd[1540]: time="2025-03-17T17:56:05.435520157Z" level=info msg="StopPodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" returns successfully" Mar 17 17:56:05.435610 containerd[1540]: time="2025-03-17T17:56:05.433357922Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-28hkf,Uid:e62b9bfa-d049-4110-b093-e476a26ef5be,Namespace:calico-apiserver,Attempt:1,}" Mar 17 17:56:05.435780 containerd[1540]: time="2025-03-17T17:56:05.431235488Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\"" Mar 17 17:56:05.435965 containerd[1540]: time="2025-03-17T17:56:05.435956430Z" level=info msg="Ensure that sandbox b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca in task-service has been cleanup successfully" Mar 17 17:56:05.438968 containerd[1540]: time="2025-03-17T17:56:05.438655075Z" level=info msg="TearDown network for sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" successfully" Mar 17 17:56:05.438968 containerd[1540]: time="2025-03-17T17:56:05.438673546Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" returns successfully" Mar 17 17:56:05.443065 containerd[1540]: time="2025-03-17T17:56:05.442287001Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:1,}" Mar 17 17:56:05.443232 containerd[1540]: time="2025-03-17T17:56:05.443171083Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2cntr,Uid:0d288c4c-94be-4e72-9025-53893bb68385,Namespace:calico-system,Attempt:1,}" Mar 17 17:56:05.444028 containerd[1540]: time="2025-03-17T17:56:05.443357874Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c4bd45cb8-lpjjh,Uid:4761939f-21ce-4484-88c7-08bcb4f65c5c,Namespace:calico-system,Attempt:1,}" Mar 17 17:56:05.549050 containerd[1540]: time="2025-03-17T17:56:05.548916951Z" level=error msg="Failed to destroy network for sandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.550488 containerd[1540]: time="2025-03-17T17:56:05.549263979Z" level=error msg="encountered an error cleaning up failed sandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.550488 containerd[1540]: time="2025-03-17T17:56:05.549305505Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-4ntmw,Uid:f3d38051-dd31-4085-95b4-5054901044b2,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.551604 kubelet[2850]: E0317 17:56:05.550860 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.551604 kubelet[2850]: E0317 17:56:05.550899 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:05.551604 kubelet[2850]: E0317 17:56:05.550932 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:05.551720 kubelet[2850]: E0317 17:56:05.551139 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-4ntmw_kube-system(f3d38051-dd31-4085-95b4-5054901044b2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-4ntmw_kube-system(f3d38051-dd31-4085-95b4-5054901044b2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-4ntmw" podUID="f3d38051-dd31-4085-95b4-5054901044b2" Mar 17 17:56:05.564925 containerd[1540]: time="2025-03-17T17:56:05.564835526Z" level=error msg="Failed to destroy network for sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.565215 containerd[1540]: time="2025-03-17T17:56:05.565136311Z" level=error msg="encountered an error cleaning up failed sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.565215 containerd[1540]: time="2025-03-17T17:56:05.565188555Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.566403 kubelet[2850]: E0317 17:56:05.565395 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.566403 kubelet[2850]: E0317 17:56:05.565432 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:05.566403 kubelet[2850]: E0317 17:56:05.565447 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:05.566806 kubelet[2850]: E0317 17:56:05.565472 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-v59bf_kube-system(46b937aa-d1db-4705-9bec-d4bd7aeaeceb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-v59bf_kube-system(46b937aa-d1db-4705-9bec-d4bd7aeaeceb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-v59bf" podUID="46b937aa-d1db-4705-9bec-d4bd7aeaeceb" Mar 17 17:56:05.571511 containerd[1540]: time="2025-03-17T17:56:05.571385339Z" level=error msg="Failed to destroy network for sandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.571832 containerd[1540]: time="2025-03-17T17:56:05.571818915Z" level=error msg="encountered an error cleaning up failed sandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.571923 containerd[1540]: time="2025-03-17T17:56:05.571909958Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c4bd45cb8-lpjjh,Uid:4761939f-21ce-4484-88c7-08bcb4f65c5c,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.572209 kubelet[2850]: E0317 17:56:05.572095 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.572570 kubelet[2850]: E0317 17:56:05.572259 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:05.572570 kubelet[2850]: E0317 17:56:05.572291 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:05.572570 kubelet[2850]: E0317 17:56:05.572320 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7c4bd45cb8-lpjjh_calico-system(4761939f-21ce-4484-88c7-08bcb4f65c5c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7c4bd45cb8-lpjjh_calico-system(4761939f-21ce-4484-88c7-08bcb4f65c5c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" podUID="4761939f-21ce-4484-88c7-08bcb4f65c5c" Mar 17 17:56:05.573013 containerd[1540]: time="2025-03-17T17:56:05.572924219Z" level=error msg="Failed to destroy network for sandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.573486 containerd[1540]: time="2025-03-17T17:56:05.573459779Z" level=error msg="encountered an error cleaning up failed sandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.573537 containerd[1540]: time="2025-03-17T17:56:05.573492370Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2cntr,Uid:0d288c4c-94be-4e72-9025-53893bb68385,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.573657 kubelet[2850]: E0317 17:56:05.573605 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.573657 kubelet[2850]: E0317 17:56:05.573737 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2cntr" Mar 17 17:56:05.573657 kubelet[2850]: E0317 17:56:05.573751 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2cntr" Mar 17 17:56:05.573903 kubelet[2850]: E0317 17:56:05.573880 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-2cntr_calico-system(0d288c4c-94be-4e72-9025-53893bb68385)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-2cntr_calico-system(0d288c4c-94be-4e72-9025-53893bb68385)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2cntr" podUID="0d288c4c-94be-4e72-9025-53893bb68385" Mar 17 17:56:05.581362 containerd[1540]: time="2025-03-17T17:56:05.581338051Z" level=error msg="Failed to destroy network for sandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.581656 containerd[1540]: time="2025-03-17T17:56:05.581362202Z" level=error msg="Failed to destroy network for sandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.581892 containerd[1540]: time="2025-03-17T17:56:05.581861183Z" level=error msg="encountered an error cleaning up failed sandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.581980 containerd[1540]: time="2025-03-17T17:56:05.581967803Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-dpzgd,Uid:411709d2-a807-4f34-9412-d952d186c81f,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.582303 containerd[1540]: time="2025-03-17T17:56:05.581939122Z" level=error msg="encountered an error cleaning up failed sandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.582303 containerd[1540]: time="2025-03-17T17:56:05.582283576Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-28hkf,Uid:e62b9bfa-d049-4110-b093-e476a26ef5be,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.582368 kubelet[2850]: E0317 17:56:05.582180 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.582368 kubelet[2850]: E0317 17:56:05.582210 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:05.582368 kubelet[2850]: E0317 17:56:05.582222 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:05.582429 kubelet[2850]: E0317 17:56:05.582251 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8466cfddd6-dpzgd_calico-apiserver(411709d2-a807-4f34-9412-d952d186c81f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8466cfddd6-dpzgd_calico-apiserver(411709d2-a807-4f34-9412-d952d186c81f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" podUID="411709d2-a807-4f34-9412-d952d186c81f" Mar 17 17:56:05.582555 kubelet[2850]: E0317 17:56:05.582537 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:05.582600 kubelet[2850]: E0317 17:56:05.582581 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:05.582600 kubelet[2850]: E0317 17:56:05.582594 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:05.582652 kubelet[2850]: E0317 17:56:05.582613 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8466cfddd6-28hkf_calico-apiserver(e62b9bfa-d049-4110-b093-e476a26ef5be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8466cfddd6-28hkf_calico-apiserver(e62b9bfa-d049-4110-b093-e476a26ef5be)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" podUID="e62b9bfa-d049-4110-b093-e476a26ef5be" Mar 17 17:56:05.638260 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d-shm.mount: Deactivated successfully. Mar 17 17:56:05.638484 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca-shm.mount: Deactivated successfully. Mar 17 17:56:06.425172 kubelet[2850]: I0317 17:56:06.425145 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa" Mar 17 17:56:06.426369 containerd[1540]: time="2025-03-17T17:56:06.426264967Z" level=info msg="StopPodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\"" Mar 17 17:56:06.427171 containerd[1540]: time="2025-03-17T17:56:06.427150534Z" level=info msg="Ensure that sandbox 0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa in task-service has been cleanup successfully" Mar 17 17:56:06.427430 containerd[1540]: time="2025-03-17T17:56:06.427416975Z" level=info msg="TearDown network for sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" successfully" Mar 17 17:56:06.427430 containerd[1540]: time="2025-03-17T17:56:06.427427281Z" level=info msg="StopPodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" returns successfully" Mar 17 17:56:06.428798 containerd[1540]: time="2025-03-17T17:56:06.428681888Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\"" Mar 17 17:56:06.428798 containerd[1540]: time="2025-03-17T17:56:06.428727837Z" level=info msg="TearDown network for sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" successfully" Mar 17 17:56:06.428798 containerd[1540]: time="2025-03-17T17:56:06.428734262Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" returns successfully" Mar 17 17:56:06.429033 kubelet[2850]: I0317 17:56:06.429022 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab" Mar 17 17:56:06.429871 systemd[1]: run-netns-cni\x2d89e9f4dc\x2d37d3\x2de931\x2d14e4\x2df1ae9ed6f72e.mount: Deactivated successfully. Mar 17 17:56:06.431069 containerd[1540]: time="2025-03-17T17:56:06.430276035Z" level=info msg="StopPodSandbox for \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\"" Mar 17 17:56:06.431069 containerd[1540]: time="2025-03-17T17:56:06.430411085Z" level=info msg="Ensure that sandbox 6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab in task-service has been cleanup successfully" Mar 17 17:56:06.431069 containerd[1540]: time="2025-03-17T17:56:06.430693762Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:2,}" Mar 17 17:56:06.431260 containerd[1540]: time="2025-03-17T17:56:06.431239983Z" level=info msg="TearDown network for sandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\" successfully" Mar 17 17:56:06.431841 containerd[1540]: time="2025-03-17T17:56:06.431820584Z" level=info msg="StopPodSandbox for \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\" returns successfully" Mar 17 17:56:06.432101 systemd[1]: run-netns-cni\x2d5f73763f\x2d3721\x2d3972\x2d6c93\x2d9a48c34fe75f.mount: Deactivated successfully. Mar 17 17:56:06.435440 containerd[1540]: time="2025-03-17T17:56:06.435413624Z" level=info msg="StopPodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\"" Mar 17 17:56:06.435503 containerd[1540]: time="2025-03-17T17:56:06.435489884Z" level=info msg="TearDown network for sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" successfully" Mar 17 17:56:06.435503 containerd[1540]: time="2025-03-17T17:56:06.435499226Z" level=info msg="StopPodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" returns successfully" Mar 17 17:56:06.438425 containerd[1540]: time="2025-03-17T17:56:06.438398813Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2cntr,Uid:0d288c4c-94be-4e72-9025-53893bb68385,Namespace:calico-system,Attempt:2,}" Mar 17 17:56:06.438904 kubelet[2850]: I0317 17:56:06.438881 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402" Mar 17 17:56:06.440047 containerd[1540]: time="2025-03-17T17:56:06.439744088Z" level=info msg="StopPodSandbox for \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\"" Mar 17 17:56:06.440047 containerd[1540]: time="2025-03-17T17:56:06.439938401Z" level=info msg="Ensure that sandbox 9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402 in task-service has been cleanup successfully" Mar 17 17:56:06.443611 containerd[1540]: time="2025-03-17T17:56:06.440160673Z" level=info msg="TearDown network for sandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\" successfully" Mar 17 17:56:06.443611 containerd[1540]: time="2025-03-17T17:56:06.440184392Z" level=info msg="StopPodSandbox for \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\" returns successfully" Mar 17 17:56:06.444096 systemd[1]: run-netns-cni\x2d6746abaf\x2d22e3\x2d40b3\x2dcd0b\x2daa8fc9f7ece7.mount: Deactivated successfully. Mar 17 17:56:06.444827 kubelet[2850]: I0317 17:56:06.444795 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4" Mar 17 17:56:06.446309 containerd[1540]: time="2025-03-17T17:56:06.445636174Z" level=info msg="StopPodSandbox for \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\"" Mar 17 17:56:06.446309 containerd[1540]: time="2025-03-17T17:56:06.445796923Z" level=info msg="Ensure that sandbox 076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4 in task-service has been cleanup successfully" Mar 17 17:56:06.447082 systemd[1]: run-netns-cni\x2dbc296096\x2d81e6\x2d09f0\x2d4158\x2dfde2a1f3cc23.mount: Deactivated successfully. Mar 17 17:56:06.447966 containerd[1540]: time="2025-03-17T17:56:06.447948028Z" level=info msg="TearDown network for sandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\" successfully" Mar 17 17:56:06.447966 containerd[1540]: time="2025-03-17T17:56:06.447963434Z" level=info msg="StopPodSandbox for \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\" returns successfully" Mar 17 17:56:06.449360 containerd[1540]: time="2025-03-17T17:56:06.449174489Z" level=info msg="StopPodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\"" Mar 17 17:56:06.449360 containerd[1540]: time="2025-03-17T17:56:06.449220338Z" level=info msg="StopPodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\"" Mar 17 17:56:06.449415 containerd[1540]: time="2025-03-17T17:56:06.449237646Z" level=info msg="TearDown network for sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" successfully" Mar 17 17:56:06.449440 containerd[1540]: time="2025-03-17T17:56:06.449414895Z" level=info msg="StopPodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" returns successfully" Mar 17 17:56:06.449474 containerd[1540]: time="2025-03-17T17:56:06.449270256Z" level=info msg="TearDown network for sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" successfully" Mar 17 17:56:06.449474 containerd[1540]: time="2025-03-17T17:56:06.449445194Z" level=info msg="StopPodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" returns successfully" Mar 17 17:56:06.450353 containerd[1540]: time="2025-03-17T17:56:06.450335486Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-28hkf,Uid:e62b9bfa-d049-4110-b093-e476a26ef5be,Namespace:calico-apiserver,Attempt:2,}" Mar 17 17:56:06.450584 kubelet[2850]: I0317 17:56:06.450290 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd" Mar 17 17:56:06.450803 containerd[1540]: time="2025-03-17T17:56:06.450647297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-dpzgd,Uid:411709d2-a807-4f34-9412-d952d186c81f,Namespace:calico-apiserver,Attempt:2,}" Mar 17 17:56:06.451309 containerd[1540]: time="2025-03-17T17:56:06.451245947Z" level=info msg="StopPodSandbox for \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\"" Mar 17 17:56:06.451622 containerd[1540]: time="2025-03-17T17:56:06.451452161Z" level=info msg="Ensure that sandbox 0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd in task-service has been cleanup successfully" Mar 17 17:56:06.451766 containerd[1540]: time="2025-03-17T17:56:06.451743826Z" level=info msg="TearDown network for sandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\" successfully" Mar 17 17:56:06.451916 containerd[1540]: time="2025-03-17T17:56:06.451753944Z" level=info msg="StopPodSandbox for \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\" returns successfully" Mar 17 17:56:06.452264 containerd[1540]: time="2025-03-17T17:56:06.452220874Z" level=info msg="StopPodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\"" Mar 17 17:56:06.452399 containerd[1540]: time="2025-03-17T17:56:06.452332250Z" level=info msg="TearDown network for sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" successfully" Mar 17 17:56:06.452399 containerd[1540]: time="2025-03-17T17:56:06.452339738Z" level=info msg="StopPodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" returns successfully" Mar 17 17:56:06.453228 containerd[1540]: time="2025-03-17T17:56:06.453100503Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c4bd45cb8-lpjjh,Uid:4761939f-21ce-4484-88c7-08bcb4f65c5c,Namespace:calico-system,Attempt:2,}" Mar 17 17:56:06.453366 kubelet[2850]: I0317 17:56:06.453346 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7" Mar 17 17:56:06.453719 containerd[1540]: time="2025-03-17T17:56:06.453638843Z" level=info msg="StopPodSandbox for \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\"" Mar 17 17:56:06.453903 containerd[1540]: time="2025-03-17T17:56:06.453834258Z" level=info msg="Ensure that sandbox df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7 in task-service has been cleanup successfully" Mar 17 17:56:06.454128 containerd[1540]: time="2025-03-17T17:56:06.454080269Z" level=info msg="TearDown network for sandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\" successfully" Mar 17 17:56:06.454128 containerd[1540]: time="2025-03-17T17:56:06.454090747Z" level=info msg="StopPodSandbox for \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\" returns successfully" Mar 17 17:56:06.454330 containerd[1540]: time="2025-03-17T17:56:06.454245888Z" level=info msg="StopPodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\"" Mar 17 17:56:06.454330 containerd[1540]: time="2025-03-17T17:56:06.454294886Z" level=info msg="TearDown network for sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" successfully" Mar 17 17:56:06.454330 containerd[1540]: time="2025-03-17T17:56:06.454302780Z" level=info msg="StopPodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" returns successfully" Mar 17 17:56:06.454572 containerd[1540]: time="2025-03-17T17:56:06.454516787Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-4ntmw,Uid:f3d38051-dd31-4085-95b4-5054901044b2,Namespace:kube-system,Attempt:2,}" Mar 17 17:56:06.600265 containerd[1540]: time="2025-03-17T17:56:06.600167669Z" level=error msg="Failed to destroy network for sandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.600426 containerd[1540]: time="2025-03-17T17:56:06.600379555Z" level=error msg="encountered an error cleaning up failed sandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.600426 containerd[1540]: time="2025-03-17T17:56:06.600415285Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.600501 containerd[1540]: time="2025-03-17T17:56:06.600468644Z" level=error msg="Failed to destroy network for sandbox \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.600723 kubelet[2850]: E0317 17:56:06.600649 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.600723 kubelet[2850]: E0317 17:56:06.600696 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:06.600723 kubelet[2850]: E0317 17:56:06.600711 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:06.601642 kubelet[2850]: E0317 17:56:06.600742 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-v59bf_kube-system(46b937aa-d1db-4705-9bec-d4bd7aeaeceb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-v59bf_kube-system(46b937aa-d1db-4705-9bec-d4bd7aeaeceb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-v59bf" podUID="46b937aa-d1db-4705-9bec-d4bd7aeaeceb" Mar 17 17:56:06.601642 kubelet[2850]: E0317 17:56:06.601042 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.601642 kubelet[2850]: E0317 17:56:06.601056 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2cntr" Mar 17 17:56:06.601744 containerd[1540]: time="2025-03-17T17:56:06.600958949Z" level=error msg="encountered an error cleaning up failed sandbox \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.601744 containerd[1540]: time="2025-03-17T17:56:06.600981291Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2cntr,Uid:0d288c4c-94be-4e72-9025-53893bb68385,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.601787 kubelet[2850]: E0317 17:56:06.601067 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2cntr" Mar 17 17:56:06.601787 kubelet[2850]: E0317 17:56:06.601083 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-2cntr_calico-system(0d288c4c-94be-4e72-9025-53893bb68385)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-2cntr_calico-system(0d288c4c-94be-4e72-9025-53893bb68385)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2cntr" podUID="0d288c4c-94be-4e72-9025-53893bb68385" Mar 17 17:56:06.604262 containerd[1540]: time="2025-03-17T17:56:06.604234660Z" level=error msg="Failed to destroy network for sandbox \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.604980 containerd[1540]: time="2025-03-17T17:56:06.604873781Z" level=error msg="encountered an error cleaning up failed sandbox \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.605068 containerd[1540]: time="2025-03-17T17:56:06.605056844Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-28hkf,Uid:e62b9bfa-d049-4110-b093-e476a26ef5be,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.605416 kubelet[2850]: E0317 17:56:06.605395 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.605547 kubelet[2850]: E0317 17:56:06.605425 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:06.605547 kubelet[2850]: E0317 17:56:06.605437 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:06.605547 kubelet[2850]: E0317 17:56:06.605462 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8466cfddd6-28hkf_calico-apiserver(e62b9bfa-d049-4110-b093-e476a26ef5be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8466cfddd6-28hkf_calico-apiserver(e62b9bfa-d049-4110-b093-e476a26ef5be)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" podUID="e62b9bfa-d049-4110-b093-e476a26ef5be" Mar 17 17:56:06.607112 containerd[1540]: time="2025-03-17T17:56:06.607094483Z" level=error msg="Failed to destroy network for sandbox \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.607439 containerd[1540]: time="2025-03-17T17:56:06.607427093Z" level=error msg="encountered an error cleaning up failed sandbox \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.607597 containerd[1540]: time="2025-03-17T17:56:06.607583913Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-dpzgd,Uid:411709d2-a807-4f34-9412-d952d186c81f,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.607972 kubelet[2850]: E0317 17:56:06.607953 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.608007 kubelet[2850]: E0317 17:56:06.607980 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:06.608007 kubelet[2850]: E0317 17:56:06.607993 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:06.608053 kubelet[2850]: E0317 17:56:06.608015 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8466cfddd6-dpzgd_calico-apiserver(411709d2-a807-4f34-9412-d952d186c81f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8466cfddd6-dpzgd_calico-apiserver(411709d2-a807-4f34-9412-d952d186c81f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" podUID="411709d2-a807-4f34-9412-d952d186c81f" Mar 17 17:56:06.611808 containerd[1540]: time="2025-03-17T17:56:06.611765851Z" level=error msg="Failed to destroy network for sandbox \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.612017 containerd[1540]: time="2025-03-17T17:56:06.612000376Z" level=error msg="encountered an error cleaning up failed sandbox \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.612064 containerd[1540]: time="2025-03-17T17:56:06.612048062Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c4bd45cb8-lpjjh,Uid:4761939f-21ce-4484-88c7-08bcb4f65c5c,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.612844 kubelet[2850]: E0317 17:56:06.612247 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.612844 kubelet[2850]: E0317 17:56:06.612291 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:06.612844 kubelet[2850]: E0317 17:56:06.612306 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:06.612914 kubelet[2850]: E0317 17:56:06.612336 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7c4bd45cb8-lpjjh_calico-system(4761939f-21ce-4484-88c7-08bcb4f65c5c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7c4bd45cb8-lpjjh_calico-system(4761939f-21ce-4484-88c7-08bcb4f65c5c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" podUID="4761939f-21ce-4484-88c7-08bcb4f65c5c" Mar 17 17:56:06.614655 containerd[1540]: time="2025-03-17T17:56:06.614628823Z" level=error msg="Failed to destroy network for sandbox \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.614841 containerd[1540]: time="2025-03-17T17:56:06.614825482Z" level=error msg="encountered an error cleaning up failed sandbox \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.614872 containerd[1540]: time="2025-03-17T17:56:06.614860652Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-4ntmw,Uid:f3d38051-dd31-4085-95b4-5054901044b2,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.614990 kubelet[2850]: E0317 17:56:06.614968 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:06.615040 kubelet[2850]: E0317 17:56:06.615006 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:06.615040 kubelet[2850]: E0317 17:56:06.615019 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:06.615084 kubelet[2850]: E0317 17:56:06.615045 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-4ntmw_kube-system(f3d38051-dd31-4085-95b4-5054901044b2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-4ntmw_kube-system(f3d38051-dd31-4085-95b4-5054901044b2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-4ntmw" podUID="f3d38051-dd31-4085-95b4-5054901044b2" Mar 17 17:56:06.639046 systemd[1]: run-netns-cni\x2d51b39800\x2d989c\x2df86a\x2d831e\x2dafd9e70705ec.mount: Deactivated successfully. Mar 17 17:56:06.639102 systemd[1]: run-netns-cni\x2d9bc6a1b8\x2d1f9d\x2de454\x2d7323\x2d3eccf6584e0c.mount: Deactivated successfully. Mar 17 17:56:07.456344 kubelet[2850]: I0317 17:56:07.456286 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459" Mar 17 17:56:07.457265 containerd[1540]: time="2025-03-17T17:56:07.457076425Z" level=info msg="StopPodSandbox for \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\"" Mar 17 17:56:07.457265 containerd[1540]: time="2025-03-17T17:56:07.457228207Z" level=info msg="Ensure that sandbox 4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459 in task-service has been cleanup successfully" Mar 17 17:56:07.459109 systemd[1]: run-netns-cni\x2de030ff0f\x2d018b\x2d91a6\x2d15bb\x2de13bd5c64627.mount: Deactivated successfully. Mar 17 17:56:07.460546 containerd[1540]: time="2025-03-17T17:56:07.459960485Z" level=info msg="TearDown network for sandbox \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\" successfully" Mar 17 17:56:07.460546 containerd[1540]: time="2025-03-17T17:56:07.459982784Z" level=info msg="StopPodSandbox for \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\" returns successfully" Mar 17 17:56:07.460546 containerd[1540]: time="2025-03-17T17:56:07.460342779Z" level=info msg="StopPodSandbox for \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\"" Mar 17 17:56:07.460546 containerd[1540]: time="2025-03-17T17:56:07.460398944Z" level=info msg="TearDown network for sandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\" successfully" Mar 17 17:56:07.460546 containerd[1540]: time="2025-03-17T17:56:07.460407024Z" level=info msg="StopPodSandbox for \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\" returns successfully" Mar 17 17:56:07.460774 containerd[1540]: time="2025-03-17T17:56:07.460744021Z" level=info msg="StopPodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\"" Mar 17 17:56:07.460811 containerd[1540]: time="2025-03-17T17:56:07.460801104Z" level=info msg="TearDown network for sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" successfully" Mar 17 17:56:07.461312 containerd[1540]: time="2025-03-17T17:56:07.460809725Z" level=info msg="StopPodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" returns successfully" Mar 17 17:56:07.461354 kubelet[2850]: I0317 17:56:07.461034 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1" Mar 17 17:56:07.461601 containerd[1540]: time="2025-03-17T17:56:07.461583416Z" level=info msg="StopPodSandbox for \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\"" Mar 17 17:56:07.463122 containerd[1540]: time="2025-03-17T17:56:07.462386357Z" level=info msg="Ensure that sandbox a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1 in task-service has been cleanup successfully" Mar 17 17:56:07.464781 containerd[1540]: time="2025-03-17T17:56:07.463259558Z" level=info msg="TearDown network for sandbox \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\" successfully" Mar 17 17:56:07.464781 containerd[1540]: time="2025-03-17T17:56:07.463272548Z" level=info msg="StopPodSandbox for \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\" returns successfully" Mar 17 17:56:07.464781 containerd[1540]: time="2025-03-17T17:56:07.461726258Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-28hkf,Uid:e62b9bfa-d049-4110-b093-e476a26ef5be,Namespace:calico-apiserver,Attempt:3,}" Mar 17 17:56:07.464618 systemd[1]: run-netns-cni\x2de25f4e54\x2d6c5b\x2d1f1d\x2da008\x2dceeb170e8a75.mount: Deactivated successfully. Mar 17 17:56:07.466185 containerd[1540]: time="2025-03-17T17:56:07.466062393Z" level=info msg="StopPodSandbox for \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\"" Mar 17 17:56:07.466185 containerd[1540]: time="2025-03-17T17:56:07.466122152Z" level=info msg="TearDown network for sandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\" successfully" Mar 17 17:56:07.466185 containerd[1540]: time="2025-03-17T17:56:07.466130540Z" level=info msg="StopPodSandbox for \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\" returns successfully" Mar 17 17:56:07.471279 containerd[1540]: time="2025-03-17T17:56:07.470826401Z" level=info msg="StopPodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\"" Mar 17 17:56:07.473712 kubelet[2850]: I0317 17:56:07.472992 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd" Mar 17 17:56:07.475510 containerd[1540]: time="2025-03-17T17:56:07.475481055Z" level=info msg="TearDown network for sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" successfully" Mar 17 17:56:07.475589 containerd[1540]: time="2025-03-17T17:56:07.475520549Z" level=info msg="StopPodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" returns successfully" Mar 17 17:56:07.475589 containerd[1540]: time="2025-03-17T17:56:07.475015132Z" level=info msg="StopPodSandbox for \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\"" Mar 17 17:56:07.475765 containerd[1540]: time="2025-03-17T17:56:07.475669485Z" level=info msg="Ensure that sandbox 446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd in task-service has been cleanup successfully" Mar 17 17:56:07.476941 containerd[1540]: time="2025-03-17T17:56:07.476769764Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-dpzgd,Uid:411709d2-a807-4f34-9412-d952d186c81f,Namespace:calico-apiserver,Attempt:3,}" Mar 17 17:56:07.476941 containerd[1540]: time="2025-03-17T17:56:07.476917714Z" level=info msg="TearDown network for sandbox \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\" successfully" Mar 17 17:56:07.476941 containerd[1540]: time="2025-03-17T17:56:07.476927571Z" level=info msg="StopPodSandbox for \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\" returns successfully" Mar 17 17:56:07.477492 containerd[1540]: time="2025-03-17T17:56:07.477316408Z" level=info msg="StopPodSandbox for \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\"" Mar 17 17:56:07.478832 containerd[1540]: time="2025-03-17T17:56:07.478676526Z" level=info msg="TearDown network for sandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\" successfully" Mar 17 17:56:07.478832 containerd[1540]: time="2025-03-17T17:56:07.478691557Z" level=info msg="StopPodSandbox for \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\" returns successfully" Mar 17 17:56:07.481723 containerd[1540]: time="2025-03-17T17:56:07.481650008Z" level=info msg="StopPodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\"" Mar 17 17:56:07.481723 containerd[1540]: time="2025-03-17T17:56:07.481700098Z" level=info msg="TearDown network for sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" successfully" Mar 17 17:56:07.481723 containerd[1540]: time="2025-03-17T17:56:07.481706497Z" level=info msg="StopPodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" returns successfully" Mar 17 17:56:07.483241 containerd[1540]: time="2025-03-17T17:56:07.483229118Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c4bd45cb8-lpjjh,Uid:4761939f-21ce-4484-88c7-08bcb4f65c5c,Namespace:calico-system,Attempt:3,}" Mar 17 17:56:07.484566 kubelet[2850]: I0317 17:56:07.484367 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442" Mar 17 17:56:07.485314 containerd[1540]: time="2025-03-17T17:56:07.485300399Z" level=info msg="StopPodSandbox for \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\"" Mar 17 17:56:07.485428 containerd[1540]: time="2025-03-17T17:56:07.485415242Z" level=info msg="Ensure that sandbox 7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442 in task-service has been cleanup successfully" Mar 17 17:56:07.486342 kubelet[2850]: I0317 17:56:07.486332 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6" Mar 17 17:56:07.487211 containerd[1540]: time="2025-03-17T17:56:07.487198042Z" level=info msg="StopPodSandbox for \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\"" Mar 17 17:56:07.488139 containerd[1540]: time="2025-03-17T17:56:07.487240457Z" level=info msg="TearDown network for sandbox \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\" successfully" Mar 17 17:56:07.488139 containerd[1540]: time="2025-03-17T17:56:07.488052989Z" level=info msg="StopPodSandbox for \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\" returns successfully" Mar 17 17:56:07.488675 containerd[1540]: time="2025-03-17T17:56:07.488664851Z" level=info msg="Ensure that sandbox b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6 in task-service has been cleanup successfully" Mar 17 17:56:07.488855 containerd[1540]: time="2025-03-17T17:56:07.488844161Z" level=info msg="TearDown network for sandbox \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\" successfully" Mar 17 17:56:07.488900 containerd[1540]: time="2025-03-17T17:56:07.488893028Z" level=info msg="StopPodSandbox for \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\" returns successfully" Mar 17 17:56:07.489044 containerd[1540]: time="2025-03-17T17:56:07.489034480Z" level=info msg="StopPodSandbox for \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\"" Mar 17 17:56:07.489802 kubelet[2850]: I0317 17:56:07.489788 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef" Mar 17 17:56:07.491439 containerd[1540]: time="2025-03-17T17:56:07.491392472Z" level=info msg="TearDown network for sandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\" successfully" Mar 17 17:56:07.491439 containerd[1540]: time="2025-03-17T17:56:07.491406488Z" level=info msg="StopPodSandbox for \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\" returns successfully" Mar 17 17:56:07.492028 containerd[1540]: time="2025-03-17T17:56:07.492016553Z" level=info msg="StopPodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\"" Mar 17 17:56:07.492586 containerd[1540]: time="2025-03-17T17:56:07.492263543Z" level=info msg="TearDown network for sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" successfully" Mar 17 17:56:07.492586 containerd[1540]: time="2025-03-17T17:56:07.492273167Z" level=info msg="StopPodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" returns successfully" Mar 17 17:56:07.492586 containerd[1540]: time="2025-03-17T17:56:07.492332208Z" level=info msg="StopPodSandbox for \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\"" Mar 17 17:56:07.492586 containerd[1540]: time="2025-03-17T17:56:07.492374624Z" level=info msg="TearDown network for sandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\" successfully" Mar 17 17:56:07.492586 containerd[1540]: time="2025-03-17T17:56:07.492386466Z" level=info msg="StopPodSandbox for \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\" returns successfully" Mar 17 17:56:07.492586 containerd[1540]: time="2025-03-17T17:56:07.492411860Z" level=info msg="StopPodSandbox for \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\"" Mar 17 17:56:07.492586 containerd[1540]: time="2025-03-17T17:56:07.492504169Z" level=info msg="Ensure that sandbox 78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef in task-service has been cleanup successfully" Mar 17 17:56:07.493508 containerd[1540]: time="2025-03-17T17:56:07.493496896Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-4ntmw,Uid:f3d38051-dd31-4085-95b4-5054901044b2,Namespace:kube-system,Attempt:3,}" Mar 17 17:56:07.493748 containerd[1540]: time="2025-03-17T17:56:07.493738808Z" level=info msg="StopPodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\"" Mar 17 17:56:07.493890 containerd[1540]: time="2025-03-17T17:56:07.493879745Z" level=info msg="TearDown network for sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" successfully" Mar 17 17:56:07.497117 containerd[1540]: time="2025-03-17T17:56:07.497105005Z" level=info msg="StopPodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" returns successfully" Mar 17 17:56:07.497345 containerd[1540]: time="2025-03-17T17:56:07.493814019Z" level=info msg="TearDown network for sandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\" successfully" Mar 17 17:56:07.497393 containerd[1540]: time="2025-03-17T17:56:07.497386303Z" level=info msg="StopPodSandbox for \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\" returns successfully" Mar 17 17:56:07.497979 containerd[1540]: time="2025-03-17T17:56:07.497967311Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2cntr,Uid:0d288c4c-94be-4e72-9025-53893bb68385,Namespace:calico-system,Attempt:3,}" Mar 17 17:56:07.498188 containerd[1540]: time="2025-03-17T17:56:07.498177065Z" level=info msg="StopPodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\"" Mar 17 17:56:07.498286 containerd[1540]: time="2025-03-17T17:56:07.498277067Z" level=info msg="TearDown network for sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" successfully" Mar 17 17:56:07.498318 containerd[1540]: time="2025-03-17T17:56:07.498312675Z" level=info msg="StopPodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" returns successfully" Mar 17 17:56:07.499573 containerd[1540]: time="2025-03-17T17:56:07.499548394Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\"" Mar 17 17:56:07.499632 containerd[1540]: time="2025-03-17T17:56:07.499619762Z" level=info msg="TearDown network for sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" successfully" Mar 17 17:56:07.499632 containerd[1540]: time="2025-03-17T17:56:07.499629686Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" returns successfully" Mar 17 17:56:07.500009 containerd[1540]: time="2025-03-17T17:56:07.499994484Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:3,}" Mar 17 17:56:07.528988 containerd[1540]: time="2025-03-17T17:56:07.528813013Z" level=error msg="Failed to destroy network for sandbox \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.529116 containerd[1540]: time="2025-03-17T17:56:07.529083164Z" level=error msg="encountered an error cleaning up failed sandbox \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.529161 containerd[1540]: time="2025-03-17T17:56:07.529145816Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-28hkf,Uid:e62b9bfa-d049-4110-b093-e476a26ef5be,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.529541 kubelet[2850]: E0317 17:56:07.529278 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.529541 kubelet[2850]: E0317 17:56:07.529310 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:07.529541 kubelet[2850]: E0317 17:56:07.529327 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:07.529617 kubelet[2850]: E0317 17:56:07.529351 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8466cfddd6-28hkf_calico-apiserver(e62b9bfa-d049-4110-b093-e476a26ef5be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8466cfddd6-28hkf_calico-apiserver(e62b9bfa-d049-4110-b093-e476a26ef5be)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" podUID="e62b9bfa-d049-4110-b093-e476a26ef5be" Mar 17 17:56:07.586538 containerd[1540]: time="2025-03-17T17:56:07.586502773Z" level=error msg="Failed to destroy network for sandbox \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.587112 containerd[1540]: time="2025-03-17T17:56:07.587093727Z" level=error msg="encountered an error cleaning up failed sandbox \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.587173 containerd[1540]: time="2025-03-17T17:56:07.587152413Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-dpzgd,Uid:411709d2-a807-4f34-9412-d952d186c81f,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.587785 containerd[1540]: time="2025-03-17T17:56:07.587768291Z" level=error msg="Failed to destroy network for sandbox \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.588445 containerd[1540]: time="2025-03-17T17:56:07.588423584Z" level=error msg="encountered an error cleaning up failed sandbox \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.588480 containerd[1540]: time="2025-03-17T17:56:07.588463549Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c4bd45cb8-lpjjh,Uid:4761939f-21ce-4484-88c7-08bcb4f65c5c,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.588637 kubelet[2850]: E0317 17:56:07.588617 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.588834 kubelet[2850]: E0317 17:56:07.588818 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:07.589099 kubelet[2850]: E0317 17:56:07.588903 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:07.589099 kubelet[2850]: E0317 17:56:07.588625 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.589099 kubelet[2850]: E0317 17:56:07.588936 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7c4bd45cb8-lpjjh_calico-system(4761939f-21ce-4484-88c7-08bcb4f65c5c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7c4bd45cb8-lpjjh_calico-system(4761939f-21ce-4484-88c7-08bcb4f65c5c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" podUID="4761939f-21ce-4484-88c7-08bcb4f65c5c" Mar 17 17:56:07.589184 kubelet[2850]: E0317 17:56:07.588965 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:07.589184 kubelet[2850]: E0317 17:56:07.588979 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:07.589184 kubelet[2850]: E0317 17:56:07.588996 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8466cfddd6-dpzgd_calico-apiserver(411709d2-a807-4f34-9412-d952d186c81f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8466cfddd6-dpzgd_calico-apiserver(411709d2-a807-4f34-9412-d952d186c81f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" podUID="411709d2-a807-4f34-9412-d952d186c81f" Mar 17 17:56:07.640672 containerd[1540]: time="2025-03-17T17:56:07.640639649Z" level=error msg="Failed to destroy network for sandbox \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.642414 containerd[1540]: time="2025-03-17T17:56:07.641289111Z" level=error msg="encountered an error cleaning up failed sandbox \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.642414 containerd[1540]: time="2025-03-17T17:56:07.641329409Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-4ntmw,Uid:f3d38051-dd31-4085-95b4-5054901044b2,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.642578 kubelet[2850]: E0317 17:56:07.642547 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.642694 kubelet[2850]: E0317 17:56:07.642638 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:07.643577 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03-shm.mount: Deactivated successfully. Mar 17 17:56:07.643684 systemd[1]: run-netns-cni\x2d71d8d83a\x2d1ead\x2da189\x2d78ea\x2d12036001b093.mount: Deactivated successfully. Mar 17 17:56:07.643718 systemd[1]: run-netns-cni\x2d937aadb1\x2d844d\x2da1f5\x2dae05\x2d45d1fa62fd56.mount: Deactivated successfully. Mar 17 17:56:07.643747 systemd[1]: run-netns-cni\x2d9a2363e7\x2d1b7a\x2dadf3\x2d2074\x2dca7401c44b5b.mount: Deactivated successfully. Mar 17 17:56:07.643789 systemd[1]: run-netns-cni\x2ddd5a4dd4\x2d9a29\x2ddaab\x2d1f37\x2d78f12d3c89ac.mount: Deactivated successfully. Mar 17 17:56:07.644056 kubelet[2850]: E0317 17:56:07.643851 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:07.644056 kubelet[2850]: E0317 17:56:07.643885 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-4ntmw_kube-system(f3d38051-dd31-4085-95b4-5054901044b2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-4ntmw_kube-system(f3d38051-dd31-4085-95b4-5054901044b2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-4ntmw" podUID="f3d38051-dd31-4085-95b4-5054901044b2" Mar 17 17:56:07.646397 containerd[1540]: time="2025-03-17T17:56:07.646373571Z" level=error msg="Failed to destroy network for sandbox \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.646600 containerd[1540]: time="2025-03-17T17:56:07.646584318Z" level=error msg="encountered an error cleaning up failed sandbox \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.646629 containerd[1540]: time="2025-03-17T17:56:07.646619012Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.647292 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57-shm.mount: Deactivated successfully. Mar 17 17:56:07.648865 kubelet[2850]: E0317 17:56:07.647814 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.648865 kubelet[2850]: E0317 17:56:07.647849 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:07.648865 kubelet[2850]: E0317 17:56:07.647861 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:07.648940 kubelet[2850]: E0317 17:56:07.647883 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-v59bf_kube-system(46b937aa-d1db-4705-9bec-d4bd7aeaeceb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-v59bf_kube-system(46b937aa-d1db-4705-9bec-d4bd7aeaeceb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-v59bf" podUID="46b937aa-d1db-4705-9bec-d4bd7aeaeceb" Mar 17 17:56:07.649870 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4-shm.mount: Deactivated successfully. Mar 17 17:56:07.651046 containerd[1540]: time="2025-03-17T17:56:07.651028482Z" level=error msg="Failed to destroy network for sandbox \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.651685 containerd[1540]: time="2025-03-17T17:56:07.651667866Z" level=error msg="encountered an error cleaning up failed sandbox \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.651725 containerd[1540]: time="2025-03-17T17:56:07.651713150Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2cntr,Uid:0d288c4c-94be-4e72-9025-53893bb68385,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.652670 kubelet[2850]: E0317 17:56:07.652652 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:07.652704 kubelet[2850]: E0317 17:56:07.652692 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2cntr" Mar 17 17:56:07.652729 kubelet[2850]: E0317 17:56:07.652712 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2cntr" Mar 17 17:56:07.652768 kubelet[2850]: E0317 17:56:07.652743 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-2cntr_calico-system(0d288c4c-94be-4e72-9025-53893bb68385)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-2cntr_calico-system(0d288c4c-94be-4e72-9025-53893bb68385)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2cntr" podUID="0d288c4c-94be-4e72-9025-53893bb68385" Mar 17 17:56:07.652843 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017-shm.mount: Deactivated successfully. Mar 17 17:56:08.577916 kubelet[2850]: I0317 17:56:08.577892 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634" Mar 17 17:56:08.579294 containerd[1540]: time="2025-03-17T17:56:08.579269786Z" level=info msg="StopPodSandbox for \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\"" Mar 17 17:56:08.580132 containerd[1540]: time="2025-03-17T17:56:08.579422310Z" level=info msg="Ensure that sandbox d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634 in task-service has been cleanup successfully" Mar 17 17:56:08.580809 systemd[1]: run-netns-cni\x2dec5c4ee9\x2dbe18\x2d686b\x2dcb11\x2d30d961383e15.mount: Deactivated successfully. Mar 17 17:56:08.581579 containerd[1540]: time="2025-03-17T17:56:08.581412883Z" level=info msg="TearDown network for sandbox \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\" successfully" Mar 17 17:56:08.581579 containerd[1540]: time="2025-03-17T17:56:08.581426978Z" level=info msg="StopPodSandbox for \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\" returns successfully" Mar 17 17:56:08.582273 containerd[1540]: time="2025-03-17T17:56:08.582180094Z" level=info msg="StopPodSandbox for \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\"" Mar 17 17:56:08.582273 containerd[1540]: time="2025-03-17T17:56:08.582258302Z" level=info msg="TearDown network for sandbox \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\" successfully" Mar 17 17:56:08.582273 containerd[1540]: time="2025-03-17T17:56:08.582265416Z" level=info msg="StopPodSandbox for \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\" returns successfully" Mar 17 17:56:08.582623 containerd[1540]: time="2025-03-17T17:56:08.582607069Z" level=info msg="StopPodSandbox for \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\"" Mar 17 17:56:08.582705 containerd[1540]: time="2025-03-17T17:56:08.582659884Z" level=info msg="TearDown network for sandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\" successfully" Mar 17 17:56:08.582705 containerd[1540]: time="2025-03-17T17:56:08.582666926Z" level=info msg="StopPodSandbox for \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\" returns successfully" Mar 17 17:56:08.582950 containerd[1540]: time="2025-03-17T17:56:08.582929065Z" level=info msg="StopPodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\"" Mar 17 17:56:08.583002 containerd[1540]: time="2025-03-17T17:56:08.582983813Z" level=info msg="TearDown network for sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" successfully" Mar 17 17:56:08.583002 containerd[1540]: time="2025-03-17T17:56:08.582990268Z" level=info msg="StopPodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" returns successfully" Mar 17 17:56:08.583215 kubelet[2850]: I0317 17:56:08.583196 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57" Mar 17 17:56:08.583542 containerd[1540]: time="2025-03-17T17:56:08.583524046Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c4bd45cb8-lpjjh,Uid:4761939f-21ce-4484-88c7-08bcb4f65c5c,Namespace:calico-system,Attempt:4,}" Mar 17 17:56:08.584008 containerd[1540]: time="2025-03-17T17:56:08.583994530Z" level=info msg="StopPodSandbox for \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\"" Mar 17 17:56:08.584127 containerd[1540]: time="2025-03-17T17:56:08.584098843Z" level=info msg="Ensure that sandbox d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57 in task-service has been cleanup successfully" Mar 17 17:56:08.584226 containerd[1540]: time="2025-03-17T17:56:08.584211040Z" level=info msg="TearDown network for sandbox \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\" successfully" Mar 17 17:56:08.584226 containerd[1540]: time="2025-03-17T17:56:08.584222851Z" level=info msg="StopPodSandbox for \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\" returns successfully" Mar 17 17:56:08.584475 containerd[1540]: time="2025-03-17T17:56:08.584461130Z" level=info msg="StopPodSandbox for \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\"" Mar 17 17:56:08.584523 containerd[1540]: time="2025-03-17T17:56:08.584512293Z" level=info msg="TearDown network for sandbox \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\" successfully" Mar 17 17:56:08.584523 containerd[1540]: time="2025-03-17T17:56:08.584520291Z" level=info msg="StopPodSandbox for \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\" returns successfully" Mar 17 17:56:08.584712 containerd[1540]: time="2025-03-17T17:56:08.584698343Z" level=info msg="StopPodSandbox for \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\"" Mar 17 17:56:08.584749 containerd[1540]: time="2025-03-17T17:56:08.584738798Z" level=info msg="TearDown network for sandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\" successfully" Mar 17 17:56:08.584749 containerd[1540]: time="2025-03-17T17:56:08.584746643Z" level=info msg="StopPodSandbox for \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\" returns successfully" Mar 17 17:56:08.584892 containerd[1540]: time="2025-03-17T17:56:08.584878898Z" level=info msg="StopPodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\"" Mar 17 17:56:08.584940 containerd[1540]: time="2025-03-17T17:56:08.584928684Z" level=info msg="TearDown network for sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" successfully" Mar 17 17:56:08.584940 containerd[1540]: time="2025-03-17T17:56:08.584936460Z" level=info msg="StopPodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" returns successfully" Mar 17 17:56:08.585399 containerd[1540]: time="2025-03-17T17:56:08.585357077Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-4ntmw,Uid:f3d38051-dd31-4085-95b4-5054901044b2,Namespace:kube-system,Attempt:4,}" Mar 17 17:56:08.586001 kubelet[2850]: I0317 17:56:08.585988 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4" Mar 17 17:56:08.586343 containerd[1540]: time="2025-03-17T17:56:08.586312960Z" level=info msg="StopPodSandbox for \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\"" Mar 17 17:56:08.586427 containerd[1540]: time="2025-03-17T17:56:08.586410530Z" level=info msg="Ensure that sandbox f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4 in task-service has been cleanup successfully" Mar 17 17:56:08.586545 containerd[1540]: time="2025-03-17T17:56:08.586533469Z" level=info msg="TearDown network for sandbox \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\" successfully" Mar 17 17:56:08.586589 containerd[1540]: time="2025-03-17T17:56:08.586544818Z" level=info msg="StopPodSandbox for \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\" returns successfully" Mar 17 17:56:08.586754 containerd[1540]: time="2025-03-17T17:56:08.586739513Z" level=info msg="StopPodSandbox for \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\"" Mar 17 17:56:08.586806 containerd[1540]: time="2025-03-17T17:56:08.586796534Z" level=info msg="TearDown network for sandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\" successfully" Mar 17 17:56:08.586830 containerd[1540]: time="2025-03-17T17:56:08.586813542Z" level=info msg="StopPodSandbox for \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\" returns successfully" Mar 17 17:56:08.587089 containerd[1540]: time="2025-03-17T17:56:08.587049704Z" level=info msg="StopPodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\"" Mar 17 17:56:08.587117 containerd[1540]: time="2025-03-17T17:56:08.587103492Z" level=info msg="TearDown network for sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" successfully" Mar 17 17:56:08.587117 containerd[1540]: time="2025-03-17T17:56:08.587109541Z" level=info msg="StopPodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" returns successfully" Mar 17 17:56:08.587481 containerd[1540]: time="2025-03-17T17:56:08.587466372Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\"" Mar 17 17:56:08.587519 containerd[1540]: time="2025-03-17T17:56:08.587506262Z" level=info msg="TearDown network for sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" successfully" Mar 17 17:56:08.587519 containerd[1540]: time="2025-03-17T17:56:08.587515356Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" returns successfully" Mar 17 17:56:08.587791 containerd[1540]: time="2025-03-17T17:56:08.587764804Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:4,}" Mar 17 17:56:08.588024 kubelet[2850]: I0317 17:56:08.588015 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017" Mar 17 17:56:08.591332 containerd[1540]: time="2025-03-17T17:56:08.591319434Z" level=info msg="StopPodSandbox for \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\"" Mar 17 17:56:08.591420 containerd[1540]: time="2025-03-17T17:56:08.591405775Z" level=info msg="Ensure that sandbox b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017 in task-service has been cleanup successfully" Mar 17 17:56:08.591506 containerd[1540]: time="2025-03-17T17:56:08.591491040Z" level=info msg="TearDown network for sandbox \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\" successfully" Mar 17 17:56:08.591536 containerd[1540]: time="2025-03-17T17:56:08.591504025Z" level=info msg="StopPodSandbox for \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\" returns successfully" Mar 17 17:56:08.591782 containerd[1540]: time="2025-03-17T17:56:08.591769753Z" level=info msg="StopPodSandbox for \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\"" Mar 17 17:56:08.591824 containerd[1540]: time="2025-03-17T17:56:08.591812285Z" level=info msg="TearDown network for sandbox \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\" successfully" Mar 17 17:56:08.591824 containerd[1540]: time="2025-03-17T17:56:08.591821380Z" level=info msg="StopPodSandbox for \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\" returns successfully" Mar 17 17:56:08.592036 containerd[1540]: time="2025-03-17T17:56:08.592023734Z" level=info msg="StopPodSandbox for \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\"" Mar 17 17:56:08.592063 containerd[1540]: time="2025-03-17T17:56:08.592059101Z" level=info msg="TearDown network for sandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\" successfully" Mar 17 17:56:08.592086 containerd[1540]: time="2025-03-17T17:56:08.592064469Z" level=info msg="StopPodSandbox for \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\" returns successfully" Mar 17 17:56:08.592258 containerd[1540]: time="2025-03-17T17:56:08.592243740Z" level=info msg="StopPodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\"" Mar 17 17:56:08.592326 containerd[1540]: time="2025-03-17T17:56:08.592279556Z" level=info msg="TearDown network for sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" successfully" Mar 17 17:56:08.592326 containerd[1540]: time="2025-03-17T17:56:08.592284644Z" level=info msg="StopPodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" returns successfully" Mar 17 17:56:08.592765 kubelet[2850]: I0317 17:56:08.592610 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03" Mar 17 17:56:08.592803 containerd[1540]: time="2025-03-17T17:56:08.592627586Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2cntr,Uid:0d288c4c-94be-4e72-9025-53893bb68385,Namespace:calico-system,Attempt:4,}" Mar 17 17:56:08.592823 containerd[1540]: time="2025-03-17T17:56:08.592815067Z" level=info msg="StopPodSandbox for \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\"" Mar 17 17:56:08.592924 containerd[1540]: time="2025-03-17T17:56:08.592909347Z" level=info msg="Ensure that sandbox 268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03 in task-service has been cleanup successfully" Mar 17 17:56:08.593010 containerd[1540]: time="2025-03-17T17:56:08.592997368Z" level=info msg="TearDown network for sandbox \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\" successfully" Mar 17 17:56:08.593038 containerd[1540]: time="2025-03-17T17:56:08.593013638Z" level=info msg="StopPodSandbox for \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\" returns successfully" Mar 17 17:56:08.593248 containerd[1540]: time="2025-03-17T17:56:08.593234601Z" level=info msg="StopPodSandbox for \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\"" Mar 17 17:56:08.593295 containerd[1540]: time="2025-03-17T17:56:08.593284264Z" level=info msg="TearDown network for sandbox \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\" successfully" Mar 17 17:56:08.593295 containerd[1540]: time="2025-03-17T17:56:08.593293073Z" level=info msg="StopPodSandbox for \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\" returns successfully" Mar 17 17:56:08.593526 containerd[1540]: time="2025-03-17T17:56:08.593512509Z" level=info msg="StopPodSandbox for \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\"" Mar 17 17:56:08.593578 containerd[1540]: time="2025-03-17T17:56:08.593567392Z" level=info msg="TearDown network for sandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\" successfully" Mar 17 17:56:08.593578 containerd[1540]: time="2025-03-17T17:56:08.593575885Z" level=info msg="StopPodSandbox for \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\" returns successfully" Mar 17 17:56:08.594179 containerd[1540]: time="2025-03-17T17:56:08.593878110Z" level=info msg="StopPodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\"" Mar 17 17:56:08.594179 containerd[1540]: time="2025-03-17T17:56:08.593912261Z" level=info msg="TearDown network for sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" successfully" Mar 17 17:56:08.594179 containerd[1540]: time="2025-03-17T17:56:08.593917514Z" level=info msg="StopPodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" returns successfully" Mar 17 17:56:08.596098 containerd[1540]: time="2025-03-17T17:56:08.595995703Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-28hkf,Uid:e62b9bfa-d049-4110-b093-e476a26ef5be,Namespace:calico-apiserver,Attempt:4,}" Mar 17 17:56:08.596379 kubelet[2850]: I0317 17:56:08.596364 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13" Mar 17 17:56:08.597122 containerd[1540]: time="2025-03-17T17:56:08.597061268Z" level=info msg="StopPodSandbox for \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\"" Mar 17 17:56:08.597207 containerd[1540]: time="2025-03-17T17:56:08.597154578Z" level=info msg="Ensure that sandbox b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13 in task-service has been cleanup successfully" Mar 17 17:56:08.597365 containerd[1540]: time="2025-03-17T17:56:08.597256838Z" level=info msg="TearDown network for sandbox \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\" successfully" Mar 17 17:56:08.597365 containerd[1540]: time="2025-03-17T17:56:08.597265893Z" level=info msg="StopPodSandbox for \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\" returns successfully" Mar 17 17:56:08.597564 containerd[1540]: time="2025-03-17T17:56:08.597513996Z" level=info msg="StopPodSandbox for \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\"" Mar 17 17:56:08.598500 containerd[1540]: time="2025-03-17T17:56:08.597551678Z" level=info msg="TearDown network for sandbox \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\" successfully" Mar 17 17:56:08.598500 containerd[1540]: time="2025-03-17T17:56:08.598457949Z" level=info msg="StopPodSandbox for \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\" returns successfully" Mar 17 17:56:08.610879 containerd[1540]: time="2025-03-17T17:56:08.610789198Z" level=info msg="StopPodSandbox for \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\"" Mar 17 17:56:08.610879 containerd[1540]: time="2025-03-17T17:56:08.610841174Z" level=info msg="TearDown network for sandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\" successfully" Mar 17 17:56:08.610879 containerd[1540]: time="2025-03-17T17:56:08.610848365Z" level=info msg="StopPodSandbox for \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\" returns successfully" Mar 17 17:56:08.637230 systemd[1]: run-netns-cni\x2da4d88ba4\x2dfad2\x2d9da4\x2dc4e0\x2dda76b0fa97f7.mount: Deactivated successfully. Mar 17 17:56:08.637291 systemd[1]: run-netns-cni\x2dbc2ac7aa\x2d9c01\x2dd6ae\x2dae7f\x2d3a5d41c92815.mount: Deactivated successfully. Mar 17 17:56:08.637323 systemd[1]: run-netns-cni\x2dab02f0d2\x2d35ff\x2d08f7\x2de9ec\x2da6da7c5ff023.mount: Deactivated successfully. Mar 17 17:56:08.637354 systemd[1]: run-netns-cni\x2d88f90ad4\x2ddbce\x2d7dc0\x2d9ae7\x2d1a94d18c82e6.mount: Deactivated successfully. Mar 17 17:56:08.637384 systemd[1]: run-netns-cni\x2dc48de6d3\x2d8573\x2d207b\x2db0f9\x2d11cae7ed605a.mount: Deactivated successfully. Mar 17 17:56:08.662324 containerd[1540]: time="2025-03-17T17:56:08.662271686Z" level=info msg="StopPodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\"" Mar 17 17:56:08.662452 containerd[1540]: time="2025-03-17T17:56:08.662342614Z" level=info msg="TearDown network for sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" successfully" Mar 17 17:56:08.662452 containerd[1540]: time="2025-03-17T17:56:08.662349690Z" level=info msg="StopPodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" returns successfully" Mar 17 17:56:08.665525 containerd[1540]: time="2025-03-17T17:56:08.665380788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-dpzgd,Uid:411709d2-a807-4f34-9412-d952d186c81f,Namespace:calico-apiserver,Attempt:4,}" Mar 17 17:56:08.939146 containerd[1540]: time="2025-03-17T17:56:08.939103552Z" level=error msg="Failed to destroy network for sandbox \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:08.939506 containerd[1540]: time="2025-03-17T17:56:08.939481479Z" level=error msg="encountered an error cleaning up failed sandbox \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:08.939634 containerd[1540]: time="2025-03-17T17:56:08.939588457Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:08.940627 kubelet[2850]: E0317 17:56:08.940601 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:08.940687 kubelet[2850]: E0317 17:56:08.940640 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:08.940687 kubelet[2850]: E0317 17:56:08.940654 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:08.940736 kubelet[2850]: E0317 17:56:08.940681 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-v59bf_kube-system(46b937aa-d1db-4705-9bec-d4bd7aeaeceb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-v59bf_kube-system(46b937aa-d1db-4705-9bec-d4bd7aeaeceb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-v59bf" podUID="46b937aa-d1db-4705-9bec-d4bd7aeaeceb" Mar 17 17:56:08.941913 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911-shm.mount: Deactivated successfully. Mar 17 17:56:08.949554 containerd[1540]: time="2025-03-17T17:56:08.949524379Z" level=error msg="Failed to destroy network for sandbox \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:08.949764 containerd[1540]: time="2025-03-17T17:56:08.949735881Z" level=error msg="encountered an error cleaning up failed sandbox \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:08.949818 containerd[1540]: time="2025-03-17T17:56:08.949774956Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c4bd45cb8-lpjjh,Uid:4761939f-21ce-4484-88c7-08bcb4f65c5c,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:08.951362 kubelet[2850]: E0317 17:56:08.949893 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:08.951362 kubelet[2850]: E0317 17:56:08.949928 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:08.951362 kubelet[2850]: E0317 17:56:08.949941 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:08.950912 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4-shm.mount: Deactivated successfully. Mar 17 17:56:08.951550 kubelet[2850]: E0317 17:56:08.949965 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7c4bd45cb8-lpjjh_calico-system(4761939f-21ce-4484-88c7-08bcb4f65c5c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7c4bd45cb8-lpjjh_calico-system(4761939f-21ce-4484-88c7-08bcb4f65c5c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" podUID="4761939f-21ce-4484-88c7-08bcb4f65c5c" Mar 17 17:56:09.135345 containerd[1540]: time="2025-03-17T17:56:09.135200044Z" level=error msg="Failed to destroy network for sandbox \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.135531 containerd[1540]: time="2025-03-17T17:56:09.135513448Z" level=error msg="encountered an error cleaning up failed sandbox \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.135602 containerd[1540]: time="2025-03-17T17:56:09.135550112Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-4ntmw,Uid:f3d38051-dd31-4085-95b4-5054901044b2,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.136216 kubelet[2850]: E0317 17:56:09.135686 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.136216 kubelet[2850]: E0317 17:56:09.135723 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:09.136216 kubelet[2850]: E0317 17:56:09.135737 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:09.136327 kubelet[2850]: E0317 17:56:09.135787 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-4ntmw_kube-system(f3d38051-dd31-4085-95b4-5054901044b2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-4ntmw_kube-system(f3d38051-dd31-4085-95b4-5054901044b2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-4ntmw" podUID="f3d38051-dd31-4085-95b4-5054901044b2" Mar 17 17:56:09.137524 containerd[1540]: time="2025-03-17T17:56:09.137265620Z" level=error msg="Failed to destroy network for sandbox \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.137524 containerd[1540]: time="2025-03-17T17:56:09.137494056Z" level=error msg="encountered an error cleaning up failed sandbox \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.137693 containerd[1540]: time="2025-03-17T17:56:09.137678236Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-dpzgd,Uid:411709d2-a807-4f34-9412-d952d186c81f,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.138204 kubelet[2850]: E0317 17:56:09.138026 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.138204 kubelet[2850]: E0317 17:56:09.138064 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:09.138204 kubelet[2850]: E0317 17:56:09.138078 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:09.138294 kubelet[2850]: E0317 17:56:09.138102 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8466cfddd6-dpzgd_calico-apiserver(411709d2-a807-4f34-9412-d952d186c81f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8466cfddd6-dpzgd_calico-apiserver(411709d2-a807-4f34-9412-d952d186c81f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" podUID="411709d2-a807-4f34-9412-d952d186c81f" Mar 17 17:56:09.142247 containerd[1540]: time="2025-03-17T17:56:09.142227277Z" level=error msg="Failed to destroy network for sandbox \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.142685 containerd[1540]: time="2025-03-17T17:56:09.142310802Z" level=error msg="Failed to destroy network for sandbox \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.143384 containerd[1540]: time="2025-03-17T17:56:09.143365509Z" level=error msg="encountered an error cleaning up failed sandbox \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.143419 containerd[1540]: time="2025-03-17T17:56:09.143402752Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2cntr,Uid:0d288c4c-94be-4e72-9025-53893bb68385,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.143583 containerd[1540]: time="2025-03-17T17:56:09.142836663Z" level=error msg="encountered an error cleaning up failed sandbox \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.143583 containerd[1540]: time="2025-03-17T17:56:09.143462414Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-28hkf,Uid:e62b9bfa-d049-4110-b093-e476a26ef5be,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.143653 kubelet[2850]: E0317 17:56:09.143554 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.143653 kubelet[2850]: E0317 17:56:09.143633 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2cntr" Mar 17 17:56:09.143653 kubelet[2850]: E0317 17:56:09.143648 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2cntr" Mar 17 17:56:09.144050 kubelet[2850]: E0317 17:56:09.143670 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-2cntr_calico-system(0d288c4c-94be-4e72-9025-53893bb68385)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-2cntr_calico-system(0d288c4c-94be-4e72-9025-53893bb68385)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2cntr" podUID="0d288c4c-94be-4e72-9025-53893bb68385" Mar 17 17:56:09.144050 kubelet[2850]: E0317 17:56:09.143554 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:09.144050 kubelet[2850]: E0317 17:56:09.143952 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:09.144142 kubelet[2850]: E0317 17:56:09.143965 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:09.144142 kubelet[2850]: E0317 17:56:09.144002 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8466cfddd6-28hkf_calico-apiserver(e62b9bfa-d049-4110-b093-e476a26ef5be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8466cfddd6-28hkf_calico-apiserver(e62b9bfa-d049-4110-b093-e476a26ef5be)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" podUID="e62b9bfa-d049-4110-b093-e476a26ef5be" Mar 17 17:56:09.598781 kubelet[2850]: I0317 17:56:09.598743 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b" Mar 17 17:56:09.599420 containerd[1540]: time="2025-03-17T17:56:09.599306373Z" level=info msg="StopPodSandbox for \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\"" Mar 17 17:56:09.600243 containerd[1540]: time="2025-03-17T17:56:09.600075220Z" level=info msg="Ensure that sandbox e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b in task-service has been cleanup successfully" Mar 17 17:56:09.600847 kubelet[2850]: I0317 17:56:09.600640 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4" Mar 17 17:56:09.601212 containerd[1540]: time="2025-03-17T17:56:09.600967969Z" level=info msg="StopPodSandbox for \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\"" Mar 17 17:56:09.601398 containerd[1540]: time="2025-03-17T17:56:09.601388165Z" level=info msg="Ensure that sandbox 87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4 in task-service has been cleanup successfully" Mar 17 17:56:09.601832 containerd[1540]: time="2025-03-17T17:56:09.601822072Z" level=info msg="TearDown network for sandbox \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\" successfully" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.602017276Z" level=info msg="StopPodSandbox for \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\" returns successfully" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.601476422Z" level=info msg="TearDown network for sandbox \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\" successfully" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.602197153Z" level=info msg="StopPodSandbox for \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\" returns successfully" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.602424747Z" level=info msg="StopPodSandbox for \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\"" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.602860928Z" level=info msg="StopPodSandbox for \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\"" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.602927609Z" level=info msg="TearDown network for sandbox \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\" successfully" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.602936919Z" level=info msg="StopPodSandbox for \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\" returns successfully" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.603213348Z" level=info msg="TearDown network for sandbox \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\" successfully" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.603242056Z" level=info msg="StopPodSandbox for \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\" returns successfully" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.603509922Z" level=info msg="StopPodSandbox for \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\"" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.603546866Z" level=info msg="TearDown network for sandbox \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\" successfully" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.603553496Z" level=info msg="StopPodSandbox for \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\" returns successfully" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.604230537Z" level=info msg="StopPodSandbox for \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\"" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.604382688Z" level=info msg="TearDown network for sandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\" successfully" Mar 17 17:56:09.604725 containerd[1540]: time="2025-03-17T17:56:09.604393909Z" level=info msg="StopPodSandbox for \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\" returns successfully" Mar 17 17:56:09.604978 kubelet[2850]: I0317 17:56:09.604055 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf" Mar 17 17:56:09.638384 systemd[1]: run-netns-cni\x2d0aa08f43\x2dc103\x2df8fc\x2d0384\x2db3fa0acc9051.mount: Deactivated successfully. Mar 17 17:56:09.673880 containerd[1540]: time="2025-03-17T17:56:09.673664098Z" level=info msg="StopPodSandbox for \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\"" Mar 17 17:56:09.673880 containerd[1540]: time="2025-03-17T17:56:09.673738857Z" level=info msg="TearDown network for sandbox \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\" successfully" Mar 17 17:56:09.673880 containerd[1540]: time="2025-03-17T17:56:09.673746292Z" level=info msg="StopPodSandbox for \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\" returns successfully" Mar 17 17:56:09.673880 containerd[1540]: time="2025-03-17T17:56:09.673878703Z" level=info msg="StopPodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\"" Mar 17 17:56:09.674902 containerd[1540]: time="2025-03-17T17:56:09.673994138Z" level=info msg="TearDown network for sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" successfully" Mar 17 17:56:09.674902 containerd[1540]: time="2025-03-17T17:56:09.674005169Z" level=info msg="StopPodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" returns successfully" Mar 17 17:56:09.676629 containerd[1540]: time="2025-03-17T17:56:09.676254373Z" level=info msg="StopPodSandbox for \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\"" Mar 17 17:56:09.676629 containerd[1540]: time="2025-03-17T17:56:09.676383939Z" level=info msg="Ensure that sandbox 06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf in task-service has been cleanup successfully" Mar 17 17:56:09.677129 containerd[1540]: time="2025-03-17T17:56:09.676945458Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c4bd45cb8-lpjjh,Uid:4761939f-21ce-4484-88c7-08bcb4f65c5c,Namespace:calico-system,Attempt:5,}" Mar 17 17:56:09.677129 containerd[1540]: time="2025-03-17T17:56:09.676991751Z" level=info msg="StopPodSandbox for \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\"" Mar 17 17:56:09.677129 containerd[1540]: time="2025-03-17T17:56:09.677035005Z" level=info msg="TearDown network for sandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\" successfully" Mar 17 17:56:09.677129 containerd[1540]: time="2025-03-17T17:56:09.677042085Z" level=info msg="StopPodSandbox for \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\" returns successfully" Mar 17 17:56:09.677227 containerd[1540]: time="2025-03-17T17:56:09.677169990Z" level=info msg="TearDown network for sandbox \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\" successfully" Mar 17 17:56:09.677227 containerd[1540]: time="2025-03-17T17:56:09.677176742Z" level=info msg="StopPodSandbox for \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\" returns successfully" Mar 17 17:56:09.679928 systemd[1]: run-netns-cni\x2d7c3de98f\x2d0936\x2dea17\x2d1c95\x2d100d62fdeb5a.mount: Deactivated successfully. Mar 17 17:56:09.680188 containerd[1540]: time="2025-03-17T17:56:09.680165688Z" level=info msg="StopPodSandbox for \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\"" Mar 17 17:56:09.680235 containerd[1540]: time="2025-03-17T17:56:09.680229013Z" level=info msg="TearDown network for sandbox \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\" successfully" Mar 17 17:56:09.680235 containerd[1540]: time="2025-03-17T17:56:09.680236353Z" level=info msg="StopPodSandbox for \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\" returns successfully" Mar 17 17:56:09.680420 containerd[1540]: time="2025-03-17T17:56:09.680404866Z" level=info msg="StopPodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\"" Mar 17 17:56:09.680462 containerd[1540]: time="2025-03-17T17:56:09.680443114Z" level=info msg="TearDown network for sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" successfully" Mar 17 17:56:09.680462 containerd[1540]: time="2025-03-17T17:56:09.680450586Z" level=info msg="StopPodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" returns successfully" Mar 17 17:56:09.682700 containerd[1540]: time="2025-03-17T17:56:09.682391573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-dpzgd,Uid:411709d2-a807-4f34-9412-d952d186c81f,Namespace:calico-apiserver,Attempt:5,}" Mar 17 17:56:09.683016 containerd[1540]: time="2025-03-17T17:56:09.682954919Z" level=info msg="StopPodSandbox for \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\"" Mar 17 17:56:09.683016 containerd[1540]: time="2025-03-17T17:56:09.683004400Z" level=info msg="TearDown network for sandbox \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\" successfully" Mar 17 17:56:09.683016 containerd[1540]: time="2025-03-17T17:56:09.683012126Z" level=info msg="StopPodSandbox for \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\" returns successfully" Mar 17 17:56:09.683332 containerd[1540]: time="2025-03-17T17:56:09.683313982Z" level=info msg="StopPodSandbox for \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\"" Mar 17 17:56:09.683392 containerd[1540]: time="2025-03-17T17:56:09.683377496Z" level=info msg="TearDown network for sandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\" successfully" Mar 17 17:56:09.683392 containerd[1540]: time="2025-03-17T17:56:09.683389513Z" level=info msg="StopPodSandbox for \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\" returns successfully" Mar 17 17:56:09.683602 kubelet[2850]: I0317 17:56:09.683546 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911" Mar 17 17:56:09.684874 containerd[1540]: time="2025-03-17T17:56:09.684792007Z" level=info msg="StopPodSandbox for \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\"" Mar 17 17:56:09.684958 containerd[1540]: time="2025-03-17T17:56:09.684941143Z" level=info msg="Ensure that sandbox 27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911 in task-service has been cleanup successfully" Mar 17 17:56:09.687069 systemd[1]: run-netns-cni\x2d165b122a\x2d9e9d\x2d1530\x2dac2f\x2d12d519c6b243.mount: Deactivated successfully. Mar 17 17:56:09.687167 containerd[1540]: time="2025-03-17T17:56:09.687107365Z" level=info msg="StopPodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\"" Mar 17 17:56:09.687213 containerd[1540]: time="2025-03-17T17:56:09.687169355Z" level=info msg="TearDown network for sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" successfully" Mar 17 17:56:09.687213 containerd[1540]: time="2025-03-17T17:56:09.687179824Z" level=info msg="StopPodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" returns successfully" Mar 17 17:56:09.687735 containerd[1540]: time="2025-03-17T17:56:09.687715994Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-4ntmw,Uid:f3d38051-dd31-4085-95b4-5054901044b2,Namespace:kube-system,Attempt:5,}" Mar 17 17:56:09.688709 containerd[1540]: time="2025-03-17T17:56:09.688383033Z" level=info msg="TearDown network for sandbox \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\" successfully" Mar 17 17:56:09.688709 containerd[1540]: time="2025-03-17T17:56:09.688399459Z" level=info msg="StopPodSandbox for \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\" returns successfully" Mar 17 17:56:09.690011 containerd[1540]: time="2025-03-17T17:56:09.689989526Z" level=info msg="StopPodSandbox for \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\"" Mar 17 17:56:09.690095 containerd[1540]: time="2025-03-17T17:56:09.690074230Z" level=info msg="TearDown network for sandbox \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\" successfully" Mar 17 17:56:09.690095 containerd[1540]: time="2025-03-17T17:56:09.690089648Z" level=info msg="StopPodSandbox for \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\" returns successfully" Mar 17 17:56:09.690818 containerd[1540]: time="2025-03-17T17:56:09.690794514Z" level=info msg="StopPodSandbox for \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\"" Mar 17 17:56:09.690895 containerd[1540]: time="2025-03-17T17:56:09.690864168Z" level=info msg="TearDown network for sandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\" successfully" Mar 17 17:56:09.690895 containerd[1540]: time="2025-03-17T17:56:09.690882852Z" level=info msg="StopPodSandbox for \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\" returns successfully" Mar 17 17:56:09.693742 containerd[1540]: time="2025-03-17T17:56:09.693716809Z" level=info msg="StopPodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\"" Mar 17 17:56:09.693846 containerd[1540]: time="2025-03-17T17:56:09.693792176Z" level=info msg="TearDown network for sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" successfully" Mar 17 17:56:09.693846 containerd[1540]: time="2025-03-17T17:56:09.693826063Z" level=info msg="StopPodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" returns successfully" Mar 17 17:56:09.694815 containerd[1540]: time="2025-03-17T17:56:09.694728312Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\"" Mar 17 17:56:09.694815 containerd[1540]: time="2025-03-17T17:56:09.694795317Z" level=info msg="TearDown network for sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" successfully" Mar 17 17:56:09.694815 containerd[1540]: time="2025-03-17T17:56:09.694805532Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" returns successfully" Mar 17 17:56:09.695249 containerd[1540]: time="2025-03-17T17:56:09.695229970Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:5,}" Mar 17 17:56:09.698457 kubelet[2850]: I0317 17:56:09.695656 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32" Mar 17 17:56:09.698288 systemd[1]: run-netns-cni\x2d46a846f7\x2dce9a\x2d464f\x2d7d8e\x2d1e0e31203e60.mount: Deactivated successfully. Mar 17 17:56:09.698597 containerd[1540]: time="2025-03-17T17:56:09.696192334Z" level=info msg="StopPodSandbox for \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\"" Mar 17 17:56:09.698597 containerd[1540]: time="2025-03-17T17:56:09.696377113Z" level=info msg="Ensure that sandbox d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32 in task-service has been cleanup successfully" Mar 17 17:56:09.701282 containerd[1540]: time="2025-03-17T17:56:09.701264641Z" level=info msg="TearDown network for sandbox \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\" successfully" Mar 17 17:56:09.701333 containerd[1540]: time="2025-03-17T17:56:09.701325813Z" level=info msg="StopPodSandbox for \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\" returns successfully" Mar 17 17:56:09.701806 containerd[1540]: time="2025-03-17T17:56:09.701795868Z" level=info msg="StopPodSandbox for \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\"" Mar 17 17:56:09.701899 containerd[1540]: time="2025-03-17T17:56:09.701886872Z" level=info msg="TearDown network for sandbox \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\" successfully" Mar 17 17:56:09.701933 containerd[1540]: time="2025-03-17T17:56:09.701927214Z" level=info msg="StopPodSandbox for \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\" returns successfully" Mar 17 17:56:09.705772 containerd[1540]: time="2025-03-17T17:56:09.705755875Z" level=info msg="StopPodSandbox for \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\"" Mar 17 17:56:09.705901 containerd[1540]: time="2025-03-17T17:56:09.705887659Z" level=info msg="TearDown network for sandbox \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\" successfully" Mar 17 17:56:09.705945 containerd[1540]: time="2025-03-17T17:56:09.705938462Z" level=info msg="StopPodSandbox for \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\" returns successfully" Mar 17 17:56:09.706494 kubelet[2850]: I0317 17:56:09.706476 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae" Mar 17 17:56:09.707408 containerd[1540]: time="2025-03-17T17:56:09.707391044Z" level=info msg="StopPodSandbox for \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\"" Mar 17 17:56:09.707505 containerd[1540]: time="2025-03-17T17:56:09.707469181Z" level=info msg="TearDown network for sandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\" successfully" Mar 17 17:56:09.707531 containerd[1540]: time="2025-03-17T17:56:09.707515411Z" level=info msg="StopPodSandbox for \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\" returns successfully" Mar 17 17:56:09.707595 containerd[1540]: time="2025-03-17T17:56:09.707581199Z" level=info msg="StopPodSandbox for \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\"" Mar 17 17:56:09.707825 containerd[1540]: time="2025-03-17T17:56:09.707810237Z" level=info msg="Ensure that sandbox 1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae in task-service has been cleanup successfully" Mar 17 17:56:09.709570 containerd[1540]: time="2025-03-17T17:56:09.707969243Z" level=info msg="TearDown network for sandbox \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\" successfully" Mar 17 17:56:09.709570 containerd[1540]: time="2025-03-17T17:56:09.707982215Z" level=info msg="StopPodSandbox for \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\" returns successfully" Mar 17 17:56:09.709570 containerd[1540]: time="2025-03-17T17:56:09.709519242Z" level=info msg="StopPodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\"" Mar 17 17:56:09.709437 systemd[1]: run-netns-cni\x2d1a24cadc\x2d41c0\x2dadd0\x2dc9ee\x2d9543d0b3541c.mount: Deactivated successfully. Mar 17 17:56:09.711326 containerd[1540]: time="2025-03-17T17:56:09.710334962Z" level=info msg="TearDown network for sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" successfully" Mar 17 17:56:09.711326 containerd[1540]: time="2025-03-17T17:56:09.710348707Z" level=info msg="StopPodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" returns successfully" Mar 17 17:56:09.711326 containerd[1540]: time="2025-03-17T17:56:09.710527299Z" level=info msg="StopPodSandbox for \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\"" Mar 17 17:56:09.711326 containerd[1540]: time="2025-03-17T17:56:09.711087293Z" level=info msg="TearDown network for sandbox \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\" successfully" Mar 17 17:56:09.712341 containerd[1540]: time="2025-03-17T17:56:09.711098843Z" level=info msg="StopPodSandbox for \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\" returns successfully" Mar 17 17:56:09.712906 containerd[1540]: time="2025-03-17T17:56:09.712743419Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2cntr,Uid:0d288c4c-94be-4e72-9025-53893bb68385,Namespace:calico-system,Attempt:5,}" Mar 17 17:56:09.713037 containerd[1540]: time="2025-03-17T17:56:09.713019057Z" level=info msg="StopPodSandbox for \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\"" Mar 17 17:56:09.713107 containerd[1540]: time="2025-03-17T17:56:09.713091259Z" level=info msg="TearDown network for sandbox \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\" successfully" Mar 17 17:56:09.713107 containerd[1540]: time="2025-03-17T17:56:09.713103388Z" level=info msg="StopPodSandbox for \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\" returns successfully" Mar 17 17:56:09.713312 containerd[1540]: time="2025-03-17T17:56:09.713262737Z" level=info msg="StopPodSandbox for \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\"" Mar 17 17:56:09.713434 containerd[1540]: time="2025-03-17T17:56:09.713371958Z" level=info msg="TearDown network for sandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\" successfully" Mar 17 17:56:09.713434 containerd[1540]: time="2025-03-17T17:56:09.713395309Z" level=info msg="StopPodSandbox for \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\" returns successfully" Mar 17 17:56:09.713708 containerd[1540]: time="2025-03-17T17:56:09.713551894Z" level=info msg="StopPodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\"" Mar 17 17:56:09.713708 containerd[1540]: time="2025-03-17T17:56:09.713676216Z" level=info msg="TearDown network for sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" successfully" Mar 17 17:56:09.713708 containerd[1540]: time="2025-03-17T17:56:09.713685794Z" level=info msg="StopPodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" returns successfully" Mar 17 17:56:09.714078 containerd[1540]: time="2025-03-17T17:56:09.713946104Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-28hkf,Uid:e62b9bfa-d049-4110-b093-e476a26ef5be,Namespace:calico-apiserver,Attempt:5,}" Mar 17 17:56:11.844344 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2612008026.mount: Deactivated successfully. Mar 17 17:56:12.456673 containerd[1540]: time="2025-03-17T17:56:12.454439500Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 17 17:56:12.456673 containerd[1540]: time="2025-03-17T17:56:12.454724202Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:12.475969 containerd[1540]: time="2025-03-17T17:56:12.454617898Z" level=error msg="Failed to destroy network for sandbox \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.476585 containerd[1540]: time="2025-03-17T17:56:12.476175054Z" level=error msg="encountered an error cleaning up failed sandbox \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.476585 containerd[1540]: time="2025-03-17T17:56:12.476218553Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c4bd45cb8-lpjjh,Uid:4761939f-21ce-4484-88c7-08bcb4f65c5c,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.477530 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7-shm.mount: Deactivated successfully. Mar 17 17:56:12.484300 containerd[1540]: time="2025-03-17T17:56:12.481400331Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:12.486425 containerd[1540]: time="2025-03-17T17:56:12.486402030Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:12.493238 kubelet[2850]: E0317 17:56:12.493166 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.493238 kubelet[2850]: E0317 17:56:12.493209 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:12.493238 kubelet[2850]: E0317 17:56:12.493227 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" Mar 17 17:56:12.494156 kubelet[2850]: E0317 17:56:12.493262 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7c4bd45cb8-lpjjh_calico-system(4761939f-21ce-4484-88c7-08bcb4f65c5c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7c4bd45cb8-lpjjh_calico-system(4761939f-21ce-4484-88c7-08bcb4f65c5c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" podUID="4761939f-21ce-4484-88c7-08bcb4f65c5c" Mar 17 17:56:12.496990 containerd[1540]: time="2025-03-17T17:56:12.496339984Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 7.054579295s" Mar 17 17:56:12.496990 containerd[1540]: time="2025-03-17T17:56:12.496377064Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 17 17:56:12.532152 containerd[1540]: time="2025-03-17T17:56:12.532120468Z" level=info msg="CreateContainer within sandbox \"62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 17:56:12.544997 containerd[1540]: time="2025-03-17T17:56:12.544965903Z" level=error msg="Failed to destroy network for sandbox \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.546716 containerd[1540]: time="2025-03-17T17:56:12.545690289Z" level=error msg="encountered an error cleaning up failed sandbox \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.546716 containerd[1540]: time="2025-03-17T17:56:12.545729120Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-4ntmw,Uid:f3d38051-dd31-4085-95b4-5054901044b2,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.546813 kubelet[2850]: E0317 17:56:12.546773 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.546813 kubelet[2850]: E0317 17:56:12.546806 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:12.546862 kubelet[2850]: E0317 17:56:12.546823 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-4ntmw" Mar 17 17:56:12.546862 kubelet[2850]: E0317 17:56:12.546850 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-4ntmw_kube-system(f3d38051-dd31-4085-95b4-5054901044b2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-4ntmw_kube-system(f3d38051-dd31-4085-95b4-5054901044b2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-4ntmw" podUID="f3d38051-dd31-4085-95b4-5054901044b2" Mar 17 17:56:12.548436 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2-shm.mount: Deactivated successfully. Mar 17 17:56:12.556720 containerd[1540]: time="2025-03-17T17:56:12.556629647Z" level=error msg="Failed to destroy network for sandbox \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.557574 containerd[1540]: time="2025-03-17T17:56:12.557398405Z" level=error msg="encountered an error cleaning up failed sandbox \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.557574 containerd[1540]: time="2025-03-17T17:56:12.557440682Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-28hkf,Uid:e62b9bfa-d049-4110-b093-e476a26ef5be,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.558982 kubelet[2850]: E0317 17:56:12.557590 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.558982 kubelet[2850]: E0317 17:56:12.557629 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:12.558982 kubelet[2850]: E0317 17:56:12.557642 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" Mar 17 17:56:12.559061 kubelet[2850]: E0317 17:56:12.557670 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8466cfddd6-28hkf_calico-apiserver(e62b9bfa-d049-4110-b093-e476a26ef5be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8466cfddd6-28hkf_calico-apiserver(e62b9bfa-d049-4110-b093-e476a26ef5be)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" podUID="e62b9bfa-d049-4110-b093-e476a26ef5be" Mar 17 17:56:12.564550 containerd[1540]: time="2025-03-17T17:56:12.564291702Z" level=error msg="Failed to destroy network for sandbox \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.564550 containerd[1540]: time="2025-03-17T17:56:12.564471386Z" level=error msg="encountered an error cleaning up failed sandbox \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.564550 containerd[1540]: time="2025-03-17T17:56:12.564504351Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2cntr,Uid:0d288c4c-94be-4e72-9025-53893bb68385,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.564677 kubelet[2850]: E0317 17:56:12.564627 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.564677 kubelet[2850]: E0317 17:56:12.564668 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2cntr" Mar 17 17:56:12.564722 kubelet[2850]: E0317 17:56:12.564683 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2cntr" Mar 17 17:56:12.564722 kubelet[2850]: E0317 17:56:12.564707 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-2cntr_calico-system(0d288c4c-94be-4e72-9025-53893bb68385)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-2cntr_calico-system(0d288c4c-94be-4e72-9025-53893bb68385)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2cntr" podUID="0d288c4c-94be-4e72-9025-53893bb68385" Mar 17 17:56:12.566402 containerd[1540]: time="2025-03-17T17:56:12.566355211Z" level=error msg="Failed to destroy network for sandbox \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.566989 containerd[1540]: time="2025-03-17T17:56:12.566955773Z" level=error msg="encountered an error cleaning up failed sandbox \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.567075 containerd[1540]: time="2025-03-17T17:56:12.566990377Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.567524 kubelet[2850]: E0317 17:56:12.567142 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.567524 kubelet[2850]: E0317 17:56:12.567167 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:12.567524 kubelet[2850]: E0317 17:56:12.567178 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:12.567738 kubelet[2850]: E0317 17:56:12.567198 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-v59bf_kube-system(46b937aa-d1db-4705-9bec-d4bd7aeaeceb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-v59bf_kube-system(46b937aa-d1db-4705-9bec-d4bd7aeaeceb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-v59bf" podUID="46b937aa-d1db-4705-9bec-d4bd7aeaeceb" Mar 17 17:56:12.571228 containerd[1540]: time="2025-03-17T17:56:12.571001497Z" level=error msg="Failed to destroy network for sandbox \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.571228 containerd[1540]: time="2025-03-17T17:56:12.571152697Z" level=error msg="encountered an error cleaning up failed sandbox \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.571228 containerd[1540]: time="2025-03-17T17:56:12.571183755Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-dpzgd,Uid:411709d2-a807-4f34-9412-d952d186c81f,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.571455 kubelet[2850]: E0317 17:56:12.571379 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.571455 kubelet[2850]: E0317 17:56:12.571402 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:12.571455 kubelet[2850]: E0317 17:56:12.571415 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" Mar 17 17:56:12.571521 kubelet[2850]: E0317 17:56:12.571435 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8466cfddd6-dpzgd_calico-apiserver(411709d2-a807-4f34-9412-d952d186c81f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8466cfddd6-dpzgd_calico-apiserver(411709d2-a807-4f34-9412-d952d186c81f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" podUID="411709d2-a807-4f34-9412-d952d186c81f" Mar 17 17:56:12.668661 containerd[1540]: time="2025-03-17T17:56:12.668614795Z" level=info msg="CreateContainer within sandbox \"62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43\"" Mar 17 17:56:12.669645 containerd[1540]: time="2025-03-17T17:56:12.669157015Z" level=info msg="StartContainer for \"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43\"" Mar 17 17:56:12.750728 systemd[1]: Started cri-containerd-8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43.scope - libcontainer container 8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43. Mar 17 17:56:12.785846 containerd[1540]: time="2025-03-17T17:56:12.785819549Z" level=info msg="StartContainer for \"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43\" returns successfully" Mar 17 17:56:12.789308 kubelet[2850]: I0317 17:56:12.788775 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0" Mar 17 17:56:12.789461 containerd[1540]: time="2025-03-17T17:56:12.789440911Z" level=info msg="StopPodSandbox for \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\"" Mar 17 17:56:12.790793 containerd[1540]: time="2025-03-17T17:56:12.790632506Z" level=info msg="Ensure that sandbox 02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0 in task-service has been cleanup successfully" Mar 17 17:56:12.790793 containerd[1540]: time="2025-03-17T17:56:12.790760718Z" level=info msg="TearDown network for sandbox \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\" successfully" Mar 17 17:56:12.790793 containerd[1540]: time="2025-03-17T17:56:12.790772179Z" level=info msg="StopPodSandbox for \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\" returns successfully" Mar 17 17:56:12.791752 containerd[1540]: time="2025-03-17T17:56:12.791678686Z" level=info msg="StopPodSandbox for \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\"" Mar 17 17:56:12.791752 containerd[1540]: time="2025-03-17T17:56:12.791721913Z" level=info msg="TearDown network for sandbox \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\" successfully" Mar 17 17:56:12.791752 containerd[1540]: time="2025-03-17T17:56:12.791728188Z" level=info msg="StopPodSandbox for \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\" returns successfully" Mar 17 17:56:12.792017 containerd[1540]: time="2025-03-17T17:56:12.791899106Z" level=info msg="StopPodSandbox for \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\"" Mar 17 17:56:12.792017 containerd[1540]: time="2025-03-17T17:56:12.791938692Z" level=info msg="TearDown network for sandbox \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\" successfully" Mar 17 17:56:12.792017 containerd[1540]: time="2025-03-17T17:56:12.791944454Z" level=info msg="StopPodSandbox for \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\" returns successfully" Mar 17 17:56:12.796586 containerd[1540]: time="2025-03-17T17:56:12.792138798Z" level=info msg="StopPodSandbox for \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\"" Mar 17 17:56:12.796805 containerd[1540]: time="2025-03-17T17:56:12.796741891Z" level=info msg="TearDown network for sandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\" successfully" Mar 17 17:56:12.796805 containerd[1540]: time="2025-03-17T17:56:12.796754371Z" level=info msg="StopPodSandbox for \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\" returns successfully" Mar 17 17:56:12.820297 containerd[1540]: time="2025-03-17T17:56:12.818768635Z" level=info msg="StopPodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\"" Mar 17 17:56:12.820297 containerd[1540]: time="2025-03-17T17:56:12.818843987Z" level=info msg="TearDown network for sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" successfully" Mar 17 17:56:12.820297 containerd[1540]: time="2025-03-17T17:56:12.818853218Z" level=info msg="StopPodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" returns successfully" Mar 17 17:56:12.837352 containerd[1540]: time="2025-03-17T17:56:12.837095047Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\"" Mar 17 17:56:12.837352 containerd[1540]: time="2025-03-17T17:56:12.837174676Z" level=info msg="TearDown network for sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" successfully" Mar 17 17:56:12.837352 containerd[1540]: time="2025-03-17T17:56:12.837182340Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" returns successfully" Mar 17 17:56:12.890893 containerd[1540]: time="2025-03-17T17:56:12.890844501Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:6,}" Mar 17 17:56:12.901249 kubelet[2850]: I0317 17:56:12.898581 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-7zjsl" podStartSLOduration=1.737110988 podStartE2EDuration="19.881862295s" podCreationTimestamp="2025-03-17 17:55:53 +0000 UTC" firstStartedPulling="2025-03-17 17:55:54.353915956 +0000 UTC m=+23.195055212" lastFinishedPulling="2025-03-17 17:56:12.49866726 +0000 UTC m=+41.339806519" observedRunningTime="2025-03-17 17:56:12.835104673 +0000 UTC m=+41.676243941" watchObservedRunningTime="2025-03-17 17:56:12.881862295 +0000 UTC m=+41.723001559" Mar 17 17:56:12.907990 kubelet[2850]: I0317 17:56:12.907971 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423" Mar 17 17:56:12.913666 containerd[1540]: time="2025-03-17T17:56:12.913642460Z" level=info msg="StopPodSandbox for \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\"" Mar 17 17:56:12.914610 containerd[1540]: time="2025-03-17T17:56:12.914596802Z" level=info msg="Ensure that sandbox 0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423 in task-service has been cleanup successfully" Mar 17 17:56:12.916727 containerd[1540]: time="2025-03-17T17:56:12.916597869Z" level=info msg="TearDown network for sandbox \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\" successfully" Mar 17 17:56:12.916727 containerd[1540]: time="2025-03-17T17:56:12.916611471Z" level=info msg="StopPodSandbox for \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\" returns successfully" Mar 17 17:56:12.917273 containerd[1540]: time="2025-03-17T17:56:12.917258848Z" level=info msg="StopPodSandbox for \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\"" Mar 17 17:56:12.917329 containerd[1540]: time="2025-03-17T17:56:12.917318887Z" level=info msg="TearDown network for sandbox \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\" successfully" Mar 17 17:56:12.917329 containerd[1540]: time="2025-03-17T17:56:12.917327653Z" level=info msg="StopPodSandbox for \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\" returns successfully" Mar 17 17:56:12.917475 containerd[1540]: time="2025-03-17T17:56:12.917462466Z" level=info msg="StopPodSandbox for \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\"" Mar 17 17:56:12.917883 containerd[1540]: time="2025-03-17T17:56:12.917861934Z" level=info msg="TearDown network for sandbox \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\" successfully" Mar 17 17:56:12.917883 containerd[1540]: time="2025-03-17T17:56:12.917878410Z" level=info msg="StopPodSandbox for \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\" returns successfully" Mar 17 17:56:12.918175 containerd[1540]: time="2025-03-17T17:56:12.918101282Z" level=info msg="StopPodSandbox for \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\"" Mar 17 17:56:12.918296 containerd[1540]: time="2025-03-17T17:56:12.918283228Z" level=info msg="TearDown network for sandbox \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\" successfully" Mar 17 17:56:12.918296 containerd[1540]: time="2025-03-17T17:56:12.918293236Z" level=info msg="StopPodSandbox for \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\" returns successfully" Mar 17 17:56:12.918511 containerd[1540]: time="2025-03-17T17:56:12.918494724Z" level=info msg="StopPodSandbox for \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\"" Mar 17 17:56:12.918552 containerd[1540]: time="2025-03-17T17:56:12.918540970Z" level=info msg="TearDown network for sandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\" successfully" Mar 17 17:56:12.918602 containerd[1540]: time="2025-03-17T17:56:12.918550645Z" level=info msg="StopPodSandbox for \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\" returns successfully" Mar 17 17:56:12.919099 containerd[1540]: time="2025-03-17T17:56:12.919075241Z" level=info msg="StopPodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\"" Mar 17 17:56:12.919422 kubelet[2850]: I0317 17:56:12.919404 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3" Mar 17 17:56:12.919780 containerd[1540]: time="2025-03-17T17:56:12.919765328Z" level=info msg="TearDown network for sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" successfully" Mar 17 17:56:12.919780 containerd[1540]: time="2025-03-17T17:56:12.919776429Z" level=info msg="StopPodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" returns successfully" Mar 17 17:56:12.920877 containerd[1540]: time="2025-03-17T17:56:12.920330194Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2cntr,Uid:0d288c4c-94be-4e72-9025-53893bb68385,Namespace:calico-system,Attempt:6,}" Mar 17 17:56:12.920877 containerd[1540]: time="2025-03-17T17:56:12.920799309Z" level=info msg="StopPodSandbox for \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\"" Mar 17 17:56:12.920937 containerd[1540]: time="2025-03-17T17:56:12.920919232Z" level=info msg="Ensure that sandbox 3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3 in task-service has been cleanup successfully" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.921423799Z" level=info msg="TearDown network for sandbox \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\" successfully" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.921436229Z" level=info msg="StopPodSandbox for \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\" returns successfully" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.922186679Z" level=info msg="StopPodSandbox for \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\"" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.922233635Z" level=info msg="TearDown network for sandbox \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\" successfully" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.922240174Z" level=info msg="StopPodSandbox for \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\" returns successfully" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.922420160Z" level=info msg="StopPodSandbox for \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\"" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.922472145Z" level=info msg="TearDown network for sandbox \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\" successfully" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.922478684Z" level=info msg="StopPodSandbox for \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\" returns successfully" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.922653431Z" level=info msg="StopPodSandbox for \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\"" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.922688763Z" level=info msg="TearDown network for sandbox \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\" successfully" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.922694594Z" level=info msg="StopPodSandbox for \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\" returns successfully" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.923031032Z" level=info msg="StopPodSandbox for \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\"" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.923118015Z" level=info msg="Ensure that sandbox c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719 in task-service has been cleanup successfully" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.923277435Z" level=info msg="StopPodSandbox for \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\"" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.923311779Z" level=info msg="TearDown network for sandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\" successfully" Mar 17 17:56:12.923667 containerd[1540]: time="2025-03-17T17:56:12.923317626Z" level=info msg="StopPodSandbox for \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\" returns successfully" Mar 17 17:56:12.923972 kubelet[2850]: I0317 17:56:12.922773 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719" Mar 17 17:56:12.923997 containerd[1540]: time="2025-03-17T17:56:12.923782084Z" level=info msg="TearDown network for sandbox \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\" successfully" Mar 17 17:56:12.923997 containerd[1540]: time="2025-03-17T17:56:12.923791693Z" level=info msg="StopPodSandbox for \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\" returns successfully" Mar 17 17:56:12.925824 containerd[1540]: time="2025-03-17T17:56:12.925633530Z" level=info msg="StopPodSandbox for \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\"" Mar 17 17:56:12.925824 containerd[1540]: time="2025-03-17T17:56:12.925678513Z" level=info msg="TearDown network for sandbox \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\" successfully" Mar 17 17:56:12.925824 containerd[1540]: time="2025-03-17T17:56:12.925685865Z" level=info msg="StopPodSandbox for \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\" returns successfully" Mar 17 17:56:12.925824 containerd[1540]: time="2025-03-17T17:56:12.925768278Z" level=info msg="StopPodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\"" Mar 17 17:56:12.925824 containerd[1540]: time="2025-03-17T17:56:12.925802958Z" level=info msg="TearDown network for sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" successfully" Mar 17 17:56:12.925824 containerd[1540]: time="2025-03-17T17:56:12.925808627Z" level=info msg="StopPodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" returns successfully" Mar 17 17:56:12.933831 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 17 17:56:12.940789 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.925928507Z" level=info msg="StopPodSandbox for \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\"" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.925961939Z" level=info msg="TearDown network for sandbox \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\" successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.925967560Z" level=info msg="StopPodSandbox for \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\" returns successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.927118273Z" level=info msg="StopPodSandbox for \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\"" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.927166893Z" level=info msg="TearDown network for sandbox \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\" successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.927173851Z" level=info msg="StopPodSandbox for \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\" returns successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.927251351Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-28hkf,Uid:e62b9bfa-d049-4110-b093-e476a26ef5be,Namespace:calico-apiserver,Attempt:6,}" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.933022331Z" level=info msg="StopPodSandbox for \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\"" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.933073464Z" level=info msg="TearDown network for sandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\" successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.933080391Z" level=info msg="StopPodSandbox for \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\" returns successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.933247608Z" level=info msg="StopPodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\"" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.933282695Z" level=info msg="TearDown network for sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.933288322Z" level=info msg="StopPodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" returns successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.933592990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-dpzgd,Uid:411709d2-a807-4f34-9412-d952d186c81f,Namespace:calico-apiserver,Attempt:6,}" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.934683435Z" level=info msg="StopPodSandbox for \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\"" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.935154081Z" level=info msg="Ensure that sandbox ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7 in task-service has been cleanup successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.935618457Z" level=info msg="TearDown network for sandbox \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\" successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.935627197Z" level=info msg="StopPodSandbox for \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\" returns successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.936145708Z" level=info msg="StopPodSandbox for \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\"" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.936350167Z" level=info msg="TearDown network for sandbox \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\" successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.936362520Z" level=info msg="StopPodSandbox for \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\" returns successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.936855600Z" level=info msg="StopPodSandbox for \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\"" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.936892768Z" level=info msg="TearDown network for sandbox \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\" successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.936898828Z" level=info msg="StopPodSandbox for \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\" returns successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.937135530Z" level=info msg="StopPodSandbox for \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\"" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.937172310Z" level=info msg="TearDown network for sandbox \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\" successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.937178166Z" level=info msg="StopPodSandbox for \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\" returns successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.937404926Z" level=info msg="StopPodSandbox for \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\"" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.937456044Z" level=info msg="TearDown network for sandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\" successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.937462488Z" level=info msg="StopPodSandbox for \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\" returns successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.937613919Z" level=info msg="StopPodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\"" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.937670820Z" level=info msg="TearDown network for sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.937678529Z" level=info msg="StopPodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" returns successfully" Mar 17 17:56:12.940830 containerd[1540]: time="2025-03-17T17:56:12.939683474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c4bd45cb8-lpjjh,Uid:4761939f-21ce-4484-88c7-08bcb4f65c5c,Namespace:calico-system,Attempt:6,}" Mar 17 17:56:12.942130 kubelet[2850]: I0317 17:56:12.934407 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7" Mar 17 17:56:12.965690 containerd[1540]: time="2025-03-17T17:56:12.965363448Z" level=error msg="Failed to destroy network for sandbox \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.965690 containerd[1540]: time="2025-03-17T17:56:12.965622360Z" level=error msg="encountered an error cleaning up failed sandbox \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.965690 containerd[1540]: time="2025-03-17T17:56:12.965656797Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.966117 kubelet[2850]: E0317 17:56:12.965926 2850 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:56:12.966117 kubelet[2850]: E0317 17:56:12.965960 2850 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:12.966117 kubelet[2850]: E0317 17:56:12.965974 2850 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-v59bf" Mar 17 17:56:12.966196 kubelet[2850]: E0317 17:56:12.965999 2850 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-v59bf_kube-system(46b937aa-d1db-4705-9bec-d4bd7aeaeceb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-v59bf_kube-system(46b937aa-d1db-4705-9bec-d4bd7aeaeceb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-v59bf" podUID="46b937aa-d1db-4705-9bec-d4bd7aeaeceb" Mar 17 17:56:12.971850 kubelet[2850]: I0317 17:56:12.971550 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2" Mar 17 17:56:12.972037 containerd[1540]: time="2025-03-17T17:56:12.972017493Z" level=info msg="StopPodSandbox for \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\"" Mar 17 17:56:12.972160 containerd[1540]: time="2025-03-17T17:56:12.972144089Z" level=info msg="Ensure that sandbox eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2 in task-service has been cleanup successfully" Mar 17 17:56:12.972259 containerd[1540]: time="2025-03-17T17:56:12.972246797Z" level=info msg="TearDown network for sandbox \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\" successfully" Mar 17 17:56:12.972259 containerd[1540]: time="2025-03-17T17:56:12.972256657Z" level=info msg="StopPodSandbox for \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\" returns successfully" Mar 17 17:56:12.972398 containerd[1540]: time="2025-03-17T17:56:12.972384160Z" level=info msg="StopPodSandbox for \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\"" Mar 17 17:56:12.972455 containerd[1540]: time="2025-03-17T17:56:12.972425217Z" level=info msg="TearDown network for sandbox \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\" successfully" Mar 17 17:56:12.972455 containerd[1540]: time="2025-03-17T17:56:12.972452427Z" level=info msg="StopPodSandbox for \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\" returns successfully" Mar 17 17:56:12.972604 containerd[1540]: time="2025-03-17T17:56:12.972588179Z" level=info msg="StopPodSandbox for \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\"" Mar 17 17:56:12.972647 containerd[1540]: time="2025-03-17T17:56:12.972640170Z" level=info msg="TearDown network for sandbox \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\" successfully" Mar 17 17:56:12.972672 containerd[1540]: time="2025-03-17T17:56:12.972647903Z" level=info msg="StopPodSandbox for \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\" returns successfully" Mar 17 17:56:12.972785 containerd[1540]: time="2025-03-17T17:56:12.972771370Z" level=info msg="StopPodSandbox for \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\"" Mar 17 17:56:12.972832 containerd[1540]: time="2025-03-17T17:56:12.972810971Z" level=info msg="TearDown network for sandbox \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\" successfully" Mar 17 17:56:12.972832 containerd[1540]: time="2025-03-17T17:56:12.972830621Z" level=info msg="StopPodSandbox for \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\" returns successfully" Mar 17 17:56:12.973023 containerd[1540]: time="2025-03-17T17:56:12.973008965Z" level=info msg="StopPodSandbox for \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\"" Mar 17 17:56:12.973052 containerd[1540]: time="2025-03-17T17:56:12.973046261Z" level=info msg="TearDown network for sandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\" successfully" Mar 17 17:56:12.973075 containerd[1540]: time="2025-03-17T17:56:12.973051672Z" level=info msg="StopPodSandbox for \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\" returns successfully" Mar 17 17:56:12.973177 containerd[1540]: time="2025-03-17T17:56:12.973163438Z" level=info msg="StopPodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\"" Mar 17 17:56:12.973215 containerd[1540]: time="2025-03-17T17:56:12.973202832Z" level=info msg="TearDown network for sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" successfully" Mar 17 17:56:12.973215 containerd[1540]: time="2025-03-17T17:56:12.973213363Z" level=info msg="StopPodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" returns successfully" Mar 17 17:56:12.973477 containerd[1540]: time="2025-03-17T17:56:12.973463943Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-4ntmw,Uid:f3d38051-dd31-4085-95b4-5054901044b2,Namespace:kube-system,Attempt:6,}" Mar 17 17:56:13.458359 systemd[1]: run-netns-cni\x2da86aae2b\x2d7954\x2df993\x2d56b6\x2db6e770efd723.mount: Deactivated successfully. Mar 17 17:56:13.458672 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3-shm.mount: Deactivated successfully. Mar 17 17:56:13.458744 systemd[1]: run-netns-cni\x2d430a021d\x2d2798\x2d6b33\x2d4a6a\x2ddb8561eecb51.mount: Deactivated successfully. Mar 17 17:56:13.458795 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423-shm.mount: Deactivated successfully. Mar 17 17:56:13.458850 systemd[1]: run-netns-cni\x2d12f9122e\x2d1840\x2d8159\x2d60f5\x2dbbf5890f4b3d.mount: Deactivated successfully. Mar 17 17:56:13.458900 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0-shm.mount: Deactivated successfully. Mar 17 17:56:13.458952 systemd[1]: run-netns-cni\x2dbc9a394f\x2d5c12\x2da38f\x2dacef\x2decc0dca21a29.mount: Deactivated successfully. Mar 17 17:56:13.459001 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719-shm.mount: Deactivated successfully. Mar 17 17:56:13.459048 systemd[1]: run-netns-cni\x2ddfa2d8bf\x2d2e07\x2d5fc4\x2da06f\x2d1a3f70979eaf.mount: Deactivated successfully. Mar 17 17:56:13.459096 systemd[1]: run-netns-cni\x2d70c89a9e\x2d154c\x2d70c1\x2da67b\x2dd17901fb67fd.mount: Deactivated successfully. Mar 17 17:56:13.975024 kubelet[2850]: I0317 17:56:13.975009 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878" Mar 17 17:56:13.976012 containerd[1540]: time="2025-03-17T17:56:13.975705613Z" level=info msg="StopPodSandbox for \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\"" Mar 17 17:56:13.976012 containerd[1540]: time="2025-03-17T17:56:13.975838706Z" level=info msg="Ensure that sandbox 0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878 in task-service has been cleanup successfully" Mar 17 17:56:13.977946 systemd[1]: run-netns-cni\x2d12f167a6\x2d02cb\x2d4fe4\x2d56f9\x2d270f94e1c217.mount: Deactivated successfully. Mar 17 17:56:13.979916 containerd[1540]: time="2025-03-17T17:56:13.979901186Z" level=info msg="TearDown network for sandbox \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\" successfully" Mar 17 17:56:13.980022 containerd[1540]: time="2025-03-17T17:56:13.979973404Z" level=info msg="StopPodSandbox for \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\" returns successfully" Mar 17 17:56:13.982332 containerd[1540]: time="2025-03-17T17:56:13.980265612Z" level=info msg="StopPodSandbox for \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\"" Mar 17 17:56:13.982462 containerd[1540]: time="2025-03-17T17:56:13.982428675Z" level=info msg="TearDown network for sandbox \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\" successfully" Mar 17 17:56:13.982462 containerd[1540]: time="2025-03-17T17:56:13.982440231Z" level=info msg="StopPodSandbox for \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\" returns successfully" Mar 17 17:56:13.982762 containerd[1540]: time="2025-03-17T17:56:13.982706170Z" level=info msg="StopPodSandbox for \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\"" Mar 17 17:56:13.982977 containerd[1540]: time="2025-03-17T17:56:13.982932950Z" level=info msg="TearDown network for sandbox \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\" successfully" Mar 17 17:56:13.982977 containerd[1540]: time="2025-03-17T17:56:13.982942236Z" level=info msg="StopPodSandbox for \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\" returns successfully" Mar 17 17:56:13.983132 containerd[1540]: time="2025-03-17T17:56:13.983123152Z" level=info msg="StopPodSandbox for \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\"" Mar 17 17:56:13.983242 containerd[1540]: time="2025-03-17T17:56:13.983205031Z" level=info msg="TearDown network for sandbox \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\" successfully" Mar 17 17:56:13.983242 containerd[1540]: time="2025-03-17T17:56:13.983212701Z" level=info msg="StopPodSandbox for \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\" returns successfully" Mar 17 17:56:13.983441 containerd[1540]: time="2025-03-17T17:56:13.983408064Z" level=info msg="StopPodSandbox for \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\"" Mar 17 17:56:13.983547 containerd[1540]: time="2025-03-17T17:56:13.983518398Z" level=info msg="TearDown network for sandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\" successfully" Mar 17 17:56:13.983547 containerd[1540]: time="2025-03-17T17:56:13.983527134Z" level=info msg="StopPodSandbox for \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\" returns successfully" Mar 17 17:56:14.004473 containerd[1540]: time="2025-03-17T17:56:14.003911568Z" level=info msg="StopPodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\"" Mar 17 17:56:14.004473 containerd[1540]: time="2025-03-17T17:56:14.003979381Z" level=info msg="TearDown network for sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" successfully" Mar 17 17:56:14.004473 containerd[1540]: time="2025-03-17T17:56:14.003987444Z" level=info msg="StopPodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" returns successfully" Mar 17 17:56:14.004473 containerd[1540]: time="2025-03-17T17:56:14.004399950Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\"" Mar 17 17:56:14.006525 containerd[1540]: time="2025-03-17T17:56:14.006417953Z" level=info msg="TearDown network for sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" successfully" Mar 17 17:56:14.006525 containerd[1540]: time="2025-03-17T17:56:14.006431483Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" returns successfully" Mar 17 17:56:14.015219 containerd[1540]: time="2025-03-17T17:56:14.015181060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:7,}" Mar 17 17:56:14.097185 systemd-networkd[1435]: cali7f6e63fe175: Link UP Mar 17 17:56:14.097306 systemd-networkd[1435]: cali7f6e63fe175: Gained carrier Mar 17 17:56:14.097410 systemd-networkd[1435]: cali35341b5d9d9: Link UP Mar 17 17:56:14.097511 systemd-networkd[1435]: cali35341b5d9d9: Gained carrier Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:13.105 [INFO][4820] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:13.205 [INFO][4820] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0 calico-kube-controllers-7c4bd45cb8- calico-system 4761939f-21ce-4484-88c7-08bcb4f65c5c 731 0 2025-03-17 17:55:54 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7c4bd45cb8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7c4bd45cb8-lpjjh eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali7f6e63fe175 [] []}} ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Namespace="calico-system" Pod="calico-kube-controllers-7c4bd45cb8-lpjjh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-" Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:13.205 [INFO][4820] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Namespace="calico-system" Pod="calico-kube-controllers-7c4bd45cb8-lpjjh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0" Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:13.934 [INFO][4863] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" HandleID="k8s-pod-network.550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Workload="localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0" Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.005 [INFO][4863] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" HandleID="k8s-pod-network.550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Workload="localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000259e10), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7c4bd45cb8-lpjjh", "timestamp":"2025-03-17 17:56:13.934413116 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.005 [INFO][4863] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.005 [INFO][4863] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.005 [INFO][4863] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.009 [INFO][4863] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" host="localhost" Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.022 [INFO][4863] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.032 [INFO][4863] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.036 [INFO][4863] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.038 [INFO][4863] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.038 [INFO][4863] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" host="localhost" Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.041 [INFO][4863] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268 Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.046 [INFO][4863] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" host="localhost" Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.052 [INFO][4863] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" host="localhost" Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.052 [INFO][4863] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" host="localhost" Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.052 [INFO][4863] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:56:14.125483 containerd[1540]: 2025-03-17 17:56:14.052 [INFO][4863] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" HandleID="k8s-pod-network.550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Workload="localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0" Mar 17 17:56:14.126098 containerd[1540]: 2025-03-17 17:56:14.059 [INFO][4820] cni-plugin/k8s.go 386: Populated endpoint ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Namespace="calico-system" Pod="calico-kube-controllers-7c4bd45cb8-lpjjh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0", GenerateName:"calico-kube-controllers-7c4bd45cb8-", Namespace:"calico-system", SelfLink:"", UID:"4761939f-21ce-4484-88c7-08bcb4f65c5c", ResourceVersion:"731", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 55, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7c4bd45cb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7c4bd45cb8-lpjjh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7f6e63fe175", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:56:14.126098 containerd[1540]: 2025-03-17 17:56:14.059 [INFO][4820] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Namespace="calico-system" Pod="calico-kube-controllers-7c4bd45cb8-lpjjh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0" Mar 17 17:56:14.126098 containerd[1540]: 2025-03-17 17:56:14.059 [INFO][4820] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7f6e63fe175 ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Namespace="calico-system" Pod="calico-kube-controllers-7c4bd45cb8-lpjjh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0" Mar 17 17:56:14.126098 containerd[1540]: 2025-03-17 17:56:14.098 [INFO][4820] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Namespace="calico-system" Pod="calico-kube-controllers-7c4bd45cb8-lpjjh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0" Mar 17 17:56:14.126098 containerd[1540]: 2025-03-17 17:56:14.099 [INFO][4820] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Namespace="calico-system" Pod="calico-kube-controllers-7c4bd45cb8-lpjjh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0", GenerateName:"calico-kube-controllers-7c4bd45cb8-", Namespace:"calico-system", SelfLink:"", UID:"4761939f-21ce-4484-88c7-08bcb4f65c5c", ResourceVersion:"731", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 55, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7c4bd45cb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268", Pod:"calico-kube-controllers-7c4bd45cb8-lpjjh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7f6e63fe175", MAC:"c6:54:73:82:3e:de", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:56:14.126098 containerd[1540]: 2025-03-17 17:56:14.124 [INFO][4820] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Namespace="calico-system" Pod="calico-kube-controllers-7c4bd45cb8-lpjjh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0" Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:13.098 [INFO][4800] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:13.206 [INFO][4800] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--8466cfddd6--28hkf-eth0 calico-apiserver-8466cfddd6- calico-apiserver e62b9bfa-d049-4110-b093-e476a26ef5be 732 0 2025-03-17 17:55:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8466cfddd6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-8466cfddd6-28hkf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali35341b5d9d9 [] []}} ContainerID="f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-28hkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--28hkf-" Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:13.206 [INFO][4800] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-28hkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--28hkf-eth0" Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:13.934 [INFO][4864] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" HandleID="k8s-pod-network.f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" Workload="localhost-k8s-calico--apiserver--8466cfddd6--28hkf-eth0" Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.006 [INFO][4864] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" HandleID="k8s-pod-network.f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" Workload="localhost-k8s-calico--apiserver--8466cfddd6--28hkf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000356840), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-8466cfddd6-28hkf", "timestamp":"2025-03-17 17:56:13.934498091 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.006 [INFO][4864] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.053 [INFO][4864] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.053 [INFO][4864] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.055 [INFO][4864] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" host="localhost" Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.059 [INFO][4864] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.062 [INFO][4864] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.064 [INFO][4864] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.067 [INFO][4864] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.067 [INFO][4864] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" host="localhost" Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.068 [INFO][4864] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86 Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.073 [INFO][4864] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" host="localhost" Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.082 [INFO][4864] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" host="localhost" Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.082 [INFO][4864] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" host="localhost" Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.082 [INFO][4864] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:56:14.134866 containerd[1540]: 2025-03-17 17:56:14.082 [INFO][4864] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" HandleID="k8s-pod-network.f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" Workload="localhost-k8s-calico--apiserver--8466cfddd6--28hkf-eth0" Mar 17 17:56:14.135939 containerd[1540]: 2025-03-17 17:56:14.089 [INFO][4800] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-28hkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--28hkf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8466cfddd6--28hkf-eth0", GenerateName:"calico-apiserver-8466cfddd6-", Namespace:"calico-apiserver", SelfLink:"", UID:"e62b9bfa-d049-4110-b093-e476a26ef5be", ResourceVersion:"732", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 55, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8466cfddd6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-8466cfddd6-28hkf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali35341b5d9d9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:56:14.135939 containerd[1540]: 2025-03-17 17:56:14.089 [INFO][4800] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-28hkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--28hkf-eth0" Mar 17 17:56:14.135939 containerd[1540]: 2025-03-17 17:56:14.089 [INFO][4800] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali35341b5d9d9 ContainerID="f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-28hkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--28hkf-eth0" Mar 17 17:56:14.135939 containerd[1540]: 2025-03-17 17:56:14.099 [INFO][4800] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-28hkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--28hkf-eth0" Mar 17 17:56:14.135939 containerd[1540]: 2025-03-17 17:56:14.099 [INFO][4800] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-28hkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--28hkf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8466cfddd6--28hkf-eth0", GenerateName:"calico-apiserver-8466cfddd6-", Namespace:"calico-apiserver", SelfLink:"", UID:"e62b9bfa-d049-4110-b093-e476a26ef5be", ResourceVersion:"732", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 55, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8466cfddd6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86", Pod:"calico-apiserver-8466cfddd6-28hkf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali35341b5d9d9", MAC:"9e:2c:ed:44:f4:6a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:56:14.135939 containerd[1540]: 2025-03-17 17:56:14.130 [INFO][4800] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-28hkf" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--28hkf-eth0" Mar 17 17:56:14.170615 containerd[1540]: time="2025-03-17T17:56:14.169728946Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:56:14.170615 containerd[1540]: time="2025-03-17T17:56:14.170467381Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:56:14.176790 containerd[1540]: time="2025-03-17T17:56:14.174466730Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:14.176790 containerd[1540]: time="2025-03-17T17:56:14.174578377Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:14.188241 systemd-networkd[1435]: calid58474e6d23: Link UP Mar 17 17:56:14.188653 systemd-networkd[1435]: calid58474e6d23: Gained carrier Mar 17 17:56:14.193104 containerd[1540]: time="2025-03-17T17:56:14.193043869Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:56:14.194530 containerd[1540]: time="2025-03-17T17:56:14.193203710Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:56:14.194739 containerd[1540]: time="2025-03-17T17:56:14.194535477Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:14.194782 containerd[1540]: time="2025-03-17T17:56:14.194740167Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:14.212695 systemd[1]: Started cri-containerd-f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86.scope - libcontainer container f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86. Mar 17 17:56:14.225697 systemd[1]: Started cri-containerd-550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268.scope - libcontainer container 550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268. Mar 17 17:56:14.253813 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:13.114 [INFO][4831] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:13.207 [INFO][4831] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--4ntmw-eth0 coredns-7db6d8ff4d- kube-system f3d38051-dd31-4085-95b4-5054901044b2 729 0 2025-03-17 17:55:47 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-4ntmw eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid58474e6d23 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" Namespace="kube-system" Pod="coredns-7db6d8ff4d-4ntmw" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--4ntmw-" Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:13.207 [INFO][4831] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" Namespace="kube-system" Pod="coredns-7db6d8ff4d-4ntmw" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--4ntmw-eth0" Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:13.938 [INFO][4860] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" HandleID="k8s-pod-network.061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" Workload="localhost-k8s-coredns--7db6d8ff4d--4ntmw-eth0" Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.006 [INFO][4860] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" HandleID="k8s-pod-network.061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" Workload="localhost-k8s-coredns--7db6d8ff4d--4ntmw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003740b0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-4ntmw", "timestamp":"2025-03-17 17:56:13.938213056 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.006 [INFO][4860] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.083 [INFO][4860] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.083 [INFO][4860] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.087 [INFO][4860] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" host="localhost" Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.100 [INFO][4860] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.103 [INFO][4860] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.114 [INFO][4860] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.131 [INFO][4860] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.131 [INFO][4860] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" host="localhost" Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.133 [INFO][4860] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.145 [INFO][4860] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" host="localhost" Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.165 [INFO][4860] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" host="localhost" Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.168 [INFO][4860] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" host="localhost" Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.168 [INFO][4860] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:56:14.286177 containerd[1540]: 2025-03-17 17:56:14.169 [INFO][4860] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" HandleID="k8s-pod-network.061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" Workload="localhost-k8s-coredns--7db6d8ff4d--4ntmw-eth0" Mar 17 17:56:14.286987 containerd[1540]: 2025-03-17 17:56:14.180 [INFO][4831] cni-plugin/k8s.go 386: Populated endpoint ContainerID="061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" Namespace="kube-system" Pod="coredns-7db6d8ff4d-4ntmw" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--4ntmw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--4ntmw-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"f3d38051-dd31-4085-95b4-5054901044b2", ResourceVersion:"729", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 55, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-4ntmw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid58474e6d23", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:56:14.286987 containerd[1540]: 2025-03-17 17:56:14.180 [INFO][4831] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" Namespace="kube-system" Pod="coredns-7db6d8ff4d-4ntmw" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--4ntmw-eth0" Mar 17 17:56:14.286987 containerd[1540]: 2025-03-17 17:56:14.180 [INFO][4831] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid58474e6d23 ContainerID="061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" Namespace="kube-system" Pod="coredns-7db6d8ff4d-4ntmw" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--4ntmw-eth0" Mar 17 17:56:14.286987 containerd[1540]: 2025-03-17 17:56:14.191 [INFO][4831] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" Namespace="kube-system" Pod="coredns-7db6d8ff4d-4ntmw" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--4ntmw-eth0" Mar 17 17:56:14.286987 containerd[1540]: 2025-03-17 17:56:14.194 [INFO][4831] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" Namespace="kube-system" Pod="coredns-7db6d8ff4d-4ntmw" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--4ntmw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--4ntmw-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"f3d38051-dd31-4085-95b4-5054901044b2", ResourceVersion:"729", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 55, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd", Pod:"coredns-7db6d8ff4d-4ntmw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid58474e6d23", MAC:"5a:e4:a2:ef:bc:87", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:56:14.286987 containerd[1540]: 2025-03-17 17:56:14.259 [INFO][4831] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd" Namespace="kube-system" Pod="coredns-7db6d8ff4d-4ntmw" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--4ntmw-eth0" Mar 17 17:56:14.293635 containerd[1540]: time="2025-03-17T17:56:14.293605424Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-28hkf,Uid:e62b9bfa-d049-4110-b093-e476a26ef5be,Namespace:calico-apiserver,Attempt:6,} returns sandbox id \"f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86\"" Mar 17 17:56:14.301586 containerd[1540]: time="2025-03-17T17:56:14.300913345Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 17 17:56:14.304837 containerd[1540]: time="2025-03-17T17:56:14.300935874Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:56:14.304837 containerd[1540]: time="2025-03-17T17:56:14.300980312Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:56:14.304837 containerd[1540]: time="2025-03-17T17:56:14.300991061Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:14.304837 containerd[1540]: time="2025-03-17T17:56:14.301047334Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:14.315306 systemd-networkd[1435]: cali0cf7bfdcde6: Link UP Mar 17 17:56:14.315503 systemd-networkd[1435]: cali0cf7bfdcde6: Gained carrier Mar 17 17:56:14.327779 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:13.105 [INFO][4810] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:13.206 [INFO][4810] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--8466cfddd6--dpzgd-eth0 calico-apiserver-8466cfddd6- calico-apiserver 411709d2-a807-4f34-9412-d952d186c81f 730 0 2025-03-17 17:55:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8466cfddd6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-8466cfddd6-dpzgd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali0cf7bfdcde6 [] []}} ContainerID="9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-dpzgd" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--dpzgd-" Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:13.206 [INFO][4810] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-dpzgd" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--dpzgd-eth0" Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:13.934 [INFO][4868] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" HandleID="k8s-pod-network.9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" Workload="localhost-k8s-calico--apiserver--8466cfddd6--dpzgd-eth0" Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.006 [INFO][4868] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" HandleID="k8s-pod-network.9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" Workload="localhost-k8s-calico--apiserver--8466cfddd6--dpzgd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004007c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-8466cfddd6-dpzgd", "timestamp":"2025-03-17 17:56:13.934555359 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.006 [INFO][4868] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.168 [INFO][4868] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.168 [INFO][4868] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.181 [INFO][4868] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" host="localhost" Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.217 [INFO][4868] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.255 [INFO][4868] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.258 [INFO][4868] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.262 [INFO][4868] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.262 [INFO][4868] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" host="localhost" Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.263 [INFO][4868] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.271 [INFO][4868] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" host="localhost" Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.300 [INFO][4868] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" host="localhost" Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.300 [INFO][4868] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" host="localhost" Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.300 [INFO][4868] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:56:14.347950 containerd[1540]: 2025-03-17 17:56:14.301 [INFO][4868] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" HandleID="k8s-pod-network.9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" Workload="localhost-k8s-calico--apiserver--8466cfddd6--dpzgd-eth0" Mar 17 17:56:14.348435 containerd[1540]: 2025-03-17 17:56:14.304 [INFO][4810] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-dpzgd" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--dpzgd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8466cfddd6--dpzgd-eth0", GenerateName:"calico-apiserver-8466cfddd6-", Namespace:"calico-apiserver", SelfLink:"", UID:"411709d2-a807-4f34-9412-d952d186c81f", ResourceVersion:"730", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 55, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8466cfddd6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-8466cfddd6-dpzgd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0cf7bfdcde6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:56:14.348435 containerd[1540]: 2025-03-17 17:56:14.309 [INFO][4810] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-dpzgd" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--dpzgd-eth0" Mar 17 17:56:14.348435 containerd[1540]: 2025-03-17 17:56:14.309 [INFO][4810] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0cf7bfdcde6 ContainerID="9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-dpzgd" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--dpzgd-eth0" Mar 17 17:56:14.348435 containerd[1540]: 2025-03-17 17:56:14.314 [INFO][4810] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-dpzgd" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--dpzgd-eth0" Mar 17 17:56:14.348435 containerd[1540]: 2025-03-17 17:56:14.315 [INFO][4810] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-dpzgd" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--dpzgd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8466cfddd6--dpzgd-eth0", GenerateName:"calico-apiserver-8466cfddd6-", Namespace:"calico-apiserver", SelfLink:"", UID:"411709d2-a807-4f34-9412-d952d186c81f", ResourceVersion:"730", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 55, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8466cfddd6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce", Pod:"calico-apiserver-8466cfddd6-dpzgd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0cf7bfdcde6", MAC:"ca:76:7e:63:72:e6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:56:14.348435 containerd[1540]: 2025-03-17 17:56:14.342 [INFO][4810] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce" Namespace="calico-apiserver" Pod="calico-apiserver-8466cfddd6-dpzgd" WorkloadEndpoint="localhost-k8s-calico--apiserver--8466cfddd6--dpzgd-eth0" Mar 17 17:56:14.349004 systemd[1]: Started cri-containerd-061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd.scope - libcontainer container 061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd. Mar 17 17:56:14.377108 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:56:14.378722 containerd[1540]: time="2025-03-17T17:56:14.378150298Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:56:14.378722 containerd[1540]: time="2025-03-17T17:56:14.378198871Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:56:14.378722 containerd[1540]: time="2025-03-17T17:56:14.378214546Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:14.378722 containerd[1540]: time="2025-03-17T17:56:14.378280356Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:14.388860 systemd-networkd[1435]: cali9002a0e79a5: Link UP Mar 17 17:56:14.389304 systemd-networkd[1435]: cali9002a0e79a5: Gained carrier Mar 17 17:56:14.414896 systemd[1]: Started cri-containerd-9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce.scope - libcontainer container 9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce. Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:13.093 [INFO][4791] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:13.207 [INFO][4791] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--2cntr-eth0 csi-node-driver- calico-system 0d288c4c-94be-4e72-9025-53893bb68385 601 0 2025-03-17 17:55:54 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:69ddf5d45d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-2cntr eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali9002a0e79a5 [] []}} ContainerID="cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" Namespace="calico-system" Pod="csi-node-driver-2cntr" WorkloadEndpoint="localhost-k8s-csi--node--driver--2cntr-" Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:13.207 [INFO][4791] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" Namespace="calico-system" Pod="csi-node-driver-2cntr" WorkloadEndpoint="localhost-k8s-csi--node--driver--2cntr-eth0" Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:13.937 [INFO][4866] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" HandleID="k8s-pod-network.cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" Workload="localhost-k8s-csi--node--driver--2cntr-eth0" Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.007 [INFO][4866] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" HandleID="k8s-pod-network.cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" Workload="localhost-k8s-csi--node--driver--2cntr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000050510), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-2cntr", "timestamp":"2025-03-17 17:56:13.9378567 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.007 [INFO][4866] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.301 [INFO][4866] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.301 [INFO][4866] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.322 [INFO][4866] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" host="localhost" Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.338 [INFO][4866] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.347 [INFO][4866] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.350 [INFO][4866] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.352 [INFO][4866] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.352 [INFO][4866] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" host="localhost" Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.355 [INFO][4866] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16 Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.361 [INFO][4866] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" host="localhost" Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.373 [INFO][4866] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" host="localhost" Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.373 [INFO][4866] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" host="localhost" Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.373 [INFO][4866] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:56:14.421698 containerd[1540]: 2025-03-17 17:56:14.373 [INFO][4866] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" HandleID="k8s-pod-network.cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" Workload="localhost-k8s-csi--node--driver--2cntr-eth0" Mar 17 17:56:14.426329 containerd[1540]: 2025-03-17 17:56:14.381 [INFO][4791] cni-plugin/k8s.go 386: Populated endpoint ContainerID="cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" Namespace="calico-system" Pod="csi-node-driver-2cntr" WorkloadEndpoint="localhost-k8s-csi--node--driver--2cntr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--2cntr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0d288c4c-94be-4e72-9025-53893bb68385", ResourceVersion:"601", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 55, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-2cntr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9002a0e79a5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:56:14.426329 containerd[1540]: 2025-03-17 17:56:14.382 [INFO][4791] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" Namespace="calico-system" Pod="csi-node-driver-2cntr" WorkloadEndpoint="localhost-k8s-csi--node--driver--2cntr-eth0" Mar 17 17:56:14.426329 containerd[1540]: 2025-03-17 17:56:14.382 [INFO][4791] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9002a0e79a5 ContainerID="cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" Namespace="calico-system" Pod="csi-node-driver-2cntr" WorkloadEndpoint="localhost-k8s-csi--node--driver--2cntr-eth0" Mar 17 17:56:14.426329 containerd[1540]: 2025-03-17 17:56:14.393 [INFO][4791] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" Namespace="calico-system" Pod="csi-node-driver-2cntr" WorkloadEndpoint="localhost-k8s-csi--node--driver--2cntr-eth0" Mar 17 17:56:14.426329 containerd[1540]: 2025-03-17 17:56:14.394 [INFO][4791] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" Namespace="calico-system" Pod="csi-node-driver-2cntr" WorkloadEndpoint="localhost-k8s-csi--node--driver--2cntr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--2cntr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0d288c4c-94be-4e72-9025-53893bb68385", ResourceVersion:"601", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 55, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16", Pod:"csi-node-driver-2cntr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9002a0e79a5", MAC:"ca:47:a9:26:6b:d7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:56:14.426329 containerd[1540]: 2025-03-17 17:56:14.418 [INFO][4791] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16" Namespace="calico-system" Pod="csi-node-driver-2cntr" WorkloadEndpoint="localhost-k8s-csi--node--driver--2cntr-eth0" Mar 17 17:56:14.458250 containerd[1540]: time="2025-03-17T17:56:14.450305735Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-4ntmw,Uid:f3d38051-dd31-4085-95b4-5054901044b2,Namespace:kube-system,Attempt:6,} returns sandbox id \"061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd\"" Mar 17 17:56:14.458250 containerd[1540]: time="2025-03-17T17:56:14.457775589Z" level=info msg="CreateContainer within sandbox \"061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 17:56:14.461119 systemd-networkd[1435]: cali809ba4b7325: Link UP Mar 17 17:56:14.462721 systemd-networkd[1435]: cali809ba4b7325: Gained carrier Mar 17 17:56:14.484503 containerd[1540]: time="2025-03-17T17:56:14.482267954Z" level=info msg="CreateContainer within sandbox \"061ada96e23ce67c80ac9fc6ce7b522419dc8ea9e7d5233f358906dc8f7e96dd\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"935a9a3380b05f9e67156ed8d0073d69d1211229b71d33fbe9bee712bd18551c\"" Mar 17 17:56:14.484503 containerd[1540]: time="2025-03-17T17:56:14.483192391Z" level=info msg="StartContainer for \"935a9a3380b05f9e67156ed8d0073d69d1211229b71d33fbe9bee712bd18551c\"" Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.062 [INFO][4914] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.071 [INFO][4914] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--v59bf-eth0 coredns-7db6d8ff4d- kube-system 46b937aa-d1db-4705-9bec-d4bd7aeaeceb 726 0 2025-03-17 17:55:47 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-v59bf eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali809ba4b7325 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" Namespace="kube-system" Pod="coredns-7db6d8ff4d-v59bf" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--v59bf-" Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.072 [INFO][4914] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" Namespace="kube-system" Pod="coredns-7db6d8ff4d-v59bf" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--v59bf-eth0" Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.103 [INFO][4944] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" HandleID="k8s-pod-network.56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" Workload="localhost-k8s-coredns--7db6d8ff4d--v59bf-eth0" Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.129 [INFO][4944] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" HandleID="k8s-pod-network.56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" Workload="localhost-k8s-coredns--7db6d8ff4d--v59bf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000120380), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-v59bf", "timestamp":"2025-03-17 17:56:14.103800735 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.130 [INFO][4944] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.375 [INFO][4944] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.375 [INFO][4944] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.379 [INFO][4944] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" host="localhost" Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.392 [INFO][4944] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.402 [INFO][4944] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.405 [INFO][4944] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.416 [INFO][4944] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.416 [INFO][4944] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" host="localhost" Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.419 [INFO][4944] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557 Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.424 [INFO][4944] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" host="localhost" Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.440 [INFO][4944] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" host="localhost" Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.440 [INFO][4944] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" host="localhost" Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.440 [INFO][4944] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:56:14.503060 containerd[1540]: 2025-03-17 17:56:14.440 [INFO][4944] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" HandleID="k8s-pod-network.56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" Workload="localhost-k8s-coredns--7db6d8ff4d--v59bf-eth0" Mar 17 17:56:14.503634 containerd[1540]: 2025-03-17 17:56:14.449 [INFO][4914] cni-plugin/k8s.go 386: Populated endpoint ContainerID="56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" Namespace="kube-system" Pod="coredns-7db6d8ff4d-v59bf" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--v59bf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--v59bf-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"46b937aa-d1db-4705-9bec-d4bd7aeaeceb", ResourceVersion:"726", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 55, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-v59bf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali809ba4b7325", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:56:14.503634 containerd[1540]: 2025-03-17 17:56:14.454 [INFO][4914] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" Namespace="kube-system" Pod="coredns-7db6d8ff4d-v59bf" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--v59bf-eth0" Mar 17 17:56:14.503634 containerd[1540]: 2025-03-17 17:56:14.454 [INFO][4914] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali809ba4b7325 ContainerID="56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" Namespace="kube-system" Pod="coredns-7db6d8ff4d-v59bf" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--v59bf-eth0" Mar 17 17:56:14.503634 containerd[1540]: 2025-03-17 17:56:14.462 [INFO][4914] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" Namespace="kube-system" Pod="coredns-7db6d8ff4d-v59bf" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--v59bf-eth0" Mar 17 17:56:14.503634 containerd[1540]: 2025-03-17 17:56:14.468 [INFO][4914] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" Namespace="kube-system" Pod="coredns-7db6d8ff4d-v59bf" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--v59bf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--v59bf-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"46b937aa-d1db-4705-9bec-d4bd7aeaeceb", ResourceVersion:"726", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 55, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557", Pod:"coredns-7db6d8ff4d-v59bf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali809ba4b7325", MAC:"c2:c2:da:7a:61:56", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:56:14.503634 containerd[1540]: 2025-03-17 17:56:14.481 [INFO][4914] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557" Namespace="kube-system" Pod="coredns-7db6d8ff4d-v59bf" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--v59bf-eth0" Mar 17 17:56:14.506661 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:56:14.507007 containerd[1540]: time="2025-03-17T17:56:14.506262212Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:56:14.507007 containerd[1540]: time="2025-03-17T17:56:14.506938090Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:56:14.507007 containerd[1540]: time="2025-03-17T17:56:14.506951815Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:14.507481 containerd[1540]: time="2025-03-17T17:56:14.507274938Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:14.546678 systemd[1]: Started cri-containerd-935a9a3380b05f9e67156ed8d0073d69d1211229b71d33fbe9bee712bd18551c.scope - libcontainer container 935a9a3380b05f9e67156ed8d0073d69d1211229b71d33fbe9bee712bd18551c. Mar 17 17:56:14.547675 systemd[1]: Started cri-containerd-cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16.scope - libcontainer container cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16. Mar 17 17:56:14.563022 containerd[1540]: time="2025-03-17T17:56:14.562837163Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:56:14.563105 containerd[1540]: time="2025-03-17T17:56:14.563035902Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:56:14.563348 containerd[1540]: time="2025-03-17T17:56:14.563087462Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:14.565118 containerd[1540]: time="2025-03-17T17:56:14.564501297Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:14.572924 containerd[1540]: time="2025-03-17T17:56:14.572874502Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c4bd45cb8-lpjjh,Uid:4761939f-21ce-4484-88c7-08bcb4f65c5c,Namespace:calico-system,Attempt:6,} returns sandbox id \"550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268\"" Mar 17 17:56:14.608742 systemd[1]: Started cri-containerd-56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557.scope - libcontainer container 56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557. Mar 17 17:56:14.631014 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:56:14.633379 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:56:14.704418 containerd[1540]: time="2025-03-17T17:56:14.704258455Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-v59bf,Uid:46b937aa-d1db-4705-9bec-d4bd7aeaeceb,Namespace:kube-system,Attempt:7,} returns sandbox id \"56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557\"" Mar 17 17:56:14.704418 containerd[1540]: time="2025-03-17T17:56:14.704302438Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8466cfddd6-dpzgd,Uid:411709d2-a807-4f34-9412-d952d186c81f,Namespace:calico-apiserver,Attempt:6,} returns sandbox id \"9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce\"" Mar 17 17:56:14.704418 containerd[1540]: time="2025-03-17T17:56:14.704331831Z" level=info msg="StartContainer for \"935a9a3380b05f9e67156ed8d0073d69d1211229b71d33fbe9bee712bd18551c\" returns successfully" Mar 17 17:56:14.704418 containerd[1540]: time="2025-03-17T17:56:14.704377980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2cntr,Uid:0d288c4c-94be-4e72-9025-53893bb68385,Namespace:calico-system,Attempt:6,} returns sandbox id \"cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16\"" Mar 17 17:56:14.729925 containerd[1540]: time="2025-03-17T17:56:14.729898666Z" level=info msg="CreateContainer within sandbox \"56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 17:56:14.785859 containerd[1540]: time="2025-03-17T17:56:14.784905691Z" level=info msg="CreateContainer within sandbox \"56606dec9a90e993b616129b765bade168389b5a8cbca8c0397d760fe5298557\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dbf49a32c7d3bc6d36d9db9186b1e7ee3bd550a0724812b64d9f0be7cb641a5c\"" Mar 17 17:56:14.787735 containerd[1540]: time="2025-03-17T17:56:14.787702923Z" level=info msg="StartContainer for \"dbf49a32c7d3bc6d36d9db9186b1e7ee3bd550a0724812b64d9f0be7cb641a5c\"" Mar 17 17:56:14.812737 systemd[1]: Started cri-containerd-dbf49a32c7d3bc6d36d9db9186b1e7ee3bd550a0724812b64d9f0be7cb641a5c.scope - libcontainer container dbf49a32c7d3bc6d36d9db9186b1e7ee3bd550a0724812b64d9f0be7cb641a5c. Mar 17 17:56:14.841968 containerd[1540]: time="2025-03-17T17:56:14.841939748Z" level=info msg="StartContainer for \"dbf49a32c7d3bc6d36d9db9186b1e7ee3bd550a0724812b64d9f0be7cb641a5c\" returns successfully" Mar 17 17:56:14.961630 kernel: bpftool[5447]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 17 17:56:15.012946 kubelet[2850]: I0317 17:56:15.012484 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-v59bf" podStartSLOduration=28.012425034 podStartE2EDuration="28.012425034s" podCreationTimestamp="2025-03-17 17:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:56:15.011601742 +0000 UTC m=+43.852741010" watchObservedRunningTime="2025-03-17 17:56:15.012425034 +0000 UTC m=+43.853564297" Mar 17 17:56:15.012946 kubelet[2850]: I0317 17:56:15.012637 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-4ntmw" podStartSLOduration=28.012632674 podStartE2EDuration="28.012632674s" podCreationTimestamp="2025-03-17 17:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:56:15.003444669 +0000 UTC m=+43.844583947" watchObservedRunningTime="2025-03-17 17:56:15.012632674 +0000 UTC m=+43.853771930" Mar 17 17:56:15.219389 systemd-networkd[1435]: vxlan.calico: Link UP Mar 17 17:56:15.219402 systemd-networkd[1435]: vxlan.calico: Gained carrier Mar 17 17:56:15.857717 systemd-networkd[1435]: cali7f6e63fe175: Gained IPv6LL Mar 17 17:56:16.050041 systemd-networkd[1435]: cali809ba4b7325: Gained IPv6LL Mar 17 17:56:16.050270 systemd-networkd[1435]: cali35341b5d9d9: Gained IPv6LL Mar 17 17:56:16.051026 systemd-networkd[1435]: cali9002a0e79a5: Gained IPv6LL Mar 17 17:56:16.113648 systemd-networkd[1435]: cali0cf7bfdcde6: Gained IPv6LL Mar 17 17:56:16.114648 systemd-networkd[1435]: calid58474e6d23: Gained IPv6LL Mar 17 17:56:17.073817 systemd-networkd[1435]: vxlan.calico: Gained IPv6LL Mar 17 17:56:17.675261 containerd[1540]: time="2025-03-17T17:56:17.675218297Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:17.676297 containerd[1540]: time="2025-03-17T17:56:17.675806065Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 17 17:56:17.676297 containerd[1540]: time="2025-03-17T17:56:17.676105166Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:17.680069 containerd[1540]: time="2025-03-17T17:56:17.680048105Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:17.680724 containerd[1540]: time="2025-03-17T17:56:17.680702541Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 3.379765811s" Mar 17 17:56:17.680767 containerd[1540]: time="2025-03-17T17:56:17.680729253Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 17 17:56:17.682590 containerd[1540]: time="2025-03-17T17:56:17.682362494Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 17 17:56:17.683318 containerd[1540]: time="2025-03-17T17:56:17.683290578Z" level=info msg="CreateContainer within sandbox \"f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 17:56:17.691639 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount194896975.mount: Deactivated successfully. Mar 17 17:56:17.702821 containerd[1540]: time="2025-03-17T17:56:17.702788396Z" level=info msg="CreateContainer within sandbox \"f3af32755809919736e4c581a512b2e74f7ca94d8d6a153f0eed015a45e85d86\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"06ff2512f6d95f7d750aeb0fa0977c909fc27103b9199ca84333089d251872bb\"" Mar 17 17:56:17.703615 containerd[1540]: time="2025-03-17T17:56:17.703483192Z" level=info msg="StartContainer for \"06ff2512f6d95f7d750aeb0fa0977c909fc27103b9199ca84333089d251872bb\"" Mar 17 17:56:17.730794 systemd[1]: Started cri-containerd-06ff2512f6d95f7d750aeb0fa0977c909fc27103b9199ca84333089d251872bb.scope - libcontainer container 06ff2512f6d95f7d750aeb0fa0977c909fc27103b9199ca84333089d251872bb. Mar 17 17:56:17.779345 containerd[1540]: time="2025-03-17T17:56:17.779183225Z" level=info msg="StartContainer for \"06ff2512f6d95f7d750aeb0fa0977c909fc27103b9199ca84333089d251872bb\" returns successfully" Mar 17 17:56:19.059508 kubelet[2850]: I0317 17:56:19.058945 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8466cfddd6-28hkf" podStartSLOduration=21.676911293 podStartE2EDuration="25.058924753s" podCreationTimestamp="2025-03-17 17:55:54 +0000 UTC" firstStartedPulling="2025-03-17 17:56:14.299679025 +0000 UTC m=+43.140818289" lastFinishedPulling="2025-03-17 17:56:17.681692489 +0000 UTC m=+46.522831749" observedRunningTime="2025-03-17 17:56:18.0481291 +0000 UTC m=+46.889268368" watchObservedRunningTime="2025-03-17 17:56:19.058924753 +0000 UTC m=+47.900064022" Mar 17 17:56:20.616445 containerd[1540]: time="2025-03-17T17:56:20.616406055Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:20.623069 containerd[1540]: time="2025-03-17T17:56:20.622993184Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 17 17:56:20.635895 containerd[1540]: time="2025-03-17T17:56:20.635833203Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:20.638750 containerd[1540]: time="2025-03-17T17:56:20.638710751Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:20.642961 containerd[1540]: time="2025-03-17T17:56:20.639160393Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 2.956775465s" Mar 17 17:56:20.642961 containerd[1540]: time="2025-03-17T17:56:20.639182931Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 17 17:56:20.642961 containerd[1540]: time="2025-03-17T17:56:20.640009080Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 17 17:56:20.682756 containerd[1540]: time="2025-03-17T17:56:20.682722127Z" level=info msg="CreateContainer within sandbox \"550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 17 17:56:20.692062 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3368150313.mount: Deactivated successfully. Mar 17 17:56:20.695973 containerd[1540]: time="2025-03-17T17:56:20.695947713Z" level=info msg="CreateContainer within sandbox \"550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67\"" Mar 17 17:56:20.696412 containerd[1540]: time="2025-03-17T17:56:20.696395486Z" level=info msg="StartContainer for \"9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67\"" Mar 17 17:56:20.746670 systemd[1]: Started cri-containerd-9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67.scope - libcontainer container 9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67. Mar 17 17:56:20.786873 containerd[1540]: time="2025-03-17T17:56:20.786846544Z" level=info msg="StartContainer for \"9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67\" returns successfully" Mar 17 17:56:21.117661 kubelet[2850]: I0317 17:56:21.117616 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7c4bd45cb8-lpjjh" podStartSLOduration=21.057212568 podStartE2EDuration="27.117601629s" podCreationTimestamp="2025-03-17 17:55:54 +0000 UTC" firstStartedPulling="2025-03-17 17:56:14.579436481 +0000 UTC m=+43.420575742" lastFinishedPulling="2025-03-17 17:56:20.639825538 +0000 UTC m=+49.480964803" observedRunningTime="2025-03-17 17:56:21.116304905 +0000 UTC m=+49.957444174" watchObservedRunningTime="2025-03-17 17:56:21.117601629 +0000 UTC m=+49.958740898" Mar 17 17:56:22.433167 containerd[1540]: time="2025-03-17T17:56:22.433135955Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:22.434243 containerd[1540]: time="2025-03-17T17:56:22.433880886Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 17 17:56:22.434243 containerd[1540]: time="2025-03-17T17:56:22.434136810Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:22.454331 containerd[1540]: time="2025-03-17T17:56:22.453948799Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:22.455098 containerd[1540]: time="2025-03-17T17:56:22.454814310Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 1.81478464s" Mar 17 17:56:22.455098 containerd[1540]: time="2025-03-17T17:56:22.454829318Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 17 17:56:22.456784 containerd[1540]: time="2025-03-17T17:56:22.456258749Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 17 17:56:22.457790 containerd[1540]: time="2025-03-17T17:56:22.457094142Z" level=info msg="CreateContainer within sandbox \"cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 17 17:56:22.496246 containerd[1540]: time="2025-03-17T17:56:22.496220768Z" level=info msg="CreateContainer within sandbox \"cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"e468730cfffe1562328f1e358f81cffa53b6ed875777cfdc4ec9a06833f7d433\"" Mar 17 17:56:22.496720 containerd[1540]: time="2025-03-17T17:56:22.496656051Z" level=info msg="StartContainer for \"e468730cfffe1562328f1e358f81cffa53b6ed875777cfdc4ec9a06833f7d433\"" Mar 17 17:56:22.521656 systemd[1]: Started cri-containerd-e468730cfffe1562328f1e358f81cffa53b6ed875777cfdc4ec9a06833f7d433.scope - libcontainer container e468730cfffe1562328f1e358f81cffa53b6ed875777cfdc4ec9a06833f7d433. Mar 17 17:56:22.548680 containerd[1540]: time="2025-03-17T17:56:22.548656629Z" level=info msg="StartContainer for \"e468730cfffe1562328f1e358f81cffa53b6ed875777cfdc4ec9a06833f7d433\" returns successfully" Mar 17 17:56:22.975531 containerd[1540]: time="2025-03-17T17:56:22.975502207Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:22.976276 containerd[1540]: time="2025-03-17T17:56:22.976253763Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 17 17:56:22.977225 containerd[1540]: time="2025-03-17T17:56:22.977209607Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 520.936186ms" Mar 17 17:56:22.977270 containerd[1540]: time="2025-03-17T17:56:22.977227126Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 17 17:56:22.977966 containerd[1540]: time="2025-03-17T17:56:22.977793631Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 17 17:56:22.981629 containerd[1540]: time="2025-03-17T17:56:22.981608896Z" level=info msg="CreateContainer within sandbox \"9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 17:56:22.989416 containerd[1540]: time="2025-03-17T17:56:22.989365406Z" level=info msg="CreateContainer within sandbox \"9e87147b2525364aae480badac2128365338a8e189e9c40c79ec6e36ef66a0ce\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"99f8dc8b705c894b28f46deb5e198d1d177e882317d8a7cf163e7fe0f15034a3\"" Mar 17 17:56:22.990511 containerd[1540]: time="2025-03-17T17:56:22.989869172Z" level=info msg="StartContainer for \"99f8dc8b705c894b28f46deb5e198d1d177e882317d8a7cf163e7fe0f15034a3\"" Mar 17 17:56:23.011530 systemd[1]: run-containerd-runc-k8s.io-99f8dc8b705c894b28f46deb5e198d1d177e882317d8a7cf163e7fe0f15034a3-runc.g4m5HB.mount: Deactivated successfully. Mar 17 17:56:23.016649 systemd[1]: Started cri-containerd-99f8dc8b705c894b28f46deb5e198d1d177e882317d8a7cf163e7fe0f15034a3.scope - libcontainer container 99f8dc8b705c894b28f46deb5e198d1d177e882317d8a7cf163e7fe0f15034a3. Mar 17 17:56:23.050345 containerd[1540]: time="2025-03-17T17:56:23.050287276Z" level=info msg="StartContainer for \"99f8dc8b705c894b28f46deb5e198d1d177e882317d8a7cf163e7fe0f15034a3\" returns successfully" Mar 17 17:56:23.069898 kubelet[2850]: I0317 17:56:23.069474 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8466cfddd6-dpzgd" podStartSLOduration=20.797448114 podStartE2EDuration="29.069464899s" podCreationTimestamp="2025-03-17 17:55:54 +0000 UTC" firstStartedPulling="2025-03-17 17:56:14.705634378 +0000 UTC m=+43.546773638" lastFinishedPulling="2025-03-17 17:56:22.977651163 +0000 UTC m=+51.818790423" observedRunningTime="2025-03-17 17:56:23.069222792 +0000 UTC m=+51.910362060" watchObservedRunningTime="2025-03-17 17:56:23.069464899 +0000 UTC m=+51.910604162" Mar 17 17:56:25.761058 containerd[1540]: time="2025-03-17T17:56:25.760934155Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:25.761853 containerd[1540]: time="2025-03-17T17:56:25.761832029Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 17 17:56:25.762253 containerd[1540]: time="2025-03-17T17:56:25.762237384Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:25.763265 containerd[1540]: time="2025-03-17T17:56:25.763240203Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:56:25.763911 containerd[1540]: time="2025-03-17T17:56:25.763663942Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 2.785809399s" Mar 17 17:56:25.763911 containerd[1540]: time="2025-03-17T17:56:25.763681699Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 17 17:56:25.766323 containerd[1540]: time="2025-03-17T17:56:25.766097034Z" level=info msg="CreateContainer within sandbox \"cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 17 17:56:25.773054 containerd[1540]: time="2025-03-17T17:56:25.773028608Z" level=info msg="CreateContainer within sandbox \"cac318a3d1725b9349070117348510646b49384db4bb4089dad1a4e988bd2d16\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"9d6fa81ef8d37ae0131cafa49a05942e21645d7257a14f8dcf1c14e9942e694a\"" Mar 17 17:56:25.773684 containerd[1540]: time="2025-03-17T17:56:25.773627225Z" level=info msg="StartContainer for \"9d6fa81ef8d37ae0131cafa49a05942e21645d7257a14f8dcf1c14e9942e694a\"" Mar 17 17:56:25.828655 systemd[1]: Started cri-containerd-9d6fa81ef8d37ae0131cafa49a05942e21645d7257a14f8dcf1c14e9942e694a.scope - libcontainer container 9d6fa81ef8d37ae0131cafa49a05942e21645d7257a14f8dcf1c14e9942e694a. Mar 17 17:56:25.846341 containerd[1540]: time="2025-03-17T17:56:25.846316689Z" level=info msg="StartContainer for \"9d6fa81ef8d37ae0131cafa49a05942e21645d7257a14f8dcf1c14e9942e694a\" returns successfully" Mar 17 17:56:26.606590 kubelet[2850]: I0317 17:56:26.606420 2850 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 17 17:56:26.611310 kubelet[2850]: I0317 17:56:26.611290 2850 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 17 17:56:31.389421 containerd[1540]: time="2025-03-17T17:56:31.389159697Z" level=info msg="StopPodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\"" Mar 17 17:56:31.389421 containerd[1540]: time="2025-03-17T17:56:31.389368660Z" level=info msg="TearDown network for sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" successfully" Mar 17 17:56:31.389421 containerd[1540]: time="2025-03-17T17:56:31.389377853Z" level=info msg="StopPodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" returns successfully" Mar 17 17:56:31.438205 containerd[1540]: time="2025-03-17T17:56:31.438070493Z" level=info msg="RemovePodSandbox for \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\"" Mar 17 17:56:31.473472 containerd[1540]: time="2025-03-17T17:56:31.473319258Z" level=info msg="Forcibly stopping sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\"" Mar 17 17:56:31.482267 containerd[1540]: time="2025-03-17T17:56:31.473405010Z" level=info msg="TearDown network for sandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" successfully" Mar 17 17:56:31.514325 containerd[1540]: time="2025-03-17T17:56:31.514290677Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.525756 containerd[1540]: time="2025-03-17T17:56:31.525722121Z" level=info msg="RemovePodSandbox \"3ead86072fd7f99be78a72f6e8faefd81c2d0d1096df3ac1568472f3dd0a439d\" returns successfully" Mar 17 17:56:31.526318 containerd[1540]: time="2025-03-17T17:56:31.526232925Z" level=info msg="StopPodSandbox for \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\"" Mar 17 17:56:31.526464 containerd[1540]: time="2025-03-17T17:56:31.526415409Z" level=info msg="TearDown network for sandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\" successfully" Mar 17 17:56:31.526464 containerd[1540]: time="2025-03-17T17:56:31.526426903Z" level=info msg="StopPodSandbox for \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\" returns successfully" Mar 17 17:56:31.526777 containerd[1540]: time="2025-03-17T17:56:31.526729683Z" level=info msg="RemovePodSandbox for \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\"" Mar 17 17:56:31.526777 containerd[1540]: time="2025-03-17T17:56:31.526745459Z" level=info msg="Forcibly stopping sandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\"" Mar 17 17:56:31.526835 containerd[1540]: time="2025-03-17T17:56:31.526781029Z" level=info msg="TearDown network for sandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\" successfully" Mar 17 17:56:31.535365 containerd[1540]: time="2025-03-17T17:56:31.535338529Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.535421 containerd[1540]: time="2025-03-17T17:56:31.535374905Z" level=info msg="RemovePodSandbox \"df8d9426051df2fc91ca5e165ab9261ad19cb46caadfba5eb30b43722b86adb7\" returns successfully" Mar 17 17:56:31.535679 containerd[1540]: time="2025-03-17T17:56:31.535667137Z" level=info msg="StopPodSandbox for \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\"" Mar 17 17:56:31.535951 containerd[1540]: time="2025-03-17T17:56:31.535892117Z" level=info msg="TearDown network for sandbox \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\" successfully" Mar 17 17:56:31.535951 containerd[1540]: time="2025-03-17T17:56:31.535902077Z" level=info msg="StopPodSandbox for \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\" returns successfully" Mar 17 17:56:31.536133 containerd[1540]: time="2025-03-17T17:56:31.536120628Z" level=info msg="RemovePodSandbox for \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\"" Mar 17 17:56:31.548419 containerd[1540]: time="2025-03-17T17:56:31.536136131Z" level=info msg="Forcibly stopping sandbox \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\"" Mar 17 17:56:31.548419 containerd[1540]: time="2025-03-17T17:56:31.536174938Z" level=info msg="TearDown network for sandbox \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\" successfully" Mar 17 17:56:31.552156 containerd[1540]: time="2025-03-17T17:56:31.552133068Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.552203 containerd[1540]: time="2025-03-17T17:56:31.552175812Z" level=info msg="RemovePodSandbox \"7311c1258818cc1f088fa5a01c89e4977d4c6db6da429fe37ef57d31b3255442\" returns successfully" Mar 17 17:56:31.552511 containerd[1540]: time="2025-03-17T17:56:31.552495923Z" level=info msg="StopPodSandbox for \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\"" Mar 17 17:56:31.552578 containerd[1540]: time="2025-03-17T17:56:31.552553542Z" level=info msg="TearDown network for sandbox \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\" successfully" Mar 17 17:56:31.552634 containerd[1540]: time="2025-03-17T17:56:31.552575593Z" level=info msg="StopPodSandbox for \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\" returns successfully" Mar 17 17:56:31.557940 containerd[1540]: time="2025-03-17T17:56:31.552750708Z" level=info msg="RemovePodSandbox for \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\"" Mar 17 17:56:31.557940 containerd[1540]: time="2025-03-17T17:56:31.552763540Z" level=info msg="Forcibly stopping sandbox \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\"" Mar 17 17:56:31.557940 containerd[1540]: time="2025-03-17T17:56:31.557527356Z" level=info msg="TearDown network for sandbox \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\" successfully" Mar 17 17:56:31.569297 containerd[1540]: time="2025-03-17T17:56:31.569273262Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.569503 containerd[1540]: time="2025-03-17T17:56:31.569431402Z" level=info msg="RemovePodSandbox \"d3df737838310cdc0b7a69ffa69dcac8deaffdc31a8897b1f5928975b24fda57\" returns successfully" Mar 17 17:56:31.569752 containerd[1540]: time="2025-03-17T17:56:31.569736723Z" level=info msg="StopPodSandbox for \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\"" Mar 17 17:56:31.569817 containerd[1540]: time="2025-03-17T17:56:31.569804781Z" level=info msg="TearDown network for sandbox \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\" successfully" Mar 17 17:56:31.569817 containerd[1540]: time="2025-03-17T17:56:31.569815099Z" level=info msg="StopPodSandbox for \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\" returns successfully" Mar 17 17:56:31.577155 containerd[1540]: time="2025-03-17T17:56:31.569981798Z" level=info msg="RemovePodSandbox for \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\"" Mar 17 17:56:31.577155 containerd[1540]: time="2025-03-17T17:56:31.570003025Z" level=info msg="Forcibly stopping sandbox \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\"" Mar 17 17:56:31.577155 containerd[1540]: time="2025-03-17T17:56:31.570043022Z" level=info msg="TearDown network for sandbox \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\" successfully" Mar 17 17:56:31.582802 containerd[1540]: time="2025-03-17T17:56:31.580367866Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.582802 containerd[1540]: time="2025-03-17T17:56:31.580399128Z" level=info msg="RemovePodSandbox \"06b8cf1f491889bf62d31317faf7ba4a9e25819586554f5b8026c827895b8daf\" returns successfully" Mar 17 17:56:31.582802 containerd[1540]: time="2025-03-17T17:56:31.580632424Z" level=info msg="StopPodSandbox for \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\"" Mar 17 17:56:31.582802 containerd[1540]: time="2025-03-17T17:56:31.580698524Z" level=info msg="TearDown network for sandbox \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\" successfully" Mar 17 17:56:31.582802 containerd[1540]: time="2025-03-17T17:56:31.580706576Z" level=info msg="StopPodSandbox for \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\" returns successfully" Mar 17 17:56:31.582802 containerd[1540]: time="2025-03-17T17:56:31.580845299Z" level=info msg="RemovePodSandbox for \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\"" Mar 17 17:56:31.582802 containerd[1540]: time="2025-03-17T17:56:31.580856943Z" level=info msg="Forcibly stopping sandbox \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\"" Mar 17 17:56:31.582802 containerd[1540]: time="2025-03-17T17:56:31.580892201Z" level=info msg="TearDown network for sandbox \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\" successfully" Mar 17 17:56:31.594494 containerd[1540]: time="2025-03-17T17:56:31.594425664Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.594494 containerd[1540]: time="2025-03-17T17:56:31.594455144Z" level=info msg="RemovePodSandbox \"eb0e1b07713fe49ba18f50f56969757911f0a6809ba41eb1292dc8804ea9f2d2\" returns successfully" Mar 17 17:56:31.594699 containerd[1540]: time="2025-03-17T17:56:31.594683868Z" level=info msg="StopPodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\"" Mar 17 17:56:31.594769 containerd[1540]: time="2025-03-17T17:56:31.594736032Z" level=info msg="TearDown network for sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" successfully" Mar 17 17:56:31.594769 containerd[1540]: time="2025-03-17T17:56:31.594766740Z" level=info msg="StopPodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" returns successfully" Mar 17 17:56:31.599533 containerd[1540]: time="2025-03-17T17:56:31.594973846Z" level=info msg="RemovePodSandbox for \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\"" Mar 17 17:56:31.599533 containerd[1540]: time="2025-03-17T17:56:31.594985616Z" level=info msg="Forcibly stopping sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\"" Mar 17 17:56:31.599533 containerd[1540]: time="2025-03-17T17:56:31.595017792Z" level=info msg="TearDown network for sandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" successfully" Mar 17 17:56:31.607605 containerd[1540]: time="2025-03-17T17:56:31.607589146Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.607720 containerd[1540]: time="2025-03-17T17:56:31.607620911Z" level=info msg="RemovePodSandbox \"792943efa9abc48aa9966e8bf81ca4273691f4b3c2ec9caec66b6a6e3ad1e68b\" returns successfully" Mar 17 17:56:31.607822 containerd[1540]: time="2025-03-17T17:56:31.607767653Z" level=info msg="StopPodSandbox for \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\"" Mar 17 17:56:31.607858 containerd[1540]: time="2025-03-17T17:56:31.607845206Z" level=info msg="TearDown network for sandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\" successfully" Mar 17 17:56:31.607879 containerd[1540]: time="2025-03-17T17:56:31.607854548Z" level=info msg="StopPodSandbox for \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\" returns successfully" Mar 17 17:56:31.608020 containerd[1540]: time="2025-03-17T17:56:31.608007818Z" level=info msg="RemovePodSandbox for \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\"" Mar 17 17:56:31.608088 containerd[1540]: time="2025-03-17T17:56:31.608075596Z" level=info msg="Forcibly stopping sandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\"" Mar 17 17:56:31.608146 containerd[1540]: time="2025-03-17T17:56:31.608115137Z" level=info msg="TearDown network for sandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\" successfully" Mar 17 17:56:31.621076 containerd[1540]: time="2025-03-17T17:56:31.621046675Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.621122 containerd[1540]: time="2025-03-17T17:56:31.621075416Z" level=info msg="RemovePodSandbox \"9035eb878bbb9db19ef0164a9e08ed39a54db0d72bc6f88717e64aa32a271402\" returns successfully" Mar 17 17:56:31.621262 containerd[1540]: time="2025-03-17T17:56:31.621246128Z" level=info msg="StopPodSandbox for \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\"" Mar 17 17:56:31.621384 containerd[1540]: time="2025-03-17T17:56:31.621293993Z" level=info msg="TearDown network for sandbox \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\" successfully" Mar 17 17:56:31.621384 containerd[1540]: time="2025-03-17T17:56:31.621301232Z" level=info msg="StopPodSandbox for \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\" returns successfully" Mar 17 17:56:31.621994 containerd[1540]: time="2025-03-17T17:56:31.621535403Z" level=info msg="RemovePodSandbox for \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\"" Mar 17 17:56:31.621994 containerd[1540]: time="2025-03-17T17:56:31.621547726Z" level=info msg="Forcibly stopping sandbox \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\"" Mar 17 17:56:31.621994 containerd[1540]: time="2025-03-17T17:56:31.621620685Z" level=info msg="TearDown network for sandbox \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\" successfully" Mar 17 17:56:31.651866 containerd[1540]: time="2025-03-17T17:56:31.651785946Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.651866 containerd[1540]: time="2025-03-17T17:56:31.651828844Z" level=info msg="RemovePodSandbox \"4c15404d298bba092aed5580d558be5aec331fadc65f529591a5c23dceb7e459\" returns successfully" Mar 17 17:56:31.653064 containerd[1540]: time="2025-03-17T17:56:31.652075916Z" level=info msg="StopPodSandbox for \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\"" Mar 17 17:56:31.653064 containerd[1540]: time="2025-03-17T17:56:31.652172412Z" level=info msg="TearDown network for sandbox \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\" successfully" Mar 17 17:56:31.653064 containerd[1540]: time="2025-03-17T17:56:31.652183249Z" level=info msg="StopPodSandbox for \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\" returns successfully" Mar 17 17:56:31.653064 containerd[1540]: time="2025-03-17T17:56:31.652334255Z" level=info msg="RemovePodSandbox for \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\"" Mar 17 17:56:31.653064 containerd[1540]: time="2025-03-17T17:56:31.652347225Z" level=info msg="Forcibly stopping sandbox \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\"" Mar 17 17:56:31.653064 containerd[1540]: time="2025-03-17T17:56:31.652383745Z" level=info msg="TearDown network for sandbox \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\" successfully" Mar 17 17:56:31.675673 containerd[1540]: time="2025-03-17T17:56:31.675596464Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.675673 containerd[1540]: time="2025-03-17T17:56:31.675641026Z" level=info msg="RemovePodSandbox \"268ce4d7d593b5335ad33a787a9d6b061b156506ec1c400a079339aecc1d5a03\" returns successfully" Mar 17 17:56:31.676659 containerd[1540]: time="2025-03-17T17:56:31.675928936Z" level=info msg="StopPodSandbox for \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\"" Mar 17 17:56:31.676659 containerd[1540]: time="2025-03-17T17:56:31.675999710Z" level=info msg="TearDown network for sandbox \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\" successfully" Mar 17 17:56:31.676659 containerd[1540]: time="2025-03-17T17:56:31.676031362Z" level=info msg="StopPodSandbox for \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\" returns successfully" Mar 17 17:56:31.676659 containerd[1540]: time="2025-03-17T17:56:31.676219745Z" level=info msg="RemovePodSandbox for \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\"" Mar 17 17:56:31.676659 containerd[1540]: time="2025-03-17T17:56:31.676298300Z" level=info msg="Forcibly stopping sandbox \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\"" Mar 17 17:56:31.676659 containerd[1540]: time="2025-03-17T17:56:31.676357375Z" level=info msg="TearDown network for sandbox \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\" successfully" Mar 17 17:56:31.703416 containerd[1540]: time="2025-03-17T17:56:31.703389389Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.703471 containerd[1540]: time="2025-03-17T17:56:31.703423383Z" level=info msg="RemovePodSandbox \"1530834fcca4a3a79b1320294144c41b1aa5d1a4bd6e24be0b908aa7b633aeae\" returns successfully" Mar 17 17:56:31.703639 containerd[1540]: time="2025-03-17T17:56:31.703618185Z" level=info msg="StopPodSandbox for \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\"" Mar 17 17:56:31.703721 containerd[1540]: time="2025-03-17T17:56:31.703679244Z" level=info msg="TearDown network for sandbox \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\" successfully" Mar 17 17:56:31.703721 containerd[1540]: time="2025-03-17T17:56:31.703688857Z" level=info msg="StopPodSandbox for \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\" returns successfully" Mar 17 17:56:31.704026 containerd[1540]: time="2025-03-17T17:56:31.703891014Z" level=info msg="RemovePodSandbox for \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\"" Mar 17 17:56:31.704026 containerd[1540]: time="2025-03-17T17:56:31.703910564Z" level=info msg="Forcibly stopping sandbox \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\"" Mar 17 17:56:31.704026 containerd[1540]: time="2025-03-17T17:56:31.703980636Z" level=info msg="TearDown network for sandbox \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\" successfully" Mar 17 17:56:31.744068 containerd[1540]: time="2025-03-17T17:56:31.744040405Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.744155 containerd[1540]: time="2025-03-17T17:56:31.744077041Z" level=info msg="RemovePodSandbox \"3e932c96b4ac81c6daeeef34eecd127bd211d26329282e37c64062c39bfcfea3\" returns successfully" Mar 17 17:56:31.744320 containerd[1540]: time="2025-03-17T17:56:31.744292178Z" level=info msg="StopPodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\"" Mar 17 17:56:31.744385 containerd[1540]: time="2025-03-17T17:56:31.744352762Z" level=info msg="TearDown network for sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" successfully" Mar 17 17:56:31.744385 containerd[1540]: time="2025-03-17T17:56:31.744363179Z" level=info msg="StopPodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" returns successfully" Mar 17 17:56:31.744524 containerd[1540]: time="2025-03-17T17:56:31.744503370Z" level=info msg="RemovePodSandbox for \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\"" Mar 17 17:56:31.744524 containerd[1540]: time="2025-03-17T17:56:31.744517377Z" level=info msg="Forcibly stopping sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\"" Mar 17 17:56:31.744679 containerd[1540]: time="2025-03-17T17:56:31.744601410Z" level=info msg="TearDown network for sandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" successfully" Mar 17 17:56:31.768250 containerd[1540]: time="2025-03-17T17:56:31.768228003Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.784219 containerd[1540]: time="2025-03-17T17:56:31.768260302Z" level=info msg="RemovePodSandbox \"60c0832caf21b0582e9227930340976601f7b178de8302b4c5750a5b394d25b8\" returns successfully" Mar 17 17:56:31.784219 containerd[1540]: time="2025-03-17T17:56:31.768518576Z" level=info msg="StopPodSandbox for \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\"" Mar 17 17:56:31.784219 containerd[1540]: time="2025-03-17T17:56:31.768590199Z" level=info msg="TearDown network for sandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\" successfully" Mar 17 17:56:31.784219 containerd[1540]: time="2025-03-17T17:56:31.768601291Z" level=info msg="StopPodSandbox for \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\" returns successfully" Mar 17 17:56:31.784219 containerd[1540]: time="2025-03-17T17:56:31.768830248Z" level=info msg="RemovePodSandbox for \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\"" Mar 17 17:56:31.784219 containerd[1540]: time="2025-03-17T17:56:31.768844148Z" level=info msg="Forcibly stopping sandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\"" Mar 17 17:56:31.784219 containerd[1540]: time="2025-03-17T17:56:31.768884738Z" level=info msg="TearDown network for sandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\" successfully" Mar 17 17:56:31.785200 containerd[1540]: time="2025-03-17T17:56:31.785122291Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.785200 containerd[1540]: time="2025-03-17T17:56:31.785156263Z" level=info msg="RemovePodSandbox \"076333e6976f722bcf28e61a97f687046873fec2552d97b6dec22b3a1f58ada4\" returns successfully" Mar 17 17:56:31.785505 containerd[1540]: time="2025-03-17T17:56:31.785462891Z" level=info msg="StopPodSandbox for \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\"" Mar 17 17:56:31.785597 containerd[1540]: time="2025-03-17T17:56:31.785552156Z" level=info msg="TearDown network for sandbox \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\" successfully" Mar 17 17:56:31.785597 containerd[1540]: time="2025-03-17T17:56:31.785594661Z" level=info msg="StopPodSandbox for \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\" returns successfully" Mar 17 17:56:31.786110 containerd[1540]: time="2025-03-17T17:56:31.785992876Z" level=info msg="RemovePodSandbox for \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\"" Mar 17 17:56:31.786110 containerd[1540]: time="2025-03-17T17:56:31.786008025Z" level=info msg="Forcibly stopping sandbox \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\"" Mar 17 17:56:31.786110 containerd[1540]: time="2025-03-17T17:56:31.786090993Z" level=info msg="TearDown network for sandbox \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\" successfully" Mar 17 17:56:31.787619 containerd[1540]: time="2025-03-17T17:56:31.787595921Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.787659 containerd[1540]: time="2025-03-17T17:56:31.787642592Z" level=info msg="RemovePodSandbox \"a81861b04440b70116d3e71575170a0140a481ba7b3f7dfd11424cf2fb440db1\" returns successfully" Mar 17 17:56:31.787839 containerd[1540]: time="2025-03-17T17:56:31.787820739Z" level=info msg="StopPodSandbox for \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\"" Mar 17 17:56:31.787893 containerd[1540]: time="2025-03-17T17:56:31.787877744Z" level=info msg="TearDown network for sandbox \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\" successfully" Mar 17 17:56:31.787893 containerd[1540]: time="2025-03-17T17:56:31.787888404Z" level=info msg="StopPodSandbox for \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\" returns successfully" Mar 17 17:56:31.788099 containerd[1540]: time="2025-03-17T17:56:31.788081330Z" level=info msg="RemovePodSandbox for \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\"" Mar 17 17:56:31.788186 containerd[1540]: time="2025-03-17T17:56:31.788167881Z" level=info msg="Forcibly stopping sandbox \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\"" Mar 17 17:56:31.788263 containerd[1540]: time="2025-03-17T17:56:31.788234318Z" level=info msg="TearDown network for sandbox \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\" successfully" Mar 17 17:56:31.790033 containerd[1540]: time="2025-03-17T17:56:31.790012501Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.790085 containerd[1540]: time="2025-03-17T17:56:31.790043914Z" level=info msg="RemovePodSandbox \"b6e81029d0c41d0248f3e55ccebd350d1d9e8f6f19adb65d37652cb3690a4c13\" returns successfully" Mar 17 17:56:31.790243 containerd[1540]: time="2025-03-17T17:56:31.790229186Z" level=info msg="StopPodSandbox for \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\"" Mar 17 17:56:31.790483 containerd[1540]: time="2025-03-17T17:56:31.790408585Z" level=info msg="TearDown network for sandbox \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\" successfully" Mar 17 17:56:31.790483 containerd[1540]: time="2025-03-17T17:56:31.790421005Z" level=info msg="StopPodSandbox for \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\" returns successfully" Mar 17 17:56:31.791637 containerd[1540]: time="2025-03-17T17:56:31.790677742Z" level=info msg="RemovePodSandbox for \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\"" Mar 17 17:56:31.791637 containerd[1540]: time="2025-03-17T17:56:31.790693764Z" level=info msg="Forcibly stopping sandbox \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\"" Mar 17 17:56:31.791637 containerd[1540]: time="2025-03-17T17:56:31.790752754Z" level=info msg="TearDown network for sandbox \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\" successfully" Mar 17 17:56:31.792095 containerd[1540]: time="2025-03-17T17:56:31.792074940Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.792173 containerd[1540]: time="2025-03-17T17:56:31.792103397Z" level=info msg="RemovePodSandbox \"e7f190f10b426954ee72015fdf60822d3f24a30b27cebfacf6724ea2ed65df9b\" returns successfully" Mar 17 17:56:31.792407 containerd[1540]: time="2025-03-17T17:56:31.792302038Z" level=info msg="StopPodSandbox for \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\"" Mar 17 17:56:31.792407 containerd[1540]: time="2025-03-17T17:56:31.792364945Z" level=info msg="TearDown network for sandbox \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\" successfully" Mar 17 17:56:31.792407 containerd[1540]: time="2025-03-17T17:56:31.792373486Z" level=info msg="StopPodSandbox for \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\" returns successfully" Mar 17 17:56:31.792861 containerd[1540]: time="2025-03-17T17:56:31.792741943Z" level=info msg="RemovePodSandbox for \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\"" Mar 17 17:56:31.792861 containerd[1540]: time="2025-03-17T17:56:31.792757511Z" level=info msg="Forcibly stopping sandbox \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\"" Mar 17 17:56:31.792861 containerd[1540]: time="2025-03-17T17:56:31.792836057Z" level=info msg="TearDown network for sandbox \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\" successfully" Mar 17 17:56:31.794587 containerd[1540]: time="2025-03-17T17:56:31.794538538Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.794661 containerd[1540]: time="2025-03-17T17:56:31.794589731Z" level=info msg="RemovePodSandbox \"c1c37863c46b64c1352706266f3e8af61af378070995a11139348afed8c51719\" returns successfully" Mar 17 17:56:31.794800 containerd[1540]: time="2025-03-17T17:56:31.794781046Z" level=info msg="StopPodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\"" Mar 17 17:56:31.794867 containerd[1540]: time="2025-03-17T17:56:31.794835510Z" level=info msg="TearDown network for sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" successfully" Mar 17 17:56:31.794867 containerd[1540]: time="2025-03-17T17:56:31.794863912Z" level=info msg="StopPodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" returns successfully" Mar 17 17:56:31.796054 containerd[1540]: time="2025-03-17T17:56:31.795052666Z" level=info msg="RemovePodSandbox for \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\"" Mar 17 17:56:31.796054 containerd[1540]: time="2025-03-17T17:56:31.795070353Z" level=info msg="Forcibly stopping sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\"" Mar 17 17:56:31.796054 containerd[1540]: time="2025-03-17T17:56:31.795112085Z" level=info msg="TearDown network for sandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" successfully" Mar 17 17:56:31.796625 containerd[1540]: time="2025-03-17T17:56:31.796609015Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.796700 containerd[1540]: time="2025-03-17T17:56:31.796688587Z" level=info msg="RemovePodSandbox \"9ed90f5590cd65c4c8a79c34e4be33846b342e63c0c0b0f64c15be99cfec42a1\" returns successfully" Mar 17 17:56:31.797041 containerd[1540]: time="2025-03-17T17:56:31.797024171Z" level=info msg="StopPodSandbox for \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\"" Mar 17 17:56:31.797163 containerd[1540]: time="2025-03-17T17:56:31.797150746Z" level=info msg="TearDown network for sandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\" successfully" Mar 17 17:56:31.797242 containerd[1540]: time="2025-03-17T17:56:31.797230632Z" level=info msg="StopPodSandbox for \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\" returns successfully" Mar 17 17:56:31.797524 containerd[1540]: time="2025-03-17T17:56:31.797510098Z" level=info msg="RemovePodSandbox for \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\"" Mar 17 17:56:31.797643 containerd[1540]: time="2025-03-17T17:56:31.797632176Z" level=info msg="Forcibly stopping sandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\"" Mar 17 17:56:31.797888 containerd[1540]: time="2025-03-17T17:56:31.797857338Z" level=info msg="TearDown network for sandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\" successfully" Mar 17 17:56:31.799369 containerd[1540]: time="2025-03-17T17:56:31.799334090Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.799468 containerd[1540]: time="2025-03-17T17:56:31.799454157Z" level=info msg="RemovePodSandbox \"6075cc49b55e1dda1c0f9a7577484d550319f0cc97ff8cf3f86324644580f8ab\" returns successfully" Mar 17 17:56:31.799721 containerd[1540]: time="2025-03-17T17:56:31.799707233Z" level=info msg="StopPodSandbox for \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\"" Mar 17 17:56:31.799845 containerd[1540]: time="2025-03-17T17:56:31.799833723Z" level=info msg="TearDown network for sandbox \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\" successfully" Mar 17 17:56:31.799939 containerd[1540]: time="2025-03-17T17:56:31.799920173Z" level=info msg="StopPodSandbox for \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\" returns successfully" Mar 17 17:56:31.800448 containerd[1540]: time="2025-03-17T17:56:31.800174470Z" level=info msg="RemovePodSandbox for \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\"" Mar 17 17:56:31.800448 containerd[1540]: time="2025-03-17T17:56:31.800189226Z" level=info msg="Forcibly stopping sandbox \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\"" Mar 17 17:56:31.800448 containerd[1540]: time="2025-03-17T17:56:31.800229106Z" level=info msg="TearDown network for sandbox \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\" successfully" Mar 17 17:56:31.801885 containerd[1540]: time="2025-03-17T17:56:31.801817876Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.801885 containerd[1540]: time="2025-03-17T17:56:31.801845610Z" level=info msg="RemovePodSandbox \"b88cc2a9e0ac64af6e0d46e15fa1a909bb358f2489d1508ade1e0d99790d3ae6\" returns successfully" Mar 17 17:56:31.802144 containerd[1540]: time="2025-03-17T17:56:31.802128723Z" level=info msg="StopPodSandbox for \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\"" Mar 17 17:56:31.802198 containerd[1540]: time="2025-03-17T17:56:31.802180733Z" level=info msg="TearDown network for sandbox \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\" successfully" Mar 17 17:56:31.802198 containerd[1540]: time="2025-03-17T17:56:31.802193731Z" level=info msg="StopPodSandbox for \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\" returns successfully" Mar 17 17:56:31.802413 containerd[1540]: time="2025-03-17T17:56:31.802396575Z" level=info msg="RemovePodSandbox for \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\"" Mar 17 17:56:31.802459 containerd[1540]: time="2025-03-17T17:56:31.802434557Z" level=info msg="Forcibly stopping sandbox \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\"" Mar 17 17:56:31.802496 containerd[1540]: time="2025-03-17T17:56:31.802474370Z" level=info msg="TearDown network for sandbox \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\" successfully" Mar 17 17:56:31.803981 containerd[1540]: time="2025-03-17T17:56:31.803960417Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.804016 containerd[1540]: time="2025-03-17T17:56:31.803989197Z" level=info msg="RemovePodSandbox \"b11563edb88da946a4612a7b86adb9cd15935f211b6d977fa3926d13b1633017\" returns successfully" Mar 17 17:56:31.804266 containerd[1540]: time="2025-03-17T17:56:31.804246136Z" level=info msg="StopPodSandbox for \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\"" Mar 17 17:56:31.804335 containerd[1540]: time="2025-03-17T17:56:31.804317328Z" level=info msg="TearDown network for sandbox \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\" successfully" Mar 17 17:56:31.804335 containerd[1540]: time="2025-03-17T17:56:31.804331960Z" level=info msg="StopPodSandbox for \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\" returns successfully" Mar 17 17:56:31.804488 containerd[1540]: time="2025-03-17T17:56:31.804468779Z" level=info msg="RemovePodSandbox for \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\"" Mar 17 17:56:31.804488 containerd[1540]: time="2025-03-17T17:56:31.804487413Z" level=info msg="Forcibly stopping sandbox \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\"" Mar 17 17:56:31.804553 containerd[1540]: time="2025-03-17T17:56:31.804526544Z" level=info msg="TearDown network for sandbox \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\" successfully" Mar 17 17:56:31.805910 containerd[1540]: time="2025-03-17T17:56:31.805888990Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.805954 containerd[1540]: time="2025-03-17T17:56:31.805916729Z" level=info msg="RemovePodSandbox \"d4a655794371d5e6e747fa49e24bff7cb2061d4063a2290ed255b5721af29f32\" returns successfully" Mar 17 17:56:31.806122 containerd[1540]: time="2025-03-17T17:56:31.806105090Z" level=info msg="StopPodSandbox for \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\"" Mar 17 17:56:31.806192 containerd[1540]: time="2025-03-17T17:56:31.806175674Z" level=info msg="TearDown network for sandbox \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\" successfully" Mar 17 17:56:31.806192 containerd[1540]: time="2025-03-17T17:56:31.806189031Z" level=info msg="StopPodSandbox for \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\" returns successfully" Mar 17 17:56:31.806365 containerd[1540]: time="2025-03-17T17:56:31.806349108Z" level=info msg="RemovePodSandbox for \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\"" Mar 17 17:56:31.806398 containerd[1540]: time="2025-03-17T17:56:31.806380958Z" level=info msg="Forcibly stopping sandbox \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\"" Mar 17 17:56:31.806454 containerd[1540]: time="2025-03-17T17:56:31.806424418Z" level=info msg="TearDown network for sandbox \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\" successfully" Mar 17 17:56:31.807919 containerd[1540]: time="2025-03-17T17:56:31.807896725Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.807968 containerd[1540]: time="2025-03-17T17:56:31.807926110Z" level=info msg="RemovePodSandbox \"0d134871dea5c8af2ba9b7995e0a14e01db1596ccfe21db5ba1b426f2ef01423\" returns successfully" Mar 17 17:56:31.808346 containerd[1540]: time="2025-03-17T17:56:31.808158418Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\"" Mar 17 17:56:31.808346 containerd[1540]: time="2025-03-17T17:56:31.808211126Z" level=info msg="TearDown network for sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" successfully" Mar 17 17:56:31.808346 containerd[1540]: time="2025-03-17T17:56:31.808220204Z" level=info msg="StopPodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" returns successfully" Mar 17 17:56:31.808449 containerd[1540]: time="2025-03-17T17:56:31.808402004Z" level=info msg="RemovePodSandbox for \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\"" Mar 17 17:56:31.808449 containerd[1540]: time="2025-03-17T17:56:31.808416736Z" level=info msg="Forcibly stopping sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\"" Mar 17 17:56:31.808499 containerd[1540]: time="2025-03-17T17:56:31.808457922Z" level=info msg="TearDown network for sandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" successfully" Mar 17 17:56:31.809881 containerd[1540]: time="2025-03-17T17:56:31.809861643Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.810003 containerd[1540]: time="2025-03-17T17:56:31.809886771Z" level=info msg="RemovePodSandbox \"b908cca0b67f5ef38c2b78d7e508a45b96f87653bb2000569bd1692c172b29ca\" returns successfully" Mar 17 17:56:31.810106 containerd[1540]: time="2025-03-17T17:56:31.810090664Z" level=info msg="StopPodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\"" Mar 17 17:56:31.810183 containerd[1540]: time="2025-03-17T17:56:31.810168067Z" level=info msg="TearDown network for sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" successfully" Mar 17 17:56:31.810214 containerd[1540]: time="2025-03-17T17:56:31.810181173Z" level=info msg="StopPodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" returns successfully" Mar 17 17:56:31.810393 containerd[1540]: time="2025-03-17T17:56:31.810355852Z" level=info msg="RemovePodSandbox for \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\"" Mar 17 17:56:31.810393 containerd[1540]: time="2025-03-17T17:56:31.810370467Z" level=info msg="Forcibly stopping sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\"" Mar 17 17:56:31.810446 containerd[1540]: time="2025-03-17T17:56:31.810428468Z" level=info msg="TearDown network for sandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" successfully" Mar 17 17:56:31.811743 containerd[1540]: time="2025-03-17T17:56:31.811724805Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.811859 containerd[1540]: time="2025-03-17T17:56:31.811750533Z" level=info msg="RemovePodSandbox \"0b841033c212c7bd063dd69b0e3d0e773378280cdfae65719d06acafcb904bfa\" returns successfully" Mar 17 17:56:31.811943 containerd[1540]: time="2025-03-17T17:56:31.811926425Z" level=info msg="StopPodSandbox for \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\"" Mar 17 17:56:31.811997 containerd[1540]: time="2025-03-17T17:56:31.811980724Z" level=info msg="TearDown network for sandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\" successfully" Mar 17 17:56:31.812028 containerd[1540]: time="2025-03-17T17:56:31.812007184Z" level=info msg="StopPodSandbox for \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\" returns successfully" Mar 17 17:56:31.812703 containerd[1540]: time="2025-03-17T17:56:31.812213940Z" level=info msg="RemovePodSandbox for \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\"" Mar 17 17:56:31.812703 containerd[1540]: time="2025-03-17T17:56:31.812232325Z" level=info msg="Forcibly stopping sandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\"" Mar 17 17:56:31.812703 containerd[1540]: time="2025-03-17T17:56:31.812272658Z" level=info msg="TearDown network for sandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\" successfully" Mar 17 17:56:31.813643 containerd[1540]: time="2025-03-17T17:56:31.813624037Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.813767 containerd[1540]: time="2025-03-17T17:56:31.813650508Z" level=info msg="RemovePodSandbox \"78c7e17077f41a0f4a15de908b002eb2a0b2aa401b669c7b9b37bc6861108cef\" returns successfully" Mar 17 17:56:31.813952 containerd[1540]: time="2025-03-17T17:56:31.813853292Z" level=info msg="StopPodSandbox for \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\"" Mar 17 17:56:31.813952 containerd[1540]: time="2025-03-17T17:56:31.813900699Z" level=info msg="TearDown network for sandbox \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\" successfully" Mar 17 17:56:31.813952 containerd[1540]: time="2025-03-17T17:56:31.813908094Z" level=info msg="StopPodSandbox for \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\" returns successfully" Mar 17 17:56:31.814202 containerd[1540]: time="2025-03-17T17:56:31.814129813Z" level=info msg="RemovePodSandbox for \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\"" Mar 17 17:56:31.814202 containerd[1540]: time="2025-03-17T17:56:31.814144313Z" level=info msg="Forcibly stopping sandbox \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\"" Mar 17 17:56:31.814668 containerd[1540]: time="2025-03-17T17:56:31.814299572Z" level=info msg="TearDown network for sandbox \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\" successfully" Mar 17 17:56:31.815605 containerd[1540]: time="2025-03-17T17:56:31.815592360Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.815675 containerd[1540]: time="2025-03-17T17:56:31.815664661Z" level=info msg="RemovePodSandbox \"f7c46f3d3d760eff4cce983aa2181b6d809d28abb7774821116dcc138cb742d4\" returns successfully" Mar 17 17:56:31.815887 containerd[1540]: time="2025-03-17T17:56:31.815875734Z" level=info msg="StopPodSandbox for \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\"" Mar 17 17:56:31.815988 containerd[1540]: time="2025-03-17T17:56:31.815978200Z" level=info msg="TearDown network for sandbox \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\" successfully" Mar 17 17:56:31.816024 containerd[1540]: time="2025-03-17T17:56:31.816017821Z" level=info msg="StopPodSandbox for \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\" returns successfully" Mar 17 17:56:31.816178 containerd[1540]: time="2025-03-17T17:56:31.816164595Z" level=info msg="RemovePodSandbox for \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\"" Mar 17 17:56:31.816214 containerd[1540]: time="2025-03-17T17:56:31.816179459Z" level=info msg="Forcibly stopping sandbox \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\"" Mar 17 17:56:31.816281 containerd[1540]: time="2025-03-17T17:56:31.816225225Z" level=info msg="TearDown network for sandbox \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\" successfully" Mar 17 17:56:31.833587 containerd[1540]: time="2025-03-17T17:56:31.833571283Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.833646 containerd[1540]: time="2025-03-17T17:56:31.833595194Z" level=info msg="RemovePodSandbox \"27f9fd0f8df5605db7400883bbf779cfc62783fde2252aded904f28c693d0911\" returns successfully" Mar 17 17:56:31.834036 containerd[1540]: time="2025-03-17T17:56:31.833902794Z" level=info msg="StopPodSandbox for \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\"" Mar 17 17:56:31.834036 containerd[1540]: time="2025-03-17T17:56:31.833947429Z" level=info msg="TearDown network for sandbox \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\" successfully" Mar 17 17:56:31.834036 containerd[1540]: time="2025-03-17T17:56:31.833953564Z" level=info msg="StopPodSandbox for \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\" returns successfully" Mar 17 17:56:31.834693 containerd[1540]: time="2025-03-17T17:56:31.834147298Z" level=info msg="RemovePodSandbox for \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\"" Mar 17 17:56:31.834693 containerd[1540]: time="2025-03-17T17:56:31.834159236Z" level=info msg="Forcibly stopping sandbox \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\"" Mar 17 17:56:31.834693 containerd[1540]: time="2025-03-17T17:56:31.834196510Z" level=info msg="TearDown network for sandbox \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\" successfully" Mar 17 17:56:31.835271 containerd[1540]: time="2025-03-17T17:56:31.835256999Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.835298 containerd[1540]: time="2025-03-17T17:56:31.835278398Z" level=info msg="RemovePodSandbox \"02cb0ff2f9769b7a460411846f0ad514d8edd820a2cf80cc0ce7b02ed99ab8f0\" returns successfully" Mar 17 17:56:31.835555 containerd[1540]: time="2025-03-17T17:56:31.835432797Z" level=info msg="StopPodSandbox for \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\"" Mar 17 17:56:31.835555 containerd[1540]: time="2025-03-17T17:56:31.835469764Z" level=info msg="TearDown network for sandbox \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\" successfully" Mar 17 17:56:31.835555 containerd[1540]: time="2025-03-17T17:56:31.835476156Z" level=info msg="StopPodSandbox for \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\" returns successfully" Mar 17 17:56:31.836625 containerd[1540]: time="2025-03-17T17:56:31.835774606Z" level=info msg="RemovePodSandbox for \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\"" Mar 17 17:56:31.836625 containerd[1540]: time="2025-03-17T17:56:31.835785441Z" level=info msg="Forcibly stopping sandbox \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\"" Mar 17 17:56:31.836625 containerd[1540]: time="2025-03-17T17:56:31.835864302Z" level=info msg="TearDown network for sandbox \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\" successfully" Mar 17 17:56:31.837075 containerd[1540]: time="2025-03-17T17:56:31.837063857Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.837121 containerd[1540]: time="2025-03-17T17:56:31.837114073Z" level=info msg="RemovePodSandbox \"0bc6431945f6baaa595b0154e3f8ce1aa3efc343f91d50e951c286f256c2c878\" returns successfully" Mar 17 17:56:31.837365 containerd[1540]: time="2025-03-17T17:56:31.837290761Z" level=info msg="StopPodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\"" Mar 17 17:56:31.837365 containerd[1540]: time="2025-03-17T17:56:31.837328096Z" level=info msg="TearDown network for sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" successfully" Mar 17 17:56:31.837365 containerd[1540]: time="2025-03-17T17:56:31.837333781Z" level=info msg="StopPodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" returns successfully" Mar 17 17:56:31.838140 containerd[1540]: time="2025-03-17T17:56:31.837587632Z" level=info msg="RemovePodSandbox for \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\"" Mar 17 17:56:31.838140 containerd[1540]: time="2025-03-17T17:56:31.837598801Z" level=info msg="Forcibly stopping sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\"" Mar 17 17:56:31.838140 containerd[1540]: time="2025-03-17T17:56:31.837627699Z" level=info msg="TearDown network for sandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" successfully" Mar 17 17:56:31.838765 containerd[1540]: time="2025-03-17T17:56:31.838748875Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.838792 containerd[1540]: time="2025-03-17T17:56:31.838771293Z" level=info msg="RemovePodSandbox \"0af549feee73f1c2fd73f6e749d4ea86572ea39ec3895d97dbd58990e554e8cc\" returns successfully" Mar 17 17:56:31.839063 containerd[1540]: time="2025-03-17T17:56:31.838938736Z" level=info msg="StopPodSandbox for \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\"" Mar 17 17:56:31.839063 containerd[1540]: time="2025-03-17T17:56:31.838978784Z" level=info msg="TearDown network for sandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\" successfully" Mar 17 17:56:31.839063 containerd[1540]: time="2025-03-17T17:56:31.838984761Z" level=info msg="StopPodSandbox for \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\" returns successfully" Mar 17 17:56:31.839862 containerd[1540]: time="2025-03-17T17:56:31.839186122Z" level=info msg="RemovePodSandbox for \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\"" Mar 17 17:56:31.839862 containerd[1540]: time="2025-03-17T17:56:31.839197754Z" level=info msg="Forcibly stopping sandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\"" Mar 17 17:56:31.839862 containerd[1540]: time="2025-03-17T17:56:31.839226188Z" level=info msg="TearDown network for sandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\" successfully" Mar 17 17:56:31.840373 containerd[1540]: time="2025-03-17T17:56:31.840360817Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.840421 containerd[1540]: time="2025-03-17T17:56:31.840413847Z" level=info msg="RemovePodSandbox \"0ca1425134ac73e645c74504471fb98495eacf875e469580ecf24e22189ab8dd\" returns successfully" Mar 17 17:56:31.840589 containerd[1540]: time="2025-03-17T17:56:31.840574605Z" level=info msg="StopPodSandbox for \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\"" Mar 17 17:56:31.840620 containerd[1540]: time="2025-03-17T17:56:31.840615459Z" level=info msg="TearDown network for sandbox \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\" successfully" Mar 17 17:56:31.840641 containerd[1540]: time="2025-03-17T17:56:31.840621929Z" level=info msg="StopPodSandbox for \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\" returns successfully" Mar 17 17:56:31.840811 containerd[1540]: time="2025-03-17T17:56:31.840797834Z" level=info msg="RemovePodSandbox for \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\"" Mar 17 17:56:31.840811 containerd[1540]: time="2025-03-17T17:56:31.840811097Z" level=info msg="Forcibly stopping sandbox \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\"" Mar 17 17:56:31.841310 containerd[1540]: time="2025-03-17T17:56:31.840840433Z" level=info msg="TearDown network for sandbox \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\" successfully" Mar 17 17:56:31.842031 containerd[1540]: time="2025-03-17T17:56:31.842014418Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.842091 containerd[1540]: time="2025-03-17T17:56:31.842034096Z" level=info msg="RemovePodSandbox \"446cc2b9c82f10f08686decea8b081a72c71ff4e62201d1e63c35ed2ddbf44cd\" returns successfully" Mar 17 17:56:31.842389 containerd[1540]: time="2025-03-17T17:56:31.842269793Z" level=info msg="StopPodSandbox for \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\"" Mar 17 17:56:31.842389 containerd[1540]: time="2025-03-17T17:56:31.842310047Z" level=info msg="TearDown network for sandbox \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\" successfully" Mar 17 17:56:31.842389 containerd[1540]: time="2025-03-17T17:56:31.842316814Z" level=info msg="StopPodSandbox for \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\" returns successfully" Mar 17 17:56:31.842517 containerd[1540]: time="2025-03-17T17:56:31.842503904Z" level=info msg="RemovePodSandbox for \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\"" Mar 17 17:56:31.842517 containerd[1540]: time="2025-03-17T17:56:31.842516975Z" level=info msg="Forcibly stopping sandbox \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\"" Mar 17 17:56:31.842568 containerd[1540]: time="2025-03-17T17:56:31.842546821Z" level=info msg="TearDown network for sandbox \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\" successfully" Mar 17 17:56:31.843761 containerd[1540]: time="2025-03-17T17:56:31.843745986Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.843855 containerd[1540]: time="2025-03-17T17:56:31.843765006Z" level=info msg="RemovePodSandbox \"d59667e84f96cef08a5d8ab8e8bb7ac9d4b41d86ed7b594f58651ae2fd2cf634\" returns successfully" Mar 17 17:56:31.843899 containerd[1540]: time="2025-03-17T17:56:31.843887027Z" level=info msg="StopPodSandbox for \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\"" Mar 17 17:56:31.844283 containerd[1540]: time="2025-03-17T17:56:31.843922595Z" level=info msg="TearDown network for sandbox \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\" successfully" Mar 17 17:56:31.844283 containerd[1540]: time="2025-03-17T17:56:31.843929898Z" level=info msg="StopPodSandbox for \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\" returns successfully" Mar 17 17:56:31.844283 containerd[1540]: time="2025-03-17T17:56:31.844042629Z" level=info msg="RemovePodSandbox for \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\"" Mar 17 17:56:31.844283 containerd[1540]: time="2025-03-17T17:56:31.844053754Z" level=info msg="Forcibly stopping sandbox \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\"" Mar 17 17:56:31.844283 containerd[1540]: time="2025-03-17T17:56:31.844085616Z" level=info msg="TearDown network for sandbox \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\" successfully" Mar 17 17:56:31.849134 containerd[1540]: time="2025-03-17T17:56:31.849122275Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.849187 containerd[1540]: time="2025-03-17T17:56:31.849178875Z" level=info msg="RemovePodSandbox \"87b33971c06161a9843c3dac0914aacc4842ccaaf882aad28e050b4d17a432e4\" returns successfully" Mar 17 17:56:31.849366 containerd[1540]: time="2025-03-17T17:56:31.849332352Z" level=info msg="StopPodSandbox for \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\"" Mar 17 17:56:31.849426 containerd[1540]: time="2025-03-17T17:56:31.849408854Z" level=info msg="TearDown network for sandbox \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\" successfully" Mar 17 17:56:31.849426 containerd[1540]: time="2025-03-17T17:56:31.849415958Z" level=info msg="StopPodSandbox for \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\" returns successfully" Mar 17 17:56:31.850322 containerd[1540]: time="2025-03-17T17:56:31.849542153Z" level=info msg="RemovePodSandbox for \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\"" Mar 17 17:56:31.850322 containerd[1540]: time="2025-03-17T17:56:31.849626775Z" level=info msg="Forcibly stopping sandbox \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\"" Mar 17 17:56:31.850322 containerd[1540]: time="2025-03-17T17:56:31.849659281Z" level=info msg="TearDown network for sandbox \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\" successfully" Mar 17 17:56:31.850708 containerd[1540]: time="2025-03-17T17:56:31.850695829Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:56:31.850771 containerd[1540]: time="2025-03-17T17:56:31.850761559Z" level=info msg="RemovePodSandbox \"ebbc975ca087e0165267fb7717f950865533c9d0ba7733f465e94cfc674caea7\" returns successfully" Mar 17 17:56:33.012591 kubelet[2850]: I0317 17:56:33.012363 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-2cntr" podStartSLOduration=27.953669822 podStartE2EDuration="39.012349693s" podCreationTimestamp="2025-03-17 17:55:54 +0000 UTC" firstStartedPulling="2025-03-17 17:56:14.705598198 +0000 UTC m=+43.546737458" lastFinishedPulling="2025-03-17 17:56:25.764278069 +0000 UTC m=+54.605417329" observedRunningTime="2025-03-17 17:56:26.0877629 +0000 UTC m=+54.928902169" watchObservedRunningTime="2025-03-17 17:56:33.012349693 +0000 UTC m=+61.853488960" Mar 17 17:56:34.440602 containerd[1540]: time="2025-03-17T17:56:34.440495086Z" level=info msg="StopContainer for \"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b\" with timeout 300 (s)" Mar 17 17:56:34.444556 containerd[1540]: time="2025-03-17T17:56:34.442603933Z" level=info msg="Stop container \"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b\" with signal terminated" Mar 17 17:56:34.513580 containerd[1540]: time="2025-03-17T17:56:34.512507450Z" level=info msg="StopContainer for \"9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67\" with timeout 30 (s)" Mar 17 17:56:34.514423 containerd[1540]: time="2025-03-17T17:56:34.514403739Z" level=info msg="Stop container \"9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67\" with signal terminated" Mar 17 17:56:34.535553 systemd[1]: cri-containerd-9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67.scope: Deactivated successfully. Mar 17 17:56:34.578306 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67-rootfs.mount: Deactivated successfully. Mar 17 17:56:34.581952 containerd[1540]: time="2025-03-17T17:56:34.576832331Z" level=info msg="shim disconnected" id=9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67 namespace=k8s.io Mar 17 17:56:34.586775 containerd[1540]: time="2025-03-17T17:56:34.586648137Z" level=warning msg="cleaning up after shim disconnected" id=9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67 namespace=k8s.io Mar 17 17:56:34.586775 containerd[1540]: time="2025-03-17T17:56:34.586675623Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:56:34.628397 containerd[1540]: time="2025-03-17T17:56:34.628291022Z" level=info msg="StopContainer for \"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43\" with timeout 5 (s)" Mar 17 17:56:34.628498 containerd[1540]: time="2025-03-17T17:56:34.628475356Z" level=info msg="Stop container \"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43\" with signal terminated" Mar 17 17:56:34.672641 systemd[1]: cri-containerd-8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43.scope: Deactivated successfully. Mar 17 17:56:34.672794 systemd[1]: cri-containerd-8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43.scope: Consumed 1.025s CPU time. Mar 17 17:56:34.699188 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43-rootfs.mount: Deactivated successfully. Mar 17 17:56:34.701363 containerd[1540]: time="2025-03-17T17:56:34.701314188Z" level=info msg="shim disconnected" id=8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43 namespace=k8s.io Mar 17 17:56:34.701363 containerd[1540]: time="2025-03-17T17:56:34.701362106Z" level=warning msg="cleaning up after shim disconnected" id=8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43 namespace=k8s.io Mar 17 17:56:34.701689 containerd[1540]: time="2025-03-17T17:56:34.701368765Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:56:34.813554 containerd[1540]: time="2025-03-17T17:56:34.813526857Z" level=info msg="StopContainer for \"9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67\" returns successfully" Mar 17 17:56:34.814983 containerd[1540]: time="2025-03-17T17:56:34.814193528Z" level=info msg="StopContainer for \"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43\" returns successfully" Mar 17 17:56:34.818062 containerd[1540]: time="2025-03-17T17:56:34.818043970Z" level=info msg="StopPodSandbox for \"62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200\"" Mar 17 17:56:34.818130 containerd[1540]: time="2025-03-17T17:56:34.818070821Z" level=info msg="Container to stop \"08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 17:56:34.818130 containerd[1540]: time="2025-03-17T17:56:34.818093770Z" level=info msg="Container to stop \"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 17:56:34.818130 containerd[1540]: time="2025-03-17T17:56:34.818098956Z" level=info msg="Container to stop \"672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 17:56:34.818389 containerd[1540]: time="2025-03-17T17:56:34.818329893Z" level=info msg="StopPodSandbox for \"550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268\"" Mar 17 17:56:34.818389 containerd[1540]: time="2025-03-17T17:56:34.818343217Z" level=info msg="Container to stop \"9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 17:56:34.820555 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268-shm.mount: Deactivated successfully. Mar 17 17:56:34.822182 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200-shm.mount: Deactivated successfully. Mar 17 17:56:34.827551 systemd[1]: cri-containerd-62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200.scope: Deactivated successfully. Mar 17 17:56:34.835936 systemd[1]: cri-containerd-550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268.scope: Deactivated successfully. Mar 17 17:56:34.853105 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200-rootfs.mount: Deactivated successfully. Mar 17 17:56:34.854289 containerd[1540]: time="2025-03-17T17:56:34.854243379Z" level=info msg="shim disconnected" id=62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200 namespace=k8s.io Mar 17 17:56:34.854491 containerd[1540]: time="2025-03-17T17:56:34.854402647Z" level=warning msg="cleaning up after shim disconnected" id=62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200 namespace=k8s.io Mar 17 17:56:34.854678 containerd[1540]: time="2025-03-17T17:56:34.854586781Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:56:34.871596 containerd[1540]: time="2025-03-17T17:56:34.871391571Z" level=info msg="shim disconnected" id=550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268 namespace=k8s.io Mar 17 17:56:34.872092 containerd[1540]: time="2025-03-17T17:56:34.872077664Z" level=warning msg="cleaning up after shim disconnected" id=550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268 namespace=k8s.io Mar 17 17:56:34.872159 containerd[1540]: time="2025-03-17T17:56:34.872133118Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:56:34.890999 containerd[1540]: time="2025-03-17T17:56:34.890975111Z" level=info msg="TearDown network for sandbox \"62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200\" successfully" Mar 17 17:56:34.891421 containerd[1540]: time="2025-03-17T17:56:34.891130542Z" level=info msg="StopPodSandbox for \"62a2d7bf00bd582733981ddca695350a9b6eb1ac19723da52e91feaba6337200\" returns successfully" Mar 17 17:56:34.941993 kubelet[2850]: I0317 17:56:34.941887 2850 topology_manager.go:215] "Topology Admit Handler" podUID="18c3a204-9c2a-4cb4-a4fa-03324e848cd4" podNamespace="calico-system" podName="calico-node-r5scn" Mar 17 17:56:34.969158 kubelet[2850]: I0317 17:56:34.968805 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-xtables-lock\") pod \"9c6a8252-f745-4277-ab3b-8e3e7085b489\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " Mar 17 17:56:34.969158 kubelet[2850]: I0317 17:56:34.968861 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-var-lib-calico\") pod \"9c6a8252-f745-4277-ab3b-8e3e7085b489\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " Mar 17 17:56:34.969158 kubelet[2850]: I0317 17:56:34.969053 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-lib-modules\") pod \"9c6a8252-f745-4277-ab3b-8e3e7085b489\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " Mar 17 17:56:34.969158 kubelet[2850]: I0317 17:56:34.969066 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-cni-log-dir\") pod \"9c6a8252-f745-4277-ab3b-8e3e7085b489\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " Mar 17 17:56:34.969158 kubelet[2850]: I0317 17:56:34.969076 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-policysync\") pod \"9c6a8252-f745-4277-ab3b-8e3e7085b489\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " Mar 17 17:56:34.969158 kubelet[2850]: I0317 17:56:34.969101 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6a8252-f745-4277-ab3b-8e3e7085b489-tigera-ca-bundle\") pod \"9c6a8252-f745-4277-ab3b-8e3e7085b489\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " Mar 17 17:56:34.970130 kubelet[2850]: I0317 17:56:34.969757 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsld9\" (UniqueName: \"kubernetes.io/projected/9c6a8252-f745-4277-ab3b-8e3e7085b489-kube-api-access-hsld9\") pod \"9c6a8252-f745-4277-ab3b-8e3e7085b489\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " Mar 17 17:56:34.970130 kubelet[2850]: I0317 17:56:34.969777 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/9c6a8252-f745-4277-ab3b-8e3e7085b489-node-certs\") pod \"9c6a8252-f745-4277-ab3b-8e3e7085b489\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " Mar 17 17:56:34.970130 kubelet[2850]: I0317 17:56:34.969793 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-cni-net-dir\") pod \"9c6a8252-f745-4277-ab3b-8e3e7085b489\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " Mar 17 17:56:34.970130 kubelet[2850]: I0317 17:56:34.969802 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-flexvol-driver-host\") pod \"9c6a8252-f745-4277-ab3b-8e3e7085b489\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " Mar 17 17:56:34.970130 kubelet[2850]: I0317 17:56:34.969824 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-cni-bin-dir\") pod \"9c6a8252-f745-4277-ab3b-8e3e7085b489\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " Mar 17 17:56:34.970130 kubelet[2850]: I0317 17:56:34.969836 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-var-run-calico\") pod \"9c6a8252-f745-4277-ab3b-8e3e7085b489\" (UID: \"9c6a8252-f745-4277-ab3b-8e3e7085b489\") " Mar 17 17:56:34.972078 kubelet[2850]: I0317 17:56:34.970437 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "9c6a8252-f745-4277-ab3b-8e3e7085b489" (UID: "9c6a8252-f745-4277-ab3b-8e3e7085b489"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 17:56:34.972078 kubelet[2850]: I0317 17:56:34.971895 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "9c6a8252-f745-4277-ab3b-8e3e7085b489" (UID: "9c6a8252-f745-4277-ab3b-8e3e7085b489"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 17:56:34.972078 kubelet[2850]: I0317 17:56:34.971921 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "9c6a8252-f745-4277-ab3b-8e3e7085b489" (UID: "9c6a8252-f745-4277-ab3b-8e3e7085b489"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 17:56:34.972078 kubelet[2850]: I0317 17:56:34.971938 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "9c6a8252-f745-4277-ab3b-8e3e7085b489" (UID: "9c6a8252-f745-4277-ab3b-8e3e7085b489"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 17:56:34.972078 kubelet[2850]: I0317 17:56:34.971948 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "9c6a8252-f745-4277-ab3b-8e3e7085b489" (UID: "9c6a8252-f745-4277-ab3b-8e3e7085b489"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 17:56:34.972627 kubelet[2850]: I0317 17:56:34.971957 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-policysync" (OuterVolumeSpecName: "policysync") pod "9c6a8252-f745-4277-ab3b-8e3e7085b489" (UID: "9c6a8252-f745-4277-ab3b-8e3e7085b489"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 17:56:34.973723 kubelet[2850]: E0317 17:56:34.973701 2850 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="9c6a8252-f745-4277-ab3b-8e3e7085b489" containerName="flexvol-driver" Mar 17 17:56:34.973770 kubelet[2850]: E0317 17:56:34.973727 2850 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="9c6a8252-f745-4277-ab3b-8e3e7085b489" containerName="calico-node" Mar 17 17:56:34.973770 kubelet[2850]: E0317 17:56:34.973736 2850 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="9c6a8252-f745-4277-ab3b-8e3e7085b489" containerName="install-cni" Mar 17 17:56:34.980748 kubelet[2850]: I0317 17:56:34.980721 2850 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c6a8252-f745-4277-ab3b-8e3e7085b489" containerName="calico-node" Mar 17 17:56:35.019899 kubelet[2850]: I0317 17:56:35.019845 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "9c6a8252-f745-4277-ab3b-8e3e7085b489" (UID: "9c6a8252-f745-4277-ab3b-8e3e7085b489"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 17:56:35.020106 kubelet[2850]: I0317 17:56:35.019961 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "9c6a8252-f745-4277-ab3b-8e3e7085b489" (UID: "9c6a8252-f745-4277-ab3b-8e3e7085b489"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 17:56:35.020106 kubelet[2850]: I0317 17:56:35.019981 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "9c6a8252-f745-4277-ab3b-8e3e7085b489" (UID: "9c6a8252-f745-4277-ab3b-8e3e7085b489"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 17:56:35.063359 containerd[1540]: time="2025-03-17T17:56:35.062524730Z" level=error msg="ExecSync for \"9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67\" failed" error="failed to exec in container: container is in CONTAINER_EXITED state" Mar 17 17:56:35.075200 kubelet[2850]: I0317 17:56:35.073664 2850 reconciler_common.go:289] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-cni-net-dir\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.075200 kubelet[2850]: I0317 17:56:35.073688 2850 reconciler_common.go:289] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-flexvol-driver-host\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.075200 kubelet[2850]: I0317 17:56:35.073695 2850 reconciler_common.go:289] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-cni-bin-dir\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.075200 kubelet[2850]: I0317 17:56:35.073702 2850 reconciler_common.go:289] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-var-run-calico\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.075200 kubelet[2850]: I0317 17:56:35.073707 2850 reconciler_common.go:289] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-xtables-lock\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.075200 kubelet[2850]: I0317 17:56:35.073711 2850 reconciler_common.go:289] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-var-lib-calico\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.075200 kubelet[2850]: I0317 17:56:35.073717 2850 reconciler_common.go:289] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-lib-modules\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.075200 kubelet[2850]: I0317 17:56:35.073722 2850 reconciler_common.go:289] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-cni-log-dir\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.075576 kubelet[2850]: I0317 17:56:35.073727 2850 reconciler_common.go:289] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/9c6a8252-f745-4277-ab3b-8e3e7085b489-policysync\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.079674 kubelet[2850]: E0317 17:56:35.079633 2850 remote_runtime.go:496] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67" cmd=["/usr/bin/check-status","-r"] Mar 17 17:56:35.081634 kubelet[2850]: I0317 17:56:35.081015 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c6a8252-f745-4277-ab3b-8e3e7085b489-kube-api-access-hsld9" (OuterVolumeSpecName: "kube-api-access-hsld9") pod "9c6a8252-f745-4277-ab3b-8e3e7085b489" (UID: "9c6a8252-f745-4277-ab3b-8e3e7085b489"). InnerVolumeSpecName "kube-api-access-hsld9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 17:56:35.082465 containerd[1540]: time="2025-03-17T17:56:35.082387667Z" level=error msg="ExecSync for \"9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67\" failed" error="failed to exec in container: container is in CONTAINER_EXITED state" Mar 17 17:56:35.082516 kubelet[2850]: E0317 17:56:35.082495 2850 remote_runtime.go:496] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67" cmd=["/usr/bin/check-status","-r"] Mar 17 17:56:35.082703 containerd[1540]: time="2025-03-17T17:56:35.082683628Z" level=error msg="ExecSync for \"9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67\" failed" error="failed to exec in container: container is in CONTAINER_EXITED state" Mar 17 17:56:35.084617 kubelet[2850]: E0317 17:56:35.083087 2850 remote_runtime.go:496] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="9ae6a56a95239f4c0a8a2229e2e3025402dd243a6d43704c7200da6cbba1fa67" cmd=["/usr/bin/check-status","-r"] Mar 17 17:56:35.087682 kubelet[2850]: I0317 17:56:35.087456 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c6a8252-f745-4277-ab3b-8e3e7085b489-node-certs" (OuterVolumeSpecName: "node-certs") pod "9c6a8252-f745-4277-ab3b-8e3e7085b489" (UID: "9c6a8252-f745-4277-ab3b-8e3e7085b489"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 17:56:35.089677 kubelet[2850]: I0317 17:56:35.089176 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c6a8252-f745-4277-ab3b-8e3e7085b489-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "9c6a8252-f745-4277-ab3b-8e3e7085b489" (UID: "9c6a8252-f745-4277-ab3b-8e3e7085b489"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 17:56:35.102607 systemd[1]: Created slice kubepods-besteffort-pod18c3a204_9c2a_4cb4_a4fa_03324e848cd4.slice - libcontainer container kubepods-besteffort-pod18c3a204_9c2a_4cb4_a4fa_03324e848cd4.slice. Mar 17 17:56:35.111278 kubelet[2850]: I0317 17:56:35.111218 2850 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Mar 17 17:56:35.113088 kubelet[2850]: I0317 17:56:35.113000 2850 scope.go:117] "RemoveContainer" containerID="8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43" Mar 17 17:56:35.115373 containerd[1540]: time="2025-03-17T17:56:35.114361321Z" level=info msg="RemoveContainer for \"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43\"" Mar 17 17:56:35.116589 containerd[1540]: time="2025-03-17T17:56:35.116568236Z" level=info msg="RemoveContainer for \"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43\" returns successfully" Mar 17 17:56:35.122927 kubelet[2850]: I0317 17:56:35.122908 2850 scope.go:117] "RemoveContainer" containerID="672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12" Mar 17 17:56:35.127387 containerd[1540]: time="2025-03-17T17:56:35.126496745Z" level=info msg="RemoveContainer for \"672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12\"" Mar 17 17:56:35.127103 systemd[1]: Removed slice kubepods-besteffort-pod9c6a8252_f745_4277_ab3b_8e3e7085b489.slice - libcontainer container kubepods-besteffort-pod9c6a8252_f745_4277_ab3b_8e3e7085b489.slice. Mar 17 17:56:35.127205 systemd[1]: kubepods-besteffort-pod9c6a8252_f745_4277_ab3b_8e3e7085b489.slice: Consumed 1.305s CPU time. Mar 17 17:56:35.129315 containerd[1540]: time="2025-03-17T17:56:35.128876150Z" level=info msg="RemoveContainer for \"672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12\" returns successfully" Mar 17 17:56:35.130160 kubelet[2850]: I0317 17:56:35.129985 2850 scope.go:117] "RemoveContainer" containerID="08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7" Mar 17 17:56:35.132660 containerd[1540]: time="2025-03-17T17:56:35.132555322Z" level=info msg="RemoveContainer for \"08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7\"" Mar 17 17:56:35.135522 containerd[1540]: time="2025-03-17T17:56:35.135498854Z" level=info msg="RemoveContainer for \"08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7\" returns successfully" Mar 17 17:56:35.137088 kubelet[2850]: I0317 17:56:35.136990 2850 scope.go:117] "RemoveContainer" containerID="8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43" Mar 17 17:56:35.137370 containerd[1540]: time="2025-03-17T17:56:35.137287716Z" level=error msg="ContainerStatus for \"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43\": not found" Mar 17 17:56:35.145726 kubelet[2850]: E0317 17:56:35.145546 2850 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43\": not found" containerID="8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43" Mar 17 17:56:35.149108 kubelet[2850]: I0317 17:56:35.145591 2850 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43"} err="failed to get container status \"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43\": rpc error: code = NotFound desc = an error occurred when try to find container \"8871dd9541bd855dc0fec13435f355006156b0de9e38c52055306dd85b5f1a43\": not found" Mar 17 17:56:35.149108 kubelet[2850]: I0317 17:56:35.149033 2850 scope.go:117] "RemoveContainer" containerID="672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12" Mar 17 17:56:35.149581 containerd[1540]: time="2025-03-17T17:56:35.149315852Z" level=error msg="ContainerStatus for \"672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12\": not found" Mar 17 17:56:35.149820 kubelet[2850]: E0317 17:56:35.149808 2850 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12\": not found" containerID="672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12" Mar 17 17:56:35.150196 kubelet[2850]: I0317 17:56:35.150134 2850 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12"} err="failed to get container status \"672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12\": rpc error: code = NotFound desc = an error occurred when try to find container \"672a65a6f441ddde295bee70d8f2b61ceedc765456f42a6841c276c5d23beb12\": not found" Mar 17 17:56:35.150196 kubelet[2850]: I0317 17:56:35.150154 2850 scope.go:117] "RemoveContainer" containerID="08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7" Mar 17 17:56:35.150362 containerd[1540]: time="2025-03-17T17:56:35.150279388Z" level=error msg="ContainerStatus for \"08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7\": not found" Mar 17 17:56:35.150427 kubelet[2850]: E0317 17:56:35.150418 2850 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7\": not found" containerID="08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7" Mar 17 17:56:35.150656 kubelet[2850]: I0317 17:56:35.150545 2850 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7"} err="failed to get container status \"08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7\": rpc error: code = NotFound desc = an error occurred when try to find container \"08f2d2e068b47c6f21c42c2989a73959f07c9912494c93f80e6f1f2e2d927cb7\": not found" Mar 17 17:56:35.174204 kubelet[2850]: I0317 17:56:35.173974 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18c3a204-9c2a-4cb4-a4fa-03324e848cd4-tigera-ca-bundle\") pod \"calico-node-r5scn\" (UID: \"18c3a204-9c2a-4cb4-a4fa-03324e848cd4\") " pod="calico-system/calico-node-r5scn" Mar 17 17:56:35.174204 kubelet[2850]: I0317 17:56:35.174002 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18c3a204-9c2a-4cb4-a4fa-03324e848cd4-lib-modules\") pod \"calico-node-r5scn\" (UID: \"18c3a204-9c2a-4cb4-a4fa-03324e848cd4\") " pod="calico-system/calico-node-r5scn" Mar 17 17:56:35.174204 kubelet[2850]: I0317 17:56:35.174017 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/18c3a204-9c2a-4cb4-a4fa-03324e848cd4-policysync\") pod \"calico-node-r5scn\" (UID: \"18c3a204-9c2a-4cb4-a4fa-03324e848cd4\") " pod="calico-system/calico-node-r5scn" Mar 17 17:56:35.174204 kubelet[2850]: I0317 17:56:35.174028 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv4kf\" (UniqueName: \"kubernetes.io/projected/18c3a204-9c2a-4cb4-a4fa-03324e848cd4-kube-api-access-lv4kf\") pod \"calico-node-r5scn\" (UID: \"18c3a204-9c2a-4cb4-a4fa-03324e848cd4\") " pod="calico-system/calico-node-r5scn" Mar 17 17:56:35.174204 kubelet[2850]: I0317 17:56:35.174040 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/18c3a204-9c2a-4cb4-a4fa-03324e848cd4-var-run-calico\") pod \"calico-node-r5scn\" (UID: \"18c3a204-9c2a-4cb4-a4fa-03324e848cd4\") " pod="calico-system/calico-node-r5scn" Mar 17 17:56:35.181060 kubelet[2850]: I0317 17:56:35.174049 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/18c3a204-9c2a-4cb4-a4fa-03324e848cd4-var-lib-calico\") pod \"calico-node-r5scn\" (UID: \"18c3a204-9c2a-4cb4-a4fa-03324e848cd4\") " pod="calico-system/calico-node-r5scn" Mar 17 17:56:35.181060 kubelet[2850]: I0317 17:56:35.174060 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/18c3a204-9c2a-4cb4-a4fa-03324e848cd4-xtables-lock\") pod \"calico-node-r5scn\" (UID: \"18c3a204-9c2a-4cb4-a4fa-03324e848cd4\") " pod="calico-system/calico-node-r5scn" Mar 17 17:56:35.181060 kubelet[2850]: I0317 17:56:35.174071 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/18c3a204-9c2a-4cb4-a4fa-03324e848cd4-node-certs\") pod \"calico-node-r5scn\" (UID: \"18c3a204-9c2a-4cb4-a4fa-03324e848cd4\") " pod="calico-system/calico-node-r5scn" Mar 17 17:56:35.181060 kubelet[2850]: I0317 17:56:35.174080 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/18c3a204-9c2a-4cb4-a4fa-03324e848cd4-cni-bin-dir\") pod \"calico-node-r5scn\" (UID: \"18c3a204-9c2a-4cb4-a4fa-03324e848cd4\") " pod="calico-system/calico-node-r5scn" Mar 17 17:56:35.181060 kubelet[2850]: I0317 17:56:35.174089 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/18c3a204-9c2a-4cb4-a4fa-03324e848cd4-cni-log-dir\") pod \"calico-node-r5scn\" (UID: \"18c3a204-9c2a-4cb4-a4fa-03324e848cd4\") " pod="calico-system/calico-node-r5scn" Mar 17 17:56:35.181178 kubelet[2850]: I0317 17:56:35.174098 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/18c3a204-9c2a-4cb4-a4fa-03324e848cd4-cni-net-dir\") pod \"calico-node-r5scn\" (UID: \"18c3a204-9c2a-4cb4-a4fa-03324e848cd4\") " pod="calico-system/calico-node-r5scn" Mar 17 17:56:35.181178 kubelet[2850]: I0317 17:56:35.174108 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/18c3a204-9c2a-4cb4-a4fa-03324e848cd4-flexvol-driver-host\") pod \"calico-node-r5scn\" (UID: \"18c3a204-9c2a-4cb4-a4fa-03324e848cd4\") " pod="calico-system/calico-node-r5scn" Mar 17 17:56:35.181178 kubelet[2850]: I0317 17:56:35.174121 2850 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6a8252-f745-4277-ab3b-8e3e7085b489-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.181178 kubelet[2850]: I0317 17:56:35.174127 2850 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-hsld9\" (UniqueName: \"kubernetes.io/projected/9c6a8252-f745-4277-ab3b-8e3e7085b489-kube-api-access-hsld9\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.181178 kubelet[2850]: I0317 17:56:35.174133 2850 reconciler_common.go:289] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/9c6a8252-f745-4277-ab3b-8e3e7085b489-node-certs\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.319517 kubelet[2850]: I0317 17:56:35.319444 2850 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c6a8252-f745-4277-ab3b-8e3e7085b489" path="/var/lib/kubelet/pods/9c6a8252-f745-4277-ab3b-8e3e7085b489/volumes" Mar 17 17:56:35.348770 systemd-networkd[1435]: cali7f6e63fe175: Link DOWN Mar 17 17:56:35.348775 systemd-networkd[1435]: cali7f6e63fe175: Lost carrier Mar 17 17:56:35.406743 containerd[1540]: time="2025-03-17T17:56:35.406718704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-r5scn,Uid:18c3a204-9c2a-4cb4-a4fa-03324e848cd4,Namespace:calico-system,Attempt:0,}" Mar 17 17:56:35.434043 containerd[1540]: 2025-03-17 17:56:35.346 [INFO][5998] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Mar 17 17:56:35.434043 containerd[1540]: 2025-03-17 17:56:35.347 [INFO][5998] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" iface="eth0" netns="/var/run/netns/cni-d7fa0e9b-67de-67cd-7e02-5592e37734c1" Mar 17 17:56:35.434043 containerd[1540]: 2025-03-17 17:56:35.347 [INFO][5998] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" iface="eth0" netns="/var/run/netns/cni-d7fa0e9b-67de-67cd-7e02-5592e37734c1" Mar 17 17:56:35.434043 containerd[1540]: 2025-03-17 17:56:35.356 [INFO][5998] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" after=9.164766ms iface="eth0" netns="/var/run/netns/cni-d7fa0e9b-67de-67cd-7e02-5592e37734c1" Mar 17 17:56:35.434043 containerd[1540]: 2025-03-17 17:56:35.356 [INFO][5998] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Mar 17 17:56:35.434043 containerd[1540]: 2025-03-17 17:56:35.356 [INFO][5998] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Mar 17 17:56:35.434043 containerd[1540]: 2025-03-17 17:56:35.394 [INFO][6012] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" HandleID="k8s-pod-network.550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Workload="localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0" Mar 17 17:56:35.434043 containerd[1540]: 2025-03-17 17:56:35.394 [INFO][6012] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:56:35.434043 containerd[1540]: 2025-03-17 17:56:35.394 [INFO][6012] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:56:35.434043 containerd[1540]: 2025-03-17 17:56:35.425 [INFO][6012] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" HandleID="k8s-pod-network.550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Workload="localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0" Mar 17 17:56:35.434043 containerd[1540]: 2025-03-17 17:56:35.425 [INFO][6012] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" HandleID="k8s-pod-network.550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Workload="localhost-k8s-calico--kube--controllers--7c4bd45cb8--lpjjh-eth0" Mar 17 17:56:35.434043 containerd[1540]: 2025-03-17 17:56:35.426 [INFO][6012] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:56:35.434043 containerd[1540]: 2025-03-17 17:56:35.431 [INFO][5998] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268" Mar 17 17:56:35.436254 containerd[1540]: time="2025-03-17T17:56:35.436113724Z" level=info msg="TearDown network for sandbox \"550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268\" successfully" Mar 17 17:56:35.436328 containerd[1540]: time="2025-03-17T17:56:35.436254145Z" level=info msg="StopPodSandbox for \"550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268\" returns successfully" Mar 17 17:56:35.443546 containerd[1540]: time="2025-03-17T17:56:35.443408518Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:56:35.443546 containerd[1540]: time="2025-03-17T17:56:35.443444191Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:56:35.443546 containerd[1540]: time="2025-03-17T17:56:35.443467672Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:35.443546 containerd[1540]: time="2025-03-17T17:56:35.443516997Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:35.459665 systemd[1]: Started cri-containerd-ef7c722f3a73419c42f85358278ddf94b2fb21b6b480733089a6b7f6cdbdcebb.scope - libcontainer container ef7c722f3a73419c42f85358278ddf94b2fb21b6b480733089a6b7f6cdbdcebb. Mar 17 17:56:35.477089 kubelet[2850]: I0317 17:56:35.476205 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7j64\" (UniqueName: \"kubernetes.io/projected/4761939f-21ce-4484-88c7-08bcb4f65c5c-kube-api-access-t7j64\") pod \"4761939f-21ce-4484-88c7-08bcb4f65c5c\" (UID: \"4761939f-21ce-4484-88c7-08bcb4f65c5c\") " Mar 17 17:56:35.477089 kubelet[2850]: I0317 17:56:35.476295 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4761939f-21ce-4484-88c7-08bcb4f65c5c-tigera-ca-bundle\") pod \"4761939f-21ce-4484-88c7-08bcb4f65c5c\" (UID: \"4761939f-21ce-4484-88c7-08bcb4f65c5c\") " Mar 17 17:56:35.481190 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-550f9b2f7e16841231cd9e8942b424b39990c5987425c92a10129af2e9bc5268-rootfs.mount: Deactivated successfully. Mar 17 17:56:35.482350 systemd[1]: run-netns-cni\x2dd7fa0e9b\x2d67de\x2d67cd\x2d7e02\x2d5592e37734c1.mount: Deactivated successfully. Mar 17 17:56:35.482484 systemd[1]: var-lib-kubelet-pods-9c6a8252\x2df745\x2d4277\x2dab3b\x2d8e3e7085b489-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. Mar 17 17:56:35.482688 systemd[1]: var-lib-kubelet-pods-9c6a8252\x2df745\x2d4277\x2dab3b\x2d8e3e7085b489-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dhsld9.mount: Deactivated successfully. Mar 17 17:56:35.482737 systemd[1]: var-lib-kubelet-pods-9c6a8252\x2df745\x2d4277\x2dab3b\x2d8e3e7085b489-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Mar 17 17:56:35.485009 kubelet[2850]: I0317 17:56:35.484894 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4761939f-21ce-4484-88c7-08bcb4f65c5c-kube-api-access-t7j64" (OuterVolumeSpecName: "kube-api-access-t7j64") pod "4761939f-21ce-4484-88c7-08bcb4f65c5c" (UID: "4761939f-21ce-4484-88c7-08bcb4f65c5c"). InnerVolumeSpecName "kube-api-access-t7j64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 17:56:35.486007 kubelet[2850]: I0317 17:56:35.485926 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4761939f-21ce-4484-88c7-08bcb4f65c5c-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "4761939f-21ce-4484-88c7-08bcb4f65c5c" (UID: "4761939f-21ce-4484-88c7-08bcb4f65c5c"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 17:56:35.487286 containerd[1540]: time="2025-03-17T17:56:35.487245752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-r5scn,Uid:18c3a204-9c2a-4cb4-a4fa-03324e848cd4,Namespace:calico-system,Attempt:0,} returns sandbox id \"ef7c722f3a73419c42f85358278ddf94b2fb21b6b480733089a6b7f6cdbdcebb\"" Mar 17 17:56:35.488162 systemd[1]: var-lib-kubelet-pods-4761939f\x2d21ce\x2d4484\x2d88c7\x2d08bcb4f65c5c-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dkube\x2dcontrollers-1.mount: Deactivated successfully. Mar 17 17:56:35.488232 systemd[1]: var-lib-kubelet-pods-4761939f\x2d21ce\x2d4484\x2d88c7\x2d08bcb4f65c5c-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dt7j64.mount: Deactivated successfully. Mar 17 17:56:35.490475 containerd[1540]: time="2025-03-17T17:56:35.490358106Z" level=info msg="CreateContainer within sandbox \"ef7c722f3a73419c42f85358278ddf94b2fb21b6b480733089a6b7f6cdbdcebb\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 17:56:35.496936 containerd[1540]: time="2025-03-17T17:56:35.496915544Z" level=info msg="CreateContainer within sandbox \"ef7c722f3a73419c42f85358278ddf94b2fb21b6b480733089a6b7f6cdbdcebb\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e84f927a849dc690bf25d8287312dde6145852125df315625f1a2bf496b669d0\"" Mar 17 17:56:35.498209 containerd[1540]: time="2025-03-17T17:56:35.498098665Z" level=info msg="StartContainer for \"e84f927a849dc690bf25d8287312dde6145852125df315625f1a2bf496b669d0\"" Mar 17 17:56:35.518661 systemd[1]: Started cri-containerd-e84f927a849dc690bf25d8287312dde6145852125df315625f1a2bf496b669d0.scope - libcontainer container e84f927a849dc690bf25d8287312dde6145852125df315625f1a2bf496b669d0. Mar 17 17:56:35.538976 containerd[1540]: time="2025-03-17T17:56:35.538675502Z" level=info msg="StartContainer for \"e84f927a849dc690bf25d8287312dde6145852125df315625f1a2bf496b669d0\" returns successfully" Mar 17 17:56:35.577446 kubelet[2850]: I0317 17:56:35.577315 2850 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4761939f-21ce-4484-88c7-08bcb4f65c5c-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.577446 kubelet[2850]: I0317 17:56:35.577341 2850 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-t7j64\" (UniqueName: \"kubernetes.io/projected/4761939f-21ce-4484-88c7-08bcb4f65c5c-kube-api-access-t7j64\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.629930 systemd[1]: cri-containerd-e84f927a849dc690bf25d8287312dde6145852125df315625f1a2bf496b669d0.scope: Deactivated successfully. Mar 17 17:56:35.649193 systemd[1]: cri-containerd-0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b.scope: Deactivated successfully. Mar 17 17:56:35.658481 containerd[1540]: time="2025-03-17T17:56:35.658429353Z" level=info msg="shim disconnected" id=e84f927a849dc690bf25d8287312dde6145852125df315625f1a2bf496b669d0 namespace=k8s.io Mar 17 17:56:35.658481 containerd[1540]: time="2025-03-17T17:56:35.658474377Z" level=warning msg="cleaning up after shim disconnected" id=e84f927a849dc690bf25d8287312dde6145852125df315625f1a2bf496b669d0 namespace=k8s.io Mar 17 17:56:35.658481 containerd[1540]: time="2025-03-17T17:56:35.658479909Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:56:35.672670 containerd[1540]: time="2025-03-17T17:56:35.672622325Z" level=warning msg="cleanup warnings time=\"2025-03-17T17:56:35Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Mar 17 17:56:35.675555 containerd[1540]: time="2025-03-17T17:56:35.675512856Z" level=info msg="shim disconnected" id=0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b namespace=k8s.io Mar 17 17:56:35.675555 containerd[1540]: time="2025-03-17T17:56:35.675547642Z" level=warning msg="cleaning up after shim disconnected" id=0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b namespace=k8s.io Mar 17 17:56:35.675555 containerd[1540]: time="2025-03-17T17:56:35.675553667Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:56:35.688401 containerd[1540]: time="2025-03-17T17:56:35.688353584Z" level=info msg="StopContainer for \"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b\" returns successfully" Mar 17 17:56:35.688766 containerd[1540]: time="2025-03-17T17:56:35.688688752Z" level=info msg="StopPodSandbox for \"d5234b7313f495dde6f98b88c9f7e58b63a4975c3f4cdd573f5eba8224e9bb3b\"" Mar 17 17:56:35.688766 containerd[1540]: time="2025-03-17T17:56:35.688714437Z" level=info msg="Container to stop \"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 17:56:35.692424 systemd[1]: cri-containerd-d5234b7313f495dde6f98b88c9f7e58b63a4975c3f4cdd573f5eba8224e9bb3b.scope: Deactivated successfully. Mar 17 17:56:35.702729 containerd[1540]: time="2025-03-17T17:56:35.702495412Z" level=info msg="shim disconnected" id=d5234b7313f495dde6f98b88c9f7e58b63a4975c3f4cdd573f5eba8224e9bb3b namespace=k8s.io Mar 17 17:56:35.702729 containerd[1540]: time="2025-03-17T17:56:35.702523586Z" level=warning msg="cleaning up after shim disconnected" id=d5234b7313f495dde6f98b88c9f7e58b63a4975c3f4cdd573f5eba8224e9bb3b namespace=k8s.io Mar 17 17:56:35.702729 containerd[1540]: time="2025-03-17T17:56:35.702528966Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:56:35.712607 containerd[1540]: time="2025-03-17T17:56:35.712585129Z" level=info msg="TearDown network for sandbox \"d5234b7313f495dde6f98b88c9f7e58b63a4975c3f4cdd573f5eba8224e9bb3b\" successfully" Mar 17 17:56:35.712607 containerd[1540]: time="2025-03-17T17:56:35.712602136Z" level=info msg="StopPodSandbox for \"d5234b7313f495dde6f98b88c9f7e58b63a4975c3f4cdd573f5eba8224e9bb3b\" returns successfully" Mar 17 17:56:35.779952 kubelet[2850]: I0317 17:56:35.779929 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6b6ca053-adf6-44c0-bdfd-693c6b378420-typha-certs\") pod \"6b6ca053-adf6-44c0-bdfd-693c6b378420\" (UID: \"6b6ca053-adf6-44c0-bdfd-693c6b378420\") " Mar 17 17:56:35.780157 kubelet[2850]: I0317 17:56:35.780080 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b6ca053-adf6-44c0-bdfd-693c6b378420-tigera-ca-bundle\") pod \"6b6ca053-adf6-44c0-bdfd-693c6b378420\" (UID: \"6b6ca053-adf6-44c0-bdfd-693c6b378420\") " Mar 17 17:56:35.780157 kubelet[2850]: I0317 17:56:35.780100 2850 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rxlv\" (UniqueName: \"kubernetes.io/projected/6b6ca053-adf6-44c0-bdfd-693c6b378420-kube-api-access-7rxlv\") pod \"6b6ca053-adf6-44c0-bdfd-693c6b378420\" (UID: \"6b6ca053-adf6-44c0-bdfd-693c6b378420\") " Mar 17 17:56:35.785150 kubelet[2850]: I0317 17:56:35.785124 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b6ca053-adf6-44c0-bdfd-693c6b378420-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "6b6ca053-adf6-44c0-bdfd-693c6b378420" (UID: "6b6ca053-adf6-44c0-bdfd-693c6b378420"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 17:56:35.785335 kubelet[2850]: I0317 17:56:35.785325 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b6ca053-adf6-44c0-bdfd-693c6b378420-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "6b6ca053-adf6-44c0-bdfd-693c6b378420" (UID: "6b6ca053-adf6-44c0-bdfd-693c6b378420"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 17:56:35.786208 kubelet[2850]: I0317 17:56:35.786191 2850 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6ca053-adf6-44c0-bdfd-693c6b378420-kube-api-access-7rxlv" (OuterVolumeSpecName: "kube-api-access-7rxlv") pod "6b6ca053-adf6-44c0-bdfd-693c6b378420" (UID: "6b6ca053-adf6-44c0-bdfd-693c6b378420"). InnerVolumeSpecName "kube-api-access-7rxlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 17:56:35.818636 kubelet[2850]: I0317 17:56:35.818608 2850 topology_manager.go:215] "Topology Admit Handler" podUID="44f8584e-117b-4f5b-918e-d374f3dd0976" podNamespace="calico-system" podName="calico-typha-9968985f7-9nl7x" Mar 17 17:56:35.818737 kubelet[2850]: E0317 17:56:35.818682 2850 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="4761939f-21ce-4484-88c7-08bcb4f65c5c" containerName="calico-kube-controllers" Mar 17 17:56:35.818737 kubelet[2850]: E0317 17:56:35.818694 2850 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="6b6ca053-adf6-44c0-bdfd-693c6b378420" containerName="calico-typha" Mar 17 17:56:35.818789 kubelet[2850]: I0317 17:56:35.818740 2850 memory_manager.go:354] "RemoveStaleState removing state" podUID="4761939f-21ce-4484-88c7-08bcb4f65c5c" containerName="calico-kube-controllers" Mar 17 17:56:35.818789 kubelet[2850]: I0317 17:56:35.818747 2850 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b6ca053-adf6-44c0-bdfd-693c6b378420" containerName="calico-typha" Mar 17 17:56:35.830260 systemd[1]: Created slice kubepods-besteffort-pod44f8584e_117b_4f5b_918e_d374f3dd0976.slice - libcontainer container kubepods-besteffort-pod44f8584e_117b_4f5b_918e_d374f3dd0976.slice. Mar 17 17:56:35.880544 kubelet[2850]: I0317 17:56:35.880453 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44f8584e-117b-4f5b-918e-d374f3dd0976-tigera-ca-bundle\") pod \"calico-typha-9968985f7-9nl7x\" (UID: \"44f8584e-117b-4f5b-918e-d374f3dd0976\") " pod="calico-system/calico-typha-9968985f7-9nl7x" Mar 17 17:56:35.880544 kubelet[2850]: I0317 17:56:35.880481 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/44f8584e-117b-4f5b-918e-d374f3dd0976-typha-certs\") pod \"calico-typha-9968985f7-9nl7x\" (UID: \"44f8584e-117b-4f5b-918e-d374f3dd0976\") " pod="calico-system/calico-typha-9968985f7-9nl7x" Mar 17 17:56:35.880544 kubelet[2850]: I0317 17:56:35.880495 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvzvb\" (UniqueName: \"kubernetes.io/projected/44f8584e-117b-4f5b-918e-d374f3dd0976-kube-api-access-pvzvb\") pod \"calico-typha-9968985f7-9nl7x\" (UID: \"44f8584e-117b-4f5b-918e-d374f3dd0976\") " pod="calico-system/calico-typha-9968985f7-9nl7x" Mar 17 17:56:35.880544 kubelet[2850]: I0317 17:56:35.880512 2850 reconciler_common.go:289] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6b6ca053-adf6-44c0-bdfd-693c6b378420-typha-certs\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.880544 kubelet[2850]: I0317 17:56:35.880519 2850 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b6ca053-adf6-44c0-bdfd-693c6b378420-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:35.880544 kubelet[2850]: I0317 17:56:35.880526 2850 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-7rxlv\" (UniqueName: \"kubernetes.io/projected/6b6ca053-adf6-44c0-bdfd-693c6b378420-kube-api-access-7rxlv\") on node \"localhost\" DevicePath \"\"" Mar 17 17:56:36.116969 kubelet[2850]: I0317 17:56:36.116888 2850 scope.go:117] "RemoveContainer" containerID="0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b" Mar 17 17:56:36.118655 containerd[1540]: time="2025-03-17T17:56:36.118464697Z" level=info msg="RemoveContainer for \"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b\"" Mar 17 17:56:36.122634 containerd[1540]: time="2025-03-17T17:56:36.122066684Z" level=info msg="RemoveContainer for \"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b\" returns successfully" Mar 17 17:56:36.122736 kubelet[2850]: I0317 17:56:36.122676 2850 scope.go:117] "RemoveContainer" containerID="0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b" Mar 17 17:56:36.123364 containerd[1540]: time="2025-03-17T17:56:36.123314352Z" level=error msg="ContainerStatus for \"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b\": not found" Mar 17 17:56:36.123486 kubelet[2850]: E0317 17:56:36.123472 2850 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b\": not found" containerID="0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b" Mar 17 17:56:36.123554 kubelet[2850]: I0317 17:56:36.123539 2850 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b"} err="failed to get container status \"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b\": rpc error: code = NotFound desc = an error occurred when try to find container \"0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b\": not found" Mar 17 17:56:36.124875 systemd[1]: Removed slice kubepods-besteffort-pod6b6ca053_adf6_44c0_bdfd_693c6b378420.slice - libcontainer container kubepods-besteffort-pod6b6ca053_adf6_44c0_bdfd_693c6b378420.slice. Mar 17 17:56:36.142333 containerd[1540]: time="2025-03-17T17:56:36.142138775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9968985f7-9nl7x,Uid:44f8584e-117b-4f5b-918e-d374f3dd0976,Namespace:calico-system,Attempt:0,}" Mar 17 17:56:36.150909 systemd[1]: Removed slice kubepods-besteffort-pod4761939f_21ce_4484_88c7_08bcb4f65c5c.slice - libcontainer container kubepods-besteffort-pod4761939f_21ce_4484_88c7_08bcb4f65c5c.slice. Mar 17 17:56:36.156992 containerd[1540]: time="2025-03-17T17:56:36.150905684Z" level=info msg="CreateContainer within sandbox \"ef7c722f3a73419c42f85358278ddf94b2fb21b6b480733089a6b7f6cdbdcebb\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 17:56:36.173982 containerd[1540]: time="2025-03-17T17:56:36.173059497Z" level=info msg="CreateContainer within sandbox \"ef7c722f3a73419c42f85358278ddf94b2fb21b6b480733089a6b7f6cdbdcebb\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"6f2fcb755fe47ac7eb5925304e5e1494904eb0ff4d00a64aa8fd0ba71d43b5d7\"" Mar 17 17:56:36.174741 containerd[1540]: time="2025-03-17T17:56:36.174641035Z" level=info msg="StartContainer for \"6f2fcb755fe47ac7eb5925304e5e1494904eb0ff4d00a64aa8fd0ba71d43b5d7\"" Mar 17 17:56:36.180721 containerd[1540]: time="2025-03-17T17:56:36.180548090Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:56:36.180808 containerd[1540]: time="2025-03-17T17:56:36.180743080Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:56:36.180808 containerd[1540]: time="2025-03-17T17:56:36.180760222Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:36.180870 containerd[1540]: time="2025-03-17T17:56:36.180808847Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:36.214675 systemd[1]: Started cri-containerd-9748cacb4bbe836ce203699f142564d7a2760599474bb2002f7192358e01ecf0.scope - libcontainer container 9748cacb4bbe836ce203699f142564d7a2760599474bb2002f7192358e01ecf0. Mar 17 17:56:36.221629 systemd[1]: Started cri-containerd-6f2fcb755fe47ac7eb5925304e5e1494904eb0ff4d00a64aa8fd0ba71d43b5d7.scope - libcontainer container 6f2fcb755fe47ac7eb5925304e5e1494904eb0ff4d00a64aa8fd0ba71d43b5d7. Mar 17 17:56:36.290750 containerd[1540]: time="2025-03-17T17:56:36.290682185Z" level=info msg="StartContainer for \"6f2fcb755fe47ac7eb5925304e5e1494904eb0ff4d00a64aa8fd0ba71d43b5d7\" returns successfully" Mar 17 17:56:36.301439 containerd[1540]: time="2025-03-17T17:56:36.301413636Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9968985f7-9nl7x,Uid:44f8584e-117b-4f5b-918e-d374f3dd0976,Namespace:calico-system,Attempt:0,} returns sandbox id \"9748cacb4bbe836ce203699f142564d7a2760599474bb2002f7192358e01ecf0\"" Mar 17 17:56:36.306522 containerd[1540]: time="2025-03-17T17:56:36.306499136Z" level=info msg="CreateContainer within sandbox \"9748cacb4bbe836ce203699f142564d7a2760599474bb2002f7192358e01ecf0\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 17 17:56:36.311966 containerd[1540]: time="2025-03-17T17:56:36.311947980Z" level=info msg="CreateContainer within sandbox \"9748cacb4bbe836ce203699f142564d7a2760599474bb2002f7192358e01ecf0\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"9381a5f685f0ac4ec4f3e0f5bc85e2eeb663df48718b421b15ffb358f8ab9f11\"" Mar 17 17:56:36.312551 containerd[1540]: time="2025-03-17T17:56:36.312224750Z" level=info msg="StartContainer for \"9381a5f685f0ac4ec4f3e0f5bc85e2eeb663df48718b421b15ffb358f8ab9f11\"" Mar 17 17:56:36.332794 systemd[1]: Started cri-containerd-9381a5f685f0ac4ec4f3e0f5bc85e2eeb663df48718b421b15ffb358f8ab9f11.scope - libcontainer container 9381a5f685f0ac4ec4f3e0f5bc85e2eeb663df48718b421b15ffb358f8ab9f11. Mar 17 17:56:36.371338 containerd[1540]: time="2025-03-17T17:56:36.371254771Z" level=info msg="StartContainer for \"9381a5f685f0ac4ec4f3e0f5bc85e2eeb663df48718b421b15ffb358f8ab9f11\" returns successfully" Mar 17 17:56:36.479597 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e84f927a849dc690bf25d8287312dde6145852125df315625f1a2bf496b669d0-rootfs.mount: Deactivated successfully. Mar 17 17:56:36.479703 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0a93fb85df1f047474597b6f3aa1e65b6000f23352ff1e8e9e00959773b22a8b-rootfs.mount: Deactivated successfully. Mar 17 17:56:36.479740 systemd[1]: var-lib-kubelet-pods-6b6ca053\x2dadf6\x2d44c0\x2dbdfd\x2d693c6b378420-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Mar 17 17:56:36.479776 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d5234b7313f495dde6f98b88c9f7e58b63a4975c3f4cdd573f5eba8224e9bb3b-rootfs.mount: Deactivated successfully. Mar 17 17:56:36.479808 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d5234b7313f495dde6f98b88c9f7e58b63a4975c3f4cdd573f5eba8224e9bb3b-shm.mount: Deactivated successfully. Mar 17 17:56:36.479843 systemd[1]: var-lib-kubelet-pods-6b6ca053\x2dadf6\x2d44c0\x2dbdfd\x2d693c6b378420-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d7rxlv.mount: Deactivated successfully. Mar 17 17:56:36.479878 systemd[1]: var-lib-kubelet-pods-6b6ca053\x2dadf6\x2d44c0\x2dbdfd\x2d693c6b378420-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Mar 17 17:56:37.148757 kubelet[2850]: I0317 17:56:37.148713 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-9968985f7-9nl7x" podStartSLOduration=3.148695415 podStartE2EDuration="3.148695415s" podCreationTimestamp="2025-03-17 17:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:56:37.138244113 +0000 UTC m=+65.979383387" watchObservedRunningTime="2025-03-17 17:56:37.148695415 +0000 UTC m=+65.989834683" Mar 17 17:56:37.313737 kubelet[2850]: I0317 17:56:37.313644 2850 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4761939f-21ce-4484-88c7-08bcb4f65c5c" path="/var/lib/kubelet/pods/4761939f-21ce-4484-88c7-08bcb4f65c5c/volumes" Mar 17 17:56:37.314691 kubelet[2850]: I0317 17:56:37.314641 2850 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b6ca053-adf6-44c0-bdfd-693c6b378420" path="/var/lib/kubelet/pods/6b6ca053-adf6-44c0-bdfd-693c6b378420/volumes" Mar 17 17:56:38.192486 systemd[1]: cri-containerd-6f2fcb755fe47ac7eb5925304e5e1494904eb0ff4d00a64aa8fd0ba71d43b5d7.scope: Deactivated successfully. Mar 17 17:56:38.300983 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6f2fcb755fe47ac7eb5925304e5e1494904eb0ff4d00a64aa8fd0ba71d43b5d7-rootfs.mount: Deactivated successfully. Mar 17 17:56:38.315049 containerd[1540]: time="2025-03-17T17:56:38.311381467Z" level=info msg="shim disconnected" id=6f2fcb755fe47ac7eb5925304e5e1494904eb0ff4d00a64aa8fd0ba71d43b5d7 namespace=k8s.io Mar 17 17:56:38.315049 containerd[1540]: time="2025-03-17T17:56:38.315046646Z" level=warning msg="cleaning up after shim disconnected" id=6f2fcb755fe47ac7eb5925304e5e1494904eb0ff4d00a64aa8fd0ba71d43b5d7 namespace=k8s.io Mar 17 17:56:38.315049 containerd[1540]: time="2025-03-17T17:56:38.315053776Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:56:38.330063 containerd[1540]: time="2025-03-17T17:56:38.329648625Z" level=warning msg="cleanup warnings time=\"2025-03-17T17:56:38Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Mar 17 17:56:39.609924 containerd[1540]: time="2025-03-17T17:56:39.609779368Z" level=info msg="CreateContainer within sandbox \"ef7c722f3a73419c42f85358278ddf94b2fb21b6b480733089a6b7f6cdbdcebb\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 17:56:39.629354 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3536228606.mount: Deactivated successfully. Mar 17 17:56:39.632233 containerd[1540]: time="2025-03-17T17:56:39.632210461Z" level=info msg="CreateContainer within sandbox \"ef7c722f3a73419c42f85358278ddf94b2fb21b6b480733089a6b7f6cdbdcebb\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ec6d8f09e472dfaf8c3b9662dd55304920a500f0011beea0235b7614aa450325\"" Mar 17 17:56:39.632948 containerd[1540]: time="2025-03-17T17:56:39.632719152Z" level=info msg="StartContainer for \"ec6d8f09e472dfaf8c3b9662dd55304920a500f0011beea0235b7614aa450325\"" Mar 17 17:56:39.654644 systemd[1]: Started cri-containerd-ec6d8f09e472dfaf8c3b9662dd55304920a500f0011beea0235b7614aa450325.scope - libcontainer container ec6d8f09e472dfaf8c3b9662dd55304920a500f0011beea0235b7614aa450325. Mar 17 17:56:39.699877 containerd[1540]: time="2025-03-17T17:56:39.699852351Z" level=info msg="StartContainer for \"ec6d8f09e472dfaf8c3b9662dd55304920a500f0011beea0235b7614aa450325\" returns successfully" Mar 17 17:56:39.938709 kubelet[2850]: I0317 17:56:39.938652 2850 topology_manager.go:215] "Topology Admit Handler" podUID="b7ac741c-01d0-4518-8804-4278bbebf46f" podNamespace="calico-system" podName="calico-kube-controllers-7444d64bbc-9ksk2" Mar 17 17:56:40.005283 kubelet[2850]: I0317 17:56:40.005187 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bdrs\" (UniqueName: \"kubernetes.io/projected/b7ac741c-01d0-4518-8804-4278bbebf46f-kube-api-access-9bdrs\") pod \"calico-kube-controllers-7444d64bbc-9ksk2\" (UID: \"b7ac741c-01d0-4518-8804-4278bbebf46f\") " pod="calico-system/calico-kube-controllers-7444d64bbc-9ksk2" Mar 17 17:56:40.005283 kubelet[2850]: I0317 17:56:40.005221 2850 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7ac741c-01d0-4518-8804-4278bbebf46f-tigera-ca-bundle\") pod \"calico-kube-controllers-7444d64bbc-9ksk2\" (UID: \"b7ac741c-01d0-4518-8804-4278bbebf46f\") " pod="calico-system/calico-kube-controllers-7444d64bbc-9ksk2" Mar 17 17:56:40.050078 systemd[1]: Created slice kubepods-besteffort-podb7ac741c_01d0_4518_8804_4278bbebf46f.slice - libcontainer container kubepods-besteffort-podb7ac741c_01d0_4518_8804_4278bbebf46f.slice. Mar 17 17:56:40.369720 systemd[1]: Started sshd@7-139.178.70.104:22-147.75.109.163:35326.service - OpenSSH per-connection server daemon (147.75.109.163:35326). Mar 17 17:56:40.385241 containerd[1540]: time="2025-03-17T17:56:40.385134182Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7444d64bbc-9ksk2,Uid:b7ac741c-01d0-4518-8804-4278bbebf46f,Namespace:calico-system,Attempt:0,}" Mar 17 17:56:40.524837 systemd-networkd[1435]: cali812b20a41f2: Link UP Mar 17 17:56:40.525797 systemd-networkd[1435]: cali812b20a41f2: Gained carrier Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.462 [INFO][6400] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7444d64bbc--9ksk2-eth0 calico-kube-controllers-7444d64bbc- calico-system b7ac741c-01d0-4518-8804-4278bbebf46f 1119 0 2025-03-17 17:56:36 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7444d64bbc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7444d64bbc-9ksk2 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali812b20a41f2 [] []}} ContainerID="fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" Namespace="calico-system" Pod="calico-kube-controllers-7444d64bbc-9ksk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7444d64bbc--9ksk2-" Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.463 [INFO][6400] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" Namespace="calico-system" Pod="calico-kube-controllers-7444d64bbc-9ksk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7444d64bbc--9ksk2-eth0" Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.490 [INFO][6434] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" HandleID="k8s-pod-network.fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" Workload="localhost-k8s-calico--kube--controllers--7444d64bbc--9ksk2-eth0" Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.498 [INFO][6434] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" HandleID="k8s-pod-network.fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" Workload="localhost-k8s-calico--kube--controllers--7444d64bbc--9ksk2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000384ae0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7444d64bbc-9ksk2", "timestamp":"2025-03-17 17:56:40.490698698 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.498 [INFO][6434] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.498 [INFO][6434] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.498 [INFO][6434] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.499 [INFO][6434] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" host="localhost" Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.501 [INFO][6434] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.504 [INFO][6434] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.505 [INFO][6434] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.507 [INFO][6434] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.507 [INFO][6434] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" host="localhost" Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.508 [INFO][6434] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.512 [INFO][6434] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" host="localhost" Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.518 [INFO][6434] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" host="localhost" Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.518 [INFO][6434] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" host="localhost" Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.518 [INFO][6434] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:56:40.538666 containerd[1540]: 2025-03-17 17:56:40.518 [INFO][6434] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" HandleID="k8s-pod-network.fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" Workload="localhost-k8s-calico--kube--controllers--7444d64bbc--9ksk2-eth0" Mar 17 17:56:40.547112 containerd[1540]: 2025-03-17 17:56:40.521 [INFO][6400] cni-plugin/k8s.go 386: Populated endpoint ContainerID="fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" Namespace="calico-system" Pod="calico-kube-controllers-7444d64bbc-9ksk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7444d64bbc--9ksk2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7444d64bbc--9ksk2-eth0", GenerateName:"calico-kube-controllers-7444d64bbc-", Namespace:"calico-system", SelfLink:"", UID:"b7ac741c-01d0-4518-8804-4278bbebf46f", ResourceVersion:"1119", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 56, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7444d64bbc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7444d64bbc-9ksk2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali812b20a41f2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:56:40.547112 containerd[1540]: 2025-03-17 17:56:40.521 [INFO][6400] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.135/32] ContainerID="fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" Namespace="calico-system" Pod="calico-kube-controllers-7444d64bbc-9ksk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7444d64bbc--9ksk2-eth0" Mar 17 17:56:40.547112 containerd[1540]: 2025-03-17 17:56:40.522 [INFO][6400] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali812b20a41f2 ContainerID="fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" Namespace="calico-system" Pod="calico-kube-controllers-7444d64bbc-9ksk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7444d64bbc--9ksk2-eth0" Mar 17 17:56:40.547112 containerd[1540]: 2025-03-17 17:56:40.525 [INFO][6400] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" Namespace="calico-system" Pod="calico-kube-controllers-7444d64bbc-9ksk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7444d64bbc--9ksk2-eth0" Mar 17 17:56:40.547112 containerd[1540]: 2025-03-17 17:56:40.526 [INFO][6400] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" Namespace="calico-system" Pod="calico-kube-controllers-7444d64bbc-9ksk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7444d64bbc--9ksk2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7444d64bbc--9ksk2-eth0", GenerateName:"calico-kube-controllers-7444d64bbc-", Namespace:"calico-system", SelfLink:"", UID:"b7ac741c-01d0-4518-8804-4278bbebf46f", ResourceVersion:"1119", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 56, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7444d64bbc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e", Pod:"calico-kube-controllers-7444d64bbc-9ksk2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali812b20a41f2", MAC:"0a:3e:4f:b7:f1:83", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:56:40.547112 containerd[1540]: 2025-03-17 17:56:40.535 [INFO][6400] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e" Namespace="calico-system" Pod="calico-kube-controllers-7444d64bbc-9ksk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7444d64bbc--9ksk2-eth0" Mar 17 17:56:40.547309 sshd[6399]: Accepted publickey for core from 147.75.109.163 port 35326 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:56:40.548307 sshd-session[6399]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:56:40.554489 systemd-logind[1520]: New session 10 of user core. Mar 17 17:56:40.558740 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 17 17:56:40.560748 kubelet[2850]: I0317 17:56:40.556090 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-r5scn" podStartSLOduration=6.5492908530000005 podStartE2EDuration="6.549290853s" podCreationTimestamp="2025-03-17 17:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:56:40.395844097 +0000 UTC m=+69.236983360" watchObservedRunningTime="2025-03-17 17:56:40.549290853 +0000 UTC m=+69.390430124" Mar 17 17:56:40.609318 containerd[1540]: time="2025-03-17T17:56:40.608030572Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:56:40.609318 containerd[1540]: time="2025-03-17T17:56:40.608064358Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:56:40.609318 containerd[1540]: time="2025-03-17T17:56:40.608106912Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:40.609318 containerd[1540]: time="2025-03-17T17:56:40.608153720Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:56:40.631704 systemd[1]: Started cri-containerd-fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e.scope - libcontainer container fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e. Mar 17 17:56:40.642357 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:56:40.668661 containerd[1540]: time="2025-03-17T17:56:40.668634072Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7444d64bbc-9ksk2,Uid:b7ac741c-01d0-4518-8804-4278bbebf46f,Namespace:calico-system,Attempt:0,} returns sandbox id \"fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e\"" Mar 17 17:56:40.767627 containerd[1540]: time="2025-03-17T17:56:40.767171752Z" level=info msg="CreateContainer within sandbox \"fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 17 17:56:40.778029 containerd[1540]: time="2025-03-17T17:56:40.778009014Z" level=info msg="CreateContainer within sandbox \"fddec8ff83d65aff2dd526a761f8fda3add8ba3e645638076adedca532b6cf4e\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"c244d67b04aba0699f611278fcbe64d65f794055eb6d03045bad64374ecfa226\"" Mar 17 17:56:40.779052 containerd[1540]: time="2025-03-17T17:56:40.779030471Z" level=info msg="StartContainer for \"c244d67b04aba0699f611278fcbe64d65f794055eb6d03045bad64374ecfa226\"" Mar 17 17:56:40.800650 systemd[1]: Started cri-containerd-c244d67b04aba0699f611278fcbe64d65f794055eb6d03045bad64374ecfa226.scope - libcontainer container c244d67b04aba0699f611278fcbe64d65f794055eb6d03045bad64374ecfa226. Mar 17 17:56:40.837712 containerd[1540]: time="2025-03-17T17:56:40.837687965Z" level=info msg="StartContainer for \"c244d67b04aba0699f611278fcbe64d65f794055eb6d03045bad64374ecfa226\" returns successfully" Mar 17 17:56:41.272506 sshd[6455]: Connection closed by 147.75.109.163 port 35326 Mar 17 17:56:41.272445 sshd-session[6399]: pam_unix(sshd:session): session closed for user core Mar 17 17:56:41.276615 systemd[1]: sshd@7-139.178.70.104:22-147.75.109.163:35326.service: Deactivated successfully. Mar 17 17:56:41.278998 systemd[1]: session-10.scope: Deactivated successfully. Mar 17 17:56:41.280032 systemd-logind[1520]: Session 10 logged out. Waiting for processes to exit. Mar 17 17:56:41.280612 systemd-logind[1520]: Removed session 10. Mar 17 17:56:41.628287 systemd[1]: run-containerd-runc-k8s.io-c244d67b04aba0699f611278fcbe64d65f794055eb6d03045bad64374ecfa226-runc.ayfSTw.mount: Deactivated successfully. Mar 17 17:56:41.650635 systemd-networkd[1435]: cali812b20a41f2: Gained IPv6LL Mar 17 17:56:41.712343 kubelet[2850]: I0317 17:56:41.712275 2850 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7444d64bbc-9ksk2" podStartSLOduration=5.7122521630000005 podStartE2EDuration="5.712252163s" podCreationTimestamp="2025-03-17 17:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:56:41.710655929 +0000 UTC m=+70.551795189" watchObservedRunningTime="2025-03-17 17:56:41.712252163 +0000 UTC m=+70.553391424" Mar 17 17:56:46.282083 systemd[1]: Started sshd@8-139.178.70.104:22-147.75.109.163:39456.service - OpenSSH per-connection server daemon (147.75.109.163:39456). Mar 17 17:56:46.369312 sshd[6817]: Accepted publickey for core from 147.75.109.163 port 39456 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:56:46.370741 sshd-session[6817]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:56:46.373822 systemd-logind[1520]: New session 11 of user core. Mar 17 17:56:46.378635 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 17 17:56:46.709436 sshd[6819]: Connection closed by 147.75.109.163 port 39456 Mar 17 17:56:46.709855 sshd-session[6817]: pam_unix(sshd:session): session closed for user core Mar 17 17:56:46.713249 systemd[1]: sshd@8-139.178.70.104:22-147.75.109.163:39456.service: Deactivated successfully. Mar 17 17:56:46.715972 systemd[1]: session-11.scope: Deactivated successfully. Mar 17 17:56:46.718657 systemd-logind[1520]: Session 11 logged out. Waiting for processes to exit. Mar 17 17:56:46.719453 systemd-logind[1520]: Removed session 11. Mar 17 17:56:51.719103 systemd[1]: Started sshd@9-139.178.70.104:22-147.75.109.163:39472.service - OpenSSH per-connection server daemon (147.75.109.163:39472). Mar 17 17:56:51.835783 sshd[6842]: Accepted publickey for core from 147.75.109.163 port 39472 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:56:51.836551 sshd-session[6842]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:56:51.839699 systemd-logind[1520]: New session 12 of user core. Mar 17 17:56:51.845313 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 17 17:56:52.008580 sshd[6845]: Connection closed by 147.75.109.163 port 39472 Mar 17 17:56:52.015613 systemd[1]: sshd@9-139.178.70.104:22-147.75.109.163:39472.service: Deactivated successfully. Mar 17 17:56:52.009105 sshd-session[6842]: pam_unix(sshd:session): session closed for user core Mar 17 17:56:52.016521 systemd[1]: session-12.scope: Deactivated successfully. Mar 17 17:56:52.017309 systemd-logind[1520]: Session 12 logged out. Waiting for processes to exit. Mar 17 17:56:52.020847 systemd[1]: Started sshd@10-139.178.70.104:22-147.75.109.163:39486.service - OpenSSH per-connection server daemon (147.75.109.163:39486). Mar 17 17:56:52.022872 systemd-logind[1520]: Removed session 12. Mar 17 17:56:52.071133 sshd[6857]: Accepted publickey for core from 147.75.109.163 port 39486 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:56:52.071974 sshd-session[6857]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:56:52.074328 systemd-logind[1520]: New session 13 of user core. Mar 17 17:56:52.078643 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 17 17:56:52.237252 sshd[6859]: Connection closed by 147.75.109.163 port 39486 Mar 17 17:56:52.244847 systemd[1]: Started sshd@11-139.178.70.104:22-147.75.109.163:39500.service - OpenSSH per-connection server daemon (147.75.109.163:39500). Mar 17 17:56:52.263270 sshd-session[6857]: pam_unix(sshd:session): session closed for user core Mar 17 17:56:52.281909 systemd[1]: sshd@10-139.178.70.104:22-147.75.109.163:39486.service: Deactivated successfully. Mar 17 17:56:52.284608 systemd[1]: session-13.scope: Deactivated successfully. Mar 17 17:56:52.288172 systemd-logind[1520]: Session 13 logged out. Waiting for processes to exit. Mar 17 17:56:52.291327 systemd-logind[1520]: Removed session 13. Mar 17 17:56:52.365832 sshd[6874]: Accepted publickey for core from 147.75.109.163 port 39500 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:56:52.368694 sshd-session[6874]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:56:52.371956 systemd-logind[1520]: New session 14 of user core. Mar 17 17:56:52.380675 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 17 17:56:52.508979 sshd[6878]: Connection closed by 147.75.109.163 port 39500 Mar 17 17:56:52.509707 sshd-session[6874]: pam_unix(sshd:session): session closed for user core Mar 17 17:56:52.511776 systemd-logind[1520]: Session 14 logged out. Waiting for processes to exit. Mar 17 17:56:52.511846 systemd[1]: sshd@11-139.178.70.104:22-147.75.109.163:39500.service: Deactivated successfully. Mar 17 17:56:52.512915 systemd[1]: session-14.scope: Deactivated successfully. Mar 17 17:56:52.513825 systemd-logind[1520]: Removed session 14. Mar 17 17:56:57.518635 systemd[1]: Started sshd@12-139.178.70.104:22-147.75.109.163:35710.service - OpenSSH per-connection server daemon (147.75.109.163:35710). Mar 17 17:56:57.708664 sshd[6893]: Accepted publickey for core from 147.75.109.163 port 35710 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:56:57.709668 sshd-session[6893]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:56:57.712988 systemd-logind[1520]: New session 15 of user core. Mar 17 17:56:57.716659 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 17 17:56:57.876136 sshd[6895]: Connection closed by 147.75.109.163 port 35710 Mar 17 17:56:57.878894 systemd[1]: sshd@12-139.178.70.104:22-147.75.109.163:35710.service: Deactivated successfully. Mar 17 17:56:57.877040 sshd-session[6893]: pam_unix(sshd:session): session closed for user core Mar 17 17:56:57.879955 systemd[1]: session-15.scope: Deactivated successfully. Mar 17 17:56:57.880339 systemd-logind[1520]: Session 15 logged out. Waiting for processes to exit. Mar 17 17:56:57.880910 systemd-logind[1520]: Removed session 15. Mar 17 17:57:02.884739 systemd[1]: Started sshd@13-139.178.70.104:22-147.75.109.163:35726.service - OpenSSH per-connection server daemon (147.75.109.163:35726). Mar 17 17:57:02.995866 sshd[6916]: Accepted publickey for core from 147.75.109.163 port 35726 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:57:02.996963 sshd-session[6916]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:57:02.999736 systemd-logind[1520]: New session 16 of user core. Mar 17 17:57:03.008652 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 17 17:57:03.157843 sshd[6918]: Connection closed by 147.75.109.163 port 35726 Mar 17 17:57:03.158256 sshd-session[6916]: pam_unix(sshd:session): session closed for user core Mar 17 17:57:03.165057 systemd[1]: sshd@13-139.178.70.104:22-147.75.109.163:35726.service: Deactivated successfully. Mar 17 17:57:03.166375 systemd[1]: session-16.scope: Deactivated successfully. Mar 17 17:57:03.166888 systemd-logind[1520]: Session 16 logged out. Waiting for processes to exit. Mar 17 17:57:03.170721 systemd[1]: Started sshd@14-139.178.70.104:22-147.75.109.163:35728.service - OpenSSH per-connection server daemon (147.75.109.163:35728). Mar 17 17:57:03.171679 systemd-logind[1520]: Removed session 16. Mar 17 17:57:03.202814 sshd[6929]: Accepted publickey for core from 147.75.109.163 port 35728 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:57:03.203778 sshd-session[6929]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:57:03.206242 systemd-logind[1520]: New session 17 of user core. Mar 17 17:57:03.211637 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 17 17:57:03.636331 sshd[6931]: Connection closed by 147.75.109.163 port 35728 Mar 17 17:57:03.638152 sshd-session[6929]: pam_unix(sshd:session): session closed for user core Mar 17 17:57:03.643119 systemd[1]: sshd@14-139.178.70.104:22-147.75.109.163:35728.service: Deactivated successfully. Mar 17 17:57:03.644056 systemd[1]: session-17.scope: Deactivated successfully. Mar 17 17:57:03.644462 systemd-logind[1520]: Session 17 logged out. Waiting for processes to exit. Mar 17 17:57:03.649714 systemd[1]: Started sshd@15-139.178.70.104:22-147.75.109.163:35730.service - OpenSSH per-connection server daemon (147.75.109.163:35730). Mar 17 17:57:03.651556 systemd-logind[1520]: Removed session 17. Mar 17 17:57:03.695409 sshd[6940]: Accepted publickey for core from 147.75.109.163 port 35730 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:57:03.696301 sshd-session[6940]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:57:03.698969 systemd-logind[1520]: New session 18 of user core. Mar 17 17:57:03.702649 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 17 17:57:05.261300 sshd[6942]: Connection closed by 147.75.109.163 port 35730 Mar 17 17:57:05.263943 sshd-session[6940]: pam_unix(sshd:session): session closed for user core Mar 17 17:57:05.290142 systemd[1]: Started sshd@16-139.178.70.104:22-147.75.109.163:54642.service - OpenSSH per-connection server daemon (147.75.109.163:54642). Mar 17 17:57:05.290511 systemd[1]: sshd@15-139.178.70.104:22-147.75.109.163:35730.service: Deactivated successfully. Mar 17 17:57:05.295546 systemd[1]: session-18.scope: Deactivated successfully. Mar 17 17:57:05.297488 systemd-logind[1520]: Session 18 logged out. Waiting for processes to exit. Mar 17 17:57:05.299861 systemd-logind[1520]: Removed session 18. Mar 17 17:57:05.386580 sshd[6963]: Accepted publickey for core from 147.75.109.163 port 54642 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:57:05.387870 sshd-session[6963]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:57:05.392608 systemd-logind[1520]: New session 19 of user core. Mar 17 17:57:05.397710 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 17 17:57:05.531295 systemd[1]: run-containerd-runc-k8s.io-ec6d8f09e472dfaf8c3b9662dd55304920a500f0011beea0235b7614aa450325-runc.kIUS3G.mount: Deactivated successfully. Mar 17 17:57:06.138752 sshd[6971]: Connection closed by 147.75.109.163 port 54642 Mar 17 17:57:06.139171 sshd-session[6963]: pam_unix(sshd:session): session closed for user core Mar 17 17:57:06.146220 systemd[1]: sshd@16-139.178.70.104:22-147.75.109.163:54642.service: Deactivated successfully. Mar 17 17:57:06.147547 systemd[1]: session-19.scope: Deactivated successfully. Mar 17 17:57:06.148613 systemd-logind[1520]: Session 19 logged out. Waiting for processes to exit. Mar 17 17:57:06.151715 systemd[1]: Started sshd@17-139.178.70.104:22-147.75.109.163:54648.service - OpenSSH per-connection server daemon (147.75.109.163:54648). Mar 17 17:57:06.152830 systemd-logind[1520]: Removed session 19. Mar 17 17:57:06.220144 sshd[7003]: Accepted publickey for core from 147.75.109.163 port 54648 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:57:06.220927 sshd-session[7003]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:57:06.223777 systemd-logind[1520]: New session 20 of user core. Mar 17 17:57:06.233736 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 17 17:57:06.987047 sshd[7005]: Connection closed by 147.75.109.163 port 54648 Mar 17 17:57:06.987843 sshd-session[7003]: pam_unix(sshd:session): session closed for user core Mar 17 17:57:06.990653 systemd[1]: sshd@17-139.178.70.104:22-147.75.109.163:54648.service: Deactivated successfully. Mar 17 17:57:06.991722 systemd[1]: session-20.scope: Deactivated successfully. Mar 17 17:57:06.992407 systemd-logind[1520]: Session 20 logged out. Waiting for processes to exit. Mar 17 17:57:06.993121 systemd-logind[1520]: Removed session 20. Mar 17 17:57:11.996696 systemd[1]: Started sshd@18-139.178.70.104:22-147.75.109.163:54652.service - OpenSSH per-connection server daemon (147.75.109.163:54652). Mar 17 17:57:12.136758 sshd[7037]: Accepted publickey for core from 147.75.109.163 port 54652 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:57:12.138264 sshd-session[7037]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:57:12.141175 systemd-logind[1520]: New session 21 of user core. Mar 17 17:57:12.146661 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 17 17:57:12.372730 sshd[7039]: Connection closed by 147.75.109.163 port 54652 Mar 17 17:57:12.373682 sshd-session[7037]: pam_unix(sshd:session): session closed for user core Mar 17 17:57:12.381665 systemd-logind[1520]: Session 21 logged out. Waiting for processes to exit. Mar 17 17:57:12.381848 systemd[1]: sshd@18-139.178.70.104:22-147.75.109.163:54652.service: Deactivated successfully. Mar 17 17:57:12.383300 systemd[1]: session-21.scope: Deactivated successfully. Mar 17 17:57:12.384201 systemd-logind[1520]: Removed session 21. Mar 17 17:57:13.456138 systemd[1]: Started sshd@19-139.178.70.104:22-116.120.58.72:54284.service - OpenSSH per-connection server daemon (116.120.58.72:54284). Mar 17 17:57:13.575323 sshd[7050]: banner exchange: Connection from 116.120.58.72 port 54284: invalid format Mar 17 17:57:13.575866 systemd[1]: sshd@19-139.178.70.104:22-116.120.58.72:54284.service: Deactivated successfully. Mar 17 17:57:17.383070 systemd[1]: Started sshd@20-139.178.70.104:22-147.75.109.163:37824.service - OpenSSH per-connection server daemon (147.75.109.163:37824). Mar 17 17:57:17.427059 sshd[7056]: Accepted publickey for core from 147.75.109.163 port 37824 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:57:17.427820 sshd-session[7056]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:57:17.430593 systemd-logind[1520]: New session 22 of user core. Mar 17 17:57:17.436749 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 17 17:57:17.553659 sshd[7058]: Connection closed by 147.75.109.163 port 37824 Mar 17 17:57:17.554110 sshd-session[7056]: pam_unix(sshd:session): session closed for user core Mar 17 17:57:17.556142 systemd[1]: sshd@20-139.178.70.104:22-147.75.109.163:37824.service: Deactivated successfully. Mar 17 17:57:17.557142 systemd[1]: session-22.scope: Deactivated successfully. Mar 17 17:57:17.557514 systemd-logind[1520]: Session 22 logged out. Waiting for processes to exit. Mar 17 17:57:17.558237 systemd-logind[1520]: Removed session 22. Mar 17 17:57:22.564107 systemd[1]: Started sshd@21-139.178.70.104:22-147.75.109.163:37826.service - OpenSSH per-connection server daemon (147.75.109.163:37826). Mar 17 17:57:22.611001 sshd[7079]: Accepted publickey for core from 147.75.109.163 port 37826 ssh2: RSA SHA256:L4fE5H9Usx39heaDVSj6Bx08oayWm2hEtQ3lEZ1U1tM Mar 17 17:57:22.611833 sshd-session[7079]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:57:22.614593 systemd-logind[1520]: New session 23 of user core. Mar 17 17:57:22.621663 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 17 17:57:22.755014 sshd[7081]: Connection closed by 147.75.109.163 port 37826 Mar 17 17:57:22.755502 sshd-session[7079]: pam_unix(sshd:session): session closed for user core Mar 17 17:57:22.757600 systemd[1]: sshd@21-139.178.70.104:22-147.75.109.163:37826.service: Deactivated successfully. Mar 17 17:57:22.758731 systemd[1]: session-23.scope: Deactivated successfully. Mar 17 17:57:22.759243 systemd-logind[1520]: Session 23 logged out. Waiting for processes to exit. Mar 17 17:57:22.759813 systemd-logind[1520]: Removed session 23.