Mar 20 21:31:51.726344 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Mar 20 19:36:47 -00 2025 Mar 20 21:31:51.726360 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=619bfa043b53ac975036e415994a80721794ae8277072d0a93c174b4f7768019 Mar 20 21:31:51.726367 kernel: Disabled fast string operations Mar 20 21:31:51.726371 kernel: BIOS-provided physical RAM map: Mar 20 21:31:51.726375 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Mar 20 21:31:51.726379 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Mar 20 21:31:51.726385 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Mar 20 21:31:51.726389 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Mar 20 21:31:51.726394 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Mar 20 21:31:51.726398 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Mar 20 21:31:51.726402 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Mar 20 21:31:51.726406 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Mar 20 21:31:51.726410 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Mar 20 21:31:51.726415 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Mar 20 21:31:51.726421 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Mar 20 21:31:51.726426 kernel: NX (Execute Disable) protection: active Mar 20 21:31:51.726431 kernel: APIC: Static calls initialized Mar 20 21:31:51.726435 kernel: SMBIOS 2.7 present. Mar 20 21:31:51.726440 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Mar 20 21:31:51.726445 kernel: vmware: hypercall mode: 0x00 Mar 20 21:31:51.726450 kernel: Hypervisor detected: VMware Mar 20 21:31:51.726455 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Mar 20 21:31:51.726461 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Mar 20 21:31:51.726465 kernel: vmware: using clock offset of 2557252637 ns Mar 20 21:31:51.726470 kernel: tsc: Detected 3408.000 MHz processor Mar 20 21:31:51.726475 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 20 21:31:51.726481 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 20 21:31:51.726485 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Mar 20 21:31:51.726490 kernel: total RAM covered: 3072M Mar 20 21:31:51.726495 kernel: Found optimal setting for mtrr clean up Mar 20 21:31:51.726501 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Mar 20 21:31:51.726506 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Mar 20 21:31:51.726512 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 20 21:31:51.726516 kernel: Using GB pages for direct mapping Mar 20 21:31:51.726521 kernel: ACPI: Early table checksum verification disabled Mar 20 21:31:51.726526 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Mar 20 21:31:51.726531 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Mar 20 21:31:51.726536 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Mar 20 21:31:51.726541 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Mar 20 21:31:51.726546 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Mar 20 21:31:51.726553 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Mar 20 21:31:51.726558 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Mar 20 21:31:51.726564 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Mar 20 21:31:51.726569 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Mar 20 21:31:51.726574 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Mar 20 21:31:51.726579 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Mar 20 21:31:51.726585 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Mar 20 21:31:51.726590 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Mar 20 21:31:51.726595 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Mar 20 21:31:51.726601 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Mar 20 21:31:51.726605 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Mar 20 21:31:51.726611 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Mar 20 21:31:51.726616 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Mar 20 21:31:51.726621 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Mar 20 21:31:51.726626 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Mar 20 21:31:51.726632 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Mar 20 21:31:51.726637 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Mar 20 21:31:51.726642 kernel: system APIC only can use physical flat Mar 20 21:31:51.726647 kernel: APIC: Switched APIC routing to: physical flat Mar 20 21:31:51.726652 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Mar 20 21:31:51.726657 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Mar 20 21:31:51.726662 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Mar 20 21:31:51.726667 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Mar 20 21:31:51.726672 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Mar 20 21:31:51.726677 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Mar 20 21:31:51.726683 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Mar 20 21:31:51.726688 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Mar 20 21:31:51.726693 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Mar 20 21:31:51.726698 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Mar 20 21:31:51.726703 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Mar 20 21:31:51.726708 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Mar 20 21:31:51.726713 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Mar 20 21:31:51.726717 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Mar 20 21:31:51.726722 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Mar 20 21:31:51.726727 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Mar 20 21:31:51.726733 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Mar 20 21:31:51.726738 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Mar 20 21:31:51.726743 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Mar 20 21:31:51.726748 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Mar 20 21:31:51.726753 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Mar 20 21:31:51.726758 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Mar 20 21:31:51.726763 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Mar 20 21:31:51.726768 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Mar 20 21:31:51.726773 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Mar 20 21:31:51.726778 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Mar 20 21:31:51.726791 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Mar 20 21:31:51.726798 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Mar 20 21:31:51.726803 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Mar 20 21:31:51.726808 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Mar 20 21:31:51.726813 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Mar 20 21:31:51.726818 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Mar 20 21:31:51.726823 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Mar 20 21:31:51.726828 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Mar 20 21:31:51.726833 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Mar 20 21:31:51.726838 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Mar 20 21:31:51.726843 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Mar 20 21:31:51.726850 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Mar 20 21:31:51.726855 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Mar 20 21:31:51.726860 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Mar 20 21:31:51.726865 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Mar 20 21:31:51.726885 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Mar 20 21:31:51.726890 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Mar 20 21:31:51.726894 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Mar 20 21:31:51.726899 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Mar 20 21:31:51.726904 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Mar 20 21:31:51.726909 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Mar 20 21:31:51.726915 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Mar 20 21:31:51.726920 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Mar 20 21:31:51.726924 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Mar 20 21:31:51.726929 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Mar 20 21:31:51.726934 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Mar 20 21:31:51.726939 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Mar 20 21:31:51.726944 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Mar 20 21:31:51.726949 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Mar 20 21:31:51.726953 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Mar 20 21:31:51.726958 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Mar 20 21:31:51.726964 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Mar 20 21:31:51.726969 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Mar 20 21:31:51.726977 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Mar 20 21:31:51.726983 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Mar 20 21:31:51.726988 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Mar 20 21:31:51.726994 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Mar 20 21:31:51.726999 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Mar 20 21:31:51.727004 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Mar 20 21:31:51.727010 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Mar 20 21:31:51.727015 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Mar 20 21:31:51.727020 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Mar 20 21:31:51.727026 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Mar 20 21:31:51.727031 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Mar 20 21:31:51.727036 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Mar 20 21:31:51.727041 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Mar 20 21:31:51.727047 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Mar 20 21:31:51.727052 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Mar 20 21:31:51.727057 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Mar 20 21:31:51.727062 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Mar 20 21:31:51.727068 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Mar 20 21:31:51.727073 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Mar 20 21:31:51.727078 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Mar 20 21:31:51.727083 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Mar 20 21:31:51.727089 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Mar 20 21:31:51.727094 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Mar 20 21:31:51.727099 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Mar 20 21:31:51.727104 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Mar 20 21:31:51.727109 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Mar 20 21:31:51.727114 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Mar 20 21:31:51.727120 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Mar 20 21:31:51.727126 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Mar 20 21:31:51.727131 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Mar 20 21:31:51.727136 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Mar 20 21:31:51.727141 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Mar 20 21:31:51.727146 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Mar 20 21:31:51.727151 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Mar 20 21:31:51.727156 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Mar 20 21:31:51.727161 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Mar 20 21:31:51.727167 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Mar 20 21:31:51.727173 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Mar 20 21:31:51.727178 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Mar 20 21:31:51.727183 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Mar 20 21:31:51.727188 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Mar 20 21:31:51.727194 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Mar 20 21:31:51.727199 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Mar 20 21:31:51.727204 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Mar 20 21:31:51.727209 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Mar 20 21:31:51.727214 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Mar 20 21:31:51.727219 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Mar 20 21:31:51.727225 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Mar 20 21:31:51.727231 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Mar 20 21:31:51.727236 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Mar 20 21:31:51.727241 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Mar 20 21:31:51.727246 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Mar 20 21:31:51.727251 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Mar 20 21:31:51.727257 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Mar 20 21:31:51.727262 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Mar 20 21:31:51.727267 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Mar 20 21:31:51.727272 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Mar 20 21:31:51.727277 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Mar 20 21:31:51.727284 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Mar 20 21:31:51.727289 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Mar 20 21:31:51.727294 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Mar 20 21:31:51.727299 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Mar 20 21:31:51.727304 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Mar 20 21:31:51.727309 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Mar 20 21:31:51.727314 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Mar 20 21:31:51.727320 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Mar 20 21:31:51.727325 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Mar 20 21:31:51.727330 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Mar 20 21:31:51.727336 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Mar 20 21:31:51.727341 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Mar 20 21:31:51.727347 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Mar 20 21:31:51.727352 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Mar 20 21:31:51.727357 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Mar 20 21:31:51.727363 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Mar 20 21:31:51.727368 kernel: Zone ranges: Mar 20 21:31:51.727374 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 20 21:31:51.727379 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Mar 20 21:31:51.727385 kernel: Normal empty Mar 20 21:31:51.727391 kernel: Movable zone start for each node Mar 20 21:31:51.727396 kernel: Early memory node ranges Mar 20 21:31:51.727401 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Mar 20 21:31:51.727407 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Mar 20 21:31:51.727412 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Mar 20 21:31:51.727417 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Mar 20 21:31:51.727422 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 20 21:31:51.727428 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Mar 20 21:31:51.727433 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Mar 20 21:31:51.727440 kernel: ACPI: PM-Timer IO Port: 0x1008 Mar 20 21:31:51.727445 kernel: system APIC only can use physical flat Mar 20 21:31:51.727450 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Mar 20 21:31:51.727455 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Mar 20 21:31:51.727461 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Mar 20 21:31:51.727466 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Mar 20 21:31:51.727471 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Mar 20 21:31:51.727476 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Mar 20 21:31:51.727481 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Mar 20 21:31:51.727488 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Mar 20 21:31:51.727493 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Mar 20 21:31:51.727498 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Mar 20 21:31:51.727503 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Mar 20 21:31:51.727508 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Mar 20 21:31:51.727514 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Mar 20 21:31:51.727519 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Mar 20 21:31:51.727524 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Mar 20 21:31:51.727529 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Mar 20 21:31:51.727535 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Mar 20 21:31:51.727541 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Mar 20 21:31:51.727546 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Mar 20 21:31:51.727551 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Mar 20 21:31:51.727557 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Mar 20 21:31:51.727562 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Mar 20 21:31:51.727567 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Mar 20 21:31:51.727572 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Mar 20 21:31:51.727577 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Mar 20 21:31:51.727583 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Mar 20 21:31:51.727588 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Mar 20 21:31:51.727594 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Mar 20 21:31:51.727600 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Mar 20 21:31:51.727605 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Mar 20 21:31:51.727610 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Mar 20 21:31:51.727615 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Mar 20 21:31:51.727620 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Mar 20 21:31:51.727626 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Mar 20 21:31:51.727631 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Mar 20 21:31:51.727636 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Mar 20 21:31:51.727642 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Mar 20 21:31:51.727647 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Mar 20 21:31:51.727653 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Mar 20 21:31:51.727658 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Mar 20 21:31:51.727663 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Mar 20 21:31:51.727668 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Mar 20 21:31:51.727673 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Mar 20 21:31:51.727678 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Mar 20 21:31:51.727684 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Mar 20 21:31:51.727689 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Mar 20 21:31:51.727695 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Mar 20 21:31:51.727700 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Mar 20 21:31:51.727706 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Mar 20 21:31:51.727711 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Mar 20 21:31:51.727716 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Mar 20 21:31:51.727721 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Mar 20 21:31:51.727727 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Mar 20 21:31:51.727732 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Mar 20 21:31:51.727737 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Mar 20 21:31:51.727742 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Mar 20 21:31:51.727748 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Mar 20 21:31:51.727754 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Mar 20 21:31:51.727759 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Mar 20 21:31:51.727764 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Mar 20 21:31:51.727769 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Mar 20 21:31:51.727775 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Mar 20 21:31:51.727780 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Mar 20 21:31:51.728729 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Mar 20 21:31:51.728736 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Mar 20 21:31:51.728744 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Mar 20 21:31:51.728749 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Mar 20 21:31:51.728755 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Mar 20 21:31:51.728760 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Mar 20 21:31:51.728765 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Mar 20 21:31:51.728770 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Mar 20 21:31:51.728775 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Mar 20 21:31:51.728781 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Mar 20 21:31:51.728793 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Mar 20 21:31:51.728799 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Mar 20 21:31:51.728806 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Mar 20 21:31:51.728811 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Mar 20 21:31:51.728816 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Mar 20 21:31:51.728822 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Mar 20 21:31:51.728827 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Mar 20 21:31:51.728832 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Mar 20 21:31:51.728837 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Mar 20 21:31:51.728843 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Mar 20 21:31:51.728848 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Mar 20 21:31:51.728853 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Mar 20 21:31:51.728859 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Mar 20 21:31:51.728865 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Mar 20 21:31:51.728870 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Mar 20 21:31:51.728875 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Mar 20 21:31:51.728880 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Mar 20 21:31:51.728885 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Mar 20 21:31:51.728891 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Mar 20 21:31:51.728896 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Mar 20 21:31:51.728901 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Mar 20 21:31:51.728906 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Mar 20 21:31:51.728913 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Mar 20 21:31:51.728918 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Mar 20 21:31:51.728923 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Mar 20 21:31:51.728928 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Mar 20 21:31:51.728934 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Mar 20 21:31:51.728939 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Mar 20 21:31:51.728944 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Mar 20 21:31:51.728949 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Mar 20 21:31:51.728954 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Mar 20 21:31:51.728961 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Mar 20 21:31:51.728966 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Mar 20 21:31:51.728971 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Mar 20 21:31:51.728976 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Mar 20 21:31:51.728982 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Mar 20 21:31:51.728987 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Mar 20 21:31:51.728992 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Mar 20 21:31:51.728998 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Mar 20 21:31:51.729003 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Mar 20 21:31:51.729008 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Mar 20 21:31:51.729014 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Mar 20 21:31:51.729020 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Mar 20 21:31:51.729025 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Mar 20 21:31:51.729030 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Mar 20 21:31:51.729036 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Mar 20 21:31:51.729041 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Mar 20 21:31:51.729046 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Mar 20 21:31:51.729051 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Mar 20 21:31:51.729056 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Mar 20 21:31:51.729062 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Mar 20 21:31:51.729068 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Mar 20 21:31:51.729073 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Mar 20 21:31:51.729079 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Mar 20 21:31:51.729084 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Mar 20 21:31:51.729089 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Mar 20 21:31:51.729095 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Mar 20 21:31:51.729100 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 20 21:31:51.729105 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Mar 20 21:31:51.729111 kernel: TSC deadline timer available Mar 20 21:31:51.729117 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Mar 20 21:31:51.729122 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Mar 20 21:31:51.729128 kernel: Booting paravirtualized kernel on VMware hypervisor Mar 20 21:31:51.729133 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 20 21:31:51.729139 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Mar 20 21:31:51.729145 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Mar 20 21:31:51.729150 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Mar 20 21:31:51.729155 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Mar 20 21:31:51.729161 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Mar 20 21:31:51.729167 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Mar 20 21:31:51.729172 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Mar 20 21:31:51.729177 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Mar 20 21:31:51.729189 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Mar 20 21:31:51.729195 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Mar 20 21:31:51.729201 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Mar 20 21:31:51.729207 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Mar 20 21:31:51.729212 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Mar 20 21:31:51.729219 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Mar 20 21:31:51.729224 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Mar 20 21:31:51.729230 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Mar 20 21:31:51.729235 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Mar 20 21:31:51.729241 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Mar 20 21:31:51.729246 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Mar 20 21:31:51.729253 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=619bfa043b53ac975036e415994a80721794ae8277072d0a93c174b4f7768019 Mar 20 21:31:51.729259 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 20 21:31:51.729265 kernel: random: crng init done Mar 20 21:31:51.729271 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Mar 20 21:31:51.729277 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Mar 20 21:31:51.729282 kernel: printk: log_buf_len min size: 262144 bytes Mar 20 21:31:51.729288 kernel: printk: log_buf_len: 1048576 bytes Mar 20 21:31:51.729293 kernel: printk: early log buf free: 239648(91%) Mar 20 21:31:51.729299 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 20 21:31:51.729305 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 20 21:31:51.729310 kernel: Fallback order for Node 0: 0 Mar 20 21:31:51.729316 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Mar 20 21:31:51.729323 kernel: Policy zone: DMA32 Mar 20 21:31:51.729328 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 20 21:31:51.729335 kernel: Memory: 1932276K/2096628K available (14336K kernel code, 2304K rwdata, 25060K rodata, 43592K init, 1472K bss, 164092K reserved, 0K cma-reserved) Mar 20 21:31:51.729341 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Mar 20 21:31:51.729347 kernel: ftrace: allocating 37985 entries in 149 pages Mar 20 21:31:51.729354 kernel: ftrace: allocated 149 pages with 4 groups Mar 20 21:31:51.729360 kernel: Dynamic Preempt: voluntary Mar 20 21:31:51.729365 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 20 21:31:51.729371 kernel: rcu: RCU event tracing is enabled. Mar 20 21:31:51.729377 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Mar 20 21:31:51.729383 kernel: Trampoline variant of Tasks RCU enabled. Mar 20 21:31:51.729388 kernel: Rude variant of Tasks RCU enabled. Mar 20 21:31:51.729394 kernel: Tracing variant of Tasks RCU enabled. Mar 20 21:31:51.729400 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 20 21:31:51.729406 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Mar 20 21:31:51.729412 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Mar 20 21:31:51.729418 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Mar 20 21:31:51.729423 kernel: Console: colour VGA+ 80x25 Mar 20 21:31:51.729430 kernel: printk: console [tty0] enabled Mar 20 21:31:51.729436 kernel: printk: console [ttyS0] enabled Mar 20 21:31:51.729441 kernel: ACPI: Core revision 20230628 Mar 20 21:31:51.729447 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Mar 20 21:31:51.729453 kernel: APIC: Switch to symmetric I/O mode setup Mar 20 21:31:51.729459 kernel: x2apic enabled Mar 20 21:31:51.729465 kernel: APIC: Switched APIC routing to: physical x2apic Mar 20 21:31:51.729471 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 20 21:31:51.729477 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Mar 20 21:31:51.729483 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Mar 20 21:31:51.729488 kernel: Disabled fast string operations Mar 20 21:31:51.729494 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Mar 20 21:31:51.729500 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Mar 20 21:31:51.729506 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 20 21:31:51.729511 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Mar 20 21:31:51.729518 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Mar 20 21:31:51.729524 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Mar 20 21:31:51.729530 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 20 21:31:51.729535 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Mar 20 21:31:51.729541 kernel: RETBleed: Mitigation: Enhanced IBRS Mar 20 21:31:51.729547 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 20 21:31:51.729553 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 20 21:31:51.729558 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Mar 20 21:31:51.729564 kernel: SRBDS: Unknown: Dependent on hypervisor status Mar 20 21:31:51.729571 kernel: GDS: Unknown: Dependent on hypervisor status Mar 20 21:31:51.729577 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 20 21:31:51.729583 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 20 21:31:51.729588 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 20 21:31:51.729594 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 20 21:31:51.729600 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Mar 20 21:31:51.729605 kernel: Freeing SMP alternatives memory: 32K Mar 20 21:31:51.729611 kernel: pid_max: default: 131072 minimum: 1024 Mar 20 21:31:51.729617 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 20 21:31:51.729623 kernel: landlock: Up and running. Mar 20 21:31:51.729629 kernel: SELinux: Initializing. Mar 20 21:31:51.729635 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 20 21:31:51.729641 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 20 21:31:51.729647 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Mar 20 21:31:51.729652 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Mar 20 21:31:51.729658 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Mar 20 21:31:51.729664 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Mar 20 21:31:51.729671 kernel: Performance Events: Skylake events, core PMU driver. Mar 20 21:31:51.729676 kernel: core: CPUID marked event: 'cpu cycles' unavailable Mar 20 21:31:51.729682 kernel: core: CPUID marked event: 'instructions' unavailable Mar 20 21:31:51.729688 kernel: core: CPUID marked event: 'bus cycles' unavailable Mar 20 21:31:51.729693 kernel: core: CPUID marked event: 'cache references' unavailable Mar 20 21:31:51.729699 kernel: core: CPUID marked event: 'cache misses' unavailable Mar 20 21:31:51.729704 kernel: core: CPUID marked event: 'branch instructions' unavailable Mar 20 21:31:51.729710 kernel: core: CPUID marked event: 'branch misses' unavailable Mar 20 21:31:51.729715 kernel: ... version: 1 Mar 20 21:31:51.729722 kernel: ... bit width: 48 Mar 20 21:31:51.729728 kernel: ... generic registers: 4 Mar 20 21:31:51.729733 kernel: ... value mask: 0000ffffffffffff Mar 20 21:31:51.729739 kernel: ... max period: 000000007fffffff Mar 20 21:31:51.729744 kernel: ... fixed-purpose events: 0 Mar 20 21:31:51.729750 kernel: ... event mask: 000000000000000f Mar 20 21:31:51.729755 kernel: signal: max sigframe size: 1776 Mar 20 21:31:51.729761 kernel: rcu: Hierarchical SRCU implementation. Mar 20 21:31:51.729767 kernel: rcu: Max phase no-delay instances is 400. Mar 20 21:31:51.729774 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 20 21:31:51.729780 kernel: smp: Bringing up secondary CPUs ... Mar 20 21:31:51.730169 kernel: smpboot: x86: Booting SMP configuration: Mar 20 21:31:51.730178 kernel: .... node #0, CPUs: #1 Mar 20 21:31:51.730184 kernel: Disabled fast string operations Mar 20 21:31:51.730190 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Mar 20 21:31:51.730196 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Mar 20 21:31:51.730202 kernel: smp: Brought up 1 node, 2 CPUs Mar 20 21:31:51.730210 kernel: smpboot: Max logical packages: 128 Mar 20 21:31:51.730216 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Mar 20 21:31:51.730223 kernel: devtmpfs: initialized Mar 20 21:31:51.730229 kernel: x86/mm: Memory block size: 128MB Mar 20 21:31:51.730235 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Mar 20 21:31:51.730241 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 20 21:31:51.730247 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Mar 20 21:31:51.730253 kernel: pinctrl core: initialized pinctrl subsystem Mar 20 21:31:51.730259 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 20 21:31:51.730265 kernel: audit: initializing netlink subsys (disabled) Mar 20 21:31:51.730270 kernel: audit: type=2000 audit(1742506310.064:1): state=initialized audit_enabled=0 res=1 Mar 20 21:31:51.730277 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 20 21:31:51.730283 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 20 21:31:51.730289 kernel: cpuidle: using governor menu Mar 20 21:31:51.730295 kernel: Simple Boot Flag at 0x36 set to 0x80 Mar 20 21:31:51.730301 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 20 21:31:51.730306 kernel: dca service started, version 1.12.1 Mar 20 21:31:51.730312 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Mar 20 21:31:51.730318 kernel: PCI: Using configuration type 1 for base access Mar 20 21:31:51.730324 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 20 21:31:51.730331 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 20 21:31:51.730337 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 20 21:31:51.730343 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 20 21:31:51.730349 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 20 21:31:51.730355 kernel: ACPI: Added _OSI(Module Device) Mar 20 21:31:51.730361 kernel: ACPI: Added _OSI(Processor Device) Mar 20 21:31:51.730367 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 20 21:31:51.730372 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 20 21:31:51.730378 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 20 21:31:51.730385 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Mar 20 21:31:51.730391 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 20 21:31:51.730397 kernel: ACPI: Interpreter enabled Mar 20 21:31:51.730403 kernel: ACPI: PM: (supports S0 S1 S5) Mar 20 21:31:51.730409 kernel: ACPI: Using IOAPIC for interrupt routing Mar 20 21:31:51.730415 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 20 21:31:51.730421 kernel: PCI: Using E820 reservations for host bridge windows Mar 20 21:31:51.730427 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Mar 20 21:31:51.732428 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Mar 20 21:31:51.732516 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 20 21:31:51.732571 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Mar 20 21:31:51.734027 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Mar 20 21:31:51.734038 kernel: PCI host bridge to bus 0000:00 Mar 20 21:31:51.734097 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 20 21:31:51.734146 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Mar 20 21:31:51.734194 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 20 21:31:51.734240 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 20 21:31:51.734285 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Mar 20 21:31:51.734330 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Mar 20 21:31:51.734391 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Mar 20 21:31:51.734454 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Mar 20 21:31:51.734513 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Mar 20 21:31:51.734570 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Mar 20 21:31:51.734622 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Mar 20 21:31:51.734673 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Mar 20 21:31:51.734723 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Mar 20 21:31:51.734774 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Mar 20 21:31:51.734882 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Mar 20 21:31:51.734942 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Mar 20 21:31:51.734993 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Mar 20 21:31:51.735046 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Mar 20 21:31:51.735101 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Mar 20 21:31:51.735153 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Mar 20 21:31:51.735203 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Mar 20 21:31:51.735260 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Mar 20 21:31:51.735310 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Mar 20 21:31:51.735359 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Mar 20 21:31:51.735409 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Mar 20 21:31:51.735459 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Mar 20 21:31:51.735509 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 20 21:31:51.735562 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Mar 20 21:31:51.735620 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.735718 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.735779 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.735855 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.735913 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.735964 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.736022 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.736074 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.736129 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.736181 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.736237 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.736288 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.736342 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.736397 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.736452 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.736505 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.736561 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.736614 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.736671 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.736724 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.736781 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.737031 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.737088 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.737140 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.737199 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.737251 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.737306 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.737358 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.737414 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.737465 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.737522 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.737574 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.737629 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.737681 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.737737 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.737796 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.737854 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.737909 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.737964 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.738016 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.738071 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.738122 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.738177 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.738232 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.738296 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.738356 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.738412 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.738464 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.739978 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.740038 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.740094 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.740146 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.740200 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.740253 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.740311 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.740365 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.740419 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.740471 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.740525 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.740576 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.740632 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.740684 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.740741 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Mar 20 21:31:51.740837 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.740893 kernel: pci_bus 0000:01: extended config space not accessible Mar 20 21:31:51.740946 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Mar 20 21:31:51.740998 kernel: pci_bus 0000:02: extended config space not accessible Mar 20 21:31:51.741007 kernel: acpiphp: Slot [32] registered Mar 20 21:31:51.741016 kernel: acpiphp: Slot [33] registered Mar 20 21:31:51.741021 kernel: acpiphp: Slot [34] registered Mar 20 21:31:51.741027 kernel: acpiphp: Slot [35] registered Mar 20 21:31:51.741033 kernel: acpiphp: Slot [36] registered Mar 20 21:31:51.741039 kernel: acpiphp: Slot [37] registered Mar 20 21:31:51.741045 kernel: acpiphp: Slot [38] registered Mar 20 21:31:51.741051 kernel: acpiphp: Slot [39] registered Mar 20 21:31:51.741056 kernel: acpiphp: Slot [40] registered Mar 20 21:31:51.741062 kernel: acpiphp: Slot [41] registered Mar 20 21:31:51.741068 kernel: acpiphp: Slot [42] registered Mar 20 21:31:51.741075 kernel: acpiphp: Slot [43] registered Mar 20 21:31:51.741081 kernel: acpiphp: Slot [44] registered Mar 20 21:31:51.741087 kernel: acpiphp: Slot [45] registered Mar 20 21:31:51.741093 kernel: acpiphp: Slot [46] registered Mar 20 21:31:51.741099 kernel: acpiphp: Slot [47] registered Mar 20 21:31:51.741104 kernel: acpiphp: Slot [48] registered Mar 20 21:31:51.741110 kernel: acpiphp: Slot [49] registered Mar 20 21:31:51.741116 kernel: acpiphp: Slot [50] registered Mar 20 21:31:51.741122 kernel: acpiphp: Slot [51] registered Mar 20 21:31:51.741129 kernel: acpiphp: Slot [52] registered Mar 20 21:31:51.741135 kernel: acpiphp: Slot [53] registered Mar 20 21:31:51.741140 kernel: acpiphp: Slot [54] registered Mar 20 21:31:51.741146 kernel: acpiphp: Slot [55] registered Mar 20 21:31:51.741152 kernel: acpiphp: Slot [56] registered Mar 20 21:31:51.741158 kernel: acpiphp: Slot [57] registered Mar 20 21:31:51.741163 kernel: acpiphp: Slot [58] registered Mar 20 21:31:51.741169 kernel: acpiphp: Slot [59] registered Mar 20 21:31:51.741175 kernel: acpiphp: Slot [60] registered Mar 20 21:31:51.741181 kernel: acpiphp: Slot [61] registered Mar 20 21:31:51.741187 kernel: acpiphp: Slot [62] registered Mar 20 21:31:51.741193 kernel: acpiphp: Slot [63] registered Mar 20 21:31:51.741243 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Mar 20 21:31:51.741295 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Mar 20 21:31:51.741345 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Mar 20 21:31:51.741395 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Mar 20 21:31:51.741446 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Mar 20 21:31:51.741499 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Mar 20 21:31:51.741550 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Mar 20 21:31:51.741601 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Mar 20 21:31:51.741651 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Mar 20 21:31:51.741708 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Mar 20 21:31:51.741761 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Mar 20 21:31:51.741860 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Mar 20 21:31:51.741949 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Mar 20 21:31:51.742023 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Mar 20 21:31:51.742077 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Mar 20 21:31:51.742144 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Mar 20 21:31:51.742195 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Mar 20 21:31:51.742245 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Mar 20 21:31:51.742294 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Mar 20 21:31:51.742345 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Mar 20 21:31:51.742398 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Mar 20 21:31:51.742448 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Mar 20 21:31:51.742500 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Mar 20 21:31:51.742550 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Mar 20 21:31:51.742599 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Mar 20 21:31:51.742648 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Mar 20 21:31:51.742716 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Mar 20 21:31:51.742797 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Mar 20 21:31:51.742853 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Mar 20 21:31:51.742903 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Mar 20 21:31:51.742954 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Mar 20 21:31:51.743003 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Mar 20 21:31:51.743057 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Mar 20 21:31:51.743108 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Mar 20 21:31:51.743158 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Mar 20 21:31:51.743210 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Mar 20 21:31:51.743271 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Mar 20 21:31:51.743344 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Mar 20 21:31:51.743396 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Mar 20 21:31:51.743446 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Mar 20 21:31:51.743500 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Mar 20 21:31:51.743559 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Mar 20 21:31:51.743632 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Mar 20 21:31:51.743685 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Mar 20 21:31:51.743737 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Mar 20 21:31:51.743796 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Mar 20 21:31:51.743849 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Mar 20 21:31:51.743903 kernel: pci 0000:0b:00.0: supports D1 D2 Mar 20 21:31:51.743958 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 20 21:31:51.744017 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Mar 20 21:31:51.744070 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Mar 20 21:31:51.744122 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Mar 20 21:31:51.744174 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Mar 20 21:31:51.744225 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Mar 20 21:31:51.744275 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Mar 20 21:31:51.744330 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Mar 20 21:31:51.744380 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Mar 20 21:31:51.744431 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Mar 20 21:31:51.744482 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Mar 20 21:31:51.744533 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Mar 20 21:31:51.744584 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Mar 20 21:31:51.744635 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Mar 20 21:31:51.744686 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Mar 20 21:31:51.744740 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Mar 20 21:31:51.744861 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Mar 20 21:31:51.744915 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Mar 20 21:31:51.744966 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Mar 20 21:31:51.745017 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Mar 20 21:31:51.745068 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Mar 20 21:31:51.745119 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Mar 20 21:31:51.745170 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Mar 20 21:31:51.745224 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Mar 20 21:31:51.745276 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Mar 20 21:31:51.745327 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Mar 20 21:31:51.745377 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Mar 20 21:31:51.745428 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Mar 20 21:31:51.745480 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Mar 20 21:31:51.745531 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Mar 20 21:31:51.745581 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Mar 20 21:31:51.745634 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Mar 20 21:31:51.745686 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Mar 20 21:31:51.745736 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Mar 20 21:31:51.745801 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Mar 20 21:31:51.745858 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Mar 20 21:31:51.745910 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Mar 20 21:31:51.745961 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Mar 20 21:31:51.746014 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Mar 20 21:31:51.746065 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Mar 20 21:31:51.746116 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Mar 20 21:31:51.746167 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Mar 20 21:31:51.746218 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Mar 20 21:31:51.746270 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Mar 20 21:31:51.746321 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Mar 20 21:31:51.746372 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Mar 20 21:31:51.746426 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Mar 20 21:31:51.746477 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Mar 20 21:31:51.746528 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Mar 20 21:31:51.746579 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Mar 20 21:31:51.746630 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Mar 20 21:31:51.746680 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Mar 20 21:31:51.746731 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Mar 20 21:31:51.746782 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Mar 20 21:31:51.746847 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Mar 20 21:31:51.746897 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Mar 20 21:31:51.746947 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Mar 20 21:31:51.747001 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Mar 20 21:31:51.747053 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Mar 20 21:31:51.747104 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Mar 20 21:31:51.747155 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Mar 20 21:31:51.747206 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Mar 20 21:31:51.747259 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Mar 20 21:31:51.747310 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Mar 20 21:31:51.747362 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Mar 20 21:31:51.747414 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Mar 20 21:31:51.747465 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Mar 20 21:31:51.747516 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Mar 20 21:31:51.747566 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Mar 20 21:31:51.747619 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Mar 20 21:31:51.747673 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Mar 20 21:31:51.747723 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Mar 20 21:31:51.747775 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Mar 20 21:31:51.747921 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Mar 20 21:31:51.747974 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Mar 20 21:31:51.748025 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Mar 20 21:31:51.748076 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Mar 20 21:31:51.748126 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Mar 20 21:31:51.748180 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Mar 20 21:31:51.748229 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Mar 20 21:31:51.748279 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Mar 20 21:31:51.748288 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Mar 20 21:31:51.748295 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Mar 20 21:31:51.748301 kernel: ACPI: PCI: Interrupt link LNKB disabled Mar 20 21:31:51.748307 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 20 21:31:51.748313 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Mar 20 21:31:51.748320 kernel: iommu: Default domain type: Translated Mar 20 21:31:51.748326 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 20 21:31:51.748332 kernel: PCI: Using ACPI for IRQ routing Mar 20 21:31:51.748338 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 20 21:31:51.748344 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Mar 20 21:31:51.748350 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Mar 20 21:31:51.748399 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Mar 20 21:31:51.748449 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Mar 20 21:31:51.748498 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 20 21:31:51.748509 kernel: vgaarb: loaded Mar 20 21:31:51.748515 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Mar 20 21:31:51.748521 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Mar 20 21:31:51.748527 kernel: clocksource: Switched to clocksource tsc-early Mar 20 21:31:51.748532 kernel: VFS: Disk quotas dquot_6.6.0 Mar 20 21:31:51.748538 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 20 21:31:51.748544 kernel: pnp: PnP ACPI init Mar 20 21:31:51.748596 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Mar 20 21:31:51.748646 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Mar 20 21:31:51.748693 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Mar 20 21:31:51.748743 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Mar 20 21:31:51.748832 kernel: pnp 00:06: [dma 2] Mar 20 21:31:51.748886 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Mar 20 21:31:51.748933 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Mar 20 21:31:51.748980 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Mar 20 21:31:51.748990 kernel: pnp: PnP ACPI: found 8 devices Mar 20 21:31:51.748997 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 20 21:31:51.749003 kernel: NET: Registered PF_INET protocol family Mar 20 21:31:51.749009 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 20 21:31:51.749015 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Mar 20 21:31:51.749021 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 20 21:31:51.749027 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 20 21:31:51.749033 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Mar 20 21:31:51.749040 kernel: TCP: Hash tables configured (established 16384 bind 16384) Mar 20 21:31:51.749046 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 20 21:31:51.749052 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 20 21:31:51.749057 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 20 21:31:51.749063 kernel: NET: Registered PF_XDP protocol family Mar 20 21:31:51.749114 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Mar 20 21:31:51.749167 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 20 21:31:51.749218 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 20 21:31:51.749272 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 20 21:31:51.749323 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 20 21:31:51.749375 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Mar 20 21:31:51.749427 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Mar 20 21:31:51.749479 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Mar 20 21:31:51.749531 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Mar 20 21:31:51.749585 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Mar 20 21:31:51.749637 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Mar 20 21:31:51.749687 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Mar 20 21:31:51.749739 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Mar 20 21:31:51.749796 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Mar 20 21:31:51.749849 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Mar 20 21:31:51.749902 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Mar 20 21:31:51.749952 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Mar 20 21:31:51.750470 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Mar 20 21:31:51.750534 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Mar 20 21:31:51.750589 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Mar 20 21:31:51.750646 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Mar 20 21:31:51.750699 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Mar 20 21:31:51.750751 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Mar 20 21:31:51.750816 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Mar 20 21:31:51.750870 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Mar 20 21:31:51.750923 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.750975 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.751028 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.751080 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.751132 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.751184 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.751235 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.751286 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.751338 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.751388 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.751441 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.751493 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.751544 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.751594 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.751645 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.751696 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.751746 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.751844 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.751900 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.751950 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.752001 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.752050 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.752101 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.752152 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.752202 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.752252 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.752306 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.752358 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.752409 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.752480 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.752543 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.752595 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.752645 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.752696 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.752749 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.752810 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.752868 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.752930 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.752987 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.753051 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.753103 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.753154 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.753209 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.753259 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.753317 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.753375 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.753427 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.753478 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.753528 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.753601 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.753655 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.753706 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.753760 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.753841 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.753894 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.753946 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.754008 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.754061 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.754111 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.754163 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.754214 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.754266 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.754320 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.754371 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.754423 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.754474 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.754525 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.754576 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.754626 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.754676 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.754726 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.754779 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.754894 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.754946 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.754997 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.755047 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.755107 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.757534 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.757592 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.757646 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.757701 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.757751 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.757897 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Mar 20 21:31:51.757950 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Mar 20 21:31:51.758001 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Mar 20 21:31:51.758052 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Mar 20 21:31:51.758102 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Mar 20 21:31:51.758152 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Mar 20 21:31:51.758205 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Mar 20 21:31:51.758286 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Mar 20 21:31:51.758338 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Mar 20 21:31:51.758388 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Mar 20 21:31:51.758438 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Mar 20 21:31:51.758488 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Mar 20 21:31:51.758539 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Mar 20 21:31:51.758589 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Mar 20 21:31:51.758639 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Mar 20 21:31:51.758689 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Mar 20 21:31:51.758743 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Mar 20 21:31:51.758805 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Mar 20 21:31:51.758859 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Mar 20 21:31:51.758910 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Mar 20 21:31:51.758961 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Mar 20 21:31:51.759011 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Mar 20 21:31:51.759062 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Mar 20 21:31:51.759113 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Mar 20 21:31:51.759163 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Mar 20 21:31:51.759216 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Mar 20 21:31:51.759269 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Mar 20 21:31:51.759319 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Mar 20 21:31:51.759369 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Mar 20 21:31:51.759418 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Mar 20 21:31:51.759469 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Mar 20 21:31:51.759522 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Mar 20 21:31:51.759573 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Mar 20 21:31:51.759623 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Mar 20 21:31:51.759674 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Mar 20 21:31:51.759727 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Mar 20 21:31:51.759779 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Mar 20 21:31:51.759846 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Mar 20 21:31:51.759897 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Mar 20 21:31:51.759949 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Mar 20 21:31:51.760008 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Mar 20 21:31:51.760059 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Mar 20 21:31:51.760111 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Mar 20 21:31:51.760162 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Mar 20 21:31:51.760213 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Mar 20 21:31:51.760264 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Mar 20 21:31:51.760315 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Mar 20 21:31:51.760366 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Mar 20 21:31:51.760417 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Mar 20 21:31:51.760468 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Mar 20 21:31:51.760521 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Mar 20 21:31:51.760571 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Mar 20 21:31:51.760622 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Mar 20 21:31:51.760673 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Mar 20 21:31:51.760723 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Mar 20 21:31:51.760775 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Mar 20 21:31:51.760844 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Mar 20 21:31:51.760896 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Mar 20 21:31:51.760948 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Mar 20 21:31:51.761003 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Mar 20 21:31:51.761053 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Mar 20 21:31:51.761104 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Mar 20 21:31:51.761155 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Mar 20 21:31:51.761205 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Mar 20 21:31:51.761256 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Mar 20 21:31:51.761306 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Mar 20 21:31:51.761358 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Mar 20 21:31:51.761573 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Mar 20 21:31:51.761742 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Mar 20 21:31:51.762114 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Mar 20 21:31:51.762172 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Mar 20 21:31:51.762227 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Mar 20 21:31:51.762279 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Mar 20 21:31:51.762330 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Mar 20 21:31:51.762382 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Mar 20 21:31:51.762432 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Mar 20 21:31:51.762483 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Mar 20 21:31:51.762534 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Mar 20 21:31:51.762588 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Mar 20 21:31:51.762640 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Mar 20 21:31:51.762690 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Mar 20 21:31:51.762741 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Mar 20 21:31:51.762824 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Mar 20 21:31:51.762878 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Mar 20 21:31:51.762930 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Mar 20 21:31:51.762980 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Mar 20 21:31:51.763031 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Mar 20 21:31:51.763083 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Mar 20 21:31:51.763137 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Mar 20 21:31:51.763187 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Mar 20 21:31:51.763239 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Mar 20 21:31:51.763289 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Mar 20 21:31:51.763340 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Mar 20 21:31:51.763390 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Mar 20 21:31:51.763443 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Mar 20 21:31:51.763494 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Mar 20 21:31:51.763544 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Mar 20 21:31:51.763597 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Mar 20 21:31:51.764286 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Mar 20 21:31:51.764346 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Mar 20 21:31:51.764399 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Mar 20 21:31:51.764450 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Mar 20 21:31:51.764501 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Mar 20 21:31:51.764551 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Mar 20 21:31:51.764602 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Mar 20 21:31:51.764651 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Mar 20 21:31:51.764700 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Mar 20 21:31:51.764754 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Mar 20 21:31:51.764817 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Mar 20 21:31:51.764869 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Mar 20 21:31:51.764919 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Mar 20 21:31:51.764970 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Mar 20 21:31:51.765021 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Mar 20 21:31:51.765072 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Mar 20 21:31:51.765122 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Mar 20 21:31:51.765172 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Mar 20 21:31:51.765225 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Mar 20 21:31:51.765270 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Mar 20 21:31:51.765315 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Mar 20 21:31:51.765360 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Mar 20 21:31:51.765403 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Mar 20 21:31:51.765451 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Mar 20 21:31:51.765498 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Mar 20 21:31:51.765544 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Mar 20 21:31:51.765593 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Mar 20 21:31:51.765639 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Mar 20 21:31:51.765684 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Mar 20 21:31:51.765730 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Mar 20 21:31:51.765775 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Mar 20 21:31:51.768045 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Mar 20 21:31:51.768099 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Mar 20 21:31:51.768150 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Mar 20 21:31:51.768202 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Mar 20 21:31:51.768249 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Mar 20 21:31:51.768294 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Mar 20 21:31:51.768343 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Mar 20 21:31:51.768389 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Mar 20 21:31:51.768434 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Mar 20 21:31:51.768485 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Mar 20 21:31:51.768530 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Mar 20 21:31:51.768588 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Mar 20 21:31:51.768634 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Mar 20 21:31:51.768683 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Mar 20 21:31:51.768730 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Mar 20 21:31:51.768781 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Mar 20 21:31:51.768837 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Mar 20 21:31:51.768889 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Mar 20 21:31:51.768943 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Mar 20 21:31:51.768998 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Mar 20 21:31:51.769047 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Mar 20 21:31:51.769092 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Mar 20 21:31:51.769141 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Mar 20 21:31:51.769187 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Mar 20 21:31:51.769233 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Mar 20 21:31:51.769281 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Mar 20 21:31:51.769327 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Mar 20 21:31:51.769375 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Mar 20 21:31:51.769425 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Mar 20 21:31:51.769470 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Mar 20 21:31:51.769520 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Mar 20 21:31:51.769566 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Mar 20 21:31:51.769616 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Mar 20 21:31:51.769662 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Mar 20 21:31:51.769714 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Mar 20 21:31:51.769761 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Mar 20 21:31:51.770097 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Mar 20 21:31:51.770149 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Mar 20 21:31:51.770202 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Mar 20 21:31:51.770249 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Mar 20 21:31:51.770299 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Mar 20 21:31:51.770348 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Mar 20 21:31:51.770394 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Mar 20 21:31:51.770440 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Mar 20 21:31:51.770489 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Mar 20 21:31:51.770534 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Mar 20 21:31:51.770579 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Mar 20 21:31:51.770631 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Mar 20 21:31:51.770677 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Mar 20 21:31:51.770725 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Mar 20 21:31:51.770771 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Mar 20 21:31:51.770935 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Mar 20 21:31:51.770983 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Mar 20 21:31:51.771034 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Mar 20 21:31:51.771080 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Mar 20 21:31:51.771129 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Mar 20 21:31:51.771175 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Mar 20 21:31:51.771226 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Mar 20 21:31:51.771273 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Mar 20 21:31:51.771318 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Mar 20 21:31:51.771369 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Mar 20 21:31:51.771414 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Mar 20 21:31:51.771458 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Mar 20 21:31:51.771507 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Mar 20 21:31:51.771552 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Mar 20 21:31:51.771602 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Mar 20 21:31:51.771647 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Mar 20 21:31:51.771697 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Mar 20 21:31:51.771742 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Mar 20 21:31:51.771798 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Mar 20 21:31:51.771845 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Mar 20 21:31:51.771897 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Mar 20 21:31:51.771944 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Mar 20 21:31:51.772001 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Mar 20 21:31:51.772084 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Mar 20 21:31:51.772139 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Mar 20 21:31:51.772149 kernel: PCI: CLS 32 bytes, default 64 Mar 20 21:31:51.772155 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 20 21:31:51.772164 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Mar 20 21:31:51.772170 kernel: clocksource: Switched to clocksource tsc Mar 20 21:31:51.772176 kernel: Initialise system trusted keyrings Mar 20 21:31:51.772182 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Mar 20 21:31:51.772188 kernel: Key type asymmetric registered Mar 20 21:31:51.772194 kernel: Asymmetric key parser 'x509' registered Mar 20 21:31:51.772200 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 20 21:31:51.772206 kernel: io scheduler mq-deadline registered Mar 20 21:31:51.772212 kernel: io scheduler kyber registered Mar 20 21:31:51.772219 kernel: io scheduler bfq registered Mar 20 21:31:51.772271 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Mar 20 21:31:51.772323 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.772374 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Mar 20 21:31:51.772424 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.772475 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Mar 20 21:31:51.772525 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.772576 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Mar 20 21:31:51.772628 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.772679 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Mar 20 21:31:51.772729 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.772778 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Mar 20 21:31:51.774032 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.774092 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Mar 20 21:31:51.774144 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.774195 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Mar 20 21:31:51.774244 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.774293 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Mar 20 21:31:51.774343 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.774396 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Mar 20 21:31:51.774446 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.774496 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Mar 20 21:31:51.774546 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.774596 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Mar 20 21:31:51.774645 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.774700 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Mar 20 21:31:51.774751 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.775831 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Mar 20 21:31:51.775897 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.775954 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Mar 20 21:31:51.776056 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.776113 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Mar 20 21:31:51.776166 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.776218 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Mar 20 21:31:51.776270 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.776322 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Mar 20 21:31:51.776373 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.776427 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Mar 20 21:31:51.776478 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.776529 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Mar 20 21:31:51.776579 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.776630 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Mar 20 21:31:51.776681 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.776734 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Mar 20 21:31:51.778245 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.778318 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Mar 20 21:31:51.778378 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.778448 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Mar 20 21:31:51.778506 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.778560 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Mar 20 21:31:51.778612 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.778665 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Mar 20 21:31:51.778717 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.778769 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Mar 20 21:31:51.778829 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.778886 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Mar 20 21:31:51.778938 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.778991 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Mar 20 21:31:51.779044 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.779097 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Mar 20 21:31:51.779151 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.779221 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Mar 20 21:31:51.779288 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.779341 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Mar 20 21:31:51.779393 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Mar 20 21:31:51.779404 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 20 21:31:51.779411 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 20 21:31:51.779417 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 20 21:31:51.779423 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Mar 20 21:31:51.779430 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 20 21:31:51.779436 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 20 21:31:51.779489 kernel: rtc_cmos 00:01: registered as rtc0 Mar 20 21:31:51.779538 kernel: rtc_cmos 00:01: setting system clock to 2025-03-20T21:31:51 UTC (1742506311) Mar 20 21:31:51.779588 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Mar 20 21:31:51.779597 kernel: intel_pstate: CPU model not supported Mar 20 21:31:51.779604 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 20 21:31:51.779610 kernel: NET: Registered PF_INET6 protocol family Mar 20 21:31:51.779616 kernel: Segment Routing with IPv6 Mar 20 21:31:51.779622 kernel: In-situ OAM (IOAM) with IPv6 Mar 20 21:31:51.779629 kernel: NET: Registered PF_PACKET protocol family Mar 20 21:31:51.779635 kernel: Key type dns_resolver registered Mar 20 21:31:51.779642 kernel: IPI shorthand broadcast: enabled Mar 20 21:31:51.779650 kernel: sched_clock: Marking stable (855058991, 218163634)->(1127478028, -54255403) Mar 20 21:31:51.779656 kernel: registered taskstats version 1 Mar 20 21:31:51.779662 kernel: Loading compiled-in X.509 certificates Mar 20 21:31:51.779668 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 9e7923b67df1c6f0613bc4380f7ea8de9ce851ac' Mar 20 21:31:51.779675 kernel: Key type .fscrypt registered Mar 20 21:31:51.779682 kernel: Key type fscrypt-provisioning registered Mar 20 21:31:51.779688 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 20 21:31:51.779694 kernel: ima: Allocated hash algorithm: sha1 Mar 20 21:31:51.779700 kernel: ima: No architecture policies found Mar 20 21:31:51.779707 kernel: clk: Disabling unused clocks Mar 20 21:31:51.779713 kernel: Freeing unused kernel image (initmem) memory: 43592K Mar 20 21:31:51.779720 kernel: Write protecting the kernel read-only data: 40960k Mar 20 21:31:51.779727 kernel: Freeing unused kernel image (rodata/data gap) memory: 1564K Mar 20 21:31:51.779733 kernel: Run /init as init process Mar 20 21:31:51.779739 kernel: with arguments: Mar 20 21:31:51.779745 kernel: /init Mar 20 21:31:51.779751 kernel: with environment: Mar 20 21:31:51.779757 kernel: HOME=/ Mar 20 21:31:51.779764 kernel: TERM=linux Mar 20 21:31:51.779770 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 20 21:31:51.779777 systemd[1]: Successfully made /usr/ read-only. Mar 20 21:31:51.779798 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 20 21:31:51.779806 systemd[1]: Detected virtualization vmware. Mar 20 21:31:51.779812 systemd[1]: Detected architecture x86-64. Mar 20 21:31:51.779818 systemd[1]: Running in initrd. Mar 20 21:31:51.779824 systemd[1]: No hostname configured, using default hostname. Mar 20 21:31:51.779832 systemd[1]: Hostname set to . Mar 20 21:31:51.779838 systemd[1]: Initializing machine ID from random generator. Mar 20 21:31:51.779844 systemd[1]: Queued start job for default target initrd.target. Mar 20 21:31:51.779851 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 21:31:51.779857 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 21:31:51.779864 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 20 21:31:51.779870 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 20 21:31:51.779878 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 20 21:31:51.779885 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 20 21:31:51.779892 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 20 21:31:51.779898 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 20 21:31:51.779905 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 21:31:51.779925 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 20 21:31:51.779931 systemd[1]: Reached target paths.target - Path Units. Mar 20 21:31:51.779939 systemd[1]: Reached target slices.target - Slice Units. Mar 20 21:31:51.779945 systemd[1]: Reached target swap.target - Swaps. Mar 20 21:31:51.779951 systemd[1]: Reached target timers.target - Timer Units. Mar 20 21:31:51.779962 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 20 21:31:51.779969 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 20 21:31:51.779975 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 20 21:31:51.779981 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 20 21:31:51.779988 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 20 21:31:51.779994 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 20 21:31:51.780002 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 21:31:51.780008 systemd[1]: Reached target sockets.target - Socket Units. Mar 20 21:31:51.780014 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 20 21:31:51.780021 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 20 21:31:51.780027 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 20 21:31:51.780034 systemd[1]: Starting systemd-fsck-usr.service... Mar 20 21:31:51.780040 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 20 21:31:51.780048 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 20 21:31:51.780054 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 21:31:51.780062 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 20 21:31:51.780082 systemd-journald[218]: Collecting audit messages is disabled. Mar 20 21:31:51.780099 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 21:31:51.780107 systemd[1]: Finished systemd-fsck-usr.service. Mar 20 21:31:51.780114 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 20 21:31:51.780121 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 20 21:31:51.780127 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 20 21:31:51.780134 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 21:31:51.780141 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 20 21:31:51.780148 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 21:31:51.780154 kernel: Bridge firewalling registered Mar 20 21:31:51.780160 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 20 21:31:51.780167 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 21:31:51.780173 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 20 21:31:51.780180 systemd-journald[218]: Journal started Mar 20 21:31:51.780195 systemd-journald[218]: Runtime Journal (/run/log/journal/9b98b1d486e64b9f9d7328be60465662) is 4.8M, max 38.6M, 33.7M free. Mar 20 21:31:51.741988 systemd-modules-load[219]: Inserted module 'overlay' Mar 20 21:31:51.768571 systemd-modules-load[219]: Inserted module 'br_netfilter' Mar 20 21:31:51.784950 systemd[1]: Started systemd-journald.service - Journal Service. Mar 20 21:31:51.786434 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 21:31:51.787216 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 20 21:31:51.788665 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 20 21:31:51.790741 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 20 21:31:51.800057 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 21:31:51.800897 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 20 21:31:51.806219 dracut-cmdline[249]: dracut-dracut-053 Mar 20 21:31:51.810095 dracut-cmdline[249]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=619bfa043b53ac975036e415994a80721794ae8277072d0a93c174b4f7768019 Mar 20 21:31:51.828297 systemd-resolved[257]: Positive Trust Anchors: Mar 20 21:31:51.828489 systemd-resolved[257]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 20 21:31:51.828513 systemd-resolved[257]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 20 21:31:51.830847 systemd-resolved[257]: Defaulting to hostname 'linux'. Mar 20 21:31:51.831386 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 20 21:31:51.831511 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 20 21:31:51.852798 kernel: SCSI subsystem initialized Mar 20 21:31:51.858797 kernel: Loading iSCSI transport class v2.0-870. Mar 20 21:31:51.864796 kernel: iscsi: registered transport (tcp) Mar 20 21:31:51.877802 kernel: iscsi: registered transport (qla4xxx) Mar 20 21:31:51.877820 kernel: QLogic iSCSI HBA Driver Mar 20 21:31:51.896286 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 20 21:31:51.897067 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 20 21:31:51.915514 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 20 21:31:51.915545 kernel: device-mapper: uevent: version 1.0.3 Mar 20 21:31:51.915554 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 20 21:31:51.946795 kernel: raid6: avx2x4 gen() 49558 MB/s Mar 20 21:31:51.962795 kernel: raid6: avx2x2 gen() 55671 MB/s Mar 20 21:31:51.979917 kernel: raid6: avx2x1 gen() 46647 MB/s Mar 20 21:31:51.979933 kernel: raid6: using algorithm avx2x2 gen() 55671 MB/s Mar 20 21:31:51.997932 kernel: raid6: .... xor() 32899 MB/s, rmw enabled Mar 20 21:31:51.997957 kernel: raid6: using avx2x2 recovery algorithm Mar 20 21:31:52.010796 kernel: xor: automatically using best checksumming function avx Mar 20 21:31:52.096808 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 20 21:31:52.101411 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 20 21:31:52.102282 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 21:31:52.115599 systemd-udevd[436]: Using default interface naming scheme 'v255'. Mar 20 21:31:52.118392 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 21:31:52.120914 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 20 21:31:52.134051 dracut-pre-trigger[441]: rd.md=0: removing MD RAID activation Mar 20 21:31:52.147982 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 20 21:31:52.148670 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 20 21:31:52.222017 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 21:31:52.223468 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 20 21:31:52.241178 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 20 21:31:52.241940 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 20 21:31:52.242330 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 21:31:52.242612 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 20 21:31:52.243565 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 20 21:31:52.253113 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 20 21:31:52.286795 kernel: VMware PVSCSI driver - version 1.0.7.0-k Mar 20 21:31:52.289092 kernel: vmw_pvscsi: using 64bit dma Mar 20 21:31:52.289110 kernel: vmw_pvscsi: max_id: 16 Mar 20 21:31:52.289121 kernel: vmw_pvscsi: setting ring_pages to 8 Mar 20 21:31:52.291838 kernel: vmw_pvscsi: enabling reqCallThreshold Mar 20 21:31:52.291855 kernel: vmw_pvscsi: driver-based request coalescing enabled Mar 20 21:31:52.291863 kernel: vmw_pvscsi: using MSI-X Mar 20 21:31:52.295966 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Mar 20 21:31:52.297795 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Mar 20 21:31:52.301092 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Mar 20 21:31:52.307799 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Mar 20 21:31:52.307824 kernel: libata version 3.00 loaded. Mar 20 21:31:52.310807 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Mar 20 21:31:52.313475 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Mar 20 21:31:52.315795 kernel: ata_piix 0000:00:07.1: version 2.13 Mar 20 21:31:52.321257 kernel: scsi host1: ata_piix Mar 20 21:31:52.321329 kernel: scsi host2: ata_piix Mar 20 21:31:52.321390 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Mar 20 21:31:52.321399 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Mar 20 21:31:52.328807 kernel: cryptd: max_cpu_qlen set to 1000 Mar 20 21:31:52.332797 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Mar 20 21:31:52.335516 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 20 21:31:52.335586 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 21:31:52.336838 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 21:31:52.337089 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 20 21:31:52.337165 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 21:31:52.337560 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 21:31:52.338855 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 21:31:52.352887 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 21:31:52.353465 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 21:31:52.373065 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 21:31:52.488811 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Mar 20 21:31:52.492839 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Mar 20 21:31:52.503353 kernel: AVX2 version of gcm_enc/dec engaged. Mar 20 21:31:52.503385 kernel: AES CTR mode by8 optimization enabled Mar 20 21:31:52.513348 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Mar 20 21:31:52.519494 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 20 21:31:52.519566 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Mar 20 21:31:52.519629 kernel: sd 0:0:0:0: [sda] Cache data unavailable Mar 20 21:31:52.519691 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Mar 20 21:31:52.519751 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 20 21:31:52.519760 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 20 21:31:52.521806 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Mar 20 21:31:52.529647 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 20 21:31:52.529660 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Mar 20 21:31:52.560300 kernel: BTRFS: device fsid 48a514e8-9ecc-46c2-935b-caca347f921e devid 1 transid 39 /dev/sda3 scanned by (udev-worker) (491) Mar 20 21:31:52.564797 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by (udev-worker) (485) Mar 20 21:31:52.570726 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Mar 20 21:31:52.576999 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Mar 20 21:31:52.581103 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Mar 20 21:31:52.581220 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Mar 20 21:31:52.586694 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Mar 20 21:31:52.587231 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 20 21:31:52.646821 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 20 21:31:53.658884 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 20 21:31:53.659353 disk-uuid[595]: The operation has completed successfully. Mar 20 21:31:53.695318 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 20 21:31:53.695383 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 20 21:31:53.710728 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 20 21:31:53.724025 sh[611]: Success Mar 20 21:31:53.733803 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Mar 20 21:31:53.768125 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 20 21:31:53.769836 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 20 21:31:53.782700 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 20 21:31:53.797799 kernel: BTRFS info (device dm-0): first mount of filesystem 48a514e8-9ecc-46c2-935b-caca347f921e Mar 20 21:31:53.797829 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 20 21:31:53.797840 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 20 21:31:53.798148 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 20 21:31:53.799103 kernel: BTRFS info (device dm-0): using free space tree Mar 20 21:31:53.806798 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 20 21:31:53.808602 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 20 21:31:53.809426 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Mar 20 21:31:53.811908 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 20 21:31:53.840530 kernel: BTRFS info (device sda6): first mount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 21:31:53.840567 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 21:31:53.840576 kernel: BTRFS info (device sda6): using free space tree Mar 20 21:31:53.847799 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 20 21:31:53.851819 kernel: BTRFS info (device sda6): last unmount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 21:31:53.854355 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 20 21:31:53.855463 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 20 21:31:53.882262 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Mar 20 21:31:53.885907 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 20 21:31:53.936830 ignition[667]: Ignition 2.20.0 Mar 20 21:31:53.936840 ignition[667]: Stage: fetch-offline Mar 20 21:31:53.936857 ignition[667]: no configs at "/usr/lib/ignition/base.d" Mar 20 21:31:53.936862 ignition[667]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 20 21:31:53.936921 ignition[667]: parsed url from cmdline: "" Mar 20 21:31:53.936925 ignition[667]: no config URL provided Mar 20 21:31:53.936928 ignition[667]: reading system config file "/usr/lib/ignition/user.ign" Mar 20 21:31:53.936932 ignition[667]: no config at "/usr/lib/ignition/user.ign" Mar 20 21:31:53.937331 ignition[667]: config successfully fetched Mar 20 21:31:53.937349 ignition[667]: parsing config with SHA512: 156d68f0364c6b8d5792727129b589eee1e7018d7d6f13952acff34e6510542afeff24d13728729a86a528b0660d9bcfe62118dadd8ef844278501d0e44ab05e Mar 20 21:31:53.940576 unknown[667]: fetched base config from "system" Mar 20 21:31:53.940583 unknown[667]: fetched user config from "vmware" Mar 20 21:31:53.940810 ignition[667]: fetch-offline: fetch-offline passed Mar 20 21:31:53.940851 ignition[667]: Ignition finished successfully Mar 20 21:31:53.941994 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 20 21:31:53.961419 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 20 21:31:53.962455 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 20 21:31:53.987018 systemd-networkd[801]: lo: Link UP Mar 20 21:31:53.987024 systemd-networkd[801]: lo: Gained carrier Mar 20 21:31:53.987834 systemd-networkd[801]: Enumeration completed Mar 20 21:31:53.987984 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 20 21:31:53.988078 systemd-networkd[801]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Mar 20 21:31:53.988128 systemd[1]: Reached target network.target - Network. Mar 20 21:31:53.988208 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 20 21:31:53.992012 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Mar 20 21:31:53.992129 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Mar 20 21:31:53.991652 systemd-networkd[801]: ens192: Link UP Mar 20 21:31:53.991654 systemd-networkd[801]: ens192: Gained carrier Mar 20 21:31:53.991871 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 20 21:31:54.003977 ignition[804]: Ignition 2.20.0 Mar 20 21:31:54.003989 ignition[804]: Stage: kargs Mar 20 21:31:54.004113 ignition[804]: no configs at "/usr/lib/ignition/base.d" Mar 20 21:31:54.004120 ignition[804]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 20 21:31:54.004657 ignition[804]: kargs: kargs passed Mar 20 21:31:54.004681 ignition[804]: Ignition finished successfully Mar 20 21:31:54.006129 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 20 21:31:54.006894 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 20 21:31:54.028370 ignition[811]: Ignition 2.20.0 Mar 20 21:31:54.028392 ignition[811]: Stage: disks Mar 20 21:31:54.028551 ignition[811]: no configs at "/usr/lib/ignition/base.d" Mar 20 21:31:54.028560 ignition[811]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 20 21:31:54.029241 ignition[811]: disks: disks passed Mar 20 21:31:54.029284 ignition[811]: Ignition finished successfully Mar 20 21:31:54.029929 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 20 21:31:54.030443 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 20 21:31:54.030595 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 20 21:31:54.030827 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 20 21:31:54.031042 systemd[1]: Reached target sysinit.target - System Initialization. Mar 20 21:31:54.031243 systemd[1]: Reached target basic.target - Basic System. Mar 20 21:31:54.032018 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 20 21:31:54.052714 systemd-fsck[819]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 20 21:31:54.054304 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 20 21:31:54.055466 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 20 21:31:54.124835 kernel: EXT4-fs (sda9): mounted filesystem 79cdbe74-6884-4c57-b04d-c9a431509f16 r/w with ordered data mode. Quota mode: none. Mar 20 21:31:54.124983 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 20 21:31:54.125436 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 20 21:31:54.126409 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 20 21:31:54.128736 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 20 21:31:54.129035 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 20 21:31:54.129064 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 20 21:31:54.129080 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 20 21:31:54.136889 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 20 21:31:54.137833 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 20 21:31:54.144799 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (827) Mar 20 21:31:54.147050 kernel: BTRFS info (device sda6): first mount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 21:31:54.147076 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 21:31:54.147085 kernel: BTRFS info (device sda6): using free space tree Mar 20 21:31:54.150809 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 20 21:31:54.151524 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 20 21:31:54.166665 initrd-setup-root[851]: cut: /sysroot/etc/passwd: No such file or directory Mar 20 21:31:54.169280 initrd-setup-root[858]: cut: /sysroot/etc/group: No such file or directory Mar 20 21:31:54.171211 initrd-setup-root[865]: cut: /sysroot/etc/shadow: No such file or directory Mar 20 21:31:54.173578 initrd-setup-root[872]: cut: /sysroot/etc/gshadow: No such file or directory Mar 20 21:31:54.225728 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 20 21:31:54.226493 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 20 21:31:54.228866 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 20 21:31:54.242801 kernel: BTRFS info (device sda6): last unmount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 21:31:54.256956 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 20 21:31:54.260835 ignition[940]: INFO : Ignition 2.20.0 Mar 20 21:31:54.261055 ignition[940]: INFO : Stage: mount Mar 20 21:31:54.261274 ignition[940]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 21:31:54.261429 ignition[940]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 20 21:31:54.262123 ignition[940]: INFO : mount: mount passed Mar 20 21:31:54.262253 ignition[940]: INFO : Ignition finished successfully Mar 20 21:31:54.262906 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 20 21:31:54.263519 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 20 21:31:54.794631 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 20 21:31:54.795513 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 20 21:31:54.823864 kernel: BTRFS: device label OEM devid 1 transid 18 /dev/sda6 scanned by mount (951) Mar 20 21:31:54.825822 kernel: BTRFS info (device sda6): first mount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 21:31:54.825853 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 21:31:54.827868 kernel: BTRFS info (device sda6): using free space tree Mar 20 21:31:54.831801 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 20 21:31:54.832429 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 20 21:31:54.850640 ignition[968]: INFO : Ignition 2.20.0 Mar 20 21:31:54.850980 ignition[968]: INFO : Stage: files Mar 20 21:31:54.850980 ignition[968]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 21:31:54.850980 ignition[968]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 20 21:31:54.851575 ignition[968]: DEBUG : files: compiled without relabeling support, skipping Mar 20 21:31:54.852209 ignition[968]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 20 21:31:54.852209 ignition[968]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 20 21:31:54.854389 ignition[968]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 20 21:31:54.854588 ignition[968]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 20 21:31:54.854752 ignition[968]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 20 21:31:54.854721 unknown[968]: wrote ssh authorized keys file for user: core Mar 20 21:31:54.856058 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 20 21:31:54.856328 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 20 21:31:54.923630 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 20 21:31:55.063441 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 20 21:31:55.063441 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 20 21:31:55.064124 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 20 21:31:55.064124 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 20 21:31:55.064124 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 20 21:31:55.064124 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 20 21:31:55.064124 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 20 21:31:55.064124 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 20 21:31:55.064124 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 20 21:31:55.065681 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 20 21:31:55.065681 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 20 21:31:55.065681 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 20 21:31:55.065681 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 20 21:31:55.065681 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 20 21:31:55.065681 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 Mar 20 21:31:55.552588 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 20 21:31:55.914954 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 20 21:31:55.914954 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Mar 20 21:31:55.915610 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Mar 20 21:31:55.915610 ignition[968]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Mar 20 21:31:55.915610 ignition[968]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 20 21:31:55.915610 ignition[968]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 20 21:31:55.915610 ignition[968]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Mar 20 21:31:55.915610 ignition[968]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Mar 20 21:31:55.915610 ignition[968]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 20 21:31:55.915610 ignition[968]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 20 21:31:55.915610 ignition[968]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Mar 20 21:31:55.915610 ignition[968]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Mar 20 21:31:55.961886 systemd-networkd[801]: ens192: Gained IPv6LL Mar 20 21:31:55.980993 ignition[968]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Mar 20 21:31:55.984413 ignition[968]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Mar 20 21:31:55.984608 ignition[968]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Mar 20 21:31:55.984608 ignition[968]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Mar 20 21:31:55.984608 ignition[968]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Mar 20 21:31:55.985111 ignition[968]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 20 21:31:55.985111 ignition[968]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 20 21:31:55.985111 ignition[968]: INFO : files: files passed Mar 20 21:31:55.985111 ignition[968]: INFO : Ignition finished successfully Mar 20 21:31:55.985562 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 20 21:31:55.986589 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 20 21:31:55.988856 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 20 21:31:55.999468 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 20 21:31:55.999686 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 20 21:31:56.001924 initrd-setup-root-after-ignition[1000]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 20 21:31:56.001924 initrd-setup-root-after-ignition[1000]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 20 21:31:56.003182 initrd-setup-root-after-ignition[1004]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 20 21:31:56.004047 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 20 21:31:56.004514 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 20 21:31:56.005285 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 20 21:31:56.030112 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 20 21:31:56.030184 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 20 21:31:56.030457 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 20 21:31:56.030579 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 20 21:31:56.030771 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 20 21:31:56.031236 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 20 21:31:56.042332 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 20 21:31:56.043251 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 20 21:31:56.053248 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 20 21:31:56.053404 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 21:31:56.053672 systemd[1]: Stopped target timers.target - Timer Units. Mar 20 21:31:56.053867 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 20 21:31:56.053930 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 20 21:31:56.054317 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 20 21:31:56.054473 systemd[1]: Stopped target basic.target - Basic System. Mar 20 21:31:56.054649 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 20 21:31:56.054841 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 20 21:31:56.055043 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 20 21:31:56.055252 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 20 21:31:56.055594 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 20 21:31:56.055793 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 20 21:31:56.055996 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 20 21:31:56.056183 systemd[1]: Stopped target swap.target - Swaps. Mar 20 21:31:56.056335 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 20 21:31:56.056399 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 20 21:31:56.056646 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 20 21:31:56.056918 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 21:31:56.057108 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 20 21:31:56.057148 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 21:31:56.057313 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 20 21:31:56.057372 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 20 21:31:56.057642 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 20 21:31:56.057718 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 20 21:31:56.057949 systemd[1]: Stopped target paths.target - Path Units. Mar 20 21:31:56.058098 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 20 21:31:56.061890 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 21:31:56.062126 systemd[1]: Stopped target slices.target - Slice Units. Mar 20 21:31:56.062395 systemd[1]: Stopped target sockets.target - Socket Units. Mar 20 21:31:56.062637 systemd[1]: iscsid.socket: Deactivated successfully. Mar 20 21:31:56.062725 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 20 21:31:56.063013 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 20 21:31:56.063072 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 20 21:31:56.063400 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 20 21:31:56.063483 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 20 21:31:56.063818 systemd[1]: ignition-files.service: Deactivated successfully. Mar 20 21:31:56.063897 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 20 21:31:56.064738 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 20 21:31:56.064875 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 20 21:31:56.064980 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 21:31:56.066906 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 20 21:31:56.067048 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 20 21:31:56.067156 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 21:31:56.067388 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 20 21:31:56.067486 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 20 21:31:56.071902 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 20 21:31:56.071953 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 20 21:31:56.080709 ignition[1024]: INFO : Ignition 2.20.0 Mar 20 21:31:56.081110 ignition[1024]: INFO : Stage: umount Mar 20 21:31:56.081339 ignition[1024]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 21:31:56.081472 ignition[1024]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Mar 20 21:31:56.081891 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 20 21:31:56.082296 ignition[1024]: INFO : umount: umount passed Mar 20 21:31:56.082435 ignition[1024]: INFO : Ignition finished successfully Mar 20 21:31:56.083542 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 20 21:31:56.083762 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 20 21:31:56.084143 systemd[1]: Stopped target network.target - Network. Mar 20 21:31:56.084241 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 20 21:31:56.084282 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 20 21:31:56.084409 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 20 21:31:56.084485 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 20 21:31:56.084623 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 20 21:31:56.084650 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 20 21:31:56.084752 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 20 21:31:56.084782 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 20 21:31:56.084974 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 20 21:31:56.086428 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 20 21:31:56.091000 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 20 21:31:56.091072 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 20 21:31:56.092694 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 20 21:31:56.093378 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 20 21:31:56.093410 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 21:31:56.094373 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 20 21:31:56.094515 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 20 21:31:56.094577 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 20 21:31:56.095401 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 20 21:31:56.095655 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 20 21:31:56.095684 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 20 21:31:56.096327 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 20 21:31:56.096432 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 20 21:31:56.096459 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 20 21:31:56.096599 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Mar 20 21:31:56.096624 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Mar 20 21:31:56.096752 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 20 21:31:56.096775 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 20 21:31:56.096956 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 20 21:31:56.096978 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 20 21:31:56.097354 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 21:31:56.098050 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 20 21:31:56.110140 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 20 21:31:56.110355 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 21:31:56.110854 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 20 21:31:56.110893 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 20 21:31:56.111256 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 20 21:31:56.111278 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 21:31:56.111392 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 20 21:31:56.111419 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 20 21:31:56.111605 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 20 21:31:56.111631 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 20 21:31:56.111773 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 20 21:31:56.112828 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 21:31:56.114850 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 20 21:31:56.114964 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 20 21:31:56.114996 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 21:31:56.115243 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 20 21:31:56.115269 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 21:31:56.115591 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 20 21:31:56.115642 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 20 21:31:56.120944 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 20 21:31:56.121014 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 20 21:31:56.139377 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 20 21:31:56.139453 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 20 21:31:56.139817 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 20 21:31:56.139962 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 20 21:31:56.139998 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 20 21:31:56.140743 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 20 21:31:56.154935 systemd[1]: Switching root. Mar 20 21:31:56.186374 systemd-journald[218]: Journal stopped Mar 20 21:31:57.657946 systemd-journald[218]: Received SIGTERM from PID 1 (systemd). Mar 20 21:31:57.657968 kernel: SELinux: policy capability network_peer_controls=1 Mar 20 21:31:57.657977 kernel: SELinux: policy capability open_perms=1 Mar 20 21:31:57.657983 kernel: SELinux: policy capability extended_socket_class=1 Mar 20 21:31:57.657988 kernel: SELinux: policy capability always_check_network=0 Mar 20 21:31:57.657996 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 20 21:31:57.658009 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 20 21:31:57.658019 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 20 21:31:57.658028 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 20 21:31:57.658038 systemd[1]: Successfully loaded SELinux policy in 35.537ms. Mar 20 21:31:57.658049 kernel: audit: type=1403 audit(1742506316.884:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 20 21:31:57.658056 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.992ms. Mar 20 21:31:57.658064 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 20 21:31:57.658072 systemd[1]: Detected virtualization vmware. Mar 20 21:31:57.658079 systemd[1]: Detected architecture x86-64. Mar 20 21:31:57.658085 systemd[1]: Detected first boot. Mar 20 21:31:57.658092 systemd[1]: Initializing machine ID from random generator. Mar 20 21:31:57.658099 zram_generator::config[1069]: No configuration found. Mar 20 21:31:57.658183 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Mar 20 21:31:57.658194 kernel: Guest personality initialized and is active Mar 20 21:31:57.658200 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Mar 20 21:31:57.658206 kernel: Initialized host personality Mar 20 21:31:57.658212 kernel: NET: Registered PF_VSOCK protocol family Mar 20 21:31:57.658219 systemd[1]: Populated /etc with preset unit settings. Mar 20 21:31:57.658229 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Mar 20 21:31:57.658236 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Mar 20 21:31:57.658243 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 20 21:31:57.658249 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 20 21:31:57.658256 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 20 21:31:57.658262 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 20 21:31:57.658270 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 20 21:31:57.658278 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 20 21:31:57.658285 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 20 21:31:57.658292 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 20 21:31:57.662616 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 20 21:31:57.662627 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 20 21:31:57.662634 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 20 21:31:57.662641 systemd[1]: Created slice user.slice - User and Session Slice. Mar 20 21:31:57.662648 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 21:31:57.662659 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 21:31:57.662668 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 20 21:31:57.662675 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 20 21:31:57.662683 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 20 21:31:57.662690 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 20 21:31:57.662697 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 20 21:31:57.662703 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 21:31:57.662721 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 20 21:31:57.662732 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 20 21:31:57.662739 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 20 21:31:57.662746 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 20 21:31:57.662753 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 21:31:57.662760 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 20 21:31:57.662766 systemd[1]: Reached target slices.target - Slice Units. Mar 20 21:31:57.662773 systemd[1]: Reached target swap.target - Swaps. Mar 20 21:31:57.662780 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 20 21:31:57.662827 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 20 21:31:57.662837 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 20 21:31:57.662851 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 20 21:31:57.662860 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 20 21:31:57.662869 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 21:31:57.662876 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 20 21:31:57.662883 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 20 21:31:57.662891 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 20 21:31:57.662898 systemd[1]: Mounting media.mount - External Media Directory... Mar 20 21:31:57.662911 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 21:31:57.662920 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 20 21:31:57.662927 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 20 21:31:57.662936 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 20 21:31:57.662943 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 20 21:31:57.662951 systemd[1]: Reached target machines.target - Containers. Mar 20 21:31:57.662958 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 20 21:31:57.662964 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Mar 20 21:31:57.662971 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 20 21:31:57.662978 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 20 21:31:57.662986 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 20 21:31:57.662994 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 20 21:31:57.663003 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 20 21:31:57.663010 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 20 21:31:57.663017 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 20 21:31:57.663024 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 20 21:31:57.663031 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 20 21:31:57.663038 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 20 21:31:57.663045 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 20 21:31:57.663053 systemd[1]: Stopped systemd-fsck-usr.service. Mar 20 21:31:57.663061 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 21:31:57.663069 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 20 21:31:57.663076 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 20 21:31:57.663083 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 20 21:31:57.663089 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 20 21:31:57.663096 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 20 21:31:57.663103 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 20 21:31:57.663110 systemd[1]: verity-setup.service: Deactivated successfully. Mar 20 21:31:57.663119 systemd[1]: Stopped verity-setup.service. Mar 20 21:31:57.663126 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 21:31:57.663133 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 20 21:31:57.663140 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 20 21:31:57.663147 systemd[1]: Mounted media.mount - External Media Directory. Mar 20 21:31:57.663155 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 20 21:31:57.663162 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 20 21:31:57.663168 kernel: fuse: init (API version 7.39) Mar 20 21:31:57.663175 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 20 21:31:57.663184 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 21:31:57.663191 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 20 21:31:57.663197 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 20 21:31:57.663205 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 20 21:31:57.663212 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 20 21:31:57.663218 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 20 21:31:57.663225 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 20 21:31:57.663232 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 20 21:31:57.663240 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 20 21:31:57.663248 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 20 21:31:57.663255 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 20 21:31:57.663262 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 20 21:31:57.663268 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 20 21:31:57.663276 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 20 21:31:57.663283 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 20 21:31:57.663289 kernel: ACPI: bus type drm_connector registered Mar 20 21:31:57.663297 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 20 21:31:57.663308 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 20 21:31:57.663316 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 20 21:31:57.663324 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 20 21:31:57.663332 kernel: loop: module loaded Mar 20 21:31:57.663339 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 20 21:31:57.663346 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 20 21:31:57.663355 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 21:31:57.663382 systemd-journald[1167]: Collecting audit messages is disabled. Mar 20 21:31:57.663400 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 20 21:31:57.663408 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 20 21:31:57.663417 systemd-journald[1167]: Journal started Mar 20 21:31:57.663432 systemd-journald[1167]: Runtime Journal (/run/log/journal/c18d25fbe9ac4254a0d7635d6731e5bd) is 4.8M, max 38.6M, 33.7M free. Mar 20 21:31:57.411466 systemd[1]: Queued start job for default target multi-user.target. Mar 20 21:31:57.421316 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 20 21:31:57.421585 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 20 21:31:57.675664 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 20 21:31:57.675698 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 20 21:31:57.675709 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 20 21:31:57.675958 jq[1140]: true Mar 20 21:31:57.676502 jq[1186]: true Mar 20 21:31:57.688886 systemd[1]: Started systemd-journald.service - Journal Service. Mar 20 21:31:57.686694 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 20 21:31:57.687269 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 20 21:31:57.687820 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 20 21:31:57.688123 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 20 21:31:57.688573 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 20 21:31:57.689972 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 20 21:31:57.690159 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 20 21:31:57.690464 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 20 21:31:57.721893 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 20 21:31:57.722076 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 20 21:31:57.725973 kernel: loop0: detected capacity change from 0 to 151640 Mar 20 21:31:57.725874 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 20 21:31:57.727681 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 20 21:31:57.728275 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 20 21:31:57.730696 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 20 21:31:57.738559 ignition[1187]: Ignition 2.20.0 Mar 20 21:31:57.738746 ignition[1187]: deleting config from guestinfo properties Mar 20 21:31:57.773087 systemd-journald[1167]: Time spent on flushing to /var/log/journal/c18d25fbe9ac4254a0d7635d6731e5bd is 39.059ms for 1852 entries. Mar 20 21:31:57.773087 systemd-journald[1167]: System Journal (/var/log/journal/c18d25fbe9ac4254a0d7635d6731e5bd) is 8M, max 584.8M, 576.8M free. Mar 20 21:31:57.825241 systemd-journald[1167]: Received client request to flush runtime journal. Mar 20 21:31:57.776221 ignition[1187]: Successfully deleted config Mar 20 21:31:57.783778 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 20 21:31:57.785109 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Mar 20 21:31:57.826108 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 20 21:31:57.830395 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 21:31:57.831764 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 20 21:31:57.847341 udevadm[1237]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 20 21:31:57.877601 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 20 21:31:57.897655 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 20 21:31:57.902903 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 20 21:31:57.904516 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 20 21:31:57.926868 kernel: loop1: detected capacity change from 0 to 205544 Mar 20 21:31:57.977339 systemd-tmpfiles[1242]: ACLs are not supported, ignoring. Mar 20 21:31:57.977352 systemd-tmpfiles[1242]: ACLs are not supported, ignoring. Mar 20 21:31:57.980950 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 21:31:58.017836 kernel: loop2: detected capacity change from 0 to 109808 Mar 20 21:31:58.121874 kernel: loop3: detected capacity change from 0 to 2960 Mar 20 21:31:58.237983 kernel: loop4: detected capacity change from 0 to 151640 Mar 20 21:31:58.301950 kernel: loop5: detected capacity change from 0 to 205544 Mar 20 21:31:58.357964 kernel: loop6: detected capacity change from 0 to 109808 Mar 20 21:31:58.422657 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 20 21:31:58.471905 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 20 21:31:58.474888 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 21:31:58.478818 kernel: loop7: detected capacity change from 0 to 2960 Mar 20 21:31:58.492075 (sd-merge)[1248]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Mar 20 21:31:58.492499 (sd-merge)[1248]: Merged extensions into '/usr'. Mar 20 21:31:58.498734 systemd[1]: Reload requested from client PID 1195 ('systemd-sysext') (unit systemd-sysext.service)... Mar 20 21:31:58.498744 systemd[1]: Reloading... Mar 20 21:31:58.502523 systemd-udevd[1250]: Using default interface naming scheme 'v255'. Mar 20 21:31:58.558801 zram_generator::config[1281]: No configuration found. Mar 20 21:31:58.678413 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Mar 20 21:31:58.701974 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Mar 20 21:31:58.712043 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 20 21:31:58.717801 kernel: ACPI: button: Power Button [PWRF] Mar 20 21:31:58.716260 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 21:31:58.749369 ldconfig[1191]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 20 21:31:58.758814 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1314) Mar 20 21:31:58.785440 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 20 21:31:58.785536 systemd[1]: Reloading finished in 286 ms. Mar 20 21:31:58.797236 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 21:31:58.798126 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 20 21:31:58.798713 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 20 21:31:58.823963 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Mar 20 21:31:58.828986 systemd[1]: Starting ensure-sysext.service... Mar 20 21:31:58.830918 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 20 21:31:58.834866 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 20 21:31:58.836912 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 20 21:31:58.853035 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 20 21:31:58.859637 systemd[1]: Reload requested from client PID 1361 ('systemctl') (unit ensure-sysext.service)... Mar 20 21:31:58.859738 systemd[1]: Reloading... Mar 20 21:31:58.871415 systemd-tmpfiles[1364]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 20 21:31:58.871617 systemd-tmpfiles[1364]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 20 21:31:58.872217 systemd-tmpfiles[1364]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 20 21:31:58.872389 systemd-tmpfiles[1364]: ACLs are not supported, ignoring. Mar 20 21:31:58.872425 systemd-tmpfiles[1364]: ACLs are not supported, ignoring. Mar 20 21:31:58.880811 systemd-tmpfiles[1364]: Detected autofs mount point /boot during canonicalization of boot. Mar 20 21:31:58.880826 systemd-tmpfiles[1364]: Skipping /boot Mar 20 21:31:58.883812 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input4 Mar 20 21:31:58.904246 systemd-tmpfiles[1364]: Detected autofs mount point /boot during canonicalization of boot. Mar 20 21:31:58.904252 systemd-tmpfiles[1364]: Skipping /boot Mar 20 21:31:58.913276 (udev-worker)[1315]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Mar 20 21:31:58.940803 zram_generator::config[1403]: No configuration found. Mar 20 21:31:58.973864 kernel: mousedev: PS/2 mouse device common for all mice Mar 20 21:31:58.976404 systemd-networkd[1363]: lo: Link UP Mar 20 21:31:58.976610 systemd-networkd[1363]: lo: Gained carrier Mar 20 21:31:58.977721 systemd-networkd[1363]: Enumeration completed Mar 20 21:31:58.978198 systemd-networkd[1363]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Mar 20 21:31:58.980404 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Mar 20 21:31:58.980538 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Mar 20 21:31:58.980995 systemd-networkd[1363]: ens192: Link UP Mar 20 21:31:58.981135 systemd-networkd[1363]: ens192: Gained carrier Mar 20 21:31:59.028016 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Mar 20 21:31:59.048452 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 21:31:59.110532 systemd[1]: Reloading finished in 250 ms. Mar 20 21:31:59.125079 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 20 21:31:59.125334 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 20 21:31:59.131245 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 20 21:31:59.131601 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 21:31:59.142434 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 21:31:59.143590 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 20 21:31:59.149613 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 20 21:31:59.152530 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 20 21:31:59.153323 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 20 21:31:59.155490 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 20 21:31:59.155700 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 21:31:59.155775 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 21:31:59.156929 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 20 21:31:59.158126 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 20 21:31:59.160428 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 20 21:31:59.164620 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 20 21:31:59.169716 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 20 21:31:59.171930 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 21:31:59.172062 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 21:31:59.174824 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 20 21:31:59.175281 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 20 21:31:59.175403 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 20 21:31:59.175758 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 20 21:31:59.176035 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 20 21:31:59.177078 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 20 21:31:59.177187 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 20 21:31:59.186238 systemd[1]: Finished ensure-sysext.service. Mar 20 21:31:59.189118 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 21:31:59.192773 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 20 21:31:59.202961 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 20 21:31:59.208385 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 20 21:31:59.217496 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 20 21:31:59.224559 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 20 21:31:59.224739 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 21:31:59.224763 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 21:31:59.228913 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 20 21:31:59.229040 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 21:31:59.230353 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 20 21:31:59.234030 lvm[1485]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 20 21:31:59.230618 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 20 21:31:59.234321 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 20 21:31:59.238199 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 20 21:31:59.242793 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 20 21:31:59.243090 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 20 21:31:59.243839 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 20 21:31:59.244826 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 20 21:31:59.248329 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 20 21:31:59.248442 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 20 21:31:59.248632 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 20 21:31:59.254230 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 20 21:31:59.254377 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 20 21:31:59.254604 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 20 21:31:59.260309 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 20 21:31:59.270539 augenrules[1516]: No rules Mar 20 21:31:59.271221 systemd[1]: audit-rules.service: Deactivated successfully. Mar 20 21:31:59.271848 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 20 21:31:59.272822 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 20 21:31:59.273302 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 20 21:31:59.277000 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 20 21:31:59.295165 lvm[1523]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 20 21:31:59.294836 systemd-resolved[1472]: Positive Trust Anchors: Mar 20 21:31:59.294845 systemd-resolved[1472]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 20 21:31:59.294869 systemd-resolved[1472]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 20 21:31:59.295365 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 20 21:31:59.295553 systemd[1]: Reached target time-set.target - System Time Set. Mar 20 21:31:59.305220 systemd-resolved[1472]: Defaulting to hostname 'linux'. Mar 20 21:31:59.307170 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 20 21:31:59.307373 systemd[1]: Reached target network.target - Network. Mar 20 21:31:59.307455 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 20 21:31:59.322712 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 20 21:31:59.341813 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 21:31:59.347074 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 20 21:31:59.347330 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 20 21:31:59.347356 systemd[1]: Reached target sysinit.target - System Initialization. Mar 20 21:31:59.347524 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 20 21:31:59.347653 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 20 21:31:59.347871 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 20 21:31:59.348018 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 20 21:31:59.348129 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 20 21:31:59.348234 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 20 21:31:59.348257 systemd[1]: Reached target paths.target - Path Units. Mar 20 21:31:59.348343 systemd[1]: Reached target timers.target - Timer Units. Mar 20 21:31:59.349305 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 20 21:31:59.350409 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 20 21:31:59.352181 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 20 21:31:59.352409 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 20 21:31:59.352542 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 20 21:31:59.355160 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 20 21:31:59.355509 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 20 21:31:59.356089 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 20 21:31:59.356247 systemd[1]: Reached target sockets.target - Socket Units. Mar 20 21:31:59.356341 systemd[1]: Reached target basic.target - Basic System. Mar 20 21:31:59.356461 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 20 21:31:59.356481 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 20 21:31:59.357271 systemd[1]: Starting containerd.service - containerd container runtime... Mar 20 21:31:59.359855 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 20 21:31:59.360924 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 20 21:31:59.361871 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 20 21:31:59.361989 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 20 21:31:59.363970 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 20 21:31:59.365976 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 20 21:31:59.371556 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 20 21:31:59.372934 jq[1534]: false Mar 20 21:31:59.373355 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 20 21:31:59.381893 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 20 21:31:59.382489 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 20 21:31:59.383005 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 20 21:31:59.386843 systemd[1]: Starting update-engine.service - Update Engine... Mar 20 21:31:59.392343 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 20 21:31:59.396759 jq[1550]: true Mar 20 21:31:59.397876 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Mar 20 21:31:59.399856 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 20 21:31:59.400863 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 20 21:31:59.401058 systemd[1]: motdgen.service: Deactivated successfully. Mar 20 21:31:59.401176 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 20 21:31:59.407698 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 20 21:31:59.407852 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 20 21:31:59.413780 extend-filesystems[1535]: Found loop4 Mar 20 21:31:59.419078 extend-filesystems[1535]: Found loop5 Mar 20 21:31:59.419078 extend-filesystems[1535]: Found loop6 Mar 20 21:31:59.419078 extend-filesystems[1535]: Found loop7 Mar 20 21:31:59.419078 extend-filesystems[1535]: Found sda Mar 20 21:31:59.419078 extend-filesystems[1535]: Found sda1 Mar 20 21:31:59.419078 extend-filesystems[1535]: Found sda2 Mar 20 21:31:59.419078 extend-filesystems[1535]: Found sda3 Mar 20 21:31:59.419078 extend-filesystems[1535]: Found usr Mar 20 21:31:59.419078 extend-filesystems[1535]: Found sda4 Mar 20 21:31:59.419078 extend-filesystems[1535]: Found sda6 Mar 20 21:31:59.419078 extend-filesystems[1535]: Found sda7 Mar 20 21:31:59.419078 extend-filesystems[1535]: Found sda9 Mar 20 21:31:59.419078 extend-filesystems[1535]: Checking size of /dev/sda9 Mar 20 21:31:59.420057 (ntainerd)[1556]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 20 21:31:59.426628 update_engine[1544]: I20250320 21:31:59.426425 1544 main.cc:92] Flatcar Update Engine starting Mar 20 21:31:59.428815 jq[1555]: true Mar 20 21:31:59.437740 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Mar 20 21:31:59.444260 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Mar 20 21:31:59.450652 extend-filesystems[1535]: Old size kept for /dev/sda9 Mar 20 21:31:59.450652 extend-filesystems[1535]: Found sr0 Mar 20 21:31:59.455428 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 20 21:31:59.455588 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 20 21:31:59.459698 tar[1554]: linux-amd64/helm Mar 20 21:31:59.474653 dbus-daemon[1533]: [system] SELinux support is enabled Mar 20 21:31:59.478234 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 20 21:31:59.481025 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 20 21:31:59.481045 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 20 21:31:59.481658 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 20 21:31:59.481669 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 20 21:31:59.490022 systemd[1]: Started update-engine.service - Update Engine. Mar 20 21:31:59.490261 update_engine[1544]: I20250320 21:31:59.490085 1544 update_check_scheduler.cc:74] Next update check in 6m4s Mar 20 21:31:59.492886 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1324) Mar 20 21:31:59.503937 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 20 21:31:59.508488 systemd-logind[1540]: Watching system buttons on /dev/input/event1 (Power Button) Mar 20 21:31:59.511835 systemd-logind[1540]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 20 21:31:59.513777 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Mar 20 21:31:59.514690 systemd-logind[1540]: New seat seat0. Mar 20 21:31:59.522769 systemd[1]: Started systemd-logind.service - User Login Management. Mar 20 21:31:59.539211 unknown[1573]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Mar 20 21:31:59.549371 unknown[1573]: Core dump limit set to -1 Mar 20 21:31:59.555011 bash[1593]: Updated "/home/core/.ssh/authorized_keys" Mar 20 21:31:59.555834 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 20 21:31:59.558296 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 20 21:33:16.092282 systemd-resolved[1472]: Clock change detected. Flushing caches. Mar 20 21:33:16.092371 systemd-timesyncd[1502]: Contacted time server 162.159.200.123:123 (0.flatcar.pool.ntp.org). Mar 20 21:33:16.092510 systemd-timesyncd[1502]: Initial clock synchronization to Thu 2025-03-20 21:33:16.092067 UTC. Mar 20 21:33:16.111927 sshd_keygen[1566]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 20 21:33:16.141351 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 20 21:33:16.142951 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 20 21:33:16.160600 systemd[1]: issuegen.service: Deactivated successfully. Mar 20 21:33:16.160839 locksmithd[1594]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 20 21:33:16.161666 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 20 21:33:16.166967 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 20 21:33:16.190372 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 20 21:33:16.192525 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 20 21:33:16.193486 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 20 21:33:16.193751 systemd[1]: Reached target getty.target - Login Prompts. Mar 20 21:33:16.273059 containerd[1556]: time="2025-03-20T21:33:16Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 20 21:33:16.273059 containerd[1556]: time="2025-03-20T21:33:16.273028179Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 20 21:33:16.284928 containerd[1556]: time="2025-03-20T21:33:16.284895719Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="5.854µs" Mar 20 21:33:16.284928 containerd[1556]: time="2025-03-20T21:33:16.284921831Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 20 21:33:16.285018 containerd[1556]: time="2025-03-20T21:33:16.284936885Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 20 21:33:16.287098 containerd[1556]: time="2025-03-20T21:33:16.287079390Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 20 21:33:16.287132 containerd[1556]: time="2025-03-20T21:33:16.287098196Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 20 21:33:16.287132 containerd[1556]: time="2025-03-20T21:33:16.287121057Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 20 21:33:16.287179 containerd[1556]: time="2025-03-20T21:33:16.287165368Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 20 21:33:16.287205 containerd[1556]: time="2025-03-20T21:33:16.287178031Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 20 21:33:16.287364 containerd[1556]: time="2025-03-20T21:33:16.287340895Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 20 21:33:16.287364 containerd[1556]: time="2025-03-20T21:33:16.287361789Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 20 21:33:16.287401 containerd[1556]: time="2025-03-20T21:33:16.287372020Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 20 21:33:16.287401 containerd[1556]: time="2025-03-20T21:33:16.287379417Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 20 21:33:16.287449 containerd[1556]: time="2025-03-20T21:33:16.287434564Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 20 21:33:16.288194 containerd[1556]: time="2025-03-20T21:33:16.288178356Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 20 21:33:16.288216 containerd[1556]: time="2025-03-20T21:33:16.288202723Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 20 21:33:16.288216 containerd[1556]: time="2025-03-20T21:33:16.288211106Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 20 21:33:16.288254 containerd[1556]: time="2025-03-20T21:33:16.288230323Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 20 21:33:16.288535 containerd[1556]: time="2025-03-20T21:33:16.288386510Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 20 21:33:16.288535 containerd[1556]: time="2025-03-20T21:33:16.288432508Z" level=info msg="metadata content store policy set" policy=shared Mar 20 21:33:16.291169 containerd[1556]: time="2025-03-20T21:33:16.290648291Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 20 21:33:16.291169 containerd[1556]: time="2025-03-20T21:33:16.290691296Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 20 21:33:16.291169 containerd[1556]: time="2025-03-20T21:33:16.290704201Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 20 21:33:16.291169 containerd[1556]: time="2025-03-20T21:33:16.290712738Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 20 21:33:16.291169 containerd[1556]: time="2025-03-20T21:33:16.290720102Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 20 21:33:16.291169 containerd[1556]: time="2025-03-20T21:33:16.290726552Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 20 21:33:16.291169 containerd[1556]: time="2025-03-20T21:33:16.290733523Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 20 21:33:16.291169 containerd[1556]: time="2025-03-20T21:33:16.290740706Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 20 21:33:16.291169 containerd[1556]: time="2025-03-20T21:33:16.290747313Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 20 21:33:16.291169 containerd[1556]: time="2025-03-20T21:33:16.290754206Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 20 21:33:16.291169 containerd[1556]: time="2025-03-20T21:33:16.290761117Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 20 21:33:16.291169 containerd[1556]: time="2025-03-20T21:33:16.290768001Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 20 21:33:16.291169 containerd[1556]: time="2025-03-20T21:33:16.290836671Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 20 21:33:16.291169 containerd[1556]: time="2025-03-20T21:33:16.290849128Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 20 21:33:16.291455 containerd[1556]: time="2025-03-20T21:33:16.290857718Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 20 21:33:16.291455 containerd[1556]: time="2025-03-20T21:33:16.290864434Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 20 21:33:16.291455 containerd[1556]: time="2025-03-20T21:33:16.290871155Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 20 21:33:16.291455 containerd[1556]: time="2025-03-20T21:33:16.290876895Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 20 21:33:16.291455 containerd[1556]: time="2025-03-20T21:33:16.290883768Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 20 21:33:16.291455 containerd[1556]: time="2025-03-20T21:33:16.290890196Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 20 21:33:16.291455 containerd[1556]: time="2025-03-20T21:33:16.290897462Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 20 21:33:16.291455 containerd[1556]: time="2025-03-20T21:33:16.290904283Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 20 21:33:16.291455 containerd[1556]: time="2025-03-20T21:33:16.290910216Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 20 21:33:16.291455 containerd[1556]: time="2025-03-20T21:33:16.290947433Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 20 21:33:16.291455 containerd[1556]: time="2025-03-20T21:33:16.290956583Z" level=info msg="Start snapshots syncer" Mar 20 21:33:16.291455 containerd[1556]: time="2025-03-20T21:33:16.290979551Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 20 21:33:16.291663 containerd[1556]: time="2025-03-20T21:33:16.291133852Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 20 21:33:16.291663 containerd[1556]: time="2025-03-20T21:33:16.291165945Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 20 21:33:16.291759 containerd[1556]: time="2025-03-20T21:33:16.291205579Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 20 21:33:16.291759 containerd[1556]: time="2025-03-20T21:33:16.291261148Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 20 21:33:16.291759 containerd[1556]: time="2025-03-20T21:33:16.291276583Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 20 21:33:16.291759 containerd[1556]: time="2025-03-20T21:33:16.291284017Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 20 21:33:16.291759 containerd[1556]: time="2025-03-20T21:33:16.291290164Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 20 21:33:16.291759 containerd[1556]: time="2025-03-20T21:33:16.291298043Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 20 21:33:16.291759 containerd[1556]: time="2025-03-20T21:33:16.291304613Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 20 21:33:16.291759 containerd[1556]: time="2025-03-20T21:33:16.291314218Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 20 21:33:16.291759 containerd[1556]: time="2025-03-20T21:33:16.291334129Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 20 21:33:16.291759 containerd[1556]: time="2025-03-20T21:33:16.291345057Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 20 21:33:16.291759 containerd[1556]: time="2025-03-20T21:33:16.291351965Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 20 21:33:16.291759 containerd[1556]: time="2025-03-20T21:33:16.291370130Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 20 21:33:16.291759 containerd[1556]: time="2025-03-20T21:33:16.291378613Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 20 21:33:16.291759 containerd[1556]: time="2025-03-20T21:33:16.291384625Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 20 21:33:16.291978 containerd[1556]: time="2025-03-20T21:33:16.291390221Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 20 21:33:16.291978 containerd[1556]: time="2025-03-20T21:33:16.291395411Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 20 21:33:16.291978 containerd[1556]: time="2025-03-20T21:33:16.291401108Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 20 21:33:16.291978 containerd[1556]: time="2025-03-20T21:33:16.291407513Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 20 21:33:16.291978 containerd[1556]: time="2025-03-20T21:33:16.291417272Z" level=info msg="runtime interface created" Mar 20 21:33:16.291978 containerd[1556]: time="2025-03-20T21:33:16.291420468Z" level=info msg="created NRI interface" Mar 20 21:33:16.291978 containerd[1556]: time="2025-03-20T21:33:16.291425002Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 20 21:33:16.291978 containerd[1556]: time="2025-03-20T21:33:16.291430912Z" level=info msg="Connect containerd service" Mar 20 21:33:16.291978 containerd[1556]: time="2025-03-20T21:33:16.291445089Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 20 21:33:16.292811 containerd[1556]: time="2025-03-20T21:33:16.292260509Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 20 21:33:16.414643 containerd[1556]: time="2025-03-20T21:33:16.414068733Z" level=info msg="Start subscribing containerd event" Mar 20 21:33:16.414643 containerd[1556]: time="2025-03-20T21:33:16.414100822Z" level=info msg="Start recovering state" Mar 20 21:33:16.414643 containerd[1556]: time="2025-03-20T21:33:16.414164427Z" level=info msg="Start event monitor" Mar 20 21:33:16.414643 containerd[1556]: time="2025-03-20T21:33:16.414179438Z" level=info msg="Start cni network conf syncer for default" Mar 20 21:33:16.414643 containerd[1556]: time="2025-03-20T21:33:16.414184238Z" level=info msg="Start streaming server" Mar 20 21:33:16.414643 containerd[1556]: time="2025-03-20T21:33:16.414189781Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 20 21:33:16.414643 containerd[1556]: time="2025-03-20T21:33:16.414193592Z" level=info msg="runtime interface starting up..." Mar 20 21:33:16.414643 containerd[1556]: time="2025-03-20T21:33:16.414196600Z" level=info msg="starting plugins..." Mar 20 21:33:16.414643 containerd[1556]: time="2025-03-20T21:33:16.414205001Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 20 21:33:16.414643 containerd[1556]: time="2025-03-20T21:33:16.414213934Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 20 21:33:16.414643 containerd[1556]: time="2025-03-20T21:33:16.414241268Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 20 21:33:16.414373 systemd[1]: Started containerd.service - containerd container runtime. Mar 20 21:33:16.416872 containerd[1556]: time="2025-03-20T21:33:16.416729488Z" level=info msg="containerd successfully booted in 0.144320s" Mar 20 21:33:16.423871 tar[1554]: linux-amd64/LICENSE Mar 20 21:33:16.423905 tar[1554]: linux-amd64/README.md Mar 20 21:33:16.438354 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 20 21:33:16.671297 systemd-networkd[1363]: ens192: Gained IPv6LL Mar 20 21:33:16.672613 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 20 21:33:16.673294 systemd[1]: Reached target network-online.target - Network is Online. Mar 20 21:33:16.674419 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Mar 20 21:33:16.681146 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 21:33:16.682142 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 20 21:33:16.705300 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 20 21:33:16.710050 systemd[1]: coreos-metadata.service: Deactivated successfully. Mar 20 21:33:16.710255 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Mar 20 21:33:16.710861 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 20 21:33:17.911611 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 21:33:17.912460 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 20 21:33:17.912882 systemd[1]: Startup finished in 934ms (kernel) + 5.259s (initrd) + 4.577s (userspace) = 10.770s. Mar 20 21:33:17.918279 (kubelet)[1722]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 21:33:17.966988 login[1639]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 20 21:33:17.969012 login[1640]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 20 21:33:17.973584 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 20 21:33:17.975595 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 20 21:33:17.981461 systemd-logind[1540]: New session 2 of user core. Mar 20 21:33:17.984316 systemd-logind[1540]: New session 1 of user core. Mar 20 21:33:17.991643 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 20 21:33:17.993750 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 20 21:33:18.003933 (systemd)[1729]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 20 21:33:18.005375 systemd-logind[1540]: New session c1 of user core. Mar 20 21:33:18.092801 systemd[1729]: Queued start job for default target default.target. Mar 20 21:33:18.103124 systemd[1729]: Created slice app.slice - User Application Slice. Mar 20 21:33:18.103152 systemd[1729]: Reached target paths.target - Paths. Mar 20 21:33:18.103189 systemd[1729]: Reached target timers.target - Timers. Mar 20 21:33:18.106107 systemd[1729]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 20 21:33:18.111514 systemd[1729]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 20 21:33:18.111560 systemd[1729]: Reached target sockets.target - Sockets. Mar 20 21:33:18.111584 systemd[1729]: Reached target basic.target - Basic System. Mar 20 21:33:18.111606 systemd[1729]: Reached target default.target - Main User Target. Mar 20 21:33:18.111623 systemd[1729]: Startup finished in 102ms. Mar 20 21:33:18.111844 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 20 21:33:18.113927 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 20 21:33:18.114543 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 20 21:33:18.790454 kubelet[1722]: E0320 21:33:18.790416 1722 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 21:33:18.791585 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 21:33:18.791673 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 21:33:18.791866 systemd[1]: kubelet.service: Consumed 666ms CPU time, 236M memory peak. Mar 20 21:33:29.042367 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 20 21:33:29.043921 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 21:33:29.125873 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 21:33:29.128622 (kubelet)[1772]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 21:33:29.176083 kubelet[1772]: E0320 21:33:29.176052 1772 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 21:33:29.178587 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 21:33:29.178737 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 21:33:29.179078 systemd[1]: kubelet.service: Consumed 90ms CPU time, 95.3M memory peak. Mar 20 21:33:39.429176 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 20 21:33:39.430906 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 21:33:39.775016 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 21:33:39.781304 (kubelet)[1787]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 21:33:39.843017 kubelet[1787]: E0320 21:33:39.842992 1787 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 21:33:39.844586 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 21:33:39.844670 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 21:33:39.844926 systemd[1]: kubelet.service: Consumed 101ms CPU time, 100.4M memory peak. Mar 20 21:33:46.225868 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 20 21:33:46.226563 systemd[1]: Started sshd@0-139.178.70.103:22-147.75.109.163:55796.service - OpenSSH per-connection server daemon (147.75.109.163:55796). Mar 20 21:33:46.329172 sshd[1795]: Accepted publickey for core from 147.75.109.163 port 55796 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:33:46.329998 sshd-session[1795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:33:46.333708 systemd-logind[1540]: New session 3 of user core. Mar 20 21:33:46.340239 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 20 21:33:46.398251 systemd[1]: Started sshd@1-139.178.70.103:22-147.75.109.163:55802.service - OpenSSH per-connection server daemon (147.75.109.163:55802). Mar 20 21:33:46.435362 sshd[1800]: Accepted publickey for core from 147.75.109.163 port 55802 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:33:46.436323 sshd-session[1800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:33:46.438908 systemd-logind[1540]: New session 4 of user core. Mar 20 21:33:46.449125 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 20 21:33:46.499236 sshd[1802]: Connection closed by 147.75.109.163 port 55802 Mar 20 21:33:46.500204 sshd-session[1800]: pam_unix(sshd:session): session closed for user core Mar 20 21:33:46.516584 systemd[1]: sshd@1-139.178.70.103:22-147.75.109.163:55802.service: Deactivated successfully. Mar 20 21:33:46.517589 systemd[1]: session-4.scope: Deactivated successfully. Mar 20 21:33:46.518077 systemd-logind[1540]: Session 4 logged out. Waiting for processes to exit. Mar 20 21:33:46.519280 systemd[1]: Started sshd@2-139.178.70.103:22-147.75.109.163:55804.service - OpenSSH per-connection server daemon (147.75.109.163:55804). Mar 20 21:33:46.520200 systemd-logind[1540]: Removed session 4. Mar 20 21:33:46.566667 sshd[1807]: Accepted publickey for core from 147.75.109.163 port 55804 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:33:46.567615 sshd-session[1807]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:33:46.571163 systemd-logind[1540]: New session 5 of user core. Mar 20 21:33:46.582176 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 20 21:33:46.628946 sshd[1810]: Connection closed by 147.75.109.163 port 55804 Mar 20 21:33:46.629361 sshd-session[1807]: pam_unix(sshd:session): session closed for user core Mar 20 21:33:46.638788 systemd[1]: sshd@2-139.178.70.103:22-147.75.109.163:55804.service: Deactivated successfully. Mar 20 21:33:46.639849 systemd[1]: session-5.scope: Deactivated successfully. Mar 20 21:33:46.640402 systemd-logind[1540]: Session 5 logged out. Waiting for processes to exit. Mar 20 21:33:46.641654 systemd[1]: Started sshd@3-139.178.70.103:22-147.75.109.163:55818.service - OpenSSH per-connection server daemon (147.75.109.163:55818). Mar 20 21:33:46.643304 systemd-logind[1540]: Removed session 5. Mar 20 21:33:46.682401 sshd[1815]: Accepted publickey for core from 147.75.109.163 port 55818 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:33:46.683231 sshd-session[1815]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:33:46.686872 systemd-logind[1540]: New session 6 of user core. Mar 20 21:33:46.695165 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 20 21:33:46.744916 sshd[1818]: Connection closed by 147.75.109.163 port 55818 Mar 20 21:33:46.745866 sshd-session[1815]: pam_unix(sshd:session): session closed for user core Mar 20 21:33:46.754595 systemd[1]: sshd@3-139.178.70.103:22-147.75.109.163:55818.service: Deactivated successfully. Mar 20 21:33:46.755824 systemd[1]: session-6.scope: Deactivated successfully. Mar 20 21:33:46.756469 systemd-logind[1540]: Session 6 logged out. Waiting for processes to exit. Mar 20 21:33:46.757735 systemd[1]: Started sshd@4-139.178.70.103:22-147.75.109.163:55830.service - OpenSSH per-connection server daemon (147.75.109.163:55830). Mar 20 21:33:46.759466 systemd-logind[1540]: Removed session 6. Mar 20 21:33:46.801288 sshd[1823]: Accepted publickey for core from 147.75.109.163 port 55830 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:33:46.802502 sshd-session[1823]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:33:46.805430 systemd-logind[1540]: New session 7 of user core. Mar 20 21:33:46.812193 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 20 21:33:46.925338 sudo[1827]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 20 21:33:46.925591 sudo[1827]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 21:33:46.945694 sudo[1827]: pam_unix(sudo:session): session closed for user root Mar 20 21:33:46.947311 sshd[1826]: Connection closed by 147.75.109.163 port 55830 Mar 20 21:33:46.946728 sshd-session[1823]: pam_unix(sshd:session): session closed for user core Mar 20 21:33:46.955738 systemd[1]: sshd@4-139.178.70.103:22-147.75.109.163:55830.service: Deactivated successfully. Mar 20 21:33:46.956808 systemd[1]: session-7.scope: Deactivated successfully. Mar 20 21:33:46.957936 systemd-logind[1540]: Session 7 logged out. Waiting for processes to exit. Mar 20 21:33:46.958944 systemd[1]: Started sshd@5-139.178.70.103:22-147.75.109.163:55836.service - OpenSSH per-connection server daemon (147.75.109.163:55836). Mar 20 21:33:46.960148 systemd-logind[1540]: Removed session 7. Mar 20 21:33:47.010606 sshd[1832]: Accepted publickey for core from 147.75.109.163 port 55836 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:33:47.011634 sshd-session[1832]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:33:47.014662 systemd-logind[1540]: New session 8 of user core. Mar 20 21:33:47.023193 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 20 21:33:47.074416 sudo[1837]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 20 21:33:47.074612 sudo[1837]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 21:33:47.077355 sudo[1837]: pam_unix(sudo:session): session closed for user root Mar 20 21:33:47.081097 sudo[1836]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 20 21:33:47.081295 sudo[1836]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 21:33:47.089296 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 20 21:33:47.119070 augenrules[1859]: No rules Mar 20 21:33:47.119964 systemd[1]: audit-rules.service: Deactivated successfully. Mar 20 21:33:47.120135 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 20 21:33:47.120657 sudo[1836]: pam_unix(sudo:session): session closed for user root Mar 20 21:33:47.121839 sshd[1835]: Connection closed by 147.75.109.163 port 55836 Mar 20 21:33:47.122083 sshd-session[1832]: pam_unix(sshd:session): session closed for user core Mar 20 21:33:47.128870 systemd[1]: sshd@5-139.178.70.103:22-147.75.109.163:55836.service: Deactivated successfully. Mar 20 21:33:47.129892 systemd[1]: session-8.scope: Deactivated successfully. Mar 20 21:33:47.130377 systemd-logind[1540]: Session 8 logged out. Waiting for processes to exit. Mar 20 21:33:47.131480 systemd[1]: Started sshd@6-139.178.70.103:22-147.75.109.163:55844.service - OpenSSH per-connection server daemon (147.75.109.163:55844). Mar 20 21:33:47.132804 systemd-logind[1540]: Removed session 8. Mar 20 21:33:47.167658 sshd[1867]: Accepted publickey for core from 147.75.109.163 port 55844 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:33:47.168632 sshd-session[1867]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:33:47.172277 systemd-logind[1540]: New session 9 of user core. Mar 20 21:33:47.188214 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 20 21:33:47.238174 sudo[1871]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 20 21:33:47.238365 sudo[1871]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 21:33:47.807962 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 20 21:33:47.824321 (dockerd)[1888]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 20 21:33:48.317726 dockerd[1888]: time="2025-03-20T21:33:48.317692952Z" level=info msg="Starting up" Mar 20 21:33:48.318733 dockerd[1888]: time="2025-03-20T21:33:48.318713731Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 20 21:33:48.388910 dockerd[1888]: time="2025-03-20T21:33:48.388731818Z" level=info msg="Loading containers: start." Mar 20 21:33:48.606065 kernel: Initializing XFRM netlink socket Mar 20 21:33:48.778374 systemd-networkd[1363]: docker0: Link UP Mar 20 21:33:48.827035 dockerd[1888]: time="2025-03-20T21:33:48.826997522Z" level=info msg="Loading containers: done." Mar 20 21:33:48.835763 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1957741709-merged.mount: Deactivated successfully. Mar 20 21:33:48.878452 dockerd[1888]: time="2025-03-20T21:33:48.878410276Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 20 21:33:48.878563 dockerd[1888]: time="2025-03-20T21:33:48.878482133Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 20 21:33:48.878587 dockerd[1888]: time="2025-03-20T21:33:48.878561074Z" level=info msg="Daemon has completed initialization" Mar 20 21:33:48.967719 dockerd[1888]: time="2025-03-20T21:33:48.967680192Z" level=info msg="API listen on /run/docker.sock" Mar 20 21:33:48.968193 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 20 21:33:50.095085 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 20 21:33:50.096489 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 21:33:50.346987 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 21:33:50.354314 (kubelet)[2093]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 21:33:50.370572 containerd[1556]: time="2025-03-20T21:33:50.370340761Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\"" Mar 20 21:33:50.392248 kubelet[2093]: E0320 21:33:50.392210 2093 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 21:33:50.393818 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 21:33:50.393917 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 21:33:50.394281 systemd[1]: kubelet.service: Consumed 95ms CPU time, 97.6M memory peak. Mar 20 21:33:51.168772 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4067713825.mount: Deactivated successfully. Mar 20 21:33:52.852537 containerd[1556]: time="2025-03-20T21:33:52.852495700Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:52.866559 containerd[1556]: time="2025-03-20T21:33:52.866523242Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.7: active requests=0, bytes read=27959268" Mar 20 21:33:52.878366 containerd[1556]: time="2025-03-20T21:33:52.878328554Z" level=info msg="ImageCreate event name:\"sha256:f084bc047a8cf7c8484d47c51e70e646dde3977d916f282feb99207b7b9241af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:52.894370 containerd[1556]: time="2025-03-20T21:33:52.894329260Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:52.895386 containerd[1556]: time="2025-03-20T21:33:52.895018001Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.7\" with image id \"sha256:f084bc047a8cf7c8484d47c51e70e646dde3977d916f282feb99207b7b9241af\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\", size \"27956068\" in 2.524655421s" Mar 20 21:33:52.895386 containerd[1556]: time="2025-03-20T21:33:52.895056160Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\" returns image reference \"sha256:f084bc047a8cf7c8484d47c51e70e646dde3977d916f282feb99207b7b9241af\"" Mar 20 21:33:52.908240 containerd[1556]: time="2025-03-20T21:33:52.908218467Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\"" Mar 20 21:33:54.708488 containerd[1556]: time="2025-03-20T21:33:54.708450698Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:54.714618 containerd[1556]: time="2025-03-20T21:33:54.714578376Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.7: active requests=0, bytes read=24713776" Mar 20 21:33:54.727729 containerd[1556]: time="2025-03-20T21:33:54.727691681Z" level=info msg="ImageCreate event name:\"sha256:652dcad615a9a0c252c253860d5b5b7bfebd3efe159dc033a8555bc15a6d1985\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:54.743679 containerd[1556]: time="2025-03-20T21:33:54.743547906Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:54.745419 containerd[1556]: time="2025-03-20T21:33:54.745389475Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.7\" with image id \"sha256:652dcad615a9a0c252c253860d5b5b7bfebd3efe159dc033a8555bc15a6d1985\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\", size \"26201384\" in 1.837146723s" Mar 20 21:33:54.745464 containerd[1556]: time="2025-03-20T21:33:54.745417699Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\" returns image reference \"sha256:652dcad615a9a0c252c253860d5b5b7bfebd3efe159dc033a8555bc15a6d1985\"" Mar 20 21:33:54.746172 containerd[1556]: time="2025-03-20T21:33:54.746142277Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\"" Mar 20 21:33:56.159848 containerd[1556]: time="2025-03-20T21:33:56.159807086Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:56.164437 containerd[1556]: time="2025-03-20T21:33:56.164399889Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.7: active requests=0, bytes read=18780368" Mar 20 21:33:56.170989 containerd[1556]: time="2025-03-20T21:33:56.170954698Z" level=info msg="ImageCreate event name:\"sha256:7f1f6a63d8aa14cf61d0045e912ad312b4ade24637cecccc933b163582eae68c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:56.182998 containerd[1556]: time="2025-03-20T21:33:56.182957853Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:56.183933 containerd[1556]: time="2025-03-20T21:33:56.183816625Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.7\" with image id \"sha256:7f1f6a63d8aa14cf61d0045e912ad312b4ade24637cecccc933b163582eae68c\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\", size \"20267994\" in 1.437651158s" Mar 20 21:33:56.183933 containerd[1556]: time="2025-03-20T21:33:56.183839584Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\" returns image reference \"sha256:7f1f6a63d8aa14cf61d0045e912ad312b4ade24637cecccc933b163582eae68c\"" Mar 20 21:33:56.184663 containerd[1556]: time="2025-03-20T21:33:56.184645290Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\"" Mar 20 21:33:57.706423 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1118318085.mount: Deactivated successfully. Mar 20 21:33:58.047455 containerd[1556]: time="2025-03-20T21:33:58.047380884Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:58.049928 containerd[1556]: time="2025-03-20T21:33:58.049886595Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.7: active requests=0, bytes read=30354630" Mar 20 21:33:58.057699 containerd[1556]: time="2025-03-20T21:33:58.057535952Z" level=info msg="ImageCreate event name:\"sha256:dcfc039c372ea285997a302d60e58a75b80905b4c4dba969993b9b22e8ac66d1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:58.059970 containerd[1556]: time="2025-03-20T21:33:58.059938690Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:58.060327 containerd[1556]: time="2025-03-20T21:33:58.060305329Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.7\" with image id \"sha256:dcfc039c372ea285997a302d60e58a75b80905b4c4dba969993b9b22e8ac66d1\", repo tag \"registry.k8s.io/kube-proxy:v1.31.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\", size \"30353649\" in 1.875582089s" Mar 20 21:33:58.060368 containerd[1556]: time="2025-03-20T21:33:58.060327449Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\" returns image reference \"sha256:dcfc039c372ea285997a302d60e58a75b80905b4c4dba969993b9b22e8ac66d1\"" Mar 20 21:33:58.060808 containerd[1556]: time="2025-03-20T21:33:58.060773651Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 20 21:33:58.511485 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2079711108.mount: Deactivated successfully. Mar 20 21:33:59.203879 containerd[1556]: time="2025-03-20T21:33:59.203848035Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:59.208548 containerd[1556]: time="2025-03-20T21:33:59.208424124Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" Mar 20 21:33:59.213085 containerd[1556]: time="2025-03-20T21:33:59.213053445Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:59.219586 containerd[1556]: time="2025-03-20T21:33:59.219564814Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:33:59.220148 containerd[1556]: time="2025-03-20T21:33:59.220033795Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.159194388s" Mar 20 21:33:59.220148 containerd[1556]: time="2025-03-20T21:33:59.220061172Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 20 21:33:59.220529 containerd[1556]: time="2025-03-20T21:33:59.220315851Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 20 21:33:59.677164 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2858198996.mount: Deactivated successfully. Mar 20 21:33:59.679602 containerd[1556]: time="2025-03-20T21:33:59.679581422Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 21:33:59.680230 containerd[1556]: time="2025-03-20T21:33:59.680204802Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Mar 20 21:33:59.680591 containerd[1556]: time="2025-03-20T21:33:59.680577732Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 21:33:59.681441 containerd[1556]: time="2025-03-20T21:33:59.681409438Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 21:33:59.681987 containerd[1556]: time="2025-03-20T21:33:59.681808866Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 461.478749ms" Mar 20 21:33:59.681987 containerd[1556]: time="2025-03-20T21:33:59.681825237Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Mar 20 21:33:59.682095 containerd[1556]: time="2025-03-20T21:33:59.682082706Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Mar 20 21:34:00.124178 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1061159631.mount: Deactivated successfully. Mar 20 21:34:00.609843 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 20 21:34:00.611173 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 21:34:00.972827 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 21:34:00.975104 (kubelet)[2274]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 21:34:01.013593 kubelet[2274]: E0320 21:34:01.013563 2274 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 21:34:01.015020 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 21:34:01.015207 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 21:34:01.015639 systemd[1]: kubelet.service: Consumed 75ms CPU time, 98M memory peak. Mar 20 21:34:01.154626 update_engine[1544]: I20250320 21:34:01.154578 1544 update_attempter.cc:509] Updating boot flags... Mar 20 21:34:01.179183 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (2290) Mar 20 21:34:01.220103 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (2286) Mar 20 21:34:03.306769 containerd[1556]: time="2025-03-20T21:34:03.306734418Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:03.307795 containerd[1556]: time="2025-03-20T21:34:03.307771534Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56779973" Mar 20 21:34:03.308168 containerd[1556]: time="2025-03-20T21:34:03.308155433Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:03.310060 containerd[1556]: time="2025-03-20T21:34:03.310000404Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:03.310608 containerd[1556]: time="2025-03-20T21:34:03.310590120Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 3.628492077s" Mar 20 21:34:03.310636 containerd[1556]: time="2025-03-20T21:34:03.310612793Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Mar 20 21:34:05.265780 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 21:34:05.265942 systemd[1]: kubelet.service: Consumed 75ms CPU time, 98M memory peak. Mar 20 21:34:05.267420 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 21:34:05.285257 systemd[1]: Reload requested from client PID 2332 ('systemctl') (unit session-9.scope)... Mar 20 21:34:05.285266 systemd[1]: Reloading... Mar 20 21:34:05.350072 zram_generator::config[2376]: No configuration found. Mar 20 21:34:05.403302 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Mar 20 21:34:05.421276 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 21:34:05.484276 systemd[1]: Reloading finished in 198 ms. Mar 20 21:34:05.501618 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 20 21:34:05.501667 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 20 21:34:05.501819 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 21:34:05.503321 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 21:34:05.751117 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 21:34:05.752595 (kubelet)[2444]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 20 21:34:05.902946 kubelet[2444]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 21:34:05.902946 kubelet[2444]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 21:34:05.902946 kubelet[2444]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 21:34:05.912567 kubelet[2444]: I0320 21:34:05.912536 2444 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 21:34:06.279726 kubelet[2444]: I0320 21:34:06.279696 2444 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 20 21:34:06.279726 kubelet[2444]: I0320 21:34:06.279724 2444 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 21:34:06.279994 kubelet[2444]: I0320 21:34:06.279984 2444 server.go:929] "Client rotation is on, will bootstrap in background" Mar 20 21:34:06.347763 kubelet[2444]: I0320 21:34:06.347708 2444 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 20 21:34:06.352464 kubelet[2444]: E0320 21:34:06.352409 2444 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.103:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Mar 20 21:34:06.369847 kubelet[2444]: I0320 21:34:06.369825 2444 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 21:34:06.377275 kubelet[2444]: I0320 21:34:06.377249 2444 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 20 21:34:06.381528 kubelet[2444]: I0320 21:34:06.381494 2444 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 21:34:06.381682 kubelet[2444]: I0320 21:34:06.381658 2444 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 21:34:06.381816 kubelet[2444]: I0320 21:34:06.381680 2444 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 21:34:06.381914 kubelet[2444]: I0320 21:34:06.381822 2444 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 21:34:06.381914 kubelet[2444]: I0320 21:34:06.381831 2444 container_manager_linux.go:300] "Creating device plugin manager" Mar 20 21:34:06.381962 kubelet[2444]: I0320 21:34:06.381951 2444 state_mem.go:36] "Initialized new in-memory state store" Mar 20 21:34:06.383667 kubelet[2444]: I0320 21:34:06.383644 2444 kubelet.go:408] "Attempting to sync node with API server" Mar 20 21:34:06.383667 kubelet[2444]: I0320 21:34:06.383666 2444 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 21:34:06.383753 kubelet[2444]: I0320 21:34:06.383688 2444 kubelet.go:314] "Adding apiserver pod source" Mar 20 21:34:06.383753 kubelet[2444]: I0320 21:34:06.383701 2444 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 21:34:06.390276 kubelet[2444]: W0320 21:34:06.390022 2444 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Mar 20 21:34:06.390276 kubelet[2444]: E0320 21:34:06.390075 2444 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Mar 20 21:34:06.392531 kubelet[2444]: W0320 21:34:06.392352 2444 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Mar 20 21:34:06.392531 kubelet[2444]: E0320 21:34:06.392394 2444 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Mar 20 21:34:06.392531 kubelet[2444]: I0320 21:34:06.392469 2444 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 20 21:34:06.396129 kubelet[2444]: I0320 21:34:06.396103 2444 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 21:34:06.396707 kubelet[2444]: W0320 21:34:06.396689 2444 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 20 21:34:06.398632 kubelet[2444]: I0320 21:34:06.398448 2444 server.go:1269] "Started kubelet" Mar 20 21:34:06.398717 kubelet[2444]: I0320 21:34:06.398643 2444 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 21:34:06.401706 kubelet[2444]: I0320 21:34:06.401365 2444 server.go:460] "Adding debug handlers to kubelet server" Mar 20 21:34:06.403868 kubelet[2444]: I0320 21:34:06.403840 2444 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 21:34:06.404706 kubelet[2444]: I0320 21:34:06.404468 2444 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 20 21:34:06.404706 kubelet[2444]: I0320 21:34:06.404501 2444 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 21:34:06.404706 kubelet[2444]: I0320 21:34:06.404664 2444 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 21:34:06.406678 kubelet[2444]: I0320 21:34:06.406662 2444 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 21:34:06.407260 kubelet[2444]: E0320 21:34:06.406735 2444 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 20 21:34:06.412064 kubelet[2444]: I0320 21:34:06.411115 2444 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 21:34:06.412064 kubelet[2444]: I0320 21:34:06.411169 2444 reconciler.go:26] "Reconciler: start to sync state" Mar 20 21:34:06.412064 kubelet[2444]: W0320 21:34:06.411699 2444 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Mar 20 21:34:06.412064 kubelet[2444]: E0320 21:34:06.411730 2444 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Mar 20 21:34:06.412064 kubelet[2444]: E0320 21:34:06.411766 2444 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="200ms" Mar 20 21:34:06.412064 kubelet[2444]: I0320 21:34:06.411951 2444 factory.go:221] Registration of the systemd container factory successfully Mar 20 21:34:06.412064 kubelet[2444]: I0320 21:34:06.411990 2444 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 20 21:34:06.414160 kubelet[2444]: E0320 21:34:06.412295 2444 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.103:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.103:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.182ea0617c681af5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-03-20 21:34:06.398429941 +0000 UTC m=+0.643948156,LastTimestamp:2025-03-20 21:34:06.398429941 +0000 UTC m=+0.643948156,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 20 21:34:06.419495 kubelet[2444]: I0320 21:34:06.419469 2444 factory.go:221] Registration of the containerd container factory successfully Mar 20 21:34:06.422492 kubelet[2444]: I0320 21:34:06.422451 2444 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 21:34:06.423351 kubelet[2444]: I0320 21:34:06.423335 2444 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 21:34:06.423351 kubelet[2444]: I0320 21:34:06.423351 2444 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 21:34:06.423416 kubelet[2444]: I0320 21:34:06.423365 2444 kubelet.go:2321] "Starting kubelet main sync loop" Mar 20 21:34:06.423416 kubelet[2444]: E0320 21:34:06.423394 2444 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 21:34:06.427752 kubelet[2444]: W0320 21:34:06.427716 2444 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Mar 20 21:34:06.427752 kubelet[2444]: E0320 21:34:06.427754 2444 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Mar 20 21:34:06.450527 kubelet[2444]: E0320 21:34:06.450487 2444 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 20 21:34:06.453147 kubelet[2444]: I0320 21:34:06.452965 2444 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 20 21:34:06.453147 kubelet[2444]: I0320 21:34:06.452979 2444 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 20 21:34:06.453147 kubelet[2444]: I0320 21:34:06.453020 2444 state_mem.go:36] "Initialized new in-memory state store" Mar 20 21:34:06.454449 kubelet[2444]: I0320 21:34:06.454423 2444 policy_none.go:49] "None policy: Start" Mar 20 21:34:06.455136 kubelet[2444]: I0320 21:34:06.454911 2444 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 21:34:06.455136 kubelet[2444]: I0320 21:34:06.454939 2444 state_mem.go:35] "Initializing new in-memory state store" Mar 20 21:34:06.467249 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 20 21:34:06.477827 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 20 21:34:06.481189 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 20 21:34:06.490069 kubelet[2444]: I0320 21:34:06.489885 2444 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 21:34:06.490069 kubelet[2444]: I0320 21:34:06.490034 2444 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 21:34:06.490069 kubelet[2444]: I0320 21:34:06.490042 2444 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 21:34:06.490604 kubelet[2444]: I0320 21:34:06.490592 2444 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 21:34:06.492336 kubelet[2444]: E0320 21:34:06.492314 2444 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Mar 20 21:34:06.529186 systemd[1]: Created slice kubepods-burstable-pod60762308083b5ef6c837b1be48ec53d6.slice - libcontainer container kubepods-burstable-pod60762308083b5ef6c837b1be48ec53d6.slice. Mar 20 21:34:06.542675 systemd[1]: Created slice kubepods-burstable-pod6f32907a07e55aea05abdc5cd284a8d5.slice - libcontainer container kubepods-burstable-pod6f32907a07e55aea05abdc5cd284a8d5.slice. Mar 20 21:34:06.549695 systemd[1]: Created slice kubepods-burstable-pod0385bc7ee4efa49cd1d6b0ce3ef31290.slice - libcontainer container kubepods-burstable-pod0385bc7ee4efa49cd1d6b0ce3ef31290.slice. Mar 20 21:34:06.591759 kubelet[2444]: I0320 21:34:06.591735 2444 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 20 21:34:06.592004 kubelet[2444]: E0320 21:34:06.591984 2444 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Mar 20 21:34:06.611545 kubelet[2444]: I0320 21:34:06.611509 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 21:34:06.611545 kubelet[2444]: I0320 21:34:06.611546 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 21:34:06.611676 kubelet[2444]: I0320 21:34:06.611567 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 21:34:06.611676 kubelet[2444]: I0320 21:34:06.611581 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0385bc7ee4efa49cd1d6b0ce3ef31290-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"0385bc7ee4efa49cd1d6b0ce3ef31290\") " pod="kube-system/kube-apiserver-localhost" Mar 20 21:34:06.611676 kubelet[2444]: I0320 21:34:06.611598 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0385bc7ee4efa49cd1d6b0ce3ef31290-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"0385bc7ee4efa49cd1d6b0ce3ef31290\") " pod="kube-system/kube-apiserver-localhost" Mar 20 21:34:06.611676 kubelet[2444]: I0320 21:34:06.611613 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 21:34:06.611676 kubelet[2444]: I0320 21:34:06.611627 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 21:34:06.611782 kubelet[2444]: I0320 21:34:06.611640 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6f32907a07e55aea05abdc5cd284a8d5-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"6f32907a07e55aea05abdc5cd284a8d5\") " pod="kube-system/kube-scheduler-localhost" Mar 20 21:34:06.611782 kubelet[2444]: I0320 21:34:06.611653 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0385bc7ee4efa49cd1d6b0ce3ef31290-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"0385bc7ee4efa49cd1d6b0ce3ef31290\") " pod="kube-system/kube-apiserver-localhost" Mar 20 21:34:06.613076 kubelet[2444]: E0320 21:34:06.613039 2444 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="400ms" Mar 20 21:34:06.793589 kubelet[2444]: I0320 21:34:06.793362 2444 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 20 21:34:06.793589 kubelet[2444]: E0320 21:34:06.793564 2444 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Mar 20 21:34:06.841661 containerd[1556]: time="2025-03-20T21:34:06.841631511Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:60762308083b5ef6c837b1be48ec53d6,Namespace:kube-system,Attempt:0,}" Mar 20 21:34:06.848147 containerd[1556]: time="2025-03-20T21:34:06.848117147Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:6f32907a07e55aea05abdc5cd284a8d5,Namespace:kube-system,Attempt:0,}" Mar 20 21:34:06.900183 containerd[1556]: time="2025-03-20T21:34:06.900071423Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:0385bc7ee4efa49cd1d6b0ce3ef31290,Namespace:kube-system,Attempt:0,}" Mar 20 21:34:07.013354 kubelet[2444]: E0320 21:34:07.013322 2444 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="800ms" Mar 20 21:34:07.195582 kubelet[2444]: I0320 21:34:07.195564 2444 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 20 21:34:07.195962 kubelet[2444]: E0320 21:34:07.195943 2444 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Mar 20 21:34:07.231065 containerd[1556]: time="2025-03-20T21:34:07.230287427Z" level=info msg="connecting to shim b1a0a7a8e307b694f0a8d28206218af889581e0dd6f041e3a20c0bc932f340b6" address="unix:///run/containerd/s/33567ec99cc00e67be6588d90595a533ed644d9ab6acf2e3e15e53f30850e19d" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:34:07.233561 containerd[1556]: time="2025-03-20T21:34:07.233425530Z" level=info msg="connecting to shim 8c943ddade3d65b352af7af4486eb0420149d03a67edd48dad20b5d1e87eea89" address="unix:///run/containerd/s/b1c5044b2bf2d7810f5ba82f0f982ad492887c83a32ac1dbec5e72cfec6de8be" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:34:07.236235 containerd[1556]: time="2025-03-20T21:34:07.236139119Z" level=info msg="connecting to shim 7b6d2540540e1b4587f57ab4a3b1a0f9ea466f4d18bcfee2d6f0a7dcfb8d42c5" address="unix:///run/containerd/s/be40a894b87ac7a911f5530e976f8b77ac9dc17aabb8c35ddb95bf60b5eea312" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:34:07.306315 systemd[1]: Started cri-containerd-7b6d2540540e1b4587f57ab4a3b1a0f9ea466f4d18bcfee2d6f0a7dcfb8d42c5.scope - libcontainer container 7b6d2540540e1b4587f57ab4a3b1a0f9ea466f4d18bcfee2d6f0a7dcfb8d42c5. Mar 20 21:34:07.307937 systemd[1]: Started cri-containerd-8c943ddade3d65b352af7af4486eb0420149d03a67edd48dad20b5d1e87eea89.scope - libcontainer container 8c943ddade3d65b352af7af4486eb0420149d03a67edd48dad20b5d1e87eea89. Mar 20 21:34:07.311590 systemd[1]: Started cri-containerd-b1a0a7a8e307b694f0a8d28206218af889581e0dd6f041e3a20c0bc932f340b6.scope - libcontainer container b1a0a7a8e307b694f0a8d28206218af889581e0dd6f041e3a20c0bc932f340b6. Mar 20 21:34:07.343187 kubelet[2444]: W0320 21:34:07.343121 2444 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Mar 20 21:34:07.343187 kubelet[2444]: E0320 21:34:07.343168 2444 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Mar 20 21:34:07.359934 containerd[1556]: time="2025-03-20T21:34:07.359907214Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:0385bc7ee4efa49cd1d6b0ce3ef31290,Namespace:kube-system,Attempt:0,} returns sandbox id \"7b6d2540540e1b4587f57ab4a3b1a0f9ea466f4d18bcfee2d6f0a7dcfb8d42c5\"" Mar 20 21:34:07.363724 containerd[1556]: time="2025-03-20T21:34:07.362993483Z" level=info msg="CreateContainer within sandbox \"7b6d2540540e1b4587f57ab4a3b1a0f9ea466f4d18bcfee2d6f0a7dcfb8d42c5\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 20 21:34:07.364174 kubelet[2444]: W0320 21:34:07.364139 2444 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Mar 20 21:34:07.365123 kubelet[2444]: E0320 21:34:07.365096 2444 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Mar 20 21:34:07.370016 containerd[1556]: time="2025-03-20T21:34:07.369919656Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:60762308083b5ef6c837b1be48ec53d6,Namespace:kube-system,Attempt:0,} returns sandbox id \"b1a0a7a8e307b694f0a8d28206218af889581e0dd6f041e3a20c0bc932f340b6\"" Mar 20 21:34:07.372511 containerd[1556]: time="2025-03-20T21:34:07.372449428Z" level=info msg="CreateContainer within sandbox \"b1a0a7a8e307b694f0a8d28206218af889581e0dd6f041e3a20c0bc932f340b6\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 20 21:34:07.372990 containerd[1556]: time="2025-03-20T21:34:07.372931409Z" level=info msg="Container 4403b896c7a5a335a4a97c511133a888301e5d57f909687cc856697b020c3fe1: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:07.382269 containerd[1556]: time="2025-03-20T21:34:07.382228782Z" level=info msg="CreateContainer within sandbox \"7b6d2540540e1b4587f57ab4a3b1a0f9ea466f4d18bcfee2d6f0a7dcfb8d42c5\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"4403b896c7a5a335a4a97c511133a888301e5d57f909687cc856697b020c3fe1\"" Mar 20 21:34:07.382665 containerd[1556]: time="2025-03-20T21:34:07.382470904Z" level=info msg="Container 66a5a8fe0293e6e2c3e4b071278795f5a2ed7bab1db1d5b9894476bfe726a003: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:07.383178 containerd[1556]: time="2025-03-20T21:34:07.383093732Z" level=info msg="StartContainer for \"4403b896c7a5a335a4a97c511133a888301e5d57f909687cc856697b020c3fe1\"" Mar 20 21:34:07.383826 containerd[1556]: time="2025-03-20T21:34:07.383809563Z" level=info msg="connecting to shim 4403b896c7a5a335a4a97c511133a888301e5d57f909687cc856697b020c3fe1" address="unix:///run/containerd/s/be40a894b87ac7a911f5530e976f8b77ac9dc17aabb8c35ddb95bf60b5eea312" protocol=ttrpc version=3 Mar 20 21:34:07.388777 containerd[1556]: time="2025-03-20T21:34:07.388753537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:6f32907a07e55aea05abdc5cd284a8d5,Namespace:kube-system,Attempt:0,} returns sandbox id \"8c943ddade3d65b352af7af4486eb0420149d03a67edd48dad20b5d1e87eea89\"" Mar 20 21:34:07.392622 containerd[1556]: time="2025-03-20T21:34:07.392587520Z" level=info msg="CreateContainer within sandbox \"8c943ddade3d65b352af7af4486eb0420149d03a67edd48dad20b5d1e87eea89\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 20 21:34:07.395221 containerd[1556]: time="2025-03-20T21:34:07.395082366Z" level=info msg="CreateContainer within sandbox \"b1a0a7a8e307b694f0a8d28206218af889581e0dd6f041e3a20c0bc932f340b6\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"66a5a8fe0293e6e2c3e4b071278795f5a2ed7bab1db1d5b9894476bfe726a003\"" Mar 20 21:34:07.397359 containerd[1556]: time="2025-03-20T21:34:07.397259148Z" level=info msg="StartContainer for \"66a5a8fe0293e6e2c3e4b071278795f5a2ed7bab1db1d5b9894476bfe726a003\"" Mar 20 21:34:07.398210 containerd[1556]: time="2025-03-20T21:34:07.398182079Z" level=info msg="connecting to shim 66a5a8fe0293e6e2c3e4b071278795f5a2ed7bab1db1d5b9894476bfe726a003" address="unix:///run/containerd/s/33567ec99cc00e67be6588d90595a533ed644d9ab6acf2e3e15e53f30850e19d" protocol=ttrpc version=3 Mar 20 21:34:07.400208 systemd[1]: Started cri-containerd-4403b896c7a5a335a4a97c511133a888301e5d57f909687cc856697b020c3fe1.scope - libcontainer container 4403b896c7a5a335a4a97c511133a888301e5d57f909687cc856697b020c3fe1. Mar 20 21:34:07.404947 containerd[1556]: time="2025-03-20T21:34:07.404898207Z" level=info msg="Container 7aae24faee256fefb81063f904becc00f8c509da366d2448d934e054ca136717: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:07.412932 containerd[1556]: time="2025-03-20T21:34:07.412403854Z" level=info msg="CreateContainer within sandbox \"8c943ddade3d65b352af7af4486eb0420149d03a67edd48dad20b5d1e87eea89\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"7aae24faee256fefb81063f904becc00f8c509da366d2448d934e054ca136717\"" Mar 20 21:34:07.413545 containerd[1556]: time="2025-03-20T21:34:07.413522367Z" level=info msg="StartContainer for \"7aae24faee256fefb81063f904becc00f8c509da366d2448d934e054ca136717\"" Mar 20 21:34:07.414842 containerd[1556]: time="2025-03-20T21:34:07.414816337Z" level=info msg="connecting to shim 7aae24faee256fefb81063f904becc00f8c509da366d2448d934e054ca136717" address="unix:///run/containerd/s/b1c5044b2bf2d7810f5ba82f0f982ad492887c83a32ac1dbec5e72cfec6de8be" protocol=ttrpc version=3 Mar 20 21:34:07.422233 systemd[1]: Started cri-containerd-66a5a8fe0293e6e2c3e4b071278795f5a2ed7bab1db1d5b9894476bfe726a003.scope - libcontainer container 66a5a8fe0293e6e2c3e4b071278795f5a2ed7bab1db1d5b9894476bfe726a003. Mar 20 21:34:07.441274 systemd[1]: Started cri-containerd-7aae24faee256fefb81063f904becc00f8c509da366d2448d934e054ca136717.scope - libcontainer container 7aae24faee256fefb81063f904becc00f8c509da366d2448d934e054ca136717. Mar 20 21:34:07.468291 containerd[1556]: time="2025-03-20T21:34:07.468201851Z" level=info msg="StartContainer for \"4403b896c7a5a335a4a97c511133a888301e5d57f909687cc856697b020c3fe1\" returns successfully" Mar 20 21:34:07.495782 containerd[1556]: time="2025-03-20T21:34:07.495672261Z" level=info msg="StartContainer for \"7aae24faee256fefb81063f904becc00f8c509da366d2448d934e054ca136717\" returns successfully" Mar 20 21:34:07.495782 containerd[1556]: time="2025-03-20T21:34:07.495753111Z" level=info msg="StartContainer for \"66a5a8fe0293e6e2c3e4b071278795f5a2ed7bab1db1d5b9894476bfe726a003\" returns successfully" Mar 20 21:34:07.506502 kubelet[2444]: W0320 21:34:07.506418 2444 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Mar 20 21:34:07.506502 kubelet[2444]: E0320 21:34:07.506477 2444 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Mar 20 21:34:07.709801 kubelet[2444]: W0320 21:34:07.709729 2444 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.103:6443: connect: connection refused Mar 20 21:34:07.709801 kubelet[2444]: E0320 21:34:07.709775 2444 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Mar 20 21:34:07.814812 kubelet[2444]: E0320 21:34:07.814569 2444 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="1.6s" Mar 20 21:34:07.998185 kubelet[2444]: I0320 21:34:07.998074 2444 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 20 21:34:07.998452 kubelet[2444]: E0320 21:34:07.998424 2444 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Mar 20 21:34:08.036387 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2007727521.mount: Deactivated successfully. Mar 20 21:34:08.421858 kubelet[2444]: E0320 21:34:08.421829 2444 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.103:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" Mar 20 21:34:09.599919 kubelet[2444]: I0320 21:34:09.599898 2444 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 20 21:34:10.035327 kubelet[2444]: E0320 21:34:10.035304 2444 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Mar 20 21:34:10.279598 kubelet[2444]: I0320 21:34:10.278961 2444 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Mar 20 21:34:10.279598 kubelet[2444]: E0320 21:34:10.279004 2444 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Mar 20 21:34:10.333629 kubelet[2444]: E0320 21:34:10.333548 2444 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 20 21:34:10.391681 kubelet[2444]: I0320 21:34:10.391659 2444 apiserver.go:52] "Watching apiserver" Mar 20 21:34:10.411850 kubelet[2444]: I0320 21:34:10.411810 2444 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 21:34:10.461947 kubelet[2444]: E0320 21:34:10.461754 2444 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Mar 20 21:34:11.999218 systemd[1]: Reload requested from client PID 2717 ('systemctl') (unit session-9.scope)... Mar 20 21:34:11.999229 systemd[1]: Reloading... Mar 20 21:34:12.062084 zram_generator::config[2761]: No configuration found. Mar 20 21:34:12.127528 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Mar 20 21:34:12.145889 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 21:34:12.220598 systemd[1]: Reloading finished in 221 ms. Mar 20 21:34:12.237819 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 21:34:12.244628 systemd[1]: kubelet.service: Deactivated successfully. Mar 20 21:34:12.244791 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 21:34:12.244843 systemd[1]: kubelet.service: Consumed 637ms CPU time, 116.2M memory peak. Mar 20 21:34:12.246339 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 21:34:12.814798 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 21:34:12.821478 (kubelet)[2829]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 20 21:34:12.866034 kubelet[2829]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 21:34:12.866034 kubelet[2829]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 21:34:12.866034 kubelet[2829]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 21:34:12.867119 kubelet[2829]: I0320 21:34:12.867092 2829 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 21:34:12.871935 kubelet[2829]: I0320 21:34:12.871912 2829 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 20 21:34:12.872135 kubelet[2829]: I0320 21:34:12.872021 2829 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 21:34:12.872254 kubelet[2829]: I0320 21:34:12.872246 2829 server.go:929] "Client rotation is on, will bootstrap in background" Mar 20 21:34:12.873018 kubelet[2829]: I0320 21:34:12.873009 2829 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 21:34:12.875451 kubelet[2829]: I0320 21:34:12.875418 2829 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 20 21:34:12.878491 kubelet[2829]: I0320 21:34:12.878474 2829 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 21:34:12.880116 kubelet[2829]: I0320 21:34:12.880100 2829 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 20 21:34:12.880191 kubelet[2829]: I0320 21:34:12.880180 2829 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 21:34:12.880315 kubelet[2829]: I0320 21:34:12.880294 2829 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 21:34:12.880426 kubelet[2829]: I0320 21:34:12.880315 2829 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 21:34:12.892787 kubelet[2829]: I0320 21:34:12.892763 2829 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 21:34:12.892787 kubelet[2829]: I0320 21:34:12.892784 2829 container_manager_linux.go:300] "Creating device plugin manager" Mar 20 21:34:12.892870 kubelet[2829]: I0320 21:34:12.892818 2829 state_mem.go:36] "Initialized new in-memory state store" Mar 20 21:34:12.893404 kubelet[2829]: I0320 21:34:12.893390 2829 kubelet.go:408] "Attempting to sync node with API server" Mar 20 21:34:12.893426 kubelet[2829]: I0320 21:34:12.893406 2829 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 21:34:12.893441 kubelet[2829]: I0320 21:34:12.893427 2829 kubelet.go:314] "Adding apiserver pod source" Mar 20 21:34:12.901096 kubelet[2829]: I0320 21:34:12.901011 2829 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 21:34:12.903918 kubelet[2829]: I0320 21:34:12.903398 2829 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 20 21:34:12.906302 kubelet[2829]: I0320 21:34:12.905359 2829 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 21:34:12.906302 kubelet[2829]: I0320 21:34:12.905621 2829 server.go:1269] "Started kubelet" Mar 20 21:34:12.906302 kubelet[2829]: I0320 21:34:12.905844 2829 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 21:34:12.906676 kubelet[2829]: I0320 21:34:12.906659 2829 server.go:460] "Adding debug handlers to kubelet server" Mar 20 21:34:12.907818 kubelet[2829]: I0320 21:34:12.907805 2829 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 21:34:12.916062 kubelet[2829]: I0320 21:34:12.914445 2829 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 21:34:12.916062 kubelet[2829]: I0320 21:34:12.914571 2829 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 21:34:12.916062 kubelet[2829]: I0320 21:34:12.915037 2829 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 20 21:34:12.918885 kubelet[2829]: E0320 21:34:12.918871 2829 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 20 21:34:12.919396 kubelet[2829]: I0320 21:34:12.919386 2829 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 21:34:12.919506 kubelet[2829]: I0320 21:34:12.919500 2829 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 21:34:12.919626 kubelet[2829]: I0320 21:34:12.919620 2829 reconciler.go:26] "Reconciler: start to sync state" Mar 20 21:34:12.920961 kubelet[2829]: I0320 21:34:12.920950 2829 factory.go:221] Registration of the systemd container factory successfully Mar 20 21:34:12.921085 kubelet[2829]: I0320 21:34:12.921073 2829 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 20 21:34:12.922696 kubelet[2829]: I0320 21:34:12.922686 2829 factory.go:221] Registration of the containerd container factory successfully Mar 20 21:34:12.922791 kubelet[2829]: I0320 21:34:12.922766 2829 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 21:34:12.923405 kubelet[2829]: I0320 21:34:12.923393 2829 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 21:34:12.923405 kubelet[2829]: I0320 21:34:12.923405 2829 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 21:34:12.923461 kubelet[2829]: I0320 21:34:12.923414 2829 kubelet.go:2321] "Starting kubelet main sync loop" Mar 20 21:34:12.923461 kubelet[2829]: E0320 21:34:12.923434 2829 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 21:34:12.955494 kubelet[2829]: I0320 21:34:12.955477 2829 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 20 21:34:12.955494 kubelet[2829]: I0320 21:34:12.955487 2829 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 20 21:34:12.955494 kubelet[2829]: I0320 21:34:12.955498 2829 state_mem.go:36] "Initialized new in-memory state store" Mar 20 21:34:12.955604 kubelet[2829]: I0320 21:34:12.955586 2829 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 20 21:34:12.955604 kubelet[2829]: I0320 21:34:12.955592 2829 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 20 21:34:12.955604 kubelet[2829]: I0320 21:34:12.955604 2829 policy_none.go:49] "None policy: Start" Mar 20 21:34:12.957730 kubelet[2829]: I0320 21:34:12.957718 2829 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 21:34:12.957776 kubelet[2829]: I0320 21:34:12.957736 2829 state_mem.go:35] "Initializing new in-memory state store" Mar 20 21:34:12.957826 kubelet[2829]: I0320 21:34:12.957816 2829 state_mem.go:75] "Updated machine memory state" Mar 20 21:34:12.978806 kubelet[2829]: I0320 21:34:12.978786 2829 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 21:34:12.979435 kubelet[2829]: I0320 21:34:12.978891 2829 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 21:34:12.979435 kubelet[2829]: I0320 21:34:12.978901 2829 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 21:34:12.979435 kubelet[2829]: I0320 21:34:12.979196 2829 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 21:34:13.029480 kubelet[2829]: E0320 21:34:13.029449 2829 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Mar 20 21:34:13.083246 kubelet[2829]: I0320 21:34:13.083178 2829 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 20 21:34:13.095439 kubelet[2829]: I0320 21:34:13.095339 2829 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Mar 20 21:34:13.095439 kubelet[2829]: I0320 21:34:13.095394 2829 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Mar 20 21:34:13.121124 kubelet[2829]: I0320 21:34:13.121104 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6f32907a07e55aea05abdc5cd284a8d5-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"6f32907a07e55aea05abdc5cd284a8d5\") " pod="kube-system/kube-scheduler-localhost" Mar 20 21:34:13.121124 kubelet[2829]: I0320 21:34:13.121124 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0385bc7ee4efa49cd1d6b0ce3ef31290-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"0385bc7ee4efa49cd1d6b0ce3ef31290\") " pod="kube-system/kube-apiserver-localhost" Mar 20 21:34:13.121235 kubelet[2829]: I0320 21:34:13.121137 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 21:34:13.121235 kubelet[2829]: I0320 21:34:13.121146 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 21:34:13.121235 kubelet[2829]: I0320 21:34:13.121156 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 21:34:13.121235 kubelet[2829]: I0320 21:34:13.121164 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 21:34:13.121235 kubelet[2829]: I0320 21:34:13.121172 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0385bc7ee4efa49cd1d6b0ce3ef31290-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"0385bc7ee4efa49cd1d6b0ce3ef31290\") " pod="kube-system/kube-apiserver-localhost" Mar 20 21:34:13.121326 kubelet[2829]: I0320 21:34:13.121180 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0385bc7ee4efa49cd1d6b0ce3ef31290-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"0385bc7ee4efa49cd1d6b0ce3ef31290\") " pod="kube-system/kube-apiserver-localhost" Mar 20 21:34:13.121326 kubelet[2829]: I0320 21:34:13.121188 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 21:34:13.902055 kubelet[2829]: I0320 21:34:13.901372 2829 apiserver.go:52] "Watching apiserver" Mar 20 21:34:13.920465 kubelet[2829]: I0320 21:34:13.920401 2829 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 21:34:13.984168 kubelet[2829]: E0320 21:34:13.984144 2829 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Mar 20 21:34:14.044057 kubelet[2829]: I0320 21:34:14.044013 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.043997766 podStartE2EDuration="1.043997766s" podCreationTimestamp="2025-03-20 21:34:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 21:34:14.002863307 +0000 UTC m=+1.168495178" watchObservedRunningTime="2025-03-20 21:34:14.043997766 +0000 UTC m=+1.209629629" Mar 20 21:34:14.056038 kubelet[2829]: I0320 21:34:14.055913 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=3.055831336 podStartE2EDuration="3.055831336s" podCreationTimestamp="2025-03-20 21:34:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 21:34:14.045909317 +0000 UTC m=+1.211541186" watchObservedRunningTime="2025-03-20 21:34:14.055831336 +0000 UTC m=+1.221463212" Mar 20 21:34:14.065012 kubelet[2829]: I0320 21:34:14.063822 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.063798304 podStartE2EDuration="1.063798304s" podCreationTimestamp="2025-03-20 21:34:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 21:34:14.057547107 +0000 UTC m=+1.223178979" watchObservedRunningTime="2025-03-20 21:34:14.063798304 +0000 UTC m=+1.229430168" Mar 20 21:34:16.986691 kubelet[2829]: I0320 21:34:16.986644 2829 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 20 21:34:16.987310 kubelet[2829]: I0320 21:34:16.987275 2829 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 20 21:34:16.987349 containerd[1556]: time="2025-03-20T21:34:16.987169474Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 20 21:34:17.441397 systemd[1]: Created slice kubepods-besteffort-podd715a33d_e6d5_4288_b6ce_20cfc20ab695.slice - libcontainer container kubepods-besteffort-podd715a33d_e6d5_4288_b6ce_20cfc20ab695.slice. Mar 20 21:34:17.447966 kubelet[2829]: I0320 21:34:17.447886 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d715a33d-e6d5-4288-b6ce-20cfc20ab695-lib-modules\") pod \"kube-proxy-jzxcz\" (UID: \"d715a33d-e6d5-4288-b6ce-20cfc20ab695\") " pod="kube-system/kube-proxy-jzxcz" Mar 20 21:34:17.447966 kubelet[2829]: I0320 21:34:17.447906 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d715a33d-e6d5-4288-b6ce-20cfc20ab695-kube-proxy\") pod \"kube-proxy-jzxcz\" (UID: \"d715a33d-e6d5-4288-b6ce-20cfc20ab695\") " pod="kube-system/kube-proxy-jzxcz" Mar 20 21:34:17.447966 kubelet[2829]: I0320 21:34:17.447918 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d715a33d-e6d5-4288-b6ce-20cfc20ab695-xtables-lock\") pod \"kube-proxy-jzxcz\" (UID: \"d715a33d-e6d5-4288-b6ce-20cfc20ab695\") " pod="kube-system/kube-proxy-jzxcz" Mar 20 21:34:17.447966 kubelet[2829]: I0320 21:34:17.447928 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r4z5\" (UniqueName: \"kubernetes.io/projected/d715a33d-e6d5-4288-b6ce-20cfc20ab695-kube-api-access-4r4z5\") pod \"kube-proxy-jzxcz\" (UID: \"d715a33d-e6d5-4288-b6ce-20cfc20ab695\") " pod="kube-system/kube-proxy-jzxcz" Mar 20 21:34:17.578743 sudo[1871]: pam_unix(sudo:session): session closed for user root Mar 20 21:34:17.583291 kubelet[2829]: E0320 21:34:17.583267 2829 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Mar 20 21:34:17.583291 kubelet[2829]: E0320 21:34:17.583289 2829 projected.go:194] Error preparing data for projected volume kube-api-access-4r4z5 for pod kube-system/kube-proxy-jzxcz: configmap "kube-root-ca.crt" not found Mar 20 21:34:17.583395 kubelet[2829]: E0320 21:34:17.583338 2829 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d715a33d-e6d5-4288-b6ce-20cfc20ab695-kube-api-access-4r4z5 podName:d715a33d-e6d5-4288-b6ce-20cfc20ab695 nodeName:}" failed. No retries permitted until 2025-03-20 21:34:18.083320831 +0000 UTC m=+5.248952693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4r4z5" (UniqueName: "kubernetes.io/projected/d715a33d-e6d5-4288-b6ce-20cfc20ab695-kube-api-access-4r4z5") pod "kube-proxy-jzxcz" (UID: "d715a33d-e6d5-4288-b6ce-20cfc20ab695") : configmap "kube-root-ca.crt" not found Mar 20 21:34:17.584633 sshd[1870]: Connection closed by 147.75.109.163 port 55844 Mar 20 21:34:17.602218 sshd-session[1867]: pam_unix(sshd:session): session closed for user core Mar 20 21:34:17.604883 systemd[1]: sshd@6-139.178.70.103:22-147.75.109.163:55844.service: Deactivated successfully. Mar 20 21:34:17.606528 systemd[1]: session-9.scope: Deactivated successfully. Mar 20 21:34:17.606739 systemd[1]: session-9.scope: Consumed 2.918s CPU time, 145.8M memory peak. Mar 20 21:34:17.607938 systemd-logind[1540]: Session 9 logged out. Waiting for processes to exit. Mar 20 21:34:17.608645 systemd-logind[1540]: Removed session 9. Mar 20 21:34:18.043774 systemd[1]: Created slice kubepods-besteffort-pod99cf8544_48df_40e8_a0d8_1254130b99a1.slice - libcontainer container kubepods-besteffort-pod99cf8544_48df_40e8_a0d8_1254130b99a1.slice. Mar 20 21:34:18.050906 kubelet[2829]: I0320 21:34:18.050879 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqgzt\" (UniqueName: \"kubernetes.io/projected/99cf8544-48df-40e8-a0d8-1254130b99a1-kube-api-access-nqgzt\") pod \"tigera-operator-64ff5465b7-jmbc6\" (UID: \"99cf8544-48df-40e8-a0d8-1254130b99a1\") " pod="tigera-operator/tigera-operator-64ff5465b7-jmbc6" Mar 20 21:34:18.051170 kubelet[2829]: I0320 21:34:18.050918 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/99cf8544-48df-40e8-a0d8-1254130b99a1-var-lib-calico\") pod \"tigera-operator-64ff5465b7-jmbc6\" (UID: \"99cf8544-48df-40e8-a0d8-1254130b99a1\") " pod="tigera-operator/tigera-operator-64ff5465b7-jmbc6" Mar 20 21:34:18.347607 containerd[1556]: time="2025-03-20T21:34:18.347335322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jzxcz,Uid:d715a33d-e6d5-4288-b6ce-20cfc20ab695,Namespace:kube-system,Attempt:0,}" Mar 20 21:34:18.347607 containerd[1556]: time="2025-03-20T21:34:18.347337387Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-jmbc6,Uid:99cf8544-48df-40e8-a0d8-1254130b99a1,Namespace:tigera-operator,Attempt:0,}" Mar 20 21:34:18.426710 containerd[1556]: time="2025-03-20T21:34:18.426326482Z" level=info msg="connecting to shim 93dcbf0aa28c03c6a5865c8d0356ac9d1093adba1e68682d3a7c99724c0cf905" address="unix:///run/containerd/s/4dcd37817d9597f15373798bc718b66455ccaa4f0e49b9c1aaea790710ce9774" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:34:18.438060 containerd[1556]: time="2025-03-20T21:34:18.437800428Z" level=info msg="connecting to shim 3324f3bcf50c4ff6ac72b7e3e3a1abd45c45e7a5c73407a58d70a93bd2679703" address="unix:///run/containerd/s/0588f1bec4c9368ad96c05593dcc57fe78d4355621aa05eae627277336d4eeb4" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:34:18.458191 systemd[1]: Started cri-containerd-93dcbf0aa28c03c6a5865c8d0356ac9d1093adba1e68682d3a7c99724c0cf905.scope - libcontainer container 93dcbf0aa28c03c6a5865c8d0356ac9d1093adba1e68682d3a7c99724c0cf905. Mar 20 21:34:18.461112 systemd[1]: Started cri-containerd-3324f3bcf50c4ff6ac72b7e3e3a1abd45c45e7a5c73407a58d70a93bd2679703.scope - libcontainer container 3324f3bcf50c4ff6ac72b7e3e3a1abd45c45e7a5c73407a58d70a93bd2679703. Mar 20 21:34:18.488781 containerd[1556]: time="2025-03-20T21:34:18.488755855Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jzxcz,Uid:d715a33d-e6d5-4288-b6ce-20cfc20ab695,Namespace:kube-system,Attempt:0,} returns sandbox id \"93dcbf0aa28c03c6a5865c8d0356ac9d1093adba1e68682d3a7c99724c0cf905\"" Mar 20 21:34:18.490314 containerd[1556]: time="2025-03-20T21:34:18.490295970Z" level=info msg="CreateContainer within sandbox \"93dcbf0aa28c03c6a5865c8d0356ac9d1093adba1e68682d3a7c99724c0cf905\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 20 21:34:18.533532 containerd[1556]: time="2025-03-20T21:34:18.533470812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-jmbc6,Uid:99cf8544-48df-40e8-a0d8-1254130b99a1,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3324f3bcf50c4ff6ac72b7e3e3a1abd45c45e7a5c73407a58d70a93bd2679703\"" Mar 20 21:34:18.534475 containerd[1556]: time="2025-03-20T21:34:18.534375430Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 20 21:34:18.567979 containerd[1556]: time="2025-03-20T21:34:18.567940918Z" level=info msg="Container 08caf0a465ed6a072075871d5093fd94162767fc1011c9e7d546ace9a71a3f9e: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:18.572884 containerd[1556]: time="2025-03-20T21:34:18.572810424Z" level=info msg="CreateContainer within sandbox \"93dcbf0aa28c03c6a5865c8d0356ac9d1093adba1e68682d3a7c99724c0cf905\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"08caf0a465ed6a072075871d5093fd94162767fc1011c9e7d546ace9a71a3f9e\"" Mar 20 21:34:18.575439 containerd[1556]: time="2025-03-20T21:34:18.575409252Z" level=info msg="StartContainer for \"08caf0a465ed6a072075871d5093fd94162767fc1011c9e7d546ace9a71a3f9e\"" Mar 20 21:34:18.577901 containerd[1556]: time="2025-03-20T21:34:18.577868278Z" level=info msg="connecting to shim 08caf0a465ed6a072075871d5093fd94162767fc1011c9e7d546ace9a71a3f9e" address="unix:///run/containerd/s/4dcd37817d9597f15373798bc718b66455ccaa4f0e49b9c1aaea790710ce9774" protocol=ttrpc version=3 Mar 20 21:34:18.596257 systemd[1]: Started cri-containerd-08caf0a465ed6a072075871d5093fd94162767fc1011c9e7d546ace9a71a3f9e.scope - libcontainer container 08caf0a465ed6a072075871d5093fd94162767fc1011c9e7d546ace9a71a3f9e. Mar 20 21:34:18.621993 containerd[1556]: time="2025-03-20T21:34:18.621955482Z" level=info msg="StartContainer for \"08caf0a465ed6a072075871d5093fd94162767fc1011c9e7d546ace9a71a3f9e\" returns successfully" Mar 20 21:34:21.788304 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1144457057.mount: Deactivated successfully. Mar 20 21:34:22.336330 containerd[1556]: time="2025-03-20T21:34:22.336243506Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:22.349913 containerd[1556]: time="2025-03-20T21:34:22.349872320Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 20 21:34:22.364709 containerd[1556]: time="2025-03-20T21:34:22.364674108Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:22.380859 containerd[1556]: time="2025-03-20T21:34:22.380825213Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:22.381510 containerd[1556]: time="2025-03-20T21:34:22.381207960Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 3.846781855s" Mar 20 21:34:22.381510 containerd[1556]: time="2025-03-20T21:34:22.381230003Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 20 21:34:22.382547 containerd[1556]: time="2025-03-20T21:34:22.382520940Z" level=info msg="CreateContainer within sandbox \"3324f3bcf50c4ff6ac72b7e3e3a1abd45c45e7a5c73407a58d70a93bd2679703\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 20 21:34:22.449984 containerd[1556]: time="2025-03-20T21:34:22.449952595Z" level=info msg="Container e8aa5c69b9dc33cefc57f10569efe0a0279661a6c6774992ee5082adb095242e: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:22.499354 containerd[1556]: time="2025-03-20T21:34:22.499322357Z" level=info msg="CreateContainer within sandbox \"3324f3bcf50c4ff6ac72b7e3e3a1abd45c45e7a5c73407a58d70a93bd2679703\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"e8aa5c69b9dc33cefc57f10569efe0a0279661a6c6774992ee5082adb095242e\"" Mar 20 21:34:22.500618 containerd[1556]: time="2025-03-20T21:34:22.499837080Z" level=info msg="StartContainer for \"e8aa5c69b9dc33cefc57f10569efe0a0279661a6c6774992ee5082adb095242e\"" Mar 20 21:34:22.500618 containerd[1556]: time="2025-03-20T21:34:22.500404884Z" level=info msg="connecting to shim e8aa5c69b9dc33cefc57f10569efe0a0279661a6c6774992ee5082adb095242e" address="unix:///run/containerd/s/0588f1bec4c9368ad96c05593dcc57fe78d4355621aa05eae627277336d4eeb4" protocol=ttrpc version=3 Mar 20 21:34:22.516167 systemd[1]: Started cri-containerd-e8aa5c69b9dc33cefc57f10569efe0a0279661a6c6774992ee5082adb095242e.scope - libcontainer container e8aa5c69b9dc33cefc57f10569efe0a0279661a6c6774992ee5082adb095242e. Mar 20 21:34:22.545325 containerd[1556]: time="2025-03-20T21:34:22.545193182Z" level=info msg="StartContainer for \"e8aa5c69b9dc33cefc57f10569efe0a0279661a6c6774992ee5082adb095242e\" returns successfully" Mar 20 21:34:22.977013 kubelet[2829]: I0320 21:34:22.976615 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-jzxcz" podStartSLOduration=5.976602354 podStartE2EDuration="5.976602354s" podCreationTimestamp="2025-03-20 21:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 21:34:18.965458644 +0000 UTC m=+6.131090520" watchObservedRunningTime="2025-03-20 21:34:22.976602354 +0000 UTC m=+10.142234223" Mar 20 21:34:22.977013 kubelet[2829]: I0320 21:34:22.976724 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-64ff5465b7-jmbc6" podStartSLOduration=2.129069075 podStartE2EDuration="5.97671769s" podCreationTimestamp="2025-03-20 21:34:17 +0000 UTC" firstStartedPulling="2025-03-20 21:34:18.534151639 +0000 UTC m=+5.699783499" lastFinishedPulling="2025-03-20 21:34:22.381800249 +0000 UTC m=+9.547432114" observedRunningTime="2025-03-20 21:34:22.976268769 +0000 UTC m=+10.141900634" watchObservedRunningTime="2025-03-20 21:34:22.97671769 +0000 UTC m=+10.142349559" Mar 20 21:34:25.612932 systemd[1]: Created slice kubepods-besteffort-pod5817264a_2fc1_4344_970b_f88ce7a77bd6.slice - libcontainer container kubepods-besteffort-pod5817264a_2fc1_4344_970b_f88ce7a77bd6.slice. Mar 20 21:34:25.650071 systemd[1]: Created slice kubepods-besteffort-pod392fe464_00a3_419d_9a63_19dc8518fdfc.slice - libcontainer container kubepods-besteffort-pod392fe464_00a3_419d_9a63_19dc8518fdfc.slice. Mar 20 21:34:25.696510 kubelet[2829]: I0320 21:34:25.696217 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv8fz\" (UniqueName: \"kubernetes.io/projected/5817264a-2fc1-4344-970b-f88ce7a77bd6-kube-api-access-sv8fz\") pod \"calico-typha-6457cc644c-vmjxp\" (UID: \"5817264a-2fc1-4344-970b-f88ce7a77bd6\") " pod="calico-system/calico-typha-6457cc644c-vmjxp" Mar 20 21:34:25.696510 kubelet[2829]: I0320 21:34:25.696245 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-var-lib-calico\") pod \"calico-node-hqg5z\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " pod="calico-system/calico-node-hqg5z" Mar 20 21:34:25.696510 kubelet[2829]: I0320 21:34:25.696263 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-var-run-calico\") pod \"calico-node-hqg5z\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " pod="calico-system/calico-node-hqg5z" Mar 20 21:34:25.696510 kubelet[2829]: I0320 21:34:25.696276 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-cni-log-dir\") pod \"calico-node-hqg5z\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " pod="calico-system/calico-node-hqg5z" Mar 20 21:34:25.696510 kubelet[2829]: I0320 21:34:25.696286 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-flexvol-driver-host\") pod \"calico-node-hqg5z\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " pod="calico-system/calico-node-hqg5z" Mar 20 21:34:25.696840 kubelet[2829]: I0320 21:34:25.696295 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/392fe464-00a3-419d-9a63-19dc8518fdfc-node-certs\") pod \"calico-node-hqg5z\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " pod="calico-system/calico-node-hqg5z" Mar 20 21:34:25.696840 kubelet[2829]: I0320 21:34:25.696304 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h75pd\" (UniqueName: \"kubernetes.io/projected/392fe464-00a3-419d-9a63-19dc8518fdfc-kube-api-access-h75pd\") pod \"calico-node-hqg5z\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " pod="calico-system/calico-node-hqg5z" Mar 20 21:34:25.696840 kubelet[2829]: I0320 21:34:25.696331 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/5817264a-2fc1-4344-970b-f88ce7a77bd6-typha-certs\") pod \"calico-typha-6457cc644c-vmjxp\" (UID: \"5817264a-2fc1-4344-970b-f88ce7a77bd6\") " pod="calico-system/calico-typha-6457cc644c-vmjxp" Mar 20 21:34:25.696840 kubelet[2829]: I0320 21:34:25.696349 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-lib-modules\") pod \"calico-node-hqg5z\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " pod="calico-system/calico-node-hqg5z" Mar 20 21:34:25.696840 kubelet[2829]: I0320 21:34:25.696359 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-xtables-lock\") pod \"calico-node-hqg5z\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " pod="calico-system/calico-node-hqg5z" Mar 20 21:34:25.696945 kubelet[2829]: I0320 21:34:25.696369 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-policysync\") pod \"calico-node-hqg5z\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " pod="calico-system/calico-node-hqg5z" Mar 20 21:34:25.696945 kubelet[2829]: I0320 21:34:25.696383 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5817264a-2fc1-4344-970b-f88ce7a77bd6-tigera-ca-bundle\") pod \"calico-typha-6457cc644c-vmjxp\" (UID: \"5817264a-2fc1-4344-970b-f88ce7a77bd6\") " pod="calico-system/calico-typha-6457cc644c-vmjxp" Mar 20 21:34:25.696945 kubelet[2829]: I0320 21:34:25.696400 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/392fe464-00a3-419d-9a63-19dc8518fdfc-tigera-ca-bundle\") pod \"calico-node-hqg5z\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " pod="calico-system/calico-node-hqg5z" Mar 20 21:34:25.696945 kubelet[2829]: I0320 21:34:25.696417 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-cni-bin-dir\") pod \"calico-node-hqg5z\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " pod="calico-system/calico-node-hqg5z" Mar 20 21:34:25.696945 kubelet[2829]: I0320 21:34:25.696429 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-cni-net-dir\") pod \"calico-node-hqg5z\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " pod="calico-system/calico-node-hqg5z" Mar 20 21:34:25.742943 kubelet[2829]: E0320 21:34:25.742889 2829 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7zx9v" podUID="0bea6db8-8efa-4d44-ac84-6c553760f208" Mar 20 21:34:25.797099 kubelet[2829]: I0320 21:34:25.796601 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0bea6db8-8efa-4d44-ac84-6c553760f208-registration-dir\") pod \"csi-node-driver-7zx9v\" (UID: \"0bea6db8-8efa-4d44-ac84-6c553760f208\") " pod="calico-system/csi-node-driver-7zx9v" Mar 20 21:34:25.797099 kubelet[2829]: I0320 21:34:25.796697 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/0bea6db8-8efa-4d44-ac84-6c553760f208-varrun\") pod \"csi-node-driver-7zx9v\" (UID: \"0bea6db8-8efa-4d44-ac84-6c553760f208\") " pod="calico-system/csi-node-driver-7zx9v" Mar 20 21:34:25.797099 kubelet[2829]: I0320 21:34:25.796751 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0bea6db8-8efa-4d44-ac84-6c553760f208-socket-dir\") pod \"csi-node-driver-7zx9v\" (UID: \"0bea6db8-8efa-4d44-ac84-6c553760f208\") " pod="calico-system/csi-node-driver-7zx9v" Mar 20 21:34:25.797099 kubelet[2829]: I0320 21:34:25.796784 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bea6db8-8efa-4d44-ac84-6c553760f208-kubelet-dir\") pod \"csi-node-driver-7zx9v\" (UID: \"0bea6db8-8efa-4d44-ac84-6c553760f208\") " pod="calico-system/csi-node-driver-7zx9v" Mar 20 21:34:25.797099 kubelet[2829]: I0320 21:34:25.796800 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdmk\" (UniqueName: \"kubernetes.io/projected/0bea6db8-8efa-4d44-ac84-6c553760f208-kube-api-access-vgdmk\") pod \"csi-node-driver-7zx9v\" (UID: \"0bea6db8-8efa-4d44-ac84-6c553760f208\") " pod="calico-system/csi-node-driver-7zx9v" Mar 20 21:34:25.821249 kubelet[2829]: E0320 21:34:25.818895 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.821249 kubelet[2829]: W0320 21:34:25.818909 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.821249 kubelet[2829]: E0320 21:34:25.818927 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.824335 kubelet[2829]: E0320 21:34:25.824320 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.824432 kubelet[2829]: W0320 21:34:25.824422 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.824763 kubelet[2829]: E0320 21:34:25.824579 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.829128 kubelet[2829]: E0320 21:34:25.829113 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.829294 kubelet[2829]: W0320 21:34:25.829210 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.829294 kubelet[2829]: E0320 21:34:25.829237 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.834955 kubelet[2829]: E0320 21:34:25.829576 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.834955 kubelet[2829]: W0320 21:34:25.829582 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.834955 kubelet[2829]: E0320 21:34:25.829588 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.898432 kubelet[2829]: E0320 21:34:25.897678 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.898432 kubelet[2829]: W0320 21:34:25.897751 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.898432 kubelet[2829]: E0320 21:34:25.897767 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.898625 kubelet[2829]: E0320 21:34:25.898614 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.898625 kubelet[2829]: W0320 21:34:25.898623 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.898711 kubelet[2829]: E0320 21:34:25.898632 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.898764 kubelet[2829]: E0320 21:34:25.898758 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.898764 kubelet[2829]: W0320 21:34:25.898763 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.898851 kubelet[2829]: E0320 21:34:25.898773 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.899058 kubelet[2829]: E0320 21:34:25.898991 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.899058 kubelet[2829]: W0320 21:34:25.899003 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.899058 kubelet[2829]: E0320 21:34:25.899016 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.899270 kubelet[2829]: E0320 21:34:25.899242 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.899270 kubelet[2829]: W0320 21:34:25.899249 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.899270 kubelet[2829]: E0320 21:34:25.899259 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.899384 kubelet[2829]: E0320 21:34:25.899375 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.899440 kubelet[2829]: W0320 21:34:25.899385 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.899440 kubelet[2829]: E0320 21:34:25.899405 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.899541 kubelet[2829]: E0320 21:34:25.899517 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.899541 kubelet[2829]: W0320 21:34:25.899541 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.899627 kubelet[2829]: E0320 21:34:25.899552 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.899663 kubelet[2829]: E0320 21:34:25.899658 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.899663 kubelet[2829]: W0320 21:34:25.899663 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.899732 kubelet[2829]: E0320 21:34:25.899670 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.899826 kubelet[2829]: E0320 21:34:25.899815 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.899826 kubelet[2829]: W0320 21:34:25.899825 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.899899 kubelet[2829]: E0320 21:34:25.899844 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.899977 kubelet[2829]: E0320 21:34:25.899966 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.899977 kubelet[2829]: W0320 21:34:25.899975 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.900100 kubelet[2829]: E0320 21:34:25.899984 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.900201 kubelet[2829]: E0320 21:34:25.900163 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.900201 kubelet[2829]: W0320 21:34:25.900170 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.900201 kubelet[2829]: E0320 21:34:25.900181 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.900450 kubelet[2829]: E0320 21:34:25.900374 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.900450 kubelet[2829]: W0320 21:34:25.900380 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.900450 kubelet[2829]: E0320 21:34:25.900391 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.900597 kubelet[2829]: E0320 21:34:25.900538 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.900597 kubelet[2829]: W0320 21:34:25.900545 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.900722 kubelet[2829]: E0320 21:34:25.900655 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.900780 kubelet[2829]: E0320 21:34:25.900773 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.900866 kubelet[2829]: W0320 21:34:25.900813 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.900866 kubelet[2829]: E0320 21:34:25.900844 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.900956 kubelet[2829]: E0320 21:34:25.900948 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.901058 kubelet[2829]: W0320 21:34:25.900985 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.901058 kubelet[2829]: E0320 21:34:25.901006 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.901209 kubelet[2829]: E0320 21:34:25.901202 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.901297 kubelet[2829]: W0320 21:34:25.901242 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.901297 kubelet[2829]: E0320 21:34:25.901262 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.901386 kubelet[2829]: E0320 21:34:25.901380 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.901488 kubelet[2829]: W0320 21:34:25.901416 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.901488 kubelet[2829]: E0320 21:34:25.901434 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.901563 kubelet[2829]: E0320 21:34:25.901557 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.901613 kubelet[2829]: W0320 21:34:25.901595 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.901704 kubelet[2829]: E0320 21:34:25.901651 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.901770 kubelet[2829]: E0320 21:34:25.901764 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.901860 kubelet[2829]: W0320 21:34:25.901799 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.901860 kubelet[2829]: E0320 21:34:25.901811 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.901941 kubelet[2829]: E0320 21:34:25.901935 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.901989 kubelet[2829]: W0320 21:34:25.901971 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.902071 kubelet[2829]: E0320 21:34:25.902053 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.902206 kubelet[2829]: E0320 21:34:25.902182 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.902206 kubelet[2829]: W0320 21:34:25.902191 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.902206 kubelet[2829]: E0320 21:34:25.902202 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.902311 kubelet[2829]: E0320 21:34:25.902299 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.902347 kubelet[2829]: W0320 21:34:25.902309 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.902347 kubelet[2829]: E0320 21:34:25.902318 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.902447 kubelet[2829]: E0320 21:34:25.902432 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.902447 kubelet[2829]: W0320 21:34:25.902439 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.902505 kubelet[2829]: E0320 21:34:25.902453 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.902584 kubelet[2829]: E0320 21:34:25.902573 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.902584 kubelet[2829]: W0320 21:34:25.902584 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.902637 kubelet[2829]: E0320 21:34:25.902592 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.908340 kubelet[2829]: E0320 21:34:25.908302 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:25.908340 kubelet[2829]: W0320 21:34:25.908313 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:25.908340 kubelet[2829]: E0320 21:34:25.908321 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:25.937146 containerd[1556]: time="2025-03-20T21:34:25.937065015Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6457cc644c-vmjxp,Uid:5817264a-2fc1-4344-970b-f88ce7a77bd6,Namespace:calico-system,Attempt:0,}" Mar 20 21:34:25.964183 containerd[1556]: time="2025-03-20T21:34:25.963833032Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hqg5z,Uid:392fe464-00a3-419d-9a63-19dc8518fdfc,Namespace:calico-system,Attempt:0,}" Mar 20 21:34:26.001855 kubelet[2829]: E0320 21:34:26.001797 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.001855 kubelet[2829]: W0320 21:34:26.001811 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.001855 kubelet[2829]: E0320 21:34:26.001824 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.102455 kubelet[2829]: E0320 21:34:26.102394 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.102455 kubelet[2829]: W0320 21:34:26.102408 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.102455 kubelet[2829]: E0320 21:34:26.102423 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.203609 kubelet[2829]: E0320 21:34:26.203544 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.203756 kubelet[2829]: W0320 21:34:26.203707 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.203756 kubelet[2829]: E0320 21:34:26.203725 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.219383 containerd[1556]: time="2025-03-20T21:34:26.219355844Z" level=info msg="connecting to shim f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d" address="unix:///run/containerd/s/6b186b9dd9929f6119bafe73135cd1cdc8247abd95341f2eff1796a4c1eefc5b" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:34:26.266196 systemd[1]: Started cri-containerd-f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d.scope - libcontainer container f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d. Mar 20 21:34:26.304498 kubelet[2829]: E0320 21:34:26.304451 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.304498 kubelet[2829]: W0320 21:34:26.304464 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.304498 kubelet[2829]: E0320 21:34:26.304476 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.365305 kubelet[2829]: E0320 21:34:26.365228 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.365305 kubelet[2829]: W0320 21:34:26.365243 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.365305 kubelet[2829]: E0320 21:34:26.365255 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.365566 containerd[1556]: time="2025-03-20T21:34:26.365506932Z" level=info msg="connecting to shim e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653" address="unix:///run/containerd/s/e42d18309c0a4c8a8a389851ce272df50af64fee6a089727af344d009b8850fd" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:34:26.370282 containerd[1556]: time="2025-03-20T21:34:26.370256128Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6457cc644c-vmjxp,Uid:5817264a-2fc1-4344-970b-f88ce7a77bd6,Namespace:calico-system,Attempt:0,} returns sandbox id \"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\"" Mar 20 21:34:26.371238 containerd[1556]: time="2025-03-20T21:34:26.371121670Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 20 21:34:26.386242 systemd[1]: Started cri-containerd-e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653.scope - libcontainer container e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653. Mar 20 21:34:26.574031 containerd[1556]: time="2025-03-20T21:34:26.573749630Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hqg5z,Uid:392fe464-00a3-419d-9a63-19dc8518fdfc,Namespace:calico-system,Attempt:0,} returns sandbox id \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\"" Mar 20 21:34:26.898467 kubelet[2829]: E0320 21:34:26.898442 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.898467 kubelet[2829]: W0320 21:34:26.898459 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.898763 kubelet[2829]: E0320 21:34:26.898486 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.898763 kubelet[2829]: E0320 21:34:26.898647 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.898763 kubelet[2829]: W0320 21:34:26.898652 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.898763 kubelet[2829]: E0320 21:34:26.898658 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.898851 kubelet[2829]: E0320 21:34:26.898779 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.898851 kubelet[2829]: W0320 21:34:26.898786 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.898851 kubelet[2829]: E0320 21:34:26.898801 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.899303 kubelet[2829]: E0320 21:34:26.898906 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.899303 kubelet[2829]: W0320 21:34:26.898913 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.899303 kubelet[2829]: E0320 21:34:26.898921 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.899303 kubelet[2829]: E0320 21:34:26.899066 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.899303 kubelet[2829]: W0320 21:34:26.899072 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.899303 kubelet[2829]: E0320 21:34:26.899080 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.899520 kubelet[2829]: E0320 21:34:26.899504 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.899543 kubelet[2829]: W0320 21:34:26.899523 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.899543 kubelet[2829]: E0320 21:34:26.899533 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.899668 kubelet[2829]: E0320 21:34:26.899656 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.899698 kubelet[2829]: W0320 21:34:26.899664 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.899698 kubelet[2829]: E0320 21:34:26.899684 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.899826 kubelet[2829]: E0320 21:34:26.899811 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.899852 kubelet[2829]: W0320 21:34:26.899839 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.899852 kubelet[2829]: E0320 21:34:26.899849 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.899983 kubelet[2829]: E0320 21:34:26.899968 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.899983 kubelet[2829]: W0320 21:34:26.899978 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.900040 kubelet[2829]: E0320 21:34:26.899985 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.900159 kubelet[2829]: E0320 21:34:26.900146 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.900159 kubelet[2829]: W0320 21:34:26.900156 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.900273 kubelet[2829]: E0320 21:34:26.900161 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.900331 kubelet[2829]: E0320 21:34:26.900296 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.900331 kubelet[2829]: W0320 21:34:26.900303 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.900331 kubelet[2829]: E0320 21:34:26.900308 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.900474 kubelet[2829]: E0320 21:34:26.900462 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.900474 kubelet[2829]: W0320 21:34:26.900471 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.900516 kubelet[2829]: E0320 21:34:26.900481 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.900773 kubelet[2829]: E0320 21:34:26.900619 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.900773 kubelet[2829]: W0320 21:34:26.900624 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.900773 kubelet[2829]: E0320 21:34:26.900631 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.900773 kubelet[2829]: E0320 21:34:26.900761 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.900773 kubelet[2829]: W0320 21:34:26.900766 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.900773 kubelet[2829]: E0320 21:34:26.900771 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.900920 kubelet[2829]: E0320 21:34:26.900876 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.900920 kubelet[2829]: W0320 21:34:26.900886 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.900920 kubelet[2829]: E0320 21:34:26.900892 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.901003 kubelet[2829]: E0320 21:34:26.900993 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.901003 kubelet[2829]: W0320 21:34:26.900999 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.901073 kubelet[2829]: E0320 21:34:26.901011 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.901177 kubelet[2829]: E0320 21:34:26.901163 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.901177 kubelet[2829]: W0320 21:34:26.901175 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.901220 kubelet[2829]: E0320 21:34:26.901181 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.901365 kubelet[2829]: E0320 21:34:26.901353 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.901365 kubelet[2829]: W0320 21:34:26.901361 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.901412 kubelet[2829]: E0320 21:34:26.901366 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.901481 kubelet[2829]: E0320 21:34:26.901469 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.901500 kubelet[2829]: W0320 21:34:26.901480 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.901500 kubelet[2829]: E0320 21:34:26.901490 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.901637 kubelet[2829]: E0320 21:34:26.901620 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.901662 kubelet[2829]: W0320 21:34:26.901636 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.901662 kubelet[2829]: E0320 21:34:26.901645 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.901765 kubelet[2829]: E0320 21:34:26.901754 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.901765 kubelet[2829]: W0320 21:34:26.901762 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.901807 kubelet[2829]: E0320 21:34:26.901768 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.901903 kubelet[2829]: E0320 21:34:26.901892 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.901903 kubelet[2829]: W0320 21:34:26.901901 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.901984 kubelet[2829]: E0320 21:34:26.901908 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.902008 kubelet[2829]: E0320 21:34:26.902001 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.902027 kubelet[2829]: W0320 21:34:26.902007 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.902027 kubelet[2829]: E0320 21:34:26.902014 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.902187 kubelet[2829]: E0320 21:34:26.902175 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.902187 kubelet[2829]: W0320 21:34:26.902184 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.902227 kubelet[2829]: E0320 21:34:26.902191 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.902309 kubelet[2829]: E0320 21:34:26.902297 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:26.902333 kubelet[2829]: W0320 21:34:26.902308 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:26.902333 kubelet[2829]: E0320 21:34:26.902317 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:26.924820 kubelet[2829]: E0320 21:34:26.924779 2829 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7zx9v" podUID="0bea6db8-8efa-4d44-ac84-6c553760f208" Mar 20 21:34:28.781895 containerd[1556]: time="2025-03-20T21:34:28.781848592Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:28.801587 containerd[1556]: time="2025-03-20T21:34:28.801540734Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 20 21:34:28.827115 containerd[1556]: time="2025-03-20T21:34:28.827040691Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:28.842264 containerd[1556]: time="2025-03-20T21:34:28.842212126Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:28.842814 containerd[1556]: time="2025-03-20T21:34:28.842483831Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 2.471337654s" Mar 20 21:34:28.842814 containerd[1556]: time="2025-03-20T21:34:28.842508742Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 20 21:34:28.843642 containerd[1556]: time="2025-03-20T21:34:28.843621846Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 20 21:34:28.852686 containerd[1556]: time="2025-03-20T21:34:28.852654019Z" level=info msg="CreateContainer within sandbox \"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 20 21:34:28.904525 containerd[1556]: time="2025-03-20T21:34:28.904498750Z" level=info msg="Container 45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:28.924243 kubelet[2829]: E0320 21:34:28.924212 2829 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7zx9v" podUID="0bea6db8-8efa-4d44-ac84-6c553760f208" Mar 20 21:34:28.943560 containerd[1556]: time="2025-03-20T21:34:28.943527589Z" level=info msg="CreateContainer within sandbox \"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\"" Mar 20 21:34:28.944164 containerd[1556]: time="2025-03-20T21:34:28.944125579Z" level=info msg="StartContainer for \"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\"" Mar 20 21:34:28.945031 containerd[1556]: time="2025-03-20T21:34:28.945009177Z" level=info msg="connecting to shim 45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c" address="unix:///run/containerd/s/6b186b9dd9929f6119bafe73135cd1cdc8247abd95341f2eff1796a4c1eefc5b" protocol=ttrpc version=3 Mar 20 21:34:28.968256 systemd[1]: Started cri-containerd-45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c.scope - libcontainer container 45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c. Mar 20 21:34:29.012604 containerd[1556]: time="2025-03-20T21:34:29.012574019Z" level=info msg="StartContainer for \"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\" returns successfully" Mar 20 21:34:30.031121 kubelet[2829]: E0320 21:34:30.030890 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.031121 kubelet[2829]: W0320 21:34:30.030926 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.031121 kubelet[2829]: E0320 21:34:30.030942 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.031121 kubelet[2829]: E0320 21:34:30.031126 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.031494 kubelet[2829]: W0320 21:34:30.031132 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.031494 kubelet[2829]: E0320 21:34:30.031141 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.031924 kubelet[2829]: I0320 21:34:30.031628 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6457cc644c-vmjxp" podStartSLOduration=2.534867691 podStartE2EDuration="5.006988875s" podCreationTimestamp="2025-03-20 21:34:25 +0000 UTC" firstStartedPulling="2025-03-20 21:34:26.370989684 +0000 UTC m=+13.536621544" lastFinishedPulling="2025-03-20 21:34:28.843110863 +0000 UTC m=+16.008742728" observedRunningTime="2025-03-20 21:34:30.00608949 +0000 UTC m=+17.171721359" watchObservedRunningTime="2025-03-20 21:34:30.006988875 +0000 UTC m=+17.172620743" Mar 20 21:34:30.031924 kubelet[2829]: E0320 21:34:30.031742 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.031924 kubelet[2829]: W0320 21:34:30.031748 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.031924 kubelet[2829]: E0320 21:34:30.031756 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.032114 kubelet[2829]: E0320 21:34:30.031996 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.032114 kubelet[2829]: W0320 21:34:30.032002 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.032114 kubelet[2829]: E0320 21:34:30.032007 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.032344 kubelet[2829]: E0320 21:34:30.032332 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.032344 kubelet[2829]: W0320 21:34:30.032340 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.032396 kubelet[2829]: E0320 21:34:30.032347 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.032468 kubelet[2829]: E0320 21:34:30.032459 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.032468 kubelet[2829]: W0320 21:34:30.032466 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.032546 kubelet[2829]: E0320 21:34:30.032471 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.032958 kubelet[2829]: E0320 21:34:30.032940 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.032958 kubelet[2829]: W0320 21:34:30.032951 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.032958 kubelet[2829]: E0320 21:34:30.032958 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.033147 kubelet[2829]: E0320 21:34:30.033089 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.033147 kubelet[2829]: W0320 21:34:30.033100 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.033147 kubelet[2829]: E0320 21:34:30.033105 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.033247 kubelet[2829]: E0320 21:34:30.033236 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.033247 kubelet[2829]: W0320 21:34:30.033243 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.033301 kubelet[2829]: E0320 21:34:30.033249 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.033916 kubelet[2829]: E0320 21:34:30.033887 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.033916 kubelet[2829]: W0320 21:34:30.033895 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.033916 kubelet[2829]: E0320 21:34:30.033902 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.034279 kubelet[2829]: E0320 21:34:30.034260 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.034279 kubelet[2829]: W0320 21:34:30.034266 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.034279 kubelet[2829]: E0320 21:34:30.034272 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.034400 kubelet[2829]: E0320 21:34:30.034381 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.034400 kubelet[2829]: W0320 21:34:30.034388 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.034448 kubelet[2829]: E0320 21:34:30.034435 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.034631 kubelet[2829]: E0320 21:34:30.034612 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.034631 kubelet[2829]: W0320 21:34:30.034618 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.034631 kubelet[2829]: E0320 21:34:30.034623 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.034801 kubelet[2829]: E0320 21:34:30.034724 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.034801 kubelet[2829]: W0320 21:34:30.034731 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.034801 kubelet[2829]: E0320 21:34:30.034738 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.034888 kubelet[2829]: E0320 21:34:30.034834 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.034888 kubelet[2829]: W0320 21:34:30.034838 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.034888 kubelet[2829]: E0320 21:34:30.034843 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.043430 kubelet[2829]: E0320 21:34:30.043412 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.043430 kubelet[2829]: W0320 21:34:30.043426 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.043516 kubelet[2829]: E0320 21:34:30.043439 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.043615 kubelet[2829]: E0320 21:34:30.043605 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.043644 kubelet[2829]: W0320 21:34:30.043614 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.043884 kubelet[2829]: E0320 21:34:30.043625 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.044039 kubelet[2829]: E0320 21:34:30.043970 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.044039 kubelet[2829]: W0320 21:34:30.043979 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.044039 kubelet[2829]: E0320 21:34:30.043986 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.044152 kubelet[2829]: E0320 21:34:30.044141 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.044226 kubelet[2829]: W0320 21:34:30.044178 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.044226 kubelet[2829]: E0320 21:34:30.044190 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.044387 kubelet[2829]: E0320 21:34:30.044319 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.044387 kubelet[2829]: W0320 21:34:30.044329 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.044387 kubelet[2829]: E0320 21:34:30.044334 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.044510 kubelet[2829]: E0320 21:34:30.044503 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.044677 kubelet[2829]: W0320 21:34:30.044540 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.044677 kubelet[2829]: E0320 21:34:30.044548 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.044750 kubelet[2829]: E0320 21:34:30.044744 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.044787 kubelet[2829]: W0320 21:34:30.044782 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.044825 kubelet[2829]: E0320 21:34:30.044819 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.051355 kubelet[2829]: E0320 21:34:30.051343 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.051502 kubelet[2829]: W0320 21:34:30.051415 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.051502 kubelet[2829]: E0320 21:34:30.051428 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.051615 kubelet[2829]: E0320 21:34:30.051609 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.051705 kubelet[2829]: W0320 21:34:30.051649 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.051705 kubelet[2829]: E0320 21:34:30.051656 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.051883 kubelet[2829]: E0320 21:34:30.051821 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.051883 kubelet[2829]: W0320 21:34:30.051829 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.051883 kubelet[2829]: E0320 21:34:30.051835 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.052110 kubelet[2829]: E0320 21:34:30.052050 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.052110 kubelet[2829]: W0320 21:34:30.052060 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.052110 kubelet[2829]: E0320 21:34:30.052072 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.054430 kubelet[2829]: E0320 21:34:30.054357 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.054430 kubelet[2829]: W0320 21:34:30.054367 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.054430 kubelet[2829]: E0320 21:34:30.054373 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.054534 kubelet[2829]: E0320 21:34:30.054528 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.054580 kubelet[2829]: W0320 21:34:30.054574 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.054755 kubelet[2829]: E0320 21:34:30.054614 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.054824 kubelet[2829]: E0320 21:34:30.054817 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.054909 kubelet[2829]: W0320 21:34:30.054854 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.054909 kubelet[2829]: E0320 21:34:30.054862 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.054994 kubelet[2829]: E0320 21:34:30.054988 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.058944 kubelet[2829]: W0320 21:34:30.055084 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.058944 kubelet[2829]: E0320 21:34:30.055160 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.058944 kubelet[2829]: E0320 21:34:30.055277 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.058944 kubelet[2829]: W0320 21:34:30.055282 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.058944 kubelet[2829]: E0320 21:34:30.055287 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.058944 kubelet[2829]: E0320 21:34:30.055451 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.058944 kubelet[2829]: W0320 21:34:30.055457 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.058944 kubelet[2829]: E0320 21:34:30.055462 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.058944 kubelet[2829]: E0320 21:34:30.055765 2829 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 21:34:30.058944 kubelet[2829]: W0320 21:34:30.055770 2829 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 21:34:30.059174 kubelet[2829]: E0320 21:34:30.055776 2829 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 21:34:30.348813 containerd[1556]: time="2025-03-20T21:34:30.348731748Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:30.350793 containerd[1556]: time="2025-03-20T21:34:30.350759321Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 20 21:34:30.355462 containerd[1556]: time="2025-03-20T21:34:30.355441507Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:30.360461 containerd[1556]: time="2025-03-20T21:34:30.360436846Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:30.361195 containerd[1556]: time="2025-03-20T21:34:30.360916625Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 1.517275026s" Mar 20 21:34:30.361195 containerd[1556]: time="2025-03-20T21:34:30.360938245Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 20 21:34:30.363021 containerd[1556]: time="2025-03-20T21:34:30.362952033Z" level=info msg="CreateContainer within sandbox \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 20 21:34:30.379149 containerd[1556]: time="2025-03-20T21:34:30.379127272Z" level=info msg="Container d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:30.382129 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount372680830.mount: Deactivated successfully. Mar 20 21:34:30.385864 containerd[1556]: time="2025-03-20T21:34:30.384960826Z" level=info msg="CreateContainer within sandbox \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460\"" Mar 20 21:34:30.386224 containerd[1556]: time="2025-03-20T21:34:30.386097555Z" level=info msg="StartContainer for \"d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460\"" Mar 20 21:34:30.387286 containerd[1556]: time="2025-03-20T21:34:30.387244308Z" level=info msg="connecting to shim d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460" address="unix:///run/containerd/s/e42d18309c0a4c8a8a389851ce272df50af64fee6a089727af344d009b8850fd" protocol=ttrpc version=3 Mar 20 21:34:30.411211 systemd[1]: Started cri-containerd-d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460.scope - libcontainer container d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460. Mar 20 21:34:30.467588 containerd[1556]: time="2025-03-20T21:34:30.467517987Z" level=info msg="StartContainer for \"d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460\" returns successfully" Mar 20 21:34:30.471471 systemd[1]: cri-containerd-d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460.scope: Deactivated successfully. Mar 20 21:34:30.494537 containerd[1556]: time="2025-03-20T21:34:30.494506482Z" level=info msg="received exit event container_id:\"d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460\" id:\"d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460\" pid:3467 exited_at:{seconds:1742506470 nanos:473857880}" Mar 20 21:34:30.523950 containerd[1556]: time="2025-03-20T21:34:30.523913878Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460\" id:\"d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460\" pid:3467 exited_at:{seconds:1742506470 nanos:473857880}" Mar 20 21:34:30.532610 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460-rootfs.mount: Deactivated successfully. Mar 20 21:34:30.925325 kubelet[2829]: E0320 21:34:30.925292 2829 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7zx9v" podUID="0bea6db8-8efa-4d44-ac84-6c553760f208" Mar 20 21:34:30.988365 kubelet[2829]: I0320 21:34:30.988343 2829 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 21:34:31.991373 containerd[1556]: time="2025-03-20T21:34:31.991345897Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 20 21:34:32.925897 kubelet[2829]: E0320 21:34:32.925867 2829 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7zx9v" podUID="0bea6db8-8efa-4d44-ac84-6c553760f208" Mar 20 21:34:34.924427 kubelet[2829]: E0320 21:34:34.924399 2829 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7zx9v" podUID="0bea6db8-8efa-4d44-ac84-6c553760f208" Mar 20 21:34:35.851241 containerd[1556]: time="2025-03-20T21:34:35.851207326Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:35.854409 containerd[1556]: time="2025-03-20T21:34:35.854295746Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 20 21:34:35.856933 containerd[1556]: time="2025-03-20T21:34:35.854836005Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:35.856933 containerd[1556]: time="2025-03-20T21:34:35.856249126Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:35.856933 containerd[1556]: time="2025-03-20T21:34:35.856816480Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 3.865432141s" Mar 20 21:34:35.856933 containerd[1556]: time="2025-03-20T21:34:35.856834359Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 20 21:34:35.860630 containerd[1556]: time="2025-03-20T21:34:35.859872556Z" level=info msg="CreateContainer within sandbox \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 20 21:34:35.866402 containerd[1556]: time="2025-03-20T21:34:35.863907621Z" level=info msg="Container d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:35.881110 containerd[1556]: time="2025-03-20T21:34:35.881013556Z" level=info msg="CreateContainer within sandbox \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c\"" Mar 20 21:34:35.881658 containerd[1556]: time="2025-03-20T21:34:35.881527957Z" level=info msg="StartContainer for \"d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c\"" Mar 20 21:34:35.882982 containerd[1556]: time="2025-03-20T21:34:35.882858671Z" level=info msg="connecting to shim d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c" address="unix:///run/containerd/s/e42d18309c0a4c8a8a389851ce272df50af64fee6a089727af344d009b8850fd" protocol=ttrpc version=3 Mar 20 21:34:35.924249 systemd[1]: Started cri-containerd-d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c.scope - libcontainer container d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c. Mar 20 21:34:36.003807 containerd[1556]: time="2025-03-20T21:34:36.003782659Z" level=info msg="StartContainer for \"d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c\" returns successfully" Mar 20 21:34:36.924179 kubelet[2829]: E0320 21:34:36.924152 2829 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7zx9v" podUID="0bea6db8-8efa-4d44-ac84-6c553760f208" Mar 20 21:34:37.622039 systemd[1]: cri-containerd-d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c.scope: Deactivated successfully. Mar 20 21:34:37.622264 systemd[1]: cri-containerd-d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c.scope: Consumed 289ms CPU time, 147.8M memory peak, 172K read from disk, 154M written to disk. Mar 20 21:34:37.628166 containerd[1556]: time="2025-03-20T21:34:37.628142444Z" level=info msg="received exit event container_id:\"d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c\" id:\"d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c\" pid:3525 exited_at:{seconds:1742506477 nanos:627928834}" Mar 20 21:34:37.631221 containerd[1556]: time="2025-03-20T21:34:37.628328019Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c\" id:\"d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c\" pid:3525 exited_at:{seconds:1742506477 nanos:627928834}" Mar 20 21:34:37.650632 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c-rootfs.mount: Deactivated successfully. Mar 20 21:34:37.659654 kubelet[2829]: I0320 21:34:37.659629 2829 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Mar 20 21:34:37.772810 systemd[1]: Created slice kubepods-besteffort-pod1af09251_b417_4ce1_a535_368f260e296f.slice - libcontainer container kubepods-besteffort-pod1af09251_b417_4ce1_a535_368f260e296f.slice. Mar 20 21:34:37.781312 systemd[1]: Created slice kubepods-burstable-pod07b69786_72b8_464b_95b1_4e7ec6dec86e.slice - libcontainer container kubepods-burstable-pod07b69786_72b8_464b_95b1_4e7ec6dec86e.slice. Mar 20 21:34:37.787721 systemd[1]: Created slice kubepods-besteffort-pod3db47d74_c752_4fcf_8454_2fdc29ed4c0f.slice - libcontainer container kubepods-besteffort-pod3db47d74_c752_4fcf_8454_2fdc29ed4c0f.slice. Mar 20 21:34:37.799104 systemd[1]: Created slice kubepods-burstable-pod1d166fb6_cb1b_421b_946b_f26d99e50e62.slice - libcontainer container kubepods-burstable-pod1d166fb6_cb1b_421b_946b_f26d99e50e62.slice. Mar 20 21:34:37.802623 systemd[1]: Created slice kubepods-besteffort-pod48a21f5e_3708_4b74_ac02_926c8d0690e2.slice - libcontainer container kubepods-besteffort-pod48a21f5e_3708_4b74_ac02_926c8d0690e2.slice. Mar 20 21:34:37.833649 kubelet[2829]: I0320 21:34:37.833608 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr9nx\" (UniqueName: \"kubernetes.io/projected/1d166fb6-cb1b-421b-946b-f26d99e50e62-kube-api-access-mr9nx\") pod \"coredns-6f6b679f8f-bfxj9\" (UID: \"1d166fb6-cb1b-421b-946b-f26d99e50e62\") " pod="kube-system/coredns-6f6b679f8f-bfxj9" Mar 20 21:34:37.833757 kubelet[2829]: I0320 21:34:37.833663 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/48a21f5e-3708-4b74-ac02-926c8d0690e2-calico-apiserver-certs\") pod \"calico-apiserver-7fbc6f44f-78sgc\" (UID: \"48a21f5e-3708-4b74-ac02-926c8d0690e2\") " pod="calico-apiserver/calico-apiserver-7fbc6f44f-78sgc" Mar 20 21:34:37.833757 kubelet[2829]: I0320 21:34:37.833683 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3db47d74-c752-4fcf-8454-2fdc29ed4c0f-tigera-ca-bundle\") pod \"calico-kube-controllers-57c7f44ccb-nf8xh\" (UID: \"3db47d74-c752-4fcf-8454-2fdc29ed4c0f\") " pod="calico-system/calico-kube-controllers-57c7f44ccb-nf8xh" Mar 20 21:34:37.833757 kubelet[2829]: I0320 21:34:37.833703 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07b69786-72b8-464b-95b1-4e7ec6dec86e-config-volume\") pod \"coredns-6f6b679f8f-pf9dt\" (UID: \"07b69786-72b8-464b-95b1-4e7ec6dec86e\") " pod="kube-system/coredns-6f6b679f8f-pf9dt" Mar 20 21:34:37.833757 kubelet[2829]: I0320 21:34:37.833716 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwlx2\" (UniqueName: \"kubernetes.io/projected/1af09251-b417-4ce1-a535-368f260e296f-kube-api-access-pwlx2\") pod \"calico-apiserver-7fbc6f44f-9h4bk\" (UID: \"1af09251-b417-4ce1-a535-368f260e296f\") " pod="calico-apiserver/calico-apiserver-7fbc6f44f-9h4bk" Mar 20 21:34:37.833757 kubelet[2829]: I0320 21:34:37.833727 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbnnj\" (UniqueName: \"kubernetes.io/projected/07b69786-72b8-464b-95b1-4e7ec6dec86e-kube-api-access-bbnnj\") pod \"coredns-6f6b679f8f-pf9dt\" (UID: \"07b69786-72b8-464b-95b1-4e7ec6dec86e\") " pod="kube-system/coredns-6f6b679f8f-pf9dt" Mar 20 21:34:37.837221 kubelet[2829]: I0320 21:34:37.833735 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1af09251-b417-4ce1-a535-368f260e296f-calico-apiserver-certs\") pod \"calico-apiserver-7fbc6f44f-9h4bk\" (UID: \"1af09251-b417-4ce1-a535-368f260e296f\") " pod="calico-apiserver/calico-apiserver-7fbc6f44f-9h4bk" Mar 20 21:34:37.837221 kubelet[2829]: I0320 21:34:37.833743 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d166fb6-cb1b-421b-946b-f26d99e50e62-config-volume\") pod \"coredns-6f6b679f8f-bfxj9\" (UID: \"1d166fb6-cb1b-421b-946b-f26d99e50e62\") " pod="kube-system/coredns-6f6b679f8f-bfxj9" Mar 20 21:34:37.837221 kubelet[2829]: I0320 21:34:37.833754 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4h9f\" (UniqueName: \"kubernetes.io/projected/3db47d74-c752-4fcf-8454-2fdc29ed4c0f-kube-api-access-p4h9f\") pod \"calico-kube-controllers-57c7f44ccb-nf8xh\" (UID: \"3db47d74-c752-4fcf-8454-2fdc29ed4c0f\") " pod="calico-system/calico-kube-controllers-57c7f44ccb-nf8xh" Mar 20 21:34:37.837221 kubelet[2829]: I0320 21:34:37.833765 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwqs7\" (UniqueName: \"kubernetes.io/projected/48a21f5e-3708-4b74-ac02-926c8d0690e2-kube-api-access-xwqs7\") pod \"calico-apiserver-7fbc6f44f-78sgc\" (UID: \"48a21f5e-3708-4b74-ac02-926c8d0690e2\") " pod="calico-apiserver/calico-apiserver-7fbc6f44f-78sgc" Mar 20 21:34:38.081174 containerd[1556]: time="2025-03-20T21:34:38.080576354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fbc6f44f-9h4bk,Uid:1af09251-b417-4ce1-a535-368f260e296f,Namespace:calico-apiserver,Attempt:0,}" Mar 20 21:34:38.092545 containerd[1556]: time="2025-03-20T21:34:38.092011680Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-pf9dt,Uid:07b69786-72b8-464b-95b1-4e7ec6dec86e,Namespace:kube-system,Attempt:0,}" Mar 20 21:34:38.113331 containerd[1556]: time="2025-03-20T21:34:38.113309142Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-57c7f44ccb-nf8xh,Uid:3db47d74-c752-4fcf-8454-2fdc29ed4c0f,Namespace:calico-system,Attempt:0,}" Mar 20 21:34:38.113722 containerd[1556]: time="2025-03-20T21:34:38.113703234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-bfxj9,Uid:1d166fb6-cb1b-421b-946b-f26d99e50e62,Namespace:kube-system,Attempt:0,}" Mar 20 21:34:38.114024 containerd[1556]: time="2025-03-20T21:34:38.113797243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fbc6f44f-78sgc,Uid:48a21f5e-3708-4b74-ac02-926c8d0690e2,Namespace:calico-apiserver,Attempt:0,}" Mar 20 21:34:38.176581 containerd[1556]: time="2025-03-20T21:34:38.176358796Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 20 21:34:38.386313 containerd[1556]: time="2025-03-20T21:34:38.386254551Z" level=error msg="Failed to destroy network for sandbox \"1a02b11be9e434eabb8ae176f2b0dc9f01a42e49e26d77894c1a6efa2b166f04\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.391350 containerd[1556]: time="2025-03-20T21:34:38.391243117Z" level=error msg="Failed to destroy network for sandbox \"d1bcebb93f330cefa811a2c7f2850647b52fc8b3c54892867849c06c10d98519\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.392107 containerd[1556]: time="2025-03-20T21:34:38.392090805Z" level=error msg="Failed to destroy network for sandbox \"b992c61ca1afc2525824f41188a7b0ede65e91d4d9f94082f60e271a17546ba1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.392869 containerd[1556]: time="2025-03-20T21:34:38.392789189Z" level=error msg="Failed to destroy network for sandbox \"8cf75ed8081682a51ddd419bfdce1dfed1a457db6d892417cb5869c1e51181c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.392930 containerd[1556]: time="2025-03-20T21:34:38.392790233Z" level=error msg="Failed to destroy network for sandbox \"af43aea7d3e8e5af9da173b7210ad76487ebd94de160a0506ee89f27cf6fb873\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.405675 containerd[1556]: time="2025-03-20T21:34:38.396141976Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fbc6f44f-78sgc,Uid:48a21f5e-3708-4b74-ac02-926c8d0690e2,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a02b11be9e434eabb8ae176f2b0dc9f01a42e49e26d77894c1a6efa2b166f04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.405864 containerd[1556]: time="2025-03-20T21:34:38.397217640Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-57c7f44ccb-nf8xh,Uid:3db47d74-c752-4fcf-8454-2fdc29ed4c0f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1bcebb93f330cefa811a2c7f2850647b52fc8b3c54892867849c06c10d98519\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.411441 containerd[1556]: time="2025-03-20T21:34:38.397413849Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-bfxj9,Uid:1d166fb6-cb1b-421b-946b-f26d99e50e62,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b992c61ca1afc2525824f41188a7b0ede65e91d4d9f94082f60e271a17546ba1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.411441 containerd[1556]: time="2025-03-20T21:34:38.398368068Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fbc6f44f-9h4bk,Uid:1af09251-b417-4ce1-a535-368f260e296f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8cf75ed8081682a51ddd419bfdce1dfed1a457db6d892417cb5869c1e51181c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.411441 containerd[1556]: time="2025-03-20T21:34:38.398563871Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-pf9dt,Uid:07b69786-72b8-464b-95b1-4e7ec6dec86e,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"af43aea7d3e8e5af9da173b7210ad76487ebd94de160a0506ee89f27cf6fb873\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.417350 kubelet[2829]: E0320 21:34:38.411871 2829 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1bcebb93f330cefa811a2c7f2850647b52fc8b3c54892867849c06c10d98519\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.417705 kubelet[2829]: E0320 21:34:38.411939 2829 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af43aea7d3e8e5af9da173b7210ad76487ebd94de160a0506ee89f27cf6fb873\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.417751 kubelet[2829]: E0320 21:34:38.417726 2829 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b992c61ca1afc2525824f41188a7b0ede65e91d4d9f94082f60e271a17546ba1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.417751 kubelet[2829]: E0320 21:34:38.417743 2829 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8cf75ed8081682a51ddd419bfdce1dfed1a457db6d892417cb5869c1e51181c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.417808 kubelet[2829]: E0320 21:34:38.417765 2829 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a02b11be9e434eabb8ae176f2b0dc9f01a42e49e26d77894c1a6efa2b166f04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.438453 kubelet[2829]: E0320 21:34:38.434483 2829 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b992c61ca1afc2525824f41188a7b0ede65e91d4d9f94082f60e271a17546ba1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-bfxj9" Mar 20 21:34:38.438453 kubelet[2829]: E0320 21:34:38.438203 2829 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b992c61ca1afc2525824f41188a7b0ede65e91d4d9f94082f60e271a17546ba1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-bfxj9" Mar 20 21:34:38.438453 kubelet[2829]: E0320 21:34:38.438250 2829 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-bfxj9_kube-system(1d166fb6-cb1b-421b-946b-f26d99e50e62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-bfxj9_kube-system(1d166fb6-cb1b-421b-946b-f26d99e50e62)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b992c61ca1afc2525824f41188a7b0ede65e91d4d9f94082f60e271a17546ba1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-bfxj9" podUID="1d166fb6-cb1b-421b-946b-f26d99e50e62" Mar 20 21:34:38.438626 kubelet[2829]: E0320 21:34:38.438341 2829 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a02b11be9e434eabb8ae176f2b0dc9f01a42e49e26d77894c1a6efa2b166f04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7fbc6f44f-78sgc" Mar 20 21:34:38.438626 kubelet[2829]: E0320 21:34:38.438354 2829 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a02b11be9e434eabb8ae176f2b0dc9f01a42e49e26d77894c1a6efa2b166f04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7fbc6f44f-78sgc" Mar 20 21:34:38.438626 kubelet[2829]: E0320 21:34:38.438367 2829 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7fbc6f44f-78sgc_calico-apiserver(48a21f5e-3708-4b74-ac02-926c8d0690e2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7fbc6f44f-78sgc_calico-apiserver(48a21f5e-3708-4b74-ac02-926c8d0690e2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1a02b11be9e434eabb8ae176f2b0dc9f01a42e49e26d77894c1a6efa2b166f04\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7fbc6f44f-78sgc" podUID="48a21f5e-3708-4b74-ac02-926c8d0690e2" Mar 20 21:34:38.438716 kubelet[2829]: E0320 21:34:38.434441 2829 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1bcebb93f330cefa811a2c7f2850647b52fc8b3c54892867849c06c10d98519\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-57c7f44ccb-nf8xh" Mar 20 21:34:38.438716 kubelet[2829]: E0320 21:34:38.438630 2829 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1bcebb93f330cefa811a2c7f2850647b52fc8b3c54892867849c06c10d98519\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-57c7f44ccb-nf8xh" Mar 20 21:34:38.438716 kubelet[2829]: E0320 21:34:38.438652 2829 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-57c7f44ccb-nf8xh_calico-system(3db47d74-c752-4fcf-8454-2fdc29ed4c0f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-57c7f44ccb-nf8xh_calico-system(3db47d74-c752-4fcf-8454-2fdc29ed4c0f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d1bcebb93f330cefa811a2c7f2850647b52fc8b3c54892867849c06c10d98519\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-57c7f44ccb-nf8xh" podUID="3db47d74-c752-4fcf-8454-2fdc29ed4c0f" Mar 20 21:34:38.438814 kubelet[2829]: E0320 21:34:38.438673 2829 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af43aea7d3e8e5af9da173b7210ad76487ebd94de160a0506ee89f27cf6fb873\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-pf9dt" Mar 20 21:34:38.438814 kubelet[2829]: E0320 21:34:38.438689 2829 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af43aea7d3e8e5af9da173b7210ad76487ebd94de160a0506ee89f27cf6fb873\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-pf9dt" Mar 20 21:34:38.438814 kubelet[2829]: E0320 21:34:38.438703 2829 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-pf9dt_kube-system(07b69786-72b8-464b-95b1-4e7ec6dec86e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-pf9dt_kube-system(07b69786-72b8-464b-95b1-4e7ec6dec86e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"af43aea7d3e8e5af9da173b7210ad76487ebd94de160a0506ee89f27cf6fb873\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-pf9dt" podUID="07b69786-72b8-464b-95b1-4e7ec6dec86e" Mar 20 21:34:38.438894 kubelet[2829]: E0320 21:34:38.438718 2829 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8cf75ed8081682a51ddd419bfdce1dfed1a457db6d892417cb5869c1e51181c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7fbc6f44f-9h4bk" Mar 20 21:34:38.438894 kubelet[2829]: E0320 21:34:38.438726 2829 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8cf75ed8081682a51ddd419bfdce1dfed1a457db6d892417cb5869c1e51181c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7fbc6f44f-9h4bk" Mar 20 21:34:38.438894 kubelet[2829]: E0320 21:34:38.438737 2829 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7fbc6f44f-9h4bk_calico-apiserver(1af09251-b417-4ce1-a535-368f260e296f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7fbc6f44f-9h4bk_calico-apiserver(1af09251-b417-4ce1-a535-368f260e296f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8cf75ed8081682a51ddd419bfdce1dfed1a457db6d892417cb5869c1e51181c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7fbc6f44f-9h4bk" podUID="1af09251-b417-4ce1-a535-368f260e296f" Mar 20 21:34:38.649749 systemd[1]: run-netns-cni\x2d93a4b61f\x2d336c\x2d9d95\x2ddcc0\x2d6507bd0ea088.mount: Deactivated successfully. Mar 20 21:34:38.649901 systemd[1]: run-netns-cni\x2d246af411\x2db042\x2db18c\x2da3d7\x2db9ea13b81c1a.mount: Deactivated successfully. Mar 20 21:34:38.927345 systemd[1]: Created slice kubepods-besteffort-pod0bea6db8_8efa_4d44_ac84_6c553760f208.slice - libcontainer container kubepods-besteffort-pod0bea6db8_8efa_4d44_ac84_6c553760f208.slice. Mar 20 21:34:38.928916 containerd[1556]: time="2025-03-20T21:34:38.928886861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7zx9v,Uid:0bea6db8-8efa-4d44-ac84-6c553760f208,Namespace:calico-system,Attempt:0,}" Mar 20 21:34:38.982831 containerd[1556]: time="2025-03-20T21:34:38.982781269Z" level=error msg="Failed to destroy network for sandbox \"cca1784c5dbea68d798525373ca5bbe205469650658dd69faa8e3e420ff21e4d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.984117 systemd[1]: run-netns-cni\x2d9072da5b\x2d0ef1\x2d08a8\x2d3005\x2d041a5f092d21.mount: Deactivated successfully. Mar 20 21:34:38.986063 containerd[1556]: time="2025-03-20T21:34:38.984509325Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7zx9v,Uid:0bea6db8-8efa-4d44-ac84-6c553760f208,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cca1784c5dbea68d798525373ca5bbe205469650658dd69faa8e3e420ff21e4d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.986145 kubelet[2829]: E0320 21:34:38.984666 2829 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cca1784c5dbea68d798525373ca5bbe205469650658dd69faa8e3e420ff21e4d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 21:34:38.986145 kubelet[2829]: E0320 21:34:38.984708 2829 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cca1784c5dbea68d798525373ca5bbe205469650658dd69faa8e3e420ff21e4d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7zx9v" Mar 20 21:34:38.986145 kubelet[2829]: E0320 21:34:38.984726 2829 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cca1784c5dbea68d798525373ca5bbe205469650658dd69faa8e3e420ff21e4d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7zx9v" Mar 20 21:34:38.986260 kubelet[2829]: E0320 21:34:38.984751 2829 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7zx9v_calico-system(0bea6db8-8efa-4d44-ac84-6c553760f208)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7zx9v_calico-system(0bea6db8-8efa-4d44-ac84-6c553760f208)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cca1784c5dbea68d798525373ca5bbe205469650658dd69faa8e3e420ff21e4d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7zx9v" podUID="0bea6db8-8efa-4d44-ac84-6c553760f208" Mar 20 21:34:42.518842 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4279733004.mount: Deactivated successfully. Mar 20 21:34:42.682695 containerd[1556]: time="2025-03-20T21:34:42.682216058Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:42.683580 containerd[1556]: time="2025-03-20T21:34:42.683495441Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 20 21:34:42.685707 containerd[1556]: time="2025-03-20T21:34:42.685676154Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 4.508132946s" Mar 20 21:34:42.687587 containerd[1556]: time="2025-03-20T21:34:42.687011309Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:42.687587 containerd[1556]: time="2025-03-20T21:34:42.687318265Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:42.689694 containerd[1556]: time="2025-03-20T21:34:42.689653538Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 20 21:34:42.700442 containerd[1556]: time="2025-03-20T21:34:42.700412159Z" level=info msg="CreateContainer within sandbox \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 20 21:34:42.731966 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1629892340.mount: Deactivated successfully. Mar 20 21:34:42.732202 containerd[1556]: time="2025-03-20T21:34:42.732038645Z" level=info msg="Container c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:42.808857 containerd[1556]: time="2025-03-20T21:34:42.808313417Z" level=info msg="CreateContainer within sandbox \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\"" Mar 20 21:34:42.849476 containerd[1556]: time="2025-03-20T21:34:42.849443131Z" level=info msg="StartContainer for \"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\"" Mar 20 21:34:42.857765 containerd[1556]: time="2025-03-20T21:34:42.857730992Z" level=info msg="connecting to shim c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d" address="unix:///run/containerd/s/e42d18309c0a4c8a8a389851ce272df50af64fee6a089727af344d009b8850fd" protocol=ttrpc version=3 Mar 20 21:34:42.936248 systemd[1]: Started cri-containerd-c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d.scope - libcontainer container c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d. Mar 20 21:34:42.977624 containerd[1556]: time="2025-03-20T21:34:42.977599403Z" level=info msg="StartContainer for \"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\" returns successfully" Mar 20 21:34:43.129061 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 20 21:34:43.134198 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 20 21:34:44.840076 kernel: bpftool[3936]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 20 21:34:45.025855 systemd-networkd[1363]: vxlan.calico: Link UP Mar 20 21:34:45.025860 systemd-networkd[1363]: vxlan.calico: Gained carrier Mar 20 21:34:46.655215 systemd-networkd[1363]: vxlan.calico: Gained IPv6LL Mar 20 21:34:50.925645 containerd[1556]: time="2025-03-20T21:34:50.925488928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fbc6f44f-9h4bk,Uid:1af09251-b417-4ce1-a535-368f260e296f,Namespace:calico-apiserver,Attempt:0,}" Mar 20 21:34:50.926229 containerd[1556]: time="2025-03-20T21:34:50.925863388Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-pf9dt,Uid:07b69786-72b8-464b-95b1-4e7ec6dec86e,Namespace:kube-system,Attempt:0,}" Mar 20 21:34:50.926229 containerd[1556]: time="2025-03-20T21:34:50.926154480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-57c7f44ccb-nf8xh,Uid:3db47d74-c752-4fcf-8454-2fdc29ed4c0f,Namespace:calico-system,Attempt:0,}" Mar 20 21:34:51.557662 systemd-networkd[1363]: cali7e2e7309af0: Link UP Mar 20 21:34:51.558395 systemd-networkd[1363]: cali7e2e7309af0: Gained carrier Mar 20 21:34:51.582791 containerd[1556]: 2025-03-20 21:34:51.092 [INFO][4034] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7fbc6f44f--9h4bk-eth0 calico-apiserver-7fbc6f44f- calico-apiserver 1af09251-b417-4ce1-a535-368f260e296f 710 0 2025-03-20 21:34:26 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7fbc6f44f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7fbc6f44f-9h4bk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7e2e7309af0 [] []}} ContainerID="e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-9h4bk" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--9h4bk-" Mar 20 21:34:51.582791 containerd[1556]: 2025-03-20 21:34:51.101 [INFO][4034] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-9h4bk" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--9h4bk-eth0" Mar 20 21:34:51.582791 containerd[1556]: 2025-03-20 21:34:51.470 [INFO][4068] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" HandleID="k8s-pod-network.e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" Workload="localhost-k8s-calico--apiserver--7fbc6f44f--9h4bk-eth0" Mar 20 21:34:51.582974 containerd[1556]: 2025-03-20 21:34:51.487 [INFO][4068] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" HandleID="k8s-pod-network.e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" Workload="localhost-k8s-calico--apiserver--7fbc6f44f--9h4bk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000050b70), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7fbc6f44f-9h4bk", "timestamp":"2025-03-20 21:34:51.470711502 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 21:34:51.582974 containerd[1556]: 2025-03-20 21:34:51.487 [INFO][4068] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 21:34:51.582974 containerd[1556]: 2025-03-20 21:34:51.487 [INFO][4068] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 21:34:51.582974 containerd[1556]: 2025-03-20 21:34:51.487 [INFO][4068] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 20 21:34:51.582974 containerd[1556]: 2025-03-20 21:34:51.490 [INFO][4068] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" host="localhost" Mar 20 21:34:51.582974 containerd[1556]: 2025-03-20 21:34:51.497 [INFO][4068] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 20 21:34:51.582974 containerd[1556]: 2025-03-20 21:34:51.503 [INFO][4068] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 20 21:34:51.582974 containerd[1556]: 2025-03-20 21:34:51.515 [INFO][4068] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 20 21:34:51.582974 containerd[1556]: 2025-03-20 21:34:51.527 [INFO][4068] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 20 21:34:51.582974 containerd[1556]: 2025-03-20 21:34:51.527 [INFO][4068] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" host="localhost" Mar 20 21:34:51.587253 containerd[1556]: 2025-03-20 21:34:51.528 [INFO][4068] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd Mar 20 21:34:51.587253 containerd[1556]: 2025-03-20 21:34:51.536 [INFO][4068] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" host="localhost" Mar 20 21:34:51.587253 containerd[1556]: 2025-03-20 21:34:51.547 [INFO][4068] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" host="localhost" Mar 20 21:34:51.587253 containerd[1556]: 2025-03-20 21:34:51.548 [INFO][4068] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" host="localhost" Mar 20 21:34:51.587253 containerd[1556]: 2025-03-20 21:34:51.548 [INFO][4068] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 21:34:51.587253 containerd[1556]: 2025-03-20 21:34:51.548 [INFO][4068] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" HandleID="k8s-pod-network.e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" Workload="localhost-k8s-calico--apiserver--7fbc6f44f--9h4bk-eth0" Mar 20 21:34:51.594609 containerd[1556]: 2025-03-20 21:34:51.551 [INFO][4034] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-9h4bk" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--9h4bk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7fbc6f44f--9h4bk-eth0", GenerateName:"calico-apiserver-7fbc6f44f-", Namespace:"calico-apiserver", SelfLink:"", UID:"1af09251-b417-4ce1-a535-368f260e296f", ResourceVersion:"710", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 21, 34, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fbc6f44f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7fbc6f44f-9h4bk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7e2e7309af0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 21:34:51.594685 containerd[1556]: 2025-03-20 21:34:51.551 [INFO][4034] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-9h4bk" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--9h4bk-eth0" Mar 20 21:34:51.594685 containerd[1556]: 2025-03-20 21:34:51.551 [INFO][4034] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7e2e7309af0 ContainerID="e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-9h4bk" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--9h4bk-eth0" Mar 20 21:34:51.594685 containerd[1556]: 2025-03-20 21:34:51.559 [INFO][4034] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-9h4bk" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--9h4bk-eth0" Mar 20 21:34:51.594744 containerd[1556]: 2025-03-20 21:34:51.559 [INFO][4034] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-9h4bk" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--9h4bk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7fbc6f44f--9h4bk-eth0", GenerateName:"calico-apiserver-7fbc6f44f-", Namespace:"calico-apiserver", SelfLink:"", UID:"1af09251-b417-4ce1-a535-368f260e296f", ResourceVersion:"710", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 21, 34, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fbc6f44f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd", Pod:"calico-apiserver-7fbc6f44f-9h4bk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7e2e7309af0", MAC:"2e:46:a3:1d:a7:07", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 21:34:51.594786 containerd[1556]: 2025-03-20 21:34:51.576 [INFO][4034] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-9h4bk" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--9h4bk-eth0" Mar 20 21:34:51.603454 kubelet[2829]: I0320 21:34:51.593391 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-hqg5z" podStartSLOduration=10.463527707 podStartE2EDuration="26.575938935s" podCreationTimestamp="2025-03-20 21:34:25 +0000 UTC" firstStartedPulling="2025-03-20 21:34:26.578757513 +0000 UTC m=+13.744389377" lastFinishedPulling="2025-03-20 21:34:42.691168745 +0000 UTC m=+29.856800605" observedRunningTime="2025-03-20 21:34:43.239361763 +0000 UTC m=+30.404993631" watchObservedRunningTime="2025-03-20 21:34:51.575938935 +0000 UTC m=+38.741570799" Mar 20 21:34:51.631533 systemd-networkd[1363]: calid04d9a3478f: Link UP Mar 20 21:34:51.631991 systemd-networkd[1363]: calid04d9a3478f: Gained carrier Mar 20 21:34:51.660262 containerd[1556]: 2025-03-20 21:34:51.085 [INFO][4029] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--pf9dt-eth0 coredns-6f6b679f8f- kube-system 07b69786-72b8-464b-95b1-4e7ec6dec86e 709 0 2025-03-20 21:34:17 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-pf9dt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid04d9a3478f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" Namespace="kube-system" Pod="coredns-6f6b679f8f-pf9dt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--pf9dt-" Mar 20 21:34:51.660262 containerd[1556]: 2025-03-20 21:34:51.101 [INFO][4029] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" Namespace="kube-system" Pod="coredns-6f6b679f8f-pf9dt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--pf9dt-eth0" Mar 20 21:34:51.660262 containerd[1556]: 2025-03-20 21:34:51.470 [INFO][4070] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" HandleID="k8s-pod-network.d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" Workload="localhost-k8s-coredns--6f6b679f8f--pf9dt-eth0" Mar 20 21:34:51.660426 containerd[1556]: 2025-03-20 21:34:51.487 [INFO][4070] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" HandleID="k8s-pod-network.d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" Workload="localhost-k8s-coredns--6f6b679f8f--pf9dt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003e1080), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-pf9dt", "timestamp":"2025-03-20 21:34:51.47070254 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 21:34:51.660426 containerd[1556]: 2025-03-20 21:34:51.487 [INFO][4070] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 21:34:51.660426 containerd[1556]: 2025-03-20 21:34:51.549 [INFO][4070] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 21:34:51.660426 containerd[1556]: 2025-03-20 21:34:51.549 [INFO][4070] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 20 21:34:51.660426 containerd[1556]: 2025-03-20 21:34:51.600 [INFO][4070] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" host="localhost" Mar 20 21:34:51.660426 containerd[1556]: 2025-03-20 21:34:51.607 [INFO][4070] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 20 21:34:51.660426 containerd[1556]: 2025-03-20 21:34:51.610 [INFO][4070] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 20 21:34:51.660426 containerd[1556]: 2025-03-20 21:34:51.611 [INFO][4070] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 20 21:34:51.660426 containerd[1556]: 2025-03-20 21:34:51.613 [INFO][4070] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 20 21:34:51.660426 containerd[1556]: 2025-03-20 21:34:51.613 [INFO][4070] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" host="localhost" Mar 20 21:34:51.668916 containerd[1556]: 2025-03-20 21:34:51.614 [INFO][4070] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488 Mar 20 21:34:51.668916 containerd[1556]: 2025-03-20 21:34:51.618 [INFO][4070] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" host="localhost" Mar 20 21:34:51.668916 containerd[1556]: 2025-03-20 21:34:51.625 [INFO][4070] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" host="localhost" Mar 20 21:34:51.668916 containerd[1556]: 2025-03-20 21:34:51.625 [INFO][4070] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" host="localhost" Mar 20 21:34:51.668916 containerd[1556]: 2025-03-20 21:34:51.625 [INFO][4070] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 21:34:51.668916 containerd[1556]: 2025-03-20 21:34:51.625 [INFO][4070] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" HandleID="k8s-pod-network.d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" Workload="localhost-k8s-coredns--6f6b679f8f--pf9dt-eth0" Mar 20 21:34:51.683319 containerd[1556]: 2025-03-20 21:34:51.629 [INFO][4029] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" Namespace="kube-system" Pod="coredns-6f6b679f8f-pf9dt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--pf9dt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--pf9dt-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"07b69786-72b8-464b-95b1-4e7ec6dec86e", ResourceVersion:"709", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 21, 34, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-pf9dt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid04d9a3478f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 21:34:51.683403 containerd[1556]: 2025-03-20 21:34:51.629 [INFO][4029] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" Namespace="kube-system" Pod="coredns-6f6b679f8f-pf9dt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--pf9dt-eth0" Mar 20 21:34:51.683403 containerd[1556]: 2025-03-20 21:34:51.629 [INFO][4029] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid04d9a3478f ContainerID="d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" Namespace="kube-system" Pod="coredns-6f6b679f8f-pf9dt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--pf9dt-eth0" Mar 20 21:34:51.683403 containerd[1556]: 2025-03-20 21:34:51.632 [INFO][4029] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" Namespace="kube-system" Pod="coredns-6f6b679f8f-pf9dt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--pf9dt-eth0" Mar 20 21:34:51.683474 containerd[1556]: 2025-03-20 21:34:51.632 [INFO][4029] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" Namespace="kube-system" Pod="coredns-6f6b679f8f-pf9dt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--pf9dt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--pf9dt-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"07b69786-72b8-464b-95b1-4e7ec6dec86e", ResourceVersion:"709", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 21, 34, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488", Pod:"coredns-6f6b679f8f-pf9dt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid04d9a3478f", MAC:"12:a4:e4:55:19:f3", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 21:34:51.683474 containerd[1556]: 2025-03-20 21:34:51.656 [INFO][4029] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" Namespace="kube-system" Pod="coredns-6f6b679f8f-pf9dt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--pf9dt-eth0" Mar 20 21:34:51.719871 containerd[1556]: time="2025-03-20T21:34:51.719813883Z" level=info msg="connecting to shim e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd" address="unix:///run/containerd/s/2e7e5c4a7ad55eec3b46330b530a73d909cd164633c2db078e4c65060666a259" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:34:51.761189 systemd[1]: Started cri-containerd-e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd.scope - libcontainer container e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd. Mar 20 21:34:51.764129 systemd-networkd[1363]: cali75795a60728: Link UP Mar 20 21:34:51.765843 systemd-networkd[1363]: cali75795a60728: Gained carrier Mar 20 21:34:51.772026 containerd[1556]: time="2025-03-20T21:34:51.772000539Z" level=info msg="connecting to shim d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488" address="unix:///run/containerd/s/57192417b2f5ddac25d0d20133b7a1d7aeca47de357101ec38c0aaf7fb41a02b" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.092 [INFO][4049] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0 calico-kube-controllers-57c7f44ccb- calico-system 3db47d74-c752-4fcf-8454-2fdc29ed4c0f 712 0 2025-03-20 21:34:25 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:57c7f44ccb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-57c7f44ccb-nf8xh eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali75795a60728 [] []}} ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Namespace="calico-system" Pod="calico-kube-controllers-57c7f44ccb-nf8xh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-" Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.101 [INFO][4049] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Namespace="calico-system" Pod="calico-kube-controllers-57c7f44ccb-nf8xh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.470 [INFO][4066] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" HandleID="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Workload="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.487 [INFO][4066] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" HandleID="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Workload="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000102440), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-57c7f44ccb-nf8xh", "timestamp":"2025-03-20 21:34:51.470777147 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.487 [INFO][4066] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.625 [INFO][4066] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.626 [INFO][4066] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.699 [INFO][4066] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" host="localhost" Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.715 [INFO][4066] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.719 [INFO][4066] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.721 [INFO][4066] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.723 [INFO][4066] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.723 [INFO][4066] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" host="localhost" Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.725 [INFO][4066] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840 Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.743 [INFO][4066] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" host="localhost" Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.754 [INFO][4066] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" host="localhost" Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.754 [INFO][4066] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" host="localhost" Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.754 [INFO][4066] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 21:34:51.789136 containerd[1556]: 2025-03-20 21:34:51.755 [INFO][4066] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" HandleID="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Workload="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:34:51.792963 containerd[1556]: 2025-03-20 21:34:51.759 [INFO][4049] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Namespace="calico-system" Pod="calico-kube-controllers-57c7f44ccb-nf8xh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0", GenerateName:"calico-kube-controllers-57c7f44ccb-", Namespace:"calico-system", SelfLink:"", UID:"3db47d74-c752-4fcf-8454-2fdc29ed4c0f", ResourceVersion:"712", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 21, 34, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"57c7f44ccb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-57c7f44ccb-nf8xh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali75795a60728", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 21:34:51.792963 containerd[1556]: 2025-03-20 21:34:51.759 [INFO][4049] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Namespace="calico-system" Pod="calico-kube-controllers-57c7f44ccb-nf8xh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:34:51.792963 containerd[1556]: 2025-03-20 21:34:51.759 [INFO][4049] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali75795a60728 ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Namespace="calico-system" Pod="calico-kube-controllers-57c7f44ccb-nf8xh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:34:51.792963 containerd[1556]: 2025-03-20 21:34:51.766 [INFO][4049] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Namespace="calico-system" Pod="calico-kube-controllers-57c7f44ccb-nf8xh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:34:51.792963 containerd[1556]: 2025-03-20 21:34:51.766 [INFO][4049] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Namespace="calico-system" Pod="calico-kube-controllers-57c7f44ccb-nf8xh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0", GenerateName:"calico-kube-controllers-57c7f44ccb-", Namespace:"calico-system", SelfLink:"", UID:"3db47d74-c752-4fcf-8454-2fdc29ed4c0f", ResourceVersion:"712", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 21, 34, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"57c7f44ccb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840", Pod:"calico-kube-controllers-57c7f44ccb-nf8xh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali75795a60728", MAC:"86:26:e3:2c:10:ee", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 21:34:51.792963 containerd[1556]: 2025-03-20 21:34:51.784 [INFO][4049] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Namespace="calico-system" Pod="calico-kube-controllers-57c7f44ccb-nf8xh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:34:51.797088 systemd-resolved[1472]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 20 21:34:51.808154 systemd[1]: Started cri-containerd-d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488.scope - libcontainer container d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488. Mar 20 21:34:51.818658 systemd-resolved[1472]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 20 21:34:51.873976 containerd[1556]: time="2025-03-20T21:34:51.873870035Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fbc6f44f-9h4bk,Uid:1af09251-b417-4ce1-a535-368f260e296f,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd\"" Mar 20 21:34:51.884386 containerd[1556]: time="2025-03-20T21:34:51.884211126Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 20 21:34:51.886277 containerd[1556]: time="2025-03-20T21:34:51.886171324Z" level=info msg="connecting to shim 4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" address="unix:///run/containerd/s/774ad575e29f821bcdbaad8bcc12c18b68fe61ddf6234688e7167ee208c31632" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:34:51.887037 containerd[1556]: time="2025-03-20T21:34:51.886795324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-pf9dt,Uid:07b69786-72b8-464b-95b1-4e7ec6dec86e,Namespace:kube-system,Attempt:0,} returns sandbox id \"d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488\"" Mar 20 21:34:51.892153 containerd[1556]: time="2025-03-20T21:34:51.891546174Z" level=info msg="CreateContainer within sandbox \"d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 20 21:34:51.906867 containerd[1556]: time="2025-03-20T21:34:51.906664304Z" level=info msg="Container e76c31cb12cbdbfb4822433d05f3074c600eceb15b6fff9e4bef3c2e9c7d8088: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:51.914229 systemd[1]: Started cri-containerd-4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840.scope - libcontainer container 4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840. Mar 20 21:34:51.920465 containerd[1556]: time="2025-03-20T21:34:51.920359943Z" level=info msg="CreateContainer within sandbox \"d2f539b5dbe86eaaef5dfc8d3ced78738971d25da16fd6e431bf55f2f105e488\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e76c31cb12cbdbfb4822433d05f3074c600eceb15b6fff9e4bef3c2e9c7d8088\"" Mar 20 21:34:51.926071 containerd[1556]: time="2025-03-20T21:34:51.925920999Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7zx9v,Uid:0bea6db8-8efa-4d44-ac84-6c553760f208,Namespace:calico-system,Attempt:0,}" Mar 20 21:34:51.926736 systemd-resolved[1472]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 20 21:34:51.928108 containerd[1556]: time="2025-03-20T21:34:51.928084272Z" level=info msg="StartContainer for \"e76c31cb12cbdbfb4822433d05f3074c600eceb15b6fff9e4bef3c2e9c7d8088\"" Mar 20 21:34:51.928601 containerd[1556]: time="2025-03-20T21:34:51.928578032Z" level=info msg="connecting to shim e76c31cb12cbdbfb4822433d05f3074c600eceb15b6fff9e4bef3c2e9c7d8088" address="unix:///run/containerd/s/57192417b2f5ddac25d0d20133b7a1d7aeca47de357101ec38c0aaf7fb41a02b" protocol=ttrpc version=3 Mar 20 21:34:51.962229 systemd[1]: Started cri-containerd-e76c31cb12cbdbfb4822433d05f3074c600eceb15b6fff9e4bef3c2e9c7d8088.scope - libcontainer container e76c31cb12cbdbfb4822433d05f3074c600eceb15b6fff9e4bef3c2e9c7d8088. Mar 20 21:34:51.984178 containerd[1556]: time="2025-03-20T21:34:51.984148583Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-57c7f44ccb-nf8xh,Uid:3db47d74-c752-4fcf-8454-2fdc29ed4c0f,Namespace:calico-system,Attempt:0,} returns sandbox id \"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\"" Mar 20 21:34:52.008527 containerd[1556]: time="2025-03-20T21:34:52.008505350Z" level=info msg="StartContainer for \"e76c31cb12cbdbfb4822433d05f3074c600eceb15b6fff9e4bef3c2e9c7d8088\" returns successfully" Mar 20 21:34:52.055019 systemd-networkd[1363]: calidec7a3f597d: Link UP Mar 20 21:34:52.055834 systemd-networkd[1363]: calidec7a3f597d: Gained carrier Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:51.996 [INFO][4261] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--7zx9v-eth0 csi-node-driver- calico-system 0bea6db8-8efa-4d44-ac84-6c553760f208 581 0 2025-03-20 21:34:25 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:568c96974f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-7zx9v eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calidec7a3f597d [] []}} ContainerID="7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" Namespace="calico-system" Pod="csi-node-driver-7zx9v" WorkloadEndpoint="localhost-k8s-csi--node--driver--7zx9v-" Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:51.996 [INFO][4261] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" Namespace="calico-system" Pod="csi-node-driver-7zx9v" WorkloadEndpoint="localhost-k8s-csi--node--driver--7zx9v-eth0" Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.026 [INFO][4298] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" HandleID="k8s-pod-network.7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" Workload="localhost-k8s-csi--node--driver--7zx9v-eth0" Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.033 [INFO][4298] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" HandleID="k8s-pod-network.7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" Workload="localhost-k8s-csi--node--driver--7zx9v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00030e4f0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-7zx9v", "timestamp":"2025-03-20 21:34:52.026782857 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.033 [INFO][4298] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.033 [INFO][4298] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.033 [INFO][4298] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.035 [INFO][4298] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" host="localhost" Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.038 [INFO][4298] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.041 [INFO][4298] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.042 [INFO][4298] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.043 [INFO][4298] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.043 [INFO][4298] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" host="localhost" Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.044 [INFO][4298] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.047 [INFO][4298] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" host="localhost" Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.051 [INFO][4298] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" host="localhost" Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.051 [INFO][4298] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" host="localhost" Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.051 [INFO][4298] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 21:34:52.072687 containerd[1556]: 2025-03-20 21:34:52.051 [INFO][4298] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" HandleID="k8s-pod-network.7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" Workload="localhost-k8s-csi--node--driver--7zx9v-eth0" Mar 20 21:34:52.074797 containerd[1556]: 2025-03-20 21:34:52.053 [INFO][4261] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" Namespace="calico-system" Pod="csi-node-driver-7zx9v" WorkloadEndpoint="localhost-k8s-csi--node--driver--7zx9v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7zx9v-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0bea6db8-8efa-4d44-ac84-6c553760f208", ResourceVersion:"581", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 21, 34, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-7zx9v", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidec7a3f597d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 21:34:52.074797 containerd[1556]: 2025-03-20 21:34:52.053 [INFO][4261] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" Namespace="calico-system" Pod="csi-node-driver-7zx9v" WorkloadEndpoint="localhost-k8s-csi--node--driver--7zx9v-eth0" Mar 20 21:34:52.074797 containerd[1556]: 2025-03-20 21:34:52.053 [INFO][4261] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidec7a3f597d ContainerID="7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" Namespace="calico-system" Pod="csi-node-driver-7zx9v" WorkloadEndpoint="localhost-k8s-csi--node--driver--7zx9v-eth0" Mar 20 21:34:52.074797 containerd[1556]: 2025-03-20 21:34:52.056 [INFO][4261] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" Namespace="calico-system" Pod="csi-node-driver-7zx9v" WorkloadEndpoint="localhost-k8s-csi--node--driver--7zx9v-eth0" Mar 20 21:34:52.074797 containerd[1556]: 2025-03-20 21:34:52.056 [INFO][4261] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" Namespace="calico-system" Pod="csi-node-driver-7zx9v" WorkloadEndpoint="localhost-k8s-csi--node--driver--7zx9v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7zx9v-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0bea6db8-8efa-4d44-ac84-6c553760f208", ResourceVersion:"581", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 21, 34, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f", Pod:"csi-node-driver-7zx9v", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidec7a3f597d", MAC:"92:6a:30:de:fe:b8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 21:34:52.074797 containerd[1556]: 2025-03-20 21:34:52.069 [INFO][4261] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" Namespace="calico-system" Pod="csi-node-driver-7zx9v" WorkloadEndpoint="localhost-k8s-csi--node--driver--7zx9v-eth0" Mar 20 21:34:52.125544 containerd[1556]: time="2025-03-20T21:34:52.125326638Z" level=info msg="connecting to shim 7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f" address="unix:///run/containerd/s/4637be4c6d1b869a891b7c875d401397de533e0992186a16a279faa6bba0b276" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:34:52.147223 systemd[1]: Started cri-containerd-7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f.scope - libcontainer container 7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f. Mar 20 21:34:52.157625 systemd-resolved[1472]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 20 21:34:52.170793 containerd[1556]: time="2025-03-20T21:34:52.170747283Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7zx9v,Uid:0bea6db8-8efa-4d44-ac84-6c553760f208,Namespace:calico-system,Attempt:0,} returns sandbox id \"7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f\"" Mar 20 21:34:52.925884 containerd[1556]: time="2025-03-20T21:34:52.925507920Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-bfxj9,Uid:1d166fb6-cb1b-421b-946b-f26d99e50e62,Namespace:kube-system,Attempt:0,}" Mar 20 21:34:53.119219 systemd-networkd[1363]: calidec7a3f597d: Gained IPv6LL Mar 20 21:34:53.223541 systemd-networkd[1363]: calia6c37b1dabb: Link UP Mar 20 21:34:53.224157 systemd-networkd[1363]: calia6c37b1dabb: Gained carrier Mar 20 21:34:53.245450 kubelet[2829]: I0320 21:34:53.244949 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-pf9dt" podStartSLOduration=36.244914818 podStartE2EDuration="36.244914818s" podCreationTimestamp="2025-03-20 21:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 21:34:52.270940387 +0000 UTC m=+39.436572255" watchObservedRunningTime="2025-03-20 21:34:53.244914818 +0000 UTC m=+40.410546678" Mar 20 21:34:53.247168 systemd-networkd[1363]: calid04d9a3478f: Gained IPv6LL Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.153 [INFO][4372] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--bfxj9-eth0 coredns-6f6b679f8f- kube-system 1d166fb6-cb1b-421b-946b-f26d99e50e62 713 0 2025-03-20 21:34:17 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-bfxj9 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia6c37b1dabb [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" Namespace="kube-system" Pod="coredns-6f6b679f8f-bfxj9" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--bfxj9-" Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.153 [INFO][4372] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" Namespace="kube-system" Pod="coredns-6f6b679f8f-bfxj9" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--bfxj9-eth0" Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.187 [INFO][4386] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" HandleID="k8s-pod-network.fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" Workload="localhost-k8s-coredns--6f6b679f8f--bfxj9-eth0" Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.194 [INFO][4386] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" HandleID="k8s-pod-network.fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" Workload="localhost-k8s-coredns--6f6b679f8f--bfxj9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000333440), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-bfxj9", "timestamp":"2025-03-20 21:34:53.187274099 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.194 [INFO][4386] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.194 [INFO][4386] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.194 [INFO][4386] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.196 [INFO][4386] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" host="localhost" Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.199 [INFO][4386] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.203 [INFO][4386] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.205 [INFO][4386] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.207 [INFO][4386] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.207 [INFO][4386] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" host="localhost" Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.209 [INFO][4386] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3 Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.212 [INFO][4386] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" host="localhost" Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.218 [INFO][4386] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" host="localhost" Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.218 [INFO][4386] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" host="localhost" Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.218 [INFO][4386] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 21:34:53.250199 containerd[1556]: 2025-03-20 21:34:53.218 [INFO][4386] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" HandleID="k8s-pod-network.fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" Workload="localhost-k8s-coredns--6f6b679f8f--bfxj9-eth0" Mar 20 21:34:53.259755 containerd[1556]: 2025-03-20 21:34:53.220 [INFO][4372] cni-plugin/k8s.go 386: Populated endpoint ContainerID="fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" Namespace="kube-system" Pod="coredns-6f6b679f8f-bfxj9" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--bfxj9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--bfxj9-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"1d166fb6-cb1b-421b-946b-f26d99e50e62", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 21, 34, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-bfxj9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia6c37b1dabb", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 21:34:53.259755 containerd[1556]: 2025-03-20 21:34:53.220 [INFO][4372] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" Namespace="kube-system" Pod="coredns-6f6b679f8f-bfxj9" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--bfxj9-eth0" Mar 20 21:34:53.259755 containerd[1556]: 2025-03-20 21:34:53.220 [INFO][4372] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia6c37b1dabb ContainerID="fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" Namespace="kube-system" Pod="coredns-6f6b679f8f-bfxj9" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--bfxj9-eth0" Mar 20 21:34:53.259755 containerd[1556]: 2025-03-20 21:34:53.223 [INFO][4372] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" Namespace="kube-system" Pod="coredns-6f6b679f8f-bfxj9" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--bfxj9-eth0" Mar 20 21:34:53.259755 containerd[1556]: 2025-03-20 21:34:53.224 [INFO][4372] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" Namespace="kube-system" Pod="coredns-6f6b679f8f-bfxj9" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--bfxj9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--bfxj9-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"1d166fb6-cb1b-421b-946b-f26d99e50e62", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 21, 34, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3", Pod:"coredns-6f6b679f8f-bfxj9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia6c37b1dabb", MAC:"56:d7:8b:16:1e:c5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 21:34:53.259755 containerd[1556]: 2025-03-20 21:34:53.244 [INFO][4372] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" Namespace="kube-system" Pod="coredns-6f6b679f8f-bfxj9" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--bfxj9-eth0" Mar 20 21:34:53.293922 containerd[1556]: time="2025-03-20T21:34:53.293515141Z" level=info msg="connecting to shim fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3" address="unix:///run/containerd/s/95573e9633dd34b69b315d839cd76f7f00091f3361e27df9d07a54bab80c58b2" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:34:53.330252 systemd[1]: Started cri-containerd-fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3.scope - libcontainer container fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3. Mar 20 21:34:53.342921 systemd-resolved[1472]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 20 21:34:53.372397 containerd[1556]: time="2025-03-20T21:34:53.372364325Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-bfxj9,Uid:1d166fb6-cb1b-421b-946b-f26d99e50e62,Namespace:kube-system,Attempt:0,} returns sandbox id \"fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3\"" Mar 20 21:34:53.375552 systemd-networkd[1363]: cali75795a60728: Gained IPv6LL Mar 20 21:34:53.376006 containerd[1556]: time="2025-03-20T21:34:53.375600947Z" level=info msg="CreateContainer within sandbox \"fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 20 21:34:53.386387 containerd[1556]: time="2025-03-20T21:34:53.386325563Z" level=info msg="Container fc4d60536543d354cc19ac51cc610355bda692e79ee67e3ca09ffe112d14febd: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:53.390873 containerd[1556]: time="2025-03-20T21:34:53.390844992Z" level=info msg="CreateContainer within sandbox \"fbaca025d136718b4aef4fefb9b247c58696ae92a01e3170a8453df5a12947b3\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"fc4d60536543d354cc19ac51cc610355bda692e79ee67e3ca09ffe112d14febd\"" Mar 20 21:34:53.392543 containerd[1556]: time="2025-03-20T21:34:53.392504003Z" level=info msg="StartContainer for \"fc4d60536543d354cc19ac51cc610355bda692e79ee67e3ca09ffe112d14febd\"" Mar 20 21:34:53.396387 containerd[1556]: time="2025-03-20T21:34:53.396275779Z" level=info msg="connecting to shim fc4d60536543d354cc19ac51cc610355bda692e79ee67e3ca09ffe112d14febd" address="unix:///run/containerd/s/95573e9633dd34b69b315d839cd76f7f00091f3361e27df9d07a54bab80c58b2" protocol=ttrpc version=3 Mar 20 21:34:53.412216 systemd[1]: Started cri-containerd-fc4d60536543d354cc19ac51cc610355bda692e79ee67e3ca09ffe112d14febd.scope - libcontainer container fc4d60536543d354cc19ac51cc610355bda692e79ee67e3ca09ffe112d14febd. Mar 20 21:34:53.435161 containerd[1556]: time="2025-03-20T21:34:53.434978314Z" level=info msg="StartContainer for \"fc4d60536543d354cc19ac51cc610355bda692e79ee67e3ca09ffe112d14febd\" returns successfully" Mar 20 21:34:53.439217 systemd-networkd[1363]: cali7e2e7309af0: Gained IPv6LL Mar 20 21:34:53.925035 containerd[1556]: time="2025-03-20T21:34:53.924974895Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fbc6f44f-78sgc,Uid:48a21f5e-3708-4b74-ac02-926c8d0690e2,Namespace:calico-apiserver,Attempt:0,}" Mar 20 21:34:54.054473 systemd-networkd[1363]: calic0bc0a94c32: Link UP Mar 20 21:34:54.055041 systemd-networkd[1363]: calic0bc0a94c32: Gained carrier Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:53.985 [INFO][4489] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7fbc6f44f--78sgc-eth0 calico-apiserver-7fbc6f44f- calico-apiserver 48a21f5e-3708-4b74-ac02-926c8d0690e2 711 0 2025-03-20 21:34:26 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7fbc6f44f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7fbc6f44f-78sgc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic0bc0a94c32 [] []}} ContainerID="020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-78sgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--78sgc-" Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:53.985 [INFO][4489] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-78sgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--78sgc-eth0" Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.003 [INFO][4501] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" HandleID="k8s-pod-network.020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" Workload="localhost-k8s-calico--apiserver--7fbc6f44f--78sgc-eth0" Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.009 [INFO][4501] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" HandleID="k8s-pod-network.020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" Workload="localhost-k8s-calico--apiserver--7fbc6f44f--78sgc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000304b40), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7fbc6f44f-78sgc", "timestamp":"2025-03-20 21:34:54.003258133 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.009 [INFO][4501] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.009 [INFO][4501] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.010 [INFO][4501] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.015 [INFO][4501] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" host="localhost" Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.021 [INFO][4501] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.028 [INFO][4501] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.031 [INFO][4501] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.034 [INFO][4501] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.034 [INFO][4501] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" host="localhost" Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.037 [INFO][4501] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.042 [INFO][4501] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" host="localhost" Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.048 [INFO][4501] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" host="localhost" Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.048 [INFO][4501] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" host="localhost" Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.048 [INFO][4501] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 21:34:54.078624 containerd[1556]: 2025-03-20 21:34:54.048 [INFO][4501] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" HandleID="k8s-pod-network.020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" Workload="localhost-k8s-calico--apiserver--7fbc6f44f--78sgc-eth0" Mar 20 21:34:54.085424 containerd[1556]: 2025-03-20 21:34:54.051 [INFO][4489] cni-plugin/k8s.go 386: Populated endpoint ContainerID="020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-78sgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--78sgc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7fbc6f44f--78sgc-eth0", GenerateName:"calico-apiserver-7fbc6f44f-", Namespace:"calico-apiserver", SelfLink:"", UID:"48a21f5e-3708-4b74-ac02-926c8d0690e2", ResourceVersion:"711", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 21, 34, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fbc6f44f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7fbc6f44f-78sgc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic0bc0a94c32", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 21:34:54.085424 containerd[1556]: 2025-03-20 21:34:54.051 [INFO][4489] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-78sgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--78sgc-eth0" Mar 20 21:34:54.085424 containerd[1556]: 2025-03-20 21:34:54.051 [INFO][4489] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic0bc0a94c32 ContainerID="020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-78sgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--78sgc-eth0" Mar 20 21:34:54.085424 containerd[1556]: 2025-03-20 21:34:54.054 [INFO][4489] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-78sgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--78sgc-eth0" Mar 20 21:34:54.085424 containerd[1556]: 2025-03-20 21:34:54.056 [INFO][4489] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-78sgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--78sgc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7fbc6f44f--78sgc-eth0", GenerateName:"calico-apiserver-7fbc6f44f-", Namespace:"calico-apiserver", SelfLink:"", UID:"48a21f5e-3708-4b74-ac02-926c8d0690e2", ResourceVersion:"711", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 21, 34, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fbc6f44f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d", Pod:"calico-apiserver-7fbc6f44f-78sgc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic0bc0a94c32", MAC:"22:93:e0:01:5c:04", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 21:34:54.085424 containerd[1556]: 2025-03-20 21:34:54.066 [INFO][4489] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" Namespace="calico-apiserver" Pod="calico-apiserver-7fbc6f44f-78sgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--7fbc6f44f--78sgc-eth0" Mar 20 21:34:54.126085 containerd[1556]: time="2025-03-20T21:34:54.125176145Z" level=info msg="connecting to shim 020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d" address="unix:///run/containerd/s/8fd139ae5e37b3503f64e2b19890fa357f697a9000935a0288a72341c692a1e0" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:34:54.162704 systemd[1]: Started cri-containerd-020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d.scope - libcontainer container 020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d. Mar 20 21:34:54.174591 systemd-resolved[1472]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 20 21:34:54.206511 containerd[1556]: time="2025-03-20T21:34:54.206380293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fbc6f44f-78sgc,Uid:48a21f5e-3708-4b74-ac02-926c8d0690e2,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d\"" Mar 20 21:34:54.313530 kubelet[2829]: I0320 21:34:54.313345 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-bfxj9" podStartSLOduration=37.313326783 podStartE2EDuration="37.313326783s" podCreationTimestamp="2025-03-20 21:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 21:34:54.289574953 +0000 UTC m=+41.455206817" watchObservedRunningTime="2025-03-20 21:34:54.313326783 +0000 UTC m=+41.478958648" Mar 20 21:34:54.747577 containerd[1556]: time="2025-03-20T21:34:54.747464655Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:54.748725 containerd[1556]: time="2025-03-20T21:34:54.748686294Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 20 21:34:54.749711 containerd[1556]: time="2025-03-20T21:34:54.749212231Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:54.750454 containerd[1556]: time="2025-03-20T21:34:54.750433112Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:54.751395 containerd[1556]: time="2025-03-20T21:34:54.751373489Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 2.867130037s" Mar 20 21:34:54.751474 containerd[1556]: time="2025-03-20T21:34:54.751464768Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 20 21:34:54.752421 containerd[1556]: time="2025-03-20T21:34:54.752394334Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 20 21:34:54.753990 containerd[1556]: time="2025-03-20T21:34:54.753954884Z" level=info msg="CreateContainer within sandbox \"e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 20 21:34:54.783317 containerd[1556]: time="2025-03-20T21:34:54.783228569Z" level=info msg="Container 8d8d57a19f549ac0fe084fdf3814d2b569caf8de2d3318b719c7af0b65c96f1f: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:54.799121 containerd[1556]: time="2025-03-20T21:34:54.799094376Z" level=info msg="CreateContainer within sandbox \"e6dfded3fb051380011d72de9bf2e1d0adfba4442b3d64e036a7515c43cba8cd\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8d8d57a19f549ac0fe084fdf3814d2b569caf8de2d3318b719c7af0b65c96f1f\"" Mar 20 21:34:54.799748 containerd[1556]: time="2025-03-20T21:34:54.799452161Z" level=info msg="StartContainer for \"8d8d57a19f549ac0fe084fdf3814d2b569caf8de2d3318b719c7af0b65c96f1f\"" Mar 20 21:34:54.800264 containerd[1556]: time="2025-03-20T21:34:54.800245405Z" level=info msg="connecting to shim 8d8d57a19f549ac0fe084fdf3814d2b569caf8de2d3318b719c7af0b65c96f1f" address="unix:///run/containerd/s/2e7e5c4a7ad55eec3b46330b530a73d909cd164633c2db078e4c65060666a259" protocol=ttrpc version=3 Mar 20 21:34:54.816134 systemd[1]: Started cri-containerd-8d8d57a19f549ac0fe084fdf3814d2b569caf8de2d3318b719c7af0b65c96f1f.scope - libcontainer container 8d8d57a19f549ac0fe084fdf3814d2b569caf8de2d3318b719c7af0b65c96f1f. Mar 20 21:34:54.847136 systemd-networkd[1363]: calia6c37b1dabb: Gained IPv6LL Mar 20 21:34:54.867132 containerd[1556]: time="2025-03-20T21:34:54.867110804Z" level=info msg="StartContainer for \"8d8d57a19f549ac0fe084fdf3814d2b569caf8de2d3318b719c7af0b65c96f1f\" returns successfully" Mar 20 21:34:55.308498 kubelet[2829]: I0320 21:34:55.308225 2829 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 21:34:55.417972 kubelet[2829]: I0320 21:34:55.417649 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7fbc6f44f-9h4bk" podStartSLOduration=26.546731991 podStartE2EDuration="29.417637216s" podCreationTimestamp="2025-03-20 21:34:26 +0000 UTC" firstStartedPulling="2025-03-20 21:34:51.881326387 +0000 UTC m=+39.046958247" lastFinishedPulling="2025-03-20 21:34:54.752231606 +0000 UTC m=+41.917863472" observedRunningTime="2025-03-20 21:34:55.280668221 +0000 UTC m=+42.446300089" watchObservedRunningTime="2025-03-20 21:34:55.417637216 +0000 UTC m=+42.583269079" Mar 20 21:34:55.543292 containerd[1556]: time="2025-03-20T21:34:55.542840323Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\" id:\"d34792d54822230233c0232b910f29389644d5507044f069e8b86abc9b795c1f\" pid:4632 exited_at:{seconds:1742506495 nanos:541517697}" Mar 20 21:34:55.601101 containerd[1556]: time="2025-03-20T21:34:55.600952939Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\" id:\"7d057a194224a7742616500d840bd8d12797aaeea8b9f01be8afc6b071cbfc19\" pid:4657 exited_at:{seconds:1742506495 nanos:600745724}" Mar 20 21:34:56.063637 systemd-networkd[1363]: calic0bc0a94c32: Gained IPv6LL Mar 20 21:34:57.199185 containerd[1556]: time="2025-03-20T21:34:57.199155933Z" level=info msg="StopContainer for \"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\" with timeout 300 (s)" Mar 20 21:34:57.200386 containerd[1556]: time="2025-03-20T21:34:57.200372712Z" level=info msg="Stop container \"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\" with signal terminated" Mar 20 21:34:57.493261 containerd[1556]: time="2025-03-20T21:34:57.493131080Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\" id:\"fba00ac3e92d265535c5909582b2e92e76e0517900927ec8ab675aac5ea5cddf\" pid:4699 exited_at:{seconds:1742506497 nanos:491767454}" Mar 20 21:34:57.532256 containerd[1556]: time="2025-03-20T21:34:57.532136072Z" level=info msg="StopContainer for \"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\" with timeout 5 (s)" Mar 20 21:34:57.532716 containerd[1556]: time="2025-03-20T21:34:57.532563792Z" level=info msg="Stop container \"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\" with signal terminated" Mar 20 21:34:57.563780 systemd[1]: cri-containerd-c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d.scope: Deactivated successfully. Mar 20 21:34:57.563953 systemd[1]: cri-containerd-c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d.scope: Consumed 1.009s CPU time, 168.1M memory peak, 25.2M read from disk, 636K written to disk. Mar 20 21:34:57.574341 containerd[1556]: time="2025-03-20T21:34:57.567060451Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\" id:\"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\" pid:3761 exited_at:{seconds:1742506497 nanos:566480456}" Mar 20 21:34:57.574341 containerd[1556]: time="2025-03-20T21:34:57.567084731Z" level=info msg="received exit event container_id:\"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\" id:\"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\" pid:3761 exited_at:{seconds:1742506497 nanos:566480456}" Mar 20 21:34:57.587911 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d-rootfs.mount: Deactivated successfully. Mar 20 21:34:57.930588 containerd[1556]: time="2025-03-20T21:34:57.930555653Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:57.942268 containerd[1556]: time="2025-03-20T21:34:57.942215137Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 20 21:34:57.963656 containerd[1556]: time="2025-03-20T21:34:57.963612316Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:57.978221 containerd[1556]: time="2025-03-20T21:34:57.978172544Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:57.978637 containerd[1556]: time="2025-03-20T21:34:57.978396584Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 3.22598306s" Mar 20 21:34:57.978637 containerd[1556]: time="2025-03-20T21:34:57.978417268Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 20 21:34:57.979367 containerd[1556]: time="2025-03-20T21:34:57.979128973Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 20 21:34:58.033035 containerd[1556]: time="2025-03-20T21:34:58.032993687Z" level=info msg="CreateContainer within sandbox \"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 20 21:34:58.135848 containerd[1556]: time="2025-03-20T21:34:58.135636317Z" level=info msg="Container f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:58.184476 containerd[1556]: time="2025-03-20T21:34:58.183371296Z" level=info msg="CreateContainer within sandbox \"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\"" Mar 20 21:34:58.184476 containerd[1556]: time="2025-03-20T21:34:58.183840941Z" level=info msg="StartContainer for \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\"" Mar 20 21:34:58.191747 containerd[1556]: time="2025-03-20T21:34:58.184508834Z" level=info msg="connecting to shim f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca" address="unix:///run/containerd/s/774ad575e29f821bcdbaad8bcc12c18b68fe61ddf6234688e7167ee208c31632" protocol=ttrpc version=3 Mar 20 21:34:58.204287 systemd[1]: Started cri-containerd-f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca.scope - libcontainer container f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca. Mar 20 21:34:58.252595 containerd[1556]: time="2025-03-20T21:34:58.252524058Z" level=info msg="StartContainer for \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\" returns successfully" Mar 20 21:34:58.261929 containerd[1556]: time="2025-03-20T21:34:58.261895456Z" level=info msg="StopContainer for \"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\" returns successfully" Mar 20 21:34:58.264082 containerd[1556]: time="2025-03-20T21:34:58.264025211Z" level=info msg="StopPodSandbox for \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\"" Mar 20 21:34:58.269238 containerd[1556]: time="2025-03-20T21:34:58.269204574Z" level=info msg="Container to stop \"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 21:34:58.269238 containerd[1556]: time="2025-03-20T21:34:58.269233116Z" level=info msg="Container to stop \"d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 21:34:58.269238 containerd[1556]: time="2025-03-20T21:34:58.269239804Z" level=info msg="Container to stop \"d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 21:34:58.277059 containerd[1556]: time="2025-03-20T21:34:58.276176422Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" id:\"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" pid:3323 exit_status:137 exited_at:{seconds:1742506498 nanos:276010888}" Mar 20 21:34:58.276311 systemd[1]: cri-containerd-e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653.scope: Deactivated successfully. Mar 20 21:34:58.282366 containerd[1556]: time="2025-03-20T21:34:58.282039079Z" level=info msg="StopContainer for \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\" with timeout 30 (s)" Mar 20 21:34:58.283186 containerd[1556]: time="2025-03-20T21:34:58.283162672Z" level=info msg="Stop container \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\" with signal terminated" Mar 20 21:34:58.298054 kubelet[2829]: I0320 21:34:58.297792 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-57c7f44ccb-nf8xh" podStartSLOduration=27.305330453 podStartE2EDuration="33.297779017s" podCreationTimestamp="2025-03-20 21:34:25 +0000 UTC" firstStartedPulling="2025-03-20 21:34:51.986474238 +0000 UTC m=+39.152106100" lastFinishedPulling="2025-03-20 21:34:57.978922805 +0000 UTC m=+45.144554664" observedRunningTime="2025-03-20 21:34:58.297258475 +0000 UTC m=+45.462890343" watchObservedRunningTime="2025-03-20 21:34:58.297779017 +0000 UTC m=+45.463410879" Mar 20 21:34:58.304448 systemd[1]: cri-containerd-f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca.scope: Deactivated successfully. Mar 20 21:34:58.314066 containerd[1556]: time="2025-03-20T21:34:58.313911356Z" level=info msg="received exit event container_id:\"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\" id:\"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\" pid:4745 exit_status:2 exited_at:{seconds:1742506498 nanos:312553214}" Mar 20 21:34:58.318572 containerd[1556]: time="2025-03-20T21:34:58.315503465Z" level=error msg="ExecSync for \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\" failed" error="rpc error: code = Unknown desc = failed to exec in container: failed to start exec \"b9c34d373927fd6f22aeeb4ef12d8f54b4b127ea34687c51475c876b71aa0aed\": OCI runtime exec failed: exec failed: cannot exec in a stopped container" Mar 20 21:34:58.324457 kubelet[2829]: E0320 21:34:58.323915 2829 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: failed to start exec \"b9c34d373927fd6f22aeeb4ef12d8f54b4b127ea34687c51475c876b71aa0aed\": OCI runtime exec failed: exec failed: cannot exec in a stopped container" containerID="f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca" cmd=["/usr/bin/check-status","-r"] Mar 20 21:34:58.330113 containerd[1556]: time="2025-03-20T21:34:58.330087695Z" level=info msg="shim disconnected" id=e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653 namespace=k8s.io Mar 20 21:34:58.330113 containerd[1556]: time="2025-03-20T21:34:58.330105797Z" level=warning msg="cleaning up after shim disconnected" id=e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653 namespace=k8s.io Mar 20 21:34:58.330347 containerd[1556]: time="2025-03-20T21:34:58.330112943Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 20 21:34:58.354811 containerd[1556]: time="2025-03-20T21:34:58.354215951Z" level=error msg="ExecSync for \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\" failed" error="rpc error: code = Unknown desc = failed to exec in container: failed to create exec \"f3ee22f17fa3f255955a99b2f967c889e813ad1e9130f5c399d96698e7033d56\": cannot exec in a deleted state" Mar 20 21:34:58.354993 kubelet[2829]: E0320 21:34:58.354381 2829 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: failed to create exec \"f3ee22f17fa3f255955a99b2f967c889e813ad1e9130f5c399d96698e7033d56\": cannot exec in a deleted state" containerID="f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca" cmd=["/usr/bin/check-status","-r"] Mar 20 21:34:58.356297 containerd[1556]: time="2025-03-20T21:34:58.356265819Z" level=error msg="ExecSync for \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" Mar 20 21:34:58.356528 kubelet[2829]: E0320 21:34:58.356500 2829 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca" cmd=["/usr/bin/check-status","-r"] Mar 20 21:34:58.358437 containerd[1556]: time="2025-03-20T21:34:58.358398871Z" level=info msg="StopContainer for \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\" returns successfully" Mar 20 21:34:58.361555 containerd[1556]: time="2025-03-20T21:34:58.358659963Z" level=info msg="StopPodSandbox for \"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\"" Mar 20 21:34:58.361555 containerd[1556]: time="2025-03-20T21:34:58.358777734Z" level=info msg="Container to stop \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 21:34:58.364864 systemd[1]: cri-containerd-4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840.scope: Deactivated successfully. Mar 20 21:34:58.381387 containerd[1556]: time="2025-03-20T21:34:58.381106104Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\" id:\"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\" pid:4745 exit_status:2 exited_at:{seconds:1742506498 nanos:312553214}" Mar 20 21:34:58.381387 containerd[1556]: time="2025-03-20T21:34:58.381136158Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\" id:\"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\" pid:4249 exit_status:137 exited_at:{seconds:1742506498 nanos:367040297}" Mar 20 21:34:58.381387 containerd[1556]: time="2025-03-20T21:34:58.381297803Z" level=info msg="TearDown network for sandbox \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" successfully" Mar 20 21:34:58.381387 containerd[1556]: time="2025-03-20T21:34:58.381306704Z" level=info msg="StopPodSandbox for \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" returns successfully" Mar 20 21:34:58.382410 containerd[1556]: time="2025-03-20T21:34:58.382209395Z" level=info msg="received exit event sandbox_id:\"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" exit_status:137 exited_at:{seconds:1742506498 nanos:276010888}" Mar 20 21:34:58.397652 containerd[1556]: time="2025-03-20T21:34:58.397625898Z" level=info msg="shim disconnected" id=4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840 namespace=k8s.io Mar 20 21:34:58.397652 containerd[1556]: time="2025-03-20T21:34:58.397648855Z" level=warning msg="cleaning up after shim disconnected" id=4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840 namespace=k8s.io Mar 20 21:34:58.398312 containerd[1556]: time="2025-03-20T21:34:58.397656720Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 20 21:34:58.398312 containerd[1556]: time="2025-03-20T21:34:58.398068047Z" level=info msg="received exit event sandbox_id:\"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\" exit_status:137 exited_at:{seconds:1742506498 nanos:367040297}" Mar 20 21:34:58.461835 kubelet[2829]: E0320 21:34:58.461749 2829 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="392fe464-00a3-419d-9a63-19dc8518fdfc" containerName="install-cni" Mar 20 21:34:58.461835 kubelet[2829]: E0320 21:34:58.461787 2829 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="392fe464-00a3-419d-9a63-19dc8518fdfc" containerName="flexvol-driver" Mar 20 21:34:58.461835 kubelet[2829]: E0320 21:34:58.461794 2829 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="392fe464-00a3-419d-9a63-19dc8518fdfc" containerName="calico-node" Mar 20 21:34:58.473900 kubelet[2829]: I0320 21:34:58.461842 2829 memory_manager.go:354] "RemoveStaleState removing state" podUID="392fe464-00a3-419d-9a63-19dc8518fdfc" containerName="calico-node" Mar 20 21:34:58.502893 systemd[1]: Created slice kubepods-besteffort-pod3b311483_d0df_470e_b323_70b7da6a4177.slice - libcontainer container kubepods-besteffort-pod3b311483_d0df_470e_b323_70b7da6a4177.slice. Mar 20 21:34:58.512904 systemd[1]: cri-containerd-45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c.scope: Deactivated successfully. Mar 20 21:34:58.513171 systemd[1]: cri-containerd-45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c.scope: Consumed 149ms CPU time, 30.5M memory peak, 16.6M read from disk. Mar 20 21:34:58.514041 containerd[1556]: time="2025-03-20T21:34:58.513945763Z" level=info msg="received exit event container_id:\"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\" id:\"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\" pid:3393 exit_status:1 exited_at:{seconds:1742506498 nanos:513754138}" Mar 20 21:34:58.514781 containerd[1556]: time="2025-03-20T21:34:58.514769575Z" level=info msg="TaskExit event in podsandbox handler container_id:\"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\" id:\"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\" pid:3393 exit_status:1 exited_at:{seconds:1742506498 nanos:513754138}" Mar 20 21:34:58.526988 kubelet[2829]: I0320 21:34:58.525986 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/392fe464-00a3-419d-9a63-19dc8518fdfc-node-certs\") pod \"392fe464-00a3-419d-9a63-19dc8518fdfc\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " Mar 20 21:34:58.527356 kubelet[2829]: I0320 21:34:58.527343 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-flexvol-driver-host\") pod \"392fe464-00a3-419d-9a63-19dc8518fdfc\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " Mar 20 21:34:58.527432 kubelet[2829]: I0320 21:34:58.527424 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-xtables-lock\") pod \"392fe464-00a3-419d-9a63-19dc8518fdfc\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " Mar 20 21:34:58.527486 kubelet[2829]: I0320 21:34:58.527480 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-cni-net-dir\") pod \"392fe464-00a3-419d-9a63-19dc8518fdfc\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " Mar 20 21:34:58.527613 kubelet[2829]: I0320 21:34:58.527606 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/392fe464-00a3-419d-9a63-19dc8518fdfc-tigera-ca-bundle\") pod \"392fe464-00a3-419d-9a63-19dc8518fdfc\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " Mar 20 21:34:58.527677 kubelet[2829]: I0320 21:34:58.527663 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h75pd\" (UniqueName: \"kubernetes.io/projected/392fe464-00a3-419d-9a63-19dc8518fdfc-kube-api-access-h75pd\") pod \"392fe464-00a3-419d-9a63-19dc8518fdfc\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " Mar 20 21:34:58.527938 kubelet[2829]: I0320 21:34:58.527713 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-lib-modules\") pod \"392fe464-00a3-419d-9a63-19dc8518fdfc\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " Mar 20 21:34:58.528455 kubelet[2829]: I0320 21:34:58.527742 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-policysync\") pod \"392fe464-00a3-419d-9a63-19dc8518fdfc\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " Mar 20 21:34:58.528455 kubelet[2829]: I0320 21:34:58.527994 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-var-lib-calico\") pod \"392fe464-00a3-419d-9a63-19dc8518fdfc\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " Mar 20 21:34:58.528455 kubelet[2829]: I0320 21:34:58.528004 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-cni-bin-dir\") pod \"392fe464-00a3-419d-9a63-19dc8518fdfc\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " Mar 20 21:34:58.528455 kubelet[2829]: I0320 21:34:58.528014 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-var-run-calico\") pod \"392fe464-00a3-419d-9a63-19dc8518fdfc\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " Mar 20 21:34:58.528455 kubelet[2829]: I0320 21:34:58.528022 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-cni-log-dir\") pod \"392fe464-00a3-419d-9a63-19dc8518fdfc\" (UID: \"392fe464-00a3-419d-9a63-19dc8518fdfc\") " Mar 20 21:34:58.528455 kubelet[2829]: I0320 21:34:58.528080 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/3b311483-d0df-470e-b323-70b7da6a4177-policysync\") pod \"calico-node-9kl56\" (UID: \"3b311483-d0df-470e-b323-70b7da6a4177\") " pod="calico-system/calico-node-9kl56" Mar 20 21:34:58.529945 kubelet[2829]: I0320 21:34:58.528106 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3b311483-d0df-470e-b323-70b7da6a4177-xtables-lock\") pod \"calico-node-9kl56\" (UID: \"3b311483-d0df-470e-b323-70b7da6a4177\") " pod="calico-system/calico-node-9kl56" Mar 20 21:34:58.529945 kubelet[2829]: I0320 21:34:58.528118 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b311483-d0df-470e-b323-70b7da6a4177-tigera-ca-bundle\") pod \"calico-node-9kl56\" (UID: \"3b311483-d0df-470e-b323-70b7da6a4177\") " pod="calico-system/calico-node-9kl56" Mar 20 21:34:58.529945 kubelet[2829]: I0320 21:34:58.528131 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/3b311483-d0df-470e-b323-70b7da6a4177-cni-log-dir\") pod \"calico-node-9kl56\" (UID: \"3b311483-d0df-470e-b323-70b7da6a4177\") " pod="calico-system/calico-node-9kl56" Mar 20 21:34:58.529945 kubelet[2829]: I0320 21:34:58.528141 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/3b311483-d0df-470e-b323-70b7da6a4177-var-run-calico\") pod \"calico-node-9kl56\" (UID: \"3b311483-d0df-470e-b323-70b7da6a4177\") " pod="calico-system/calico-node-9kl56" Mar 20 21:34:58.529945 kubelet[2829]: I0320 21:34:58.528152 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3b311483-d0df-470e-b323-70b7da6a4177-var-lib-calico\") pod \"calico-node-9kl56\" (UID: \"3b311483-d0df-470e-b323-70b7da6a4177\") " pod="calico-system/calico-node-9kl56" Mar 20 21:34:58.530097 kubelet[2829]: I0320 21:34:58.528166 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s47xs\" (UniqueName: \"kubernetes.io/projected/3b311483-d0df-470e-b323-70b7da6a4177-kube-api-access-s47xs\") pod \"calico-node-9kl56\" (UID: \"3b311483-d0df-470e-b323-70b7da6a4177\") " pod="calico-system/calico-node-9kl56" Mar 20 21:34:58.530097 kubelet[2829]: I0320 21:34:58.528179 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b311483-d0df-470e-b323-70b7da6a4177-lib-modules\") pod \"calico-node-9kl56\" (UID: \"3b311483-d0df-470e-b323-70b7da6a4177\") " pod="calico-system/calico-node-9kl56" Mar 20 21:34:58.530097 kubelet[2829]: I0320 21:34:58.528188 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/3b311483-d0df-470e-b323-70b7da6a4177-cni-bin-dir\") pod \"calico-node-9kl56\" (UID: \"3b311483-d0df-470e-b323-70b7da6a4177\") " pod="calico-system/calico-node-9kl56" Mar 20 21:34:58.530097 kubelet[2829]: I0320 21:34:58.528197 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/3b311483-d0df-470e-b323-70b7da6a4177-cni-net-dir\") pod \"calico-node-9kl56\" (UID: \"3b311483-d0df-470e-b323-70b7da6a4177\") " pod="calico-system/calico-node-9kl56" Mar 20 21:34:58.530097 kubelet[2829]: I0320 21:34:58.528206 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/3b311483-d0df-470e-b323-70b7da6a4177-node-certs\") pod \"calico-node-9kl56\" (UID: \"3b311483-d0df-470e-b323-70b7da6a4177\") " pod="calico-system/calico-node-9kl56" Mar 20 21:34:58.530204 kubelet[2829]: I0320 21:34:58.528219 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/3b311483-d0df-470e-b323-70b7da6a4177-flexvol-driver-host\") pod \"calico-node-9kl56\" (UID: \"3b311483-d0df-470e-b323-70b7da6a4177\") " pod="calico-system/calico-node-9kl56" Mar 20 21:34:58.556264 systemd-networkd[1363]: cali75795a60728: Link DOWN Mar 20 21:34:58.556270 systemd-networkd[1363]: cali75795a60728: Lost carrier Mar 20 21:34:58.560322 kubelet[2829]: I0320 21:34:58.555956 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "392fe464-00a3-419d-9a63-19dc8518fdfc" (UID: "392fe464-00a3-419d-9a63-19dc8518fdfc"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 21:34:58.570076 kubelet[2829]: I0320 21:34:58.560518 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392fe464-00a3-419d-9a63-19dc8518fdfc-node-certs" (OuterVolumeSpecName: "node-certs") pod "392fe464-00a3-419d-9a63-19dc8518fdfc" (UID: "392fe464-00a3-419d-9a63-19dc8518fdfc"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 21:34:58.570076 kubelet[2829]: I0320 21:34:58.561028 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-policysync" (OuterVolumeSpecName: "policysync") pod "392fe464-00a3-419d-9a63-19dc8518fdfc" (UID: "392fe464-00a3-419d-9a63-19dc8518fdfc"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 21:34:58.570076 kubelet[2829]: I0320 21:34:58.561054 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "392fe464-00a3-419d-9a63-19dc8518fdfc" (UID: "392fe464-00a3-419d-9a63-19dc8518fdfc"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 21:34:58.572404 kubelet[2829]: I0320 21:34:58.572362 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "392fe464-00a3-419d-9a63-19dc8518fdfc" (UID: "392fe464-00a3-419d-9a63-19dc8518fdfc"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 21:34:58.572404 kubelet[2829]: I0320 21:34:58.572402 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "392fe464-00a3-419d-9a63-19dc8518fdfc" (UID: "392fe464-00a3-419d-9a63-19dc8518fdfc"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 21:34:58.576394 kubelet[2829]: I0320 21:34:58.572414 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "392fe464-00a3-419d-9a63-19dc8518fdfc" (UID: "392fe464-00a3-419d-9a63-19dc8518fdfc"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 21:34:58.576394 kubelet[2829]: I0320 21:34:58.572427 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "392fe464-00a3-419d-9a63-19dc8518fdfc" (UID: "392fe464-00a3-419d-9a63-19dc8518fdfc"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 21:34:58.576394 kubelet[2829]: I0320 21:34:58.554524 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "392fe464-00a3-419d-9a63-19dc8518fdfc" (UID: "392fe464-00a3-419d-9a63-19dc8518fdfc"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 21:34:58.585573 containerd[1556]: time="2025-03-20T21:34:58.583658309Z" level=info msg="StopContainer for \"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\" returns successfully" Mar 20 21:34:58.588453 kubelet[2829]: I0320 21:34:58.586616 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "392fe464-00a3-419d-9a63-19dc8518fdfc" (UID: "392fe464-00a3-419d-9a63-19dc8518fdfc"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 21:34:58.587960 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca-rootfs.mount: Deactivated successfully. Mar 20 21:34:58.588670 containerd[1556]: time="2025-03-20T21:34:58.586833512Z" level=info msg="StopPodSandbox for \"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\"" Mar 20 21:34:58.588670 containerd[1556]: time="2025-03-20T21:34:58.586896056Z" level=info msg="Container to stop \"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 21:34:58.588029 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840-rootfs.mount: Deactivated successfully. Mar 20 21:34:58.588084 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840-shm.mount: Deactivated successfully. Mar 20 21:34:58.588130 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c-rootfs.mount: Deactivated successfully. Mar 20 21:34:58.588167 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653-rootfs.mount: Deactivated successfully. Mar 20 21:34:58.588203 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653-shm.mount: Deactivated successfully. Mar 20 21:34:58.588240 systemd[1]: var-lib-kubelet-pods-392fe464\x2d00a3\x2d419d\x2d9a63\x2d19dc8518fdfc-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Mar 20 21:34:58.596784 kubelet[2829]: I0320 21:34:58.596147 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392fe464-00a3-419d-9a63-19dc8518fdfc-kube-api-access-h75pd" (OuterVolumeSpecName: "kube-api-access-h75pd") pod "392fe464-00a3-419d-9a63-19dc8518fdfc" (UID: "392fe464-00a3-419d-9a63-19dc8518fdfc"). InnerVolumeSpecName "kube-api-access-h75pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 21:34:58.598455 systemd[1]: var-lib-kubelet-pods-392fe464\x2d00a3\x2d419d\x2d9a63\x2d19dc8518fdfc-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dh75pd.mount: Deactivated successfully. Mar 20 21:34:58.604184 systemd[1]: cri-containerd-f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d.scope: Deactivated successfully. Mar 20 21:34:58.610161 systemd[1]: var-lib-kubelet-pods-392fe464\x2d00a3\x2d419d\x2d9a63\x2d19dc8518fdfc-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. Mar 20 21:34:58.612540 kubelet[2829]: I0320 21:34:58.612364 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/392fe464-00a3-419d-9a63-19dc8518fdfc-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "392fe464-00a3-419d-9a63-19dc8518fdfc" (UID: "392fe464-00a3-419d-9a63-19dc8518fdfc"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 21:34:58.612747 containerd[1556]: time="2025-03-20T21:34:58.612379152Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\" id:\"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\" pid:3274 exit_status:137 exited_at:{seconds:1742506498 nanos:610824131}" Mar 20 21:34:58.629505 kubelet[2829]: I0320 21:34:58.629455 2829 reconciler_common.go:288] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/392fe464-00a3-419d-9a63-19dc8518fdfc-node-certs\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.629505 kubelet[2829]: I0320 21:34:58.629472 2829 reconciler_common.go:288] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-flexvol-driver-host\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.629505 kubelet[2829]: I0320 21:34:58.629478 2829 reconciler_common.go:288] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-xtables-lock\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.629505 kubelet[2829]: I0320 21:34:58.629485 2829 reconciler_common.go:288] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-lib-modules\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.629505 kubelet[2829]: I0320 21:34:58.629490 2829 reconciler_common.go:288] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-policysync\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.629505 kubelet[2829]: I0320 21:34:58.629496 2829 reconciler_common.go:288] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-cni-log-dir\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.629505 kubelet[2829]: I0320 21:34:58.629501 2829 reconciler_common.go:288] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-cni-net-dir\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.629505 kubelet[2829]: I0320 21:34:58.629506 2829 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-h75pd\" (UniqueName: \"kubernetes.io/projected/392fe464-00a3-419d-9a63-19dc8518fdfc-kube-api-access-h75pd\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.630228 kubelet[2829]: I0320 21:34:58.629511 2829 reconciler_common.go:288] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/392fe464-00a3-419d-9a63-19dc8518fdfc-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.630228 kubelet[2829]: I0320 21:34:58.629516 2829 reconciler_common.go:288] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-var-lib-calico\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.630228 kubelet[2829]: I0320 21:34:58.629520 2829 reconciler_common.go:288] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-cni-bin-dir\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.630228 kubelet[2829]: I0320 21:34:58.629525 2829 reconciler_common.go:288] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/392fe464-00a3-419d-9a63-19dc8518fdfc-var-run-calico\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.654851 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d-rootfs.mount: Deactivated successfully. Mar 20 21:34:58.663784 containerd[1556]: time="2025-03-20T21:34:58.663465143Z" level=info msg="shim disconnected" id=f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d namespace=k8s.io Mar 20 21:34:58.663784 containerd[1556]: time="2025-03-20T21:34:58.663496986Z" level=warning msg="cleaning up after shim disconnected" id=f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d namespace=k8s.io Mar 20 21:34:58.663784 containerd[1556]: time="2025-03-20T21:34:58.663504401Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 20 21:34:58.679702 containerd[1556]: time="2025-03-20T21:34:58.679557610Z" level=info msg="received exit event sandbox_id:\"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\" exit_status:137 exited_at:{seconds:1742506498 nanos:610824131}" Mar 20 21:34:58.679996 containerd[1556]: time="2025-03-20T21:34:58.679869877Z" level=info msg="TearDown network for sandbox \"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\" successfully" Mar 20 21:34:58.679996 containerd[1556]: time="2025-03-20T21:34:58.679910819Z" level=info msg="StopPodSandbox for \"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\" returns successfully" Mar 20 21:34:58.684569 containerd[1556]: 2025-03-20 21:34:58.550 [INFO][4873] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Mar 20 21:34:58.684569 containerd[1556]: 2025-03-20 21:34:58.555 [INFO][4873] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" iface="eth0" netns="/var/run/netns/cni-937f72c8-f399-13b9-5151-d068aa5a34bf" Mar 20 21:34:58.684569 containerd[1556]: 2025-03-20 21:34:58.555 [INFO][4873] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" iface="eth0" netns="/var/run/netns/cni-937f72c8-f399-13b9-5151-d068aa5a34bf" Mar 20 21:34:58.684569 containerd[1556]: 2025-03-20 21:34:58.564 [INFO][4873] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" after=8.604154ms iface="eth0" netns="/var/run/netns/cni-937f72c8-f399-13b9-5151-d068aa5a34bf" Mar 20 21:34:58.684569 containerd[1556]: 2025-03-20 21:34:58.564 [INFO][4873] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Mar 20 21:34:58.684569 containerd[1556]: 2025-03-20 21:34:58.564 [INFO][4873] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Mar 20 21:34:58.684569 containerd[1556]: 2025-03-20 21:34:58.642 [INFO][4896] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" HandleID="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Workload="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:34:58.684569 containerd[1556]: 2025-03-20 21:34:58.647 [INFO][4896] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 21:34:58.684569 containerd[1556]: 2025-03-20 21:34:58.647 [INFO][4896] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 21:34:58.684569 containerd[1556]: 2025-03-20 21:34:58.678 [INFO][4896] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" HandleID="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Workload="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:34:58.684569 containerd[1556]: 2025-03-20 21:34:58.678 [INFO][4896] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" HandleID="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Workload="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:34:58.684569 containerd[1556]: 2025-03-20 21:34:58.679 [INFO][4896] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 21:34:58.684569 containerd[1556]: 2025-03-20 21:34:58.682 [INFO][4873] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Mar 20 21:34:58.685207 containerd[1556]: time="2025-03-20T21:34:58.684962062Z" level=info msg="TearDown network for sandbox \"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\" successfully" Mar 20 21:34:58.685207 containerd[1556]: time="2025-03-20T21:34:58.684976209Z" level=info msg="StopPodSandbox for \"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\" returns successfully" Mar 20 21:34:58.730234 kubelet[2829]: I0320 21:34:58.730160 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5817264a-2fc1-4344-970b-f88ce7a77bd6-tigera-ca-bundle\") pod \"5817264a-2fc1-4344-970b-f88ce7a77bd6\" (UID: \"5817264a-2fc1-4344-970b-f88ce7a77bd6\") " Mar 20 21:34:58.730612 kubelet[2829]: I0320 21:34:58.730381 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3db47d74-c752-4fcf-8454-2fdc29ed4c0f-tigera-ca-bundle\") pod \"3db47d74-c752-4fcf-8454-2fdc29ed4c0f\" (UID: \"3db47d74-c752-4fcf-8454-2fdc29ed4c0f\") " Mar 20 21:34:58.730612 kubelet[2829]: I0320 21:34:58.730401 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/5817264a-2fc1-4344-970b-f88ce7a77bd6-typha-certs\") pod \"5817264a-2fc1-4344-970b-f88ce7a77bd6\" (UID: \"5817264a-2fc1-4344-970b-f88ce7a77bd6\") " Mar 20 21:34:58.730612 kubelet[2829]: I0320 21:34:58.730415 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4h9f\" (UniqueName: \"kubernetes.io/projected/3db47d74-c752-4fcf-8454-2fdc29ed4c0f-kube-api-access-p4h9f\") pod \"3db47d74-c752-4fcf-8454-2fdc29ed4c0f\" (UID: \"3db47d74-c752-4fcf-8454-2fdc29ed4c0f\") " Mar 20 21:34:58.730612 kubelet[2829]: I0320 21:34:58.730554 2829 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv8fz\" (UniqueName: \"kubernetes.io/projected/5817264a-2fc1-4344-970b-f88ce7a77bd6-kube-api-access-sv8fz\") pod \"5817264a-2fc1-4344-970b-f88ce7a77bd6\" (UID: \"5817264a-2fc1-4344-970b-f88ce7a77bd6\") " Mar 20 21:34:58.750775 kubelet[2829]: I0320 21:34:58.750745 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db47d74-c752-4fcf-8454-2fdc29ed4c0f-kube-api-access-p4h9f" (OuterVolumeSpecName: "kube-api-access-p4h9f") pod "3db47d74-c752-4fcf-8454-2fdc29ed4c0f" (UID: "3db47d74-c752-4fcf-8454-2fdc29ed4c0f"). InnerVolumeSpecName "kube-api-access-p4h9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 21:34:58.751613 kubelet[2829]: I0320 21:34:58.751596 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5817264a-2fc1-4344-970b-f88ce7a77bd6-kube-api-access-sv8fz" (OuterVolumeSpecName: "kube-api-access-sv8fz") pod "5817264a-2fc1-4344-970b-f88ce7a77bd6" (UID: "5817264a-2fc1-4344-970b-f88ce7a77bd6"). InnerVolumeSpecName "kube-api-access-sv8fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 21:34:58.751742 kubelet[2829]: I0320 21:34:58.751727 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5817264a-2fc1-4344-970b-f88ce7a77bd6-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "5817264a-2fc1-4344-970b-f88ce7a77bd6" (UID: "5817264a-2fc1-4344-970b-f88ce7a77bd6"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 21:34:58.753499 kubelet[2829]: I0320 21:34:58.753486 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5817264a-2fc1-4344-970b-f88ce7a77bd6-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "5817264a-2fc1-4344-970b-f88ce7a77bd6" (UID: "5817264a-2fc1-4344-970b-f88ce7a77bd6"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 21:34:58.766524 kubelet[2829]: I0320 21:34:58.766486 2829 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db47d74-c752-4fcf-8454-2fdc29ed4c0f-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "3db47d74-c752-4fcf-8454-2fdc29ed4c0f" (UID: "3db47d74-c752-4fcf-8454-2fdc29ed4c0f"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 21:34:58.807264 containerd[1556]: time="2025-03-20T21:34:58.807231120Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9kl56,Uid:3b311483-d0df-470e-b323-70b7da6a4177,Namespace:calico-system,Attempt:0,}" Mar 20 21:34:58.821105 containerd[1556]: time="2025-03-20T21:34:58.820715448Z" level=info msg="connecting to shim 7a53d5095a5e400a8477b8a2ddbce8bd0542cd5b8052dd18c803a15056dfb8e1" address="unix:///run/containerd/s/47dec1499c8834f2ef316644e4a9bfb9654195c1af19cd3150bd92f51acaae2e" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:34:58.831822 kubelet[2829]: I0320 21:34:58.831801 2829 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-p4h9f\" (UniqueName: \"kubernetes.io/projected/3db47d74-c752-4fcf-8454-2fdc29ed4c0f-kube-api-access-p4h9f\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.831981 kubelet[2829]: I0320 21:34:58.831947 2829 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-sv8fz\" (UniqueName: \"kubernetes.io/projected/5817264a-2fc1-4344-970b-f88ce7a77bd6-kube-api-access-sv8fz\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.831981 kubelet[2829]: I0320 21:34:58.831957 2829 reconciler_common.go:288] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5817264a-2fc1-4344-970b-f88ce7a77bd6-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.831981 kubelet[2829]: I0320 21:34:58.831963 2829 reconciler_common.go:288] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3db47d74-c752-4fcf-8454-2fdc29ed4c0f-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.831981 kubelet[2829]: I0320 21:34:58.831969 2829 reconciler_common.go:288] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/5817264a-2fc1-4344-970b-f88ce7a77bd6-typha-certs\") on node \"localhost\" DevicePath \"\"" Mar 20 21:34:58.842227 systemd[1]: Started cri-containerd-7a53d5095a5e400a8477b8a2ddbce8bd0542cd5b8052dd18c803a15056dfb8e1.scope - libcontainer container 7a53d5095a5e400a8477b8a2ddbce8bd0542cd5b8052dd18c803a15056dfb8e1. Mar 20 21:34:58.865945 containerd[1556]: time="2025-03-20T21:34:58.865777837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9kl56,Uid:3b311483-d0df-470e-b323-70b7da6a4177,Namespace:calico-system,Attempt:0,} returns sandbox id \"7a53d5095a5e400a8477b8a2ddbce8bd0542cd5b8052dd18c803a15056dfb8e1\"" Mar 20 21:34:58.868066 containerd[1556]: time="2025-03-20T21:34:58.867199034Z" level=info msg="CreateContainer within sandbox \"7a53d5095a5e400a8477b8a2ddbce8bd0542cd5b8052dd18c803a15056dfb8e1\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 20 21:34:58.893917 containerd[1556]: time="2025-03-20T21:34:58.893884922Z" level=info msg="Container f775a43afc59eac1952c0405244e2b62d0eeb58afac51acc4ea47ab2bf385373: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:58.897325 containerd[1556]: time="2025-03-20T21:34:58.897299428Z" level=info msg="CreateContainer within sandbox \"7a53d5095a5e400a8477b8a2ddbce8bd0542cd5b8052dd18c803a15056dfb8e1\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f775a43afc59eac1952c0405244e2b62d0eeb58afac51acc4ea47ab2bf385373\"" Mar 20 21:34:58.897994 containerd[1556]: time="2025-03-20T21:34:58.897670469Z" level=info msg="StartContainer for \"f775a43afc59eac1952c0405244e2b62d0eeb58afac51acc4ea47ab2bf385373\"" Mar 20 21:34:58.898441 containerd[1556]: time="2025-03-20T21:34:58.898426050Z" level=info msg="connecting to shim f775a43afc59eac1952c0405244e2b62d0eeb58afac51acc4ea47ab2bf385373" address="unix:///run/containerd/s/47dec1499c8834f2ef316644e4a9bfb9654195c1af19cd3150bd92f51acaae2e" protocol=ttrpc version=3 Mar 20 21:34:58.911167 systemd[1]: Started cri-containerd-f775a43afc59eac1952c0405244e2b62d0eeb58afac51acc4ea47ab2bf385373.scope - libcontainer container f775a43afc59eac1952c0405244e2b62d0eeb58afac51acc4ea47ab2bf385373. Mar 20 21:34:58.941863 containerd[1556]: time="2025-03-20T21:34:58.941581898Z" level=info msg="StartContainer for \"f775a43afc59eac1952c0405244e2b62d0eeb58afac51acc4ea47ab2bf385373\" returns successfully" Mar 20 21:34:58.946486 systemd[1]: Removed slice kubepods-besteffort-pod392fe464_00a3_419d_9a63_19dc8518fdfc.slice - libcontainer container kubepods-besteffort-pod392fe464_00a3_419d_9a63_19dc8518fdfc.slice. Mar 20 21:34:58.946597 systemd[1]: kubepods-besteffort-pod392fe464_00a3_419d_9a63_19dc8518fdfc.slice: Consumed 1.327s CPU time, 199M memory peak, 25.4M read from disk, 161M written to disk. Mar 20 21:34:58.947511 systemd[1]: Removed slice kubepods-besteffort-pod5817264a_2fc1_4344_970b_f88ce7a77bd6.slice - libcontainer container kubepods-besteffort-pod5817264a_2fc1_4344_970b_f88ce7a77bd6.slice. Mar 20 21:34:58.947585 systemd[1]: kubepods-besteffort-pod5817264a_2fc1_4344_970b_f88ce7a77bd6.slice: Consumed 173ms CPU time, 31.2M memory peak, 16.6M read from disk. Mar 20 21:34:58.948808 systemd[1]: Removed slice kubepods-besteffort-pod3db47d74_c752_4fcf_8454_2fdc29ed4c0f.slice - libcontainer container kubepods-besteffort-pod3db47d74_c752_4fcf_8454_2fdc29ed4c0f.slice. Mar 20 21:34:59.097595 systemd[1]: cri-containerd-f775a43afc59eac1952c0405244e2b62d0eeb58afac51acc4ea47ab2bf385373.scope: Deactivated successfully. Mar 20 21:34:59.098346 systemd[1]: cri-containerd-f775a43afc59eac1952c0405244e2b62d0eeb58afac51acc4ea47ab2bf385373.scope: Consumed 23ms CPU time, 19.4M memory peak, 11.2M read from disk, 6.3M written to disk. Mar 20 21:34:59.099905 containerd[1556]: time="2025-03-20T21:34:59.099876263Z" level=info msg="received exit event container_id:\"f775a43afc59eac1952c0405244e2b62d0eeb58afac51acc4ea47ab2bf385373\" id:\"f775a43afc59eac1952c0405244e2b62d0eeb58afac51acc4ea47ab2bf385373\" pid:5009 exited_at:{seconds:1742506499 nanos:99588667}" Mar 20 21:34:59.100172 containerd[1556]: time="2025-03-20T21:34:59.100032163Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f775a43afc59eac1952c0405244e2b62d0eeb58afac51acc4ea47ab2bf385373\" id:\"f775a43afc59eac1952c0405244e2b62d0eeb58afac51acc4ea47ab2bf385373\" pid:5009 exited_at:{seconds:1742506499 nanos:99588667}" Mar 20 21:34:59.306583 kubelet[2829]: I0320 21:34:59.306560 2829 scope.go:117] "RemoveContainer" containerID="45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c" Mar 20 21:34:59.312691 containerd[1556]: time="2025-03-20T21:34:59.312653402Z" level=info msg="RemoveContainer for \"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\"" Mar 20 21:34:59.351558 containerd[1556]: time="2025-03-20T21:34:59.350736971Z" level=info msg="RemoveContainer for \"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\" returns successfully" Mar 20 21:34:59.357747 kubelet[2829]: I0320 21:34:59.357703 2829 scope.go:117] "RemoveContainer" containerID="45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c" Mar 20 21:34:59.358355 containerd[1556]: time="2025-03-20T21:34:59.358111548Z" level=error msg="ContainerStatus for \"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\": not found" Mar 20 21:34:59.358595 containerd[1556]: time="2025-03-20T21:34:59.358584287Z" level=info msg="CreateContainer within sandbox \"7a53d5095a5e400a8477b8a2ddbce8bd0542cd5b8052dd18c803a15056dfb8e1\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 20 21:34:59.386842 kubelet[2829]: E0320 21:34:59.386789 2829 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\": not found" containerID="45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c" Mar 20 21:34:59.395984 containerd[1556]: time="2025-03-20T21:34:59.395775594Z" level=info msg="Container 5504212c637ec6d1dc86f5282be8fe0fdd4c154eb4847a04836488e40e88a9e7: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:59.401595 kubelet[2829]: I0320 21:34:59.390538 2829 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c"} err="failed to get container status \"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\": rpc error: code = NotFound desc = an error occurred when try to find container \"45f57a3a8183914717dc1f1ac4957294dec86510d683cebc9498d911a2b04c1c\": not found" Mar 20 21:34:59.401595 kubelet[2829]: I0320 21:34:59.400021 2829 scope.go:117] "RemoveContainer" containerID="c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d" Mar 20 21:34:59.408444 containerd[1556]: time="2025-03-20T21:34:59.408421161Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:59.409351 containerd[1556]: time="2025-03-20T21:34:59.409323112Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 20 21:34:59.410655 containerd[1556]: time="2025-03-20T21:34:59.410550400Z" level=info msg="CreateContainer within sandbox \"7a53d5095a5e400a8477b8a2ddbce8bd0542cd5b8052dd18c803a15056dfb8e1\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"5504212c637ec6d1dc86f5282be8fe0fdd4c154eb4847a04836488e40e88a9e7\"" Mar 20 21:34:59.411216 containerd[1556]: time="2025-03-20T21:34:59.411119197Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:59.413217 containerd[1556]: time="2025-03-20T21:34:59.413191279Z" level=info msg="RemoveContainer for \"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\"" Mar 20 21:34:59.414393 containerd[1556]: time="2025-03-20T21:34:59.414207736Z" level=info msg="StartContainer for \"5504212c637ec6d1dc86f5282be8fe0fdd4c154eb4847a04836488e40e88a9e7\"" Mar 20 21:34:59.414777 containerd[1556]: time="2025-03-20T21:34:59.414703643Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:59.416510 containerd[1556]: time="2025-03-20T21:34:59.416471240Z" level=info msg="RemoveContainer for \"c628925e2cd8cfb311d9fa1f132525fdb6233e7e7f4863286a1bbe3663075c8d\" returns successfully" Mar 20 21:34:59.417412 containerd[1556]: time="2025-03-20T21:34:59.417391001Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 1.438245507s" Mar 20 21:34:59.417412 containerd[1556]: time="2025-03-20T21:34:59.417411521Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 20 21:34:59.417954 containerd[1556]: time="2025-03-20T21:34:59.417937377Z" level=info msg="connecting to shim 5504212c637ec6d1dc86f5282be8fe0fdd4c154eb4847a04836488e40e88a9e7" address="unix:///run/containerd/s/47dec1499c8834f2ef316644e4a9bfb9654195c1af19cd3150bd92f51acaae2e" protocol=ttrpc version=3 Mar 20 21:34:59.418504 kubelet[2829]: I0320 21:34:59.418404 2829 scope.go:117] "RemoveContainer" containerID="d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c" Mar 20 21:34:59.419574 containerd[1556]: time="2025-03-20T21:34:59.419385561Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 20 21:34:59.421742 containerd[1556]: time="2025-03-20T21:34:59.421124027Z" level=info msg="RemoveContainer for \"d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c\"" Mar 20 21:34:59.424764 containerd[1556]: time="2025-03-20T21:34:59.424658843Z" level=info msg="CreateContainer within sandbox \"7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 20 21:34:59.463247 systemd[1]: Started cri-containerd-5504212c637ec6d1dc86f5282be8fe0fdd4c154eb4847a04836488e40e88a9e7.scope - libcontainer container 5504212c637ec6d1dc86f5282be8fe0fdd4c154eb4847a04836488e40e88a9e7. Mar 20 21:34:59.480806 containerd[1556]: time="2025-03-20T21:34:59.480680785Z" level=info msg="RemoveContainer for \"d4d77dcc2bd184b2f5139b08ac062f47dba4a3b94d7f71db1acc333106324a8c\" returns successfully" Mar 20 21:34:59.489757 kubelet[2829]: I0320 21:34:59.489739 2829 scope.go:117] "RemoveContainer" containerID="d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460" Mar 20 21:34:59.547570 containerd[1556]: time="2025-03-20T21:34:59.547384966Z" level=info msg="RemoveContainer for \"d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460\"" Mar 20 21:34:59.553666 containerd[1556]: time="2025-03-20T21:34:59.553640270Z" level=info msg="StartContainer for \"5504212c637ec6d1dc86f5282be8fe0fdd4c154eb4847a04836488e40e88a9e7\" returns successfully" Mar 20 21:34:59.561591 containerd[1556]: time="2025-03-20T21:34:59.561567159Z" level=info msg="Container 73ed3a594809ab79ee3c06e7a045de7af47f84a0b16361c68819e7786dc1a0f7: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:59.572924 containerd[1556]: time="2025-03-20T21:34:59.572443697Z" level=info msg="RemoveContainer for \"d12f184ee205b2b636eedb4340b8e9534b56c82e19eee1915ee6c65f8cc95460\" returns successfully" Mar 20 21:34:59.573128 kubelet[2829]: I0320 21:34:59.572603 2829 scope.go:117] "RemoveContainer" containerID="f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca" Mar 20 21:34:59.573813 containerd[1556]: time="2025-03-20T21:34:59.573788391Z" level=info msg="RemoveContainer for \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\"" Mar 20 21:34:59.576627 containerd[1556]: time="2025-03-20T21:34:59.576460462Z" level=info msg="RemoveContainer for \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\" returns successfully" Mar 20 21:34:59.577057 kubelet[2829]: I0320 21:34:59.576681 2829 scope.go:117] "RemoveContainer" containerID="f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca" Mar 20 21:34:59.577348 containerd[1556]: time="2025-03-20T21:34:59.577290681Z" level=error msg="ContainerStatus for \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\": not found" Mar 20 21:34:59.577442 kubelet[2829]: E0320 21:34:59.577428 2829 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\": not found" containerID="f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca" Mar 20 21:34:59.577608 kubelet[2829]: I0320 21:34:59.577490 2829 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca"} err="failed to get container status \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\": rpc error: code = NotFound desc = an error occurred when try to find container \"f8e4a2025016ae00894417f340093ee806036d9086938b27559ab6bc312868ca\": not found" Mar 20 21:34:59.578021 containerd[1556]: time="2025-03-20T21:34:59.578007587Z" level=info msg="CreateContainer within sandbox \"7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"73ed3a594809ab79ee3c06e7a045de7af47f84a0b16361c68819e7786dc1a0f7\"" Mar 20 21:34:59.578336 containerd[1556]: time="2025-03-20T21:34:59.578325869Z" level=info msg="StartContainer for \"73ed3a594809ab79ee3c06e7a045de7af47f84a0b16361c68819e7786dc1a0f7\"" Mar 20 21:34:59.579734 containerd[1556]: time="2025-03-20T21:34:59.579665314Z" level=info msg="connecting to shim 73ed3a594809ab79ee3c06e7a045de7af47f84a0b16361c68819e7786dc1a0f7" address="unix:///run/containerd/s/4637be4c6d1b869a891b7c875d401397de533e0992186a16a279faa6bba0b276" protocol=ttrpc version=3 Mar 20 21:34:59.591967 systemd[1]: var-lib-kubelet-pods-3db47d74\x2dc752\x2d4fcf\x2d8454\x2d2fdc29ed4c0f-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dkube\x2dcontrollers-1.mount: Deactivated successfully. Mar 20 21:34:59.592720 systemd[1]: run-netns-cni\x2d937f72c8\x2df399\x2d13b9\x2d5151\x2dd068aa5a34bf.mount: Deactivated successfully. Mar 20 21:34:59.592838 systemd[1]: var-lib-kubelet-pods-3db47d74\x2dc752\x2d4fcf\x2d8454\x2d2fdc29ed4c0f-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dp4h9f.mount: Deactivated successfully. Mar 20 21:34:59.592931 systemd[1]: var-lib-kubelet-pods-5817264a\x2d2fc1\x2d4344\x2d970b\x2df88ce7a77bd6-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Mar 20 21:34:59.593025 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d-shm.mount: Deactivated successfully. Mar 20 21:34:59.593136 systemd[1]: var-lib-kubelet-pods-5817264a\x2d2fc1\x2d4344\x2d970b\x2df88ce7a77bd6-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Mar 20 21:34:59.593226 systemd[1]: var-lib-kubelet-pods-5817264a\x2d2fc1\x2d4344\x2d970b\x2df88ce7a77bd6-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dsv8fz.mount: Deactivated successfully. Mar 20 21:34:59.604246 systemd[1]: Started cri-containerd-73ed3a594809ab79ee3c06e7a045de7af47f84a0b16361c68819e7786dc1a0f7.scope - libcontainer container 73ed3a594809ab79ee3c06e7a045de7af47f84a0b16361c68819e7786dc1a0f7. Mar 20 21:34:59.634371 containerd[1556]: time="2025-03-20T21:34:59.634335925Z" level=info msg="StartContainer for \"73ed3a594809ab79ee3c06e7a045de7af47f84a0b16361c68819e7786dc1a0f7\" returns successfully" Mar 20 21:34:59.863272 containerd[1556]: time="2025-03-20T21:34:59.863219194Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:34:59.863843 containerd[1556]: time="2025-03-20T21:34:59.863492442Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 20 21:34:59.864732 containerd[1556]: time="2025-03-20T21:34:59.864719501Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 445.312755ms" Mar 20 21:34:59.864759 containerd[1556]: time="2025-03-20T21:34:59.864737193Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 20 21:34:59.865707 containerd[1556]: time="2025-03-20T21:34:59.865582541Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 20 21:34:59.868026 containerd[1556]: time="2025-03-20T21:34:59.867254253Z" level=info msg="CreateContainer within sandbox \"020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 20 21:34:59.951438 containerd[1556]: time="2025-03-20T21:34:59.951403934Z" level=info msg="Container ae5e22cb252dcda48f80536559066d711de9c86778b09208925723d465bee43e: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:34:59.983272 containerd[1556]: time="2025-03-20T21:34:59.983159995Z" level=info msg="CreateContainer within sandbox \"020d078f2590d1c71a90b95340f5345aa1fd8bb940877462aa96bcb08e87509d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ae5e22cb252dcda48f80536559066d711de9c86778b09208925723d465bee43e\"" Mar 20 21:34:59.983580 containerd[1556]: time="2025-03-20T21:34:59.983457120Z" level=info msg="StartContainer for \"ae5e22cb252dcda48f80536559066d711de9c86778b09208925723d465bee43e\"" Mar 20 21:34:59.984780 containerd[1556]: time="2025-03-20T21:34:59.984547393Z" level=info msg="connecting to shim ae5e22cb252dcda48f80536559066d711de9c86778b09208925723d465bee43e" address="unix:///run/containerd/s/8fd139ae5e37b3503f64e2b19890fa357f697a9000935a0288a72341c692a1e0" protocol=ttrpc version=3 Mar 20 21:35:00.005351 systemd[1]: Started cri-containerd-ae5e22cb252dcda48f80536559066d711de9c86778b09208925723d465bee43e.scope - libcontainer container ae5e22cb252dcda48f80536559066d711de9c86778b09208925723d465bee43e. Mar 20 21:35:00.046423 containerd[1556]: time="2025-03-20T21:35:00.046384981Z" level=info msg="StartContainer for \"ae5e22cb252dcda48f80536559066d711de9c86778b09208925723d465bee43e\" returns successfully" Mar 20 21:35:00.404538 kubelet[2829]: I0320 21:35:00.404328 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7fbc6f44f-78sgc" podStartSLOduration=28.74731209 podStartE2EDuration="34.4043127s" podCreationTimestamp="2025-03-20 21:34:26 +0000 UTC" firstStartedPulling="2025-03-20 21:34:54.208292666 +0000 UTC m=+41.373924527" lastFinishedPulling="2025-03-20 21:34:59.86529327 +0000 UTC m=+47.030925137" observedRunningTime="2025-03-20 21:35:00.404117448 +0000 UTC m=+47.569749318" watchObservedRunningTime="2025-03-20 21:35:00.4043127 +0000 UTC m=+47.569944558" Mar 20 21:35:00.934654 kubelet[2829]: I0320 21:35:00.934613 2829 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392fe464-00a3-419d-9a63-19dc8518fdfc" path="/var/lib/kubelet/pods/392fe464-00a3-419d-9a63-19dc8518fdfc/volumes" Mar 20 21:35:00.937279 kubelet[2829]: I0320 21:35:00.937244 2829 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db47d74-c752-4fcf-8454-2fdc29ed4c0f" path="/var/lib/kubelet/pods/3db47d74-c752-4fcf-8454-2fdc29ed4c0f/volumes" Mar 20 21:35:00.937914 kubelet[2829]: I0320 21:35:00.937883 2829 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5817264a-2fc1-4344-970b-f88ce7a77bd6" path="/var/lib/kubelet/pods/5817264a-2fc1-4344-970b-f88ce7a77bd6/volumes" Mar 20 21:35:01.180514 kubelet[2829]: E0320 21:35:01.180479 2829 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="3db47d74-c752-4fcf-8454-2fdc29ed4c0f" containerName="calico-kube-controllers" Mar 20 21:35:01.180514 kubelet[2829]: E0320 21:35:01.180509 2829 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="5817264a-2fc1-4344-970b-f88ce7a77bd6" containerName="calico-typha" Mar 20 21:35:01.181914 kubelet[2829]: I0320 21:35:01.180529 2829 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db47d74-c752-4fcf-8454-2fdc29ed4c0f" containerName="calico-kube-controllers" Mar 20 21:35:01.181914 kubelet[2829]: I0320 21:35:01.180535 2829 memory_manager.go:354] "RemoveStaleState removing state" podUID="5817264a-2fc1-4344-970b-f88ce7a77bd6" containerName="calico-typha" Mar 20 21:35:01.187196 systemd[1]: Created slice kubepods-besteffort-pod4402f0cf_9c84_40b2_a266_771a2a1c4e9c.slice - libcontainer container kubepods-besteffort-pod4402f0cf_9c84_40b2_a266_771a2a1c4e9c.slice. Mar 20 21:35:01.266068 kubelet[2829]: I0320 21:35:01.266035 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4402f0cf-9c84-40b2-a266-771a2a1c4e9c-typha-certs\") pod \"calico-typha-6dcdf7db96-6ghgs\" (UID: \"4402f0cf-9c84-40b2-a266-771a2a1c4e9c\") " pod="calico-system/calico-typha-6dcdf7db96-6ghgs" Mar 20 21:35:01.266068 kubelet[2829]: I0320 21:35:01.266068 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4402f0cf-9c84-40b2-a266-771a2a1c4e9c-tigera-ca-bundle\") pod \"calico-typha-6dcdf7db96-6ghgs\" (UID: \"4402f0cf-9c84-40b2-a266-771a2a1c4e9c\") " pod="calico-system/calico-typha-6dcdf7db96-6ghgs" Mar 20 21:35:01.266190 kubelet[2829]: I0320 21:35:01.266084 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjwfg\" (UniqueName: \"kubernetes.io/projected/4402f0cf-9c84-40b2-a266-771a2a1c4e9c-kube-api-access-sjwfg\") pod \"calico-typha-6dcdf7db96-6ghgs\" (UID: \"4402f0cf-9c84-40b2-a266-771a2a1c4e9c\") " pod="calico-system/calico-typha-6dcdf7db96-6ghgs" Mar 20 21:35:01.513400 containerd[1556]: time="2025-03-20T21:35:01.513249665Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6dcdf7db96-6ghgs,Uid:4402f0cf-9c84-40b2-a266-771a2a1c4e9c,Namespace:calico-system,Attempt:0,}" Mar 20 21:35:01.854963 containerd[1556]: time="2025-03-20T21:35:01.854807445Z" level=info msg="connecting to shim 20aa5f83b7dc461776d7fa4add6a0a6a00de40181f5505b1dc01cd85ea1d3842" address="unix:///run/containerd/s/47135e845623cc128f51f14fb2bef55a9e284f2db6fe4c37339882cb8e082244" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:35:01.887614 systemd[1]: Started cri-containerd-20aa5f83b7dc461776d7fa4add6a0a6a00de40181f5505b1dc01cd85ea1d3842.scope - libcontainer container 20aa5f83b7dc461776d7fa4add6a0a6a00de40181f5505b1dc01cd85ea1d3842. Mar 20 21:35:01.936293 containerd[1556]: time="2025-03-20T21:35:01.936260915Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6dcdf7db96-6ghgs,Uid:4402f0cf-9c84-40b2-a266-771a2a1c4e9c,Namespace:calico-system,Attempt:0,} returns sandbox id \"20aa5f83b7dc461776d7fa4add6a0a6a00de40181f5505b1dc01cd85ea1d3842\"" Mar 20 21:35:01.957549 containerd[1556]: time="2025-03-20T21:35:01.957521618Z" level=info msg="CreateContainer within sandbox \"20aa5f83b7dc461776d7fa4add6a0a6a00de40181f5505b1dc01cd85ea1d3842\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 20 21:35:01.965890 containerd[1556]: time="2025-03-20T21:35:01.965864684Z" level=info msg="Container 4a6cceae6b88ff15335e2786c2ff21f0f319d848907ef4bb21394a9ea9345a4d: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:35:01.970604 containerd[1556]: time="2025-03-20T21:35:01.970530857Z" level=info msg="CreateContainer within sandbox \"20aa5f83b7dc461776d7fa4add6a0a6a00de40181f5505b1dc01cd85ea1d3842\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"4a6cceae6b88ff15335e2786c2ff21f0f319d848907ef4bb21394a9ea9345a4d\"" Mar 20 21:35:01.972071 containerd[1556]: time="2025-03-20T21:35:01.971107015Z" level=info msg="StartContainer for \"4a6cceae6b88ff15335e2786c2ff21f0f319d848907ef4bb21394a9ea9345a4d\"" Mar 20 21:35:01.972754 containerd[1556]: time="2025-03-20T21:35:01.972178761Z" level=info msg="connecting to shim 4a6cceae6b88ff15335e2786c2ff21f0f319d848907ef4bb21394a9ea9345a4d" address="unix:///run/containerd/s/47135e845623cc128f51f14fb2bef55a9e284f2db6fe4c37339882cb8e082244" protocol=ttrpc version=3 Mar 20 21:35:01.989300 systemd[1]: Started cri-containerd-4a6cceae6b88ff15335e2786c2ff21f0f319d848907ef4bb21394a9ea9345a4d.scope - libcontainer container 4a6cceae6b88ff15335e2786c2ff21f0f319d848907ef4bb21394a9ea9345a4d. Mar 20 21:35:02.040576 containerd[1556]: time="2025-03-20T21:35:02.040536779Z" level=info msg="StartContainer for \"4a6cceae6b88ff15335e2786c2ff21f0f319d848907ef4bb21394a9ea9345a4d\" returns successfully" Mar 20 21:35:02.534528 containerd[1556]: time="2025-03-20T21:35:02.534488975Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:35:02.534977 containerd[1556]: time="2025-03-20T21:35:02.534942951Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 20 21:35:02.539481 containerd[1556]: time="2025-03-20T21:35:02.539464916Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 2.673861047s" Mar 20 21:35:02.539619 containerd[1556]: time="2025-03-20T21:35:02.539559887Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 20 21:35:02.547999 containerd[1556]: time="2025-03-20T21:35:02.547754624Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:35:02.549146 containerd[1556]: time="2025-03-20T21:35:02.548149841Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 21:35:02.550013 containerd[1556]: time="2025-03-20T21:35:02.549869870Z" level=info msg="CreateContainer within sandbox \"7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 20 21:35:02.558896 containerd[1556]: time="2025-03-20T21:35:02.555777263Z" level=info msg="Container 60d88617bd988b2eaedcb430e183364acf6596c172ef2c7a593f8790b5849cce: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:35:02.563992 containerd[1556]: time="2025-03-20T21:35:02.563302360Z" level=info msg="CreateContainer within sandbox \"7a6ab6e8e9cd3d9b2e530d90d728b3e763ab640b166aab6631549605676f471f\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"60d88617bd988b2eaedcb430e183364acf6596c172ef2c7a593f8790b5849cce\"" Mar 20 21:35:02.563992 containerd[1556]: time="2025-03-20T21:35:02.563665332Z" level=info msg="StartContainer for \"60d88617bd988b2eaedcb430e183364acf6596c172ef2c7a593f8790b5849cce\"" Mar 20 21:35:02.570462 containerd[1556]: time="2025-03-20T21:35:02.570383183Z" level=info msg="connecting to shim 60d88617bd988b2eaedcb430e183364acf6596c172ef2c7a593f8790b5849cce" address="unix:///run/containerd/s/4637be4c6d1b869a891b7c875d401397de533e0992186a16a279faa6bba0b276" protocol=ttrpc version=3 Mar 20 21:35:02.605296 systemd[1]: Started cri-containerd-60d88617bd988b2eaedcb430e183364acf6596c172ef2c7a593f8790b5849cce.scope - libcontainer container 60d88617bd988b2eaedcb430e183364acf6596c172ef2c7a593f8790b5849cce. Mar 20 21:35:02.646813 systemd[1]: cri-containerd-5504212c637ec6d1dc86f5282be8fe0fdd4c154eb4847a04836488e40e88a9e7.scope: Deactivated successfully. Mar 20 21:35:02.646979 systemd[1]: cri-containerd-5504212c637ec6d1dc86f5282be8fe0fdd4c154eb4847a04836488e40e88a9e7.scope: Consumed 619ms CPU time, 245.5M memory peak, 294.1M read from disk. Mar 20 21:35:02.679004 containerd[1556]: time="2025-03-20T21:35:02.650630155Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5504212c637ec6d1dc86f5282be8fe0fdd4c154eb4847a04836488e40e88a9e7\" id:\"5504212c637ec6d1dc86f5282be8fe0fdd4c154eb4847a04836488e40e88a9e7\" pid:5063 exited_at:{seconds:1742506502 nanos:650207568}" Mar 20 21:35:02.679004 containerd[1556]: time="2025-03-20T21:35:02.651392897Z" level=info msg="received exit event container_id:\"5504212c637ec6d1dc86f5282be8fe0fdd4c154eb4847a04836488e40e88a9e7\" id:\"5504212c637ec6d1dc86f5282be8fe0fdd4c154eb4847a04836488e40e88a9e7\" pid:5063 exited_at:{seconds:1742506502 nanos:650207568}" Mar 20 21:35:02.679004 containerd[1556]: time="2025-03-20T21:35:02.655981865Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: failed to load CNI config list file /etc/cni/net.d/10-calico.conflist: error parsing configuration list: unexpected end of JSON input: invalid cni config: failed to load cni config" Mar 20 21:35:02.706593 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5504212c637ec6d1dc86f5282be8fe0fdd4c154eb4847a04836488e40e88a9e7-rootfs.mount: Deactivated successfully. Mar 20 21:35:02.735758 containerd[1556]: time="2025-03-20T21:35:02.735685221Z" level=info msg="StartContainer for \"60d88617bd988b2eaedcb430e183364acf6596c172ef2c7a593f8790b5849cce\" returns successfully" Mar 20 21:35:03.334466 kubelet[2829]: I0320 21:35:03.332786 2829 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 20 21:35:03.339071 kubelet[2829]: I0320 21:35:03.338970 2829 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 20 21:35:03.555187 containerd[1556]: time="2025-03-20T21:35:03.555161393Z" level=info msg="CreateContainer within sandbox \"7a53d5095a5e400a8477b8a2ddbce8bd0542cd5b8052dd18c803a15056dfb8e1\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 20 21:35:03.566316 kubelet[2829]: I0320 21:35:03.566239 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6dcdf7db96-6ghgs" podStartSLOduration=6.566218203 podStartE2EDuration="6.566218203s" podCreationTimestamp="2025-03-20 21:34:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 21:35:02.432854566 +0000 UTC m=+49.598486434" watchObservedRunningTime="2025-03-20 21:35:03.566218203 +0000 UTC m=+50.731850065" Mar 20 21:35:03.576197 containerd[1556]: time="2025-03-20T21:35:03.575952573Z" level=info msg="Container 05db4c95029c4d21ef54b8573327d398cf41efcf6a405efd8fe96ccb56d20c05: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:35:03.582005 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4191637846.mount: Deactivated successfully. Mar 20 21:35:03.593878 containerd[1556]: time="2025-03-20T21:35:03.592952057Z" level=info msg="CreateContainer within sandbox \"7a53d5095a5e400a8477b8a2ddbce8bd0542cd5b8052dd18c803a15056dfb8e1\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"05db4c95029c4d21ef54b8573327d398cf41efcf6a405efd8fe96ccb56d20c05\"" Mar 20 21:35:03.596694 containerd[1556]: time="2025-03-20T21:35:03.596428304Z" level=info msg="StartContainer for \"05db4c95029c4d21ef54b8573327d398cf41efcf6a405efd8fe96ccb56d20c05\"" Mar 20 21:35:03.600831 kubelet[2829]: I0320 21:35:03.600055 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-7zx9v" podStartSLOduration=28.223219635 podStartE2EDuration="38.600033643s" podCreationTimestamp="2025-03-20 21:34:25 +0000 UTC" firstStartedPulling="2025-03-20 21:34:52.17186137 +0000 UTC m=+39.337493235" lastFinishedPulling="2025-03-20 21:35:02.548675383 +0000 UTC m=+49.714307243" observedRunningTime="2025-03-20 21:35:03.590875444 +0000 UTC m=+50.756507313" watchObservedRunningTime="2025-03-20 21:35:03.600033643 +0000 UTC m=+50.765665506" Mar 20 21:35:03.603986 containerd[1556]: time="2025-03-20T21:35:03.603877742Z" level=info msg="connecting to shim 05db4c95029c4d21ef54b8573327d398cf41efcf6a405efd8fe96ccb56d20c05" address="unix:///run/containerd/s/47dec1499c8834f2ef316644e4a9bfb9654195c1af19cd3150bd92f51acaae2e" protocol=ttrpc version=3 Mar 20 21:35:03.633241 systemd[1]: Started cri-containerd-05db4c95029c4d21ef54b8573327d398cf41efcf6a405efd8fe96ccb56d20c05.scope - libcontainer container 05db4c95029c4d21ef54b8573327d398cf41efcf6a405efd8fe96ccb56d20c05. Mar 20 21:35:03.681110 containerd[1556]: time="2025-03-20T21:35:03.681083731Z" level=info msg="StartContainer for \"05db4c95029c4d21ef54b8573327d398cf41efcf6a405efd8fe96ccb56d20c05\" returns successfully" Mar 20 21:35:04.001061 systemd[1]: Created slice kubepods-besteffort-pod7fe92e62_aa86_4d9f_8b55_9b84fb30a8de.slice - libcontainer container kubepods-besteffort-pod7fe92e62_aa86_4d9f_8b55_9b84fb30a8de.slice. Mar 20 21:35:04.015716 kubelet[2829]: I0320 21:35:04.015144 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fe92e62-aa86-4d9f-8b55-9b84fb30a8de-tigera-ca-bundle\") pod \"calico-kube-controllers-6fcbcdd44f-2nvlk\" (UID: \"7fe92e62-aa86-4d9f-8b55-9b84fb30a8de\") " pod="calico-system/calico-kube-controllers-6fcbcdd44f-2nvlk" Mar 20 21:35:04.016377 kubelet[2829]: I0320 21:35:04.015724 2829 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jznbs\" (UniqueName: \"kubernetes.io/projected/7fe92e62-aa86-4d9f-8b55-9b84fb30a8de-kube-api-access-jznbs\") pod \"calico-kube-controllers-6fcbcdd44f-2nvlk\" (UID: \"7fe92e62-aa86-4d9f-8b55-9b84fb30a8de\") " pod="calico-system/calico-kube-controllers-6fcbcdd44f-2nvlk" Mar 20 21:35:04.305203 containerd[1556]: time="2025-03-20T21:35:04.305128053Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fcbcdd44f-2nvlk,Uid:7fe92e62-aa86-4d9f-8b55-9b84fb30a8de,Namespace:calico-system,Attempt:0,}" Mar 20 21:35:04.437253 systemd-networkd[1363]: caliaf8c0f4b957: Link UP Mar 20 21:35:04.438018 systemd-networkd[1363]: caliaf8c0f4b957: Gained carrier Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.372 [INFO][5338] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--6fcbcdd44f--2nvlk-eth0 calico-kube-controllers-6fcbcdd44f- calico-system 7fe92e62-aa86-4d9f-8b55-9b84fb30a8de 995 0 2025-03-20 21:34:59 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6fcbcdd44f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-6fcbcdd44f-2nvlk eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] caliaf8c0f4b957 [] []}} ContainerID="f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" Namespace="calico-system" Pod="calico-kube-controllers-6fcbcdd44f-2nvlk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fcbcdd44f--2nvlk-" Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.380 [INFO][5338] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" Namespace="calico-system" Pod="calico-kube-controllers-6fcbcdd44f-2nvlk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fcbcdd44f--2nvlk-eth0" Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.400 [INFO][5349] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" HandleID="k8s-pod-network.f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" Workload="localhost-k8s-calico--kube--controllers--6fcbcdd44f--2nvlk-eth0" Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.406 [INFO][5349] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" HandleID="k8s-pod-network.f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" Workload="localhost-k8s-calico--kube--controllers--6fcbcdd44f--2nvlk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000291480), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-6fcbcdd44f-2nvlk", "timestamp":"2025-03-20 21:35:04.400174745 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.406 [INFO][5349] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.407 [INFO][5349] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.407 [INFO][5349] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.408 [INFO][5349] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" host="localhost" Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.410 [INFO][5349] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.412 [INFO][5349] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.414 [INFO][5349] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.415 [INFO][5349] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.415 [INFO][5349] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" host="localhost" Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.416 [INFO][5349] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.418 [INFO][5349] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" host="localhost" Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.424 [INFO][5349] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" host="localhost" Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.424 [INFO][5349] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" host="localhost" Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.424 [INFO][5349] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 21:35:04.453927 containerd[1556]: 2025-03-20 21:35:04.424 [INFO][5349] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" HandleID="k8s-pod-network.f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" Workload="localhost-k8s-calico--kube--controllers--6fcbcdd44f--2nvlk-eth0" Mar 20 21:35:04.455311 containerd[1556]: 2025-03-20 21:35:04.426 [INFO][5338] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" Namespace="calico-system" Pod="calico-kube-controllers-6fcbcdd44f-2nvlk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fcbcdd44f--2nvlk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6fcbcdd44f--2nvlk-eth0", GenerateName:"calico-kube-controllers-6fcbcdd44f-", Namespace:"calico-system", SelfLink:"", UID:"7fe92e62-aa86-4d9f-8b55-9b84fb30a8de", ResourceVersion:"995", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 21, 34, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6fcbcdd44f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-6fcbcdd44f-2nvlk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"caliaf8c0f4b957", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 21:35:04.455311 containerd[1556]: 2025-03-20 21:35:04.427 [INFO][5338] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.135/32] ContainerID="f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" Namespace="calico-system" Pod="calico-kube-controllers-6fcbcdd44f-2nvlk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fcbcdd44f--2nvlk-eth0" Mar 20 21:35:04.455311 containerd[1556]: 2025-03-20 21:35:04.427 [INFO][5338] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaf8c0f4b957 ContainerID="f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" Namespace="calico-system" Pod="calico-kube-controllers-6fcbcdd44f-2nvlk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fcbcdd44f--2nvlk-eth0" Mar 20 21:35:04.455311 containerd[1556]: 2025-03-20 21:35:04.437 [INFO][5338] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" Namespace="calico-system" Pod="calico-kube-controllers-6fcbcdd44f-2nvlk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fcbcdd44f--2nvlk-eth0" Mar 20 21:35:04.455311 containerd[1556]: 2025-03-20 21:35:04.438 [INFO][5338] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" Namespace="calico-system" Pod="calico-kube-controllers-6fcbcdd44f-2nvlk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fcbcdd44f--2nvlk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6fcbcdd44f--2nvlk-eth0", GenerateName:"calico-kube-controllers-6fcbcdd44f-", Namespace:"calico-system", SelfLink:"", UID:"7fe92e62-aa86-4d9f-8b55-9b84fb30a8de", ResourceVersion:"995", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 21, 34, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6fcbcdd44f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a", Pod:"calico-kube-controllers-6fcbcdd44f-2nvlk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"caliaf8c0f4b957", MAC:"aa:7b:dc:4d:ae:3b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 21:35:04.455311 containerd[1556]: 2025-03-20 21:35:04.447 [INFO][5338] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" Namespace="calico-system" Pod="calico-kube-controllers-6fcbcdd44f-2nvlk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fcbcdd44f--2nvlk-eth0" Mar 20 21:35:04.578173 containerd[1556]: time="2025-03-20T21:35:04.577724173Z" level=info msg="connecting to shim f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a" address="unix:///run/containerd/s/a9633bb98a11334cdb0bcdce28d00747778f06859eb8abb8d0c74b09bab657fd" namespace=k8s.io protocol=ttrpc version=3 Mar 20 21:35:04.620389 systemd[1]: Started cri-containerd-f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a.scope - libcontainer container f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a. Mar 20 21:35:04.644203 systemd-resolved[1472]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 20 21:35:04.651840 kubelet[2829]: I0320 21:35:04.651461 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-9kl56" podStartSLOduration=6.651448605 podStartE2EDuration="6.651448605s" podCreationTimestamp="2025-03-20 21:34:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 21:35:04.649805855 +0000 UTC m=+51.815437724" watchObservedRunningTime="2025-03-20 21:35:04.651448605 +0000 UTC m=+51.817080470" Mar 20 21:35:04.706205 containerd[1556]: time="2025-03-20T21:35:04.706179903Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fcbcdd44f-2nvlk,Uid:7fe92e62-aa86-4d9f-8b55-9b84fb30a8de,Namespace:calico-system,Attempt:0,} returns sandbox id \"f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a\"" Mar 20 21:35:04.727947 containerd[1556]: time="2025-03-20T21:35:04.727920281Z" level=info msg="CreateContainer within sandbox \"f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 20 21:35:04.789205 containerd[1556]: time="2025-03-20T21:35:04.789176401Z" level=info msg="Container 77c5bf06973da8e6ae96bafedbfbb7cdb316bb4fe89f3b02c8d6e4acfd123be8: CDI devices from CRI Config.CDIDevices: []" Mar 20 21:35:04.822075 containerd[1556]: time="2025-03-20T21:35:04.821350053Z" level=info msg="TaskExit event in podsandbox handler container_id:\"05db4c95029c4d21ef54b8573327d398cf41efcf6a405efd8fe96ccb56d20c05\" id:\"e3a4d3366b980d73282e58f2f1d0717c8921b452c52e84eaac35a70042be1842\" pid:5423 exit_status:1 exited_at:{seconds:1742506504 nanos:809992635}" Mar 20 21:35:04.854552 containerd[1556]: time="2025-03-20T21:35:04.854490490Z" level=info msg="CreateContainer within sandbox \"f0205ee3d1549fb73de686a21203ee1c820c72ba485ccf5f02aca4b1879cb66a\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"77c5bf06973da8e6ae96bafedbfbb7cdb316bb4fe89f3b02c8d6e4acfd123be8\"" Mar 20 21:35:04.857958 containerd[1556]: time="2025-03-20T21:35:04.857940769Z" level=info msg="StartContainer for \"77c5bf06973da8e6ae96bafedbfbb7cdb316bb4fe89f3b02c8d6e4acfd123be8\"" Mar 20 21:35:04.858549 containerd[1556]: time="2025-03-20T21:35:04.858523769Z" level=info msg="connecting to shim 77c5bf06973da8e6ae96bafedbfbb7cdb316bb4fe89f3b02c8d6e4acfd123be8" address="unix:///run/containerd/s/a9633bb98a11334cdb0bcdce28d00747778f06859eb8abb8d0c74b09bab657fd" protocol=ttrpc version=3 Mar 20 21:35:04.872157 systemd[1]: Started cri-containerd-77c5bf06973da8e6ae96bafedbfbb7cdb316bb4fe89f3b02c8d6e4acfd123be8.scope - libcontainer container 77c5bf06973da8e6ae96bafedbfbb7cdb316bb4fe89f3b02c8d6e4acfd123be8. Mar 20 21:35:04.915990 containerd[1556]: time="2025-03-20T21:35:04.915918578Z" level=info msg="StartContainer for \"77c5bf06973da8e6ae96bafedbfbb7cdb316bb4fe89f3b02c8d6e4acfd123be8\" returns successfully" Mar 20 21:35:05.562123 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3977429750.mount: Deactivated successfully. Mar 20 21:35:05.599637 systemd-networkd[1363]: caliaf8c0f4b957: Gained IPv6LL Mar 20 21:35:05.935464 containerd[1556]: time="2025-03-20T21:35:05.935416150Z" level=info msg="TaskExit event in podsandbox handler container_id:\"05db4c95029c4d21ef54b8573327d398cf41efcf6a405efd8fe96ccb56d20c05\" id:\"87d763104e2a0b2d5e1041fc5522c355ed2e52ceaef6a4c75c4975d67607bd7e\" pid:5672 exit_status:1 exited_at:{seconds:1742506505 nanos:935077824}" Mar 20 21:35:05.954602 containerd[1556]: time="2025-03-20T21:35:05.954545201Z" level=info msg="TaskExit event in podsandbox handler container_id:\"77c5bf06973da8e6ae96bafedbfbb7cdb316bb4fe89f3b02c8d6e4acfd123be8\" id:\"126ca100f8411f59498c988ade41c936f45a72ae4ccde9dd8bb86f3c081f0954\" pid:5671 exit_status:1 exited_at:{seconds:1742506505 nanos:954312718}" Mar 20 21:35:06.772150 containerd[1556]: time="2025-03-20T21:35:06.772109600Z" level=info msg="TaskExit event in podsandbox handler container_id:\"77c5bf06973da8e6ae96bafedbfbb7cdb316bb4fe89f3b02c8d6e4acfd123be8\" id:\"1963913a6d08733fba3afe2ab3cb9d0976e5a062e4e4a535838cd03676cff1bb\" pid:5730 exited_at:{seconds:1742506506 nanos:771723446}" Mar 20 21:35:06.790506 kubelet[2829]: I0320 21:35:06.790262 2829 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6fcbcdd44f-2nvlk" podStartSLOduration=7.784352737 podStartE2EDuration="7.784352737s" podCreationTimestamp="2025-03-20 21:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 21:35:05.882055236 +0000 UTC m=+53.047687095" watchObservedRunningTime="2025-03-20 21:35:06.784352737 +0000 UTC m=+53.949984601" Mar 20 21:35:12.996030 containerd[1556]: time="2025-03-20T21:35:12.995898770Z" level=info msg="StopPodSandbox for \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\"" Mar 20 21:35:13.018223 containerd[1556]: time="2025-03-20T21:35:13.018054598Z" level=info msg="TearDown network for sandbox \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" successfully" Mar 20 21:35:13.018223 containerd[1556]: time="2025-03-20T21:35:13.018076386Z" level=info msg="StopPodSandbox for \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" returns successfully" Mar 20 21:35:13.093390 containerd[1556]: time="2025-03-20T21:35:13.093264879Z" level=info msg="RemovePodSandbox for \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\"" Mar 20 21:35:13.096502 containerd[1556]: time="2025-03-20T21:35:13.096484482Z" level=info msg="Forcibly stopping sandbox \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\"" Mar 20 21:35:13.096578 containerd[1556]: time="2025-03-20T21:35:13.096563634Z" level=info msg="TearDown network for sandbox \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" successfully" Mar 20 21:35:13.106473 containerd[1556]: time="2025-03-20T21:35:13.106455886Z" level=info msg="Ensure that sandbox e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653 in task-service has been cleanup successfully" Mar 20 21:35:13.107986 containerd[1556]: time="2025-03-20T21:35:13.107971309Z" level=info msg="RemovePodSandbox \"e0844a884b957c9c243c5c121e4ae7b90af59c5127463ba470206a8f84976653\" returns successfully" Mar 20 21:35:13.112397 containerd[1556]: time="2025-03-20T21:35:13.112381709Z" level=info msg="StopPodSandbox for \"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\"" Mar 20 21:35:13.561430 containerd[1556]: 2025-03-20 21:35:13.278 [WARNING][5764] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:35:13.561430 containerd[1556]: 2025-03-20 21:35:13.279 [INFO][5764] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Mar 20 21:35:13.561430 containerd[1556]: 2025-03-20 21:35:13.279 [INFO][5764] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" iface="eth0" netns="" Mar 20 21:35:13.561430 containerd[1556]: 2025-03-20 21:35:13.279 [INFO][5764] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Mar 20 21:35:13.561430 containerd[1556]: 2025-03-20 21:35:13.279 [INFO][5764] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Mar 20 21:35:13.561430 containerd[1556]: 2025-03-20 21:35:13.483 [INFO][5771] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" HandleID="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Workload="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:35:13.561430 containerd[1556]: 2025-03-20 21:35:13.485 [INFO][5771] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 21:35:13.561430 containerd[1556]: 2025-03-20 21:35:13.486 [INFO][5771] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 21:35:13.561430 containerd[1556]: 2025-03-20 21:35:13.557 [WARNING][5771] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" HandleID="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Workload="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:35:13.561430 containerd[1556]: 2025-03-20 21:35:13.557 [INFO][5771] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" HandleID="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Workload="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:35:13.561430 containerd[1556]: 2025-03-20 21:35:13.558 [INFO][5771] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 21:35:13.561430 containerd[1556]: 2025-03-20 21:35:13.560 [INFO][5764] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Mar 20 21:35:13.564530 containerd[1556]: time="2025-03-20T21:35:13.561457358Z" level=info msg="TearDown network for sandbox \"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\" successfully" Mar 20 21:35:13.564530 containerd[1556]: time="2025-03-20T21:35:13.561479351Z" level=info msg="StopPodSandbox for \"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\" returns successfully" Mar 20 21:35:13.564530 containerd[1556]: time="2025-03-20T21:35:13.562068353Z" level=info msg="RemovePodSandbox for \"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\"" Mar 20 21:35:13.564530 containerd[1556]: time="2025-03-20T21:35:13.562089420Z" level=info msg="Forcibly stopping sandbox \"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\"" Mar 20 21:35:13.618885 containerd[1556]: 2025-03-20 21:35:13.593 [WARNING][5790] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:35:13.618885 containerd[1556]: 2025-03-20 21:35:13.593 [INFO][5790] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Mar 20 21:35:13.618885 containerd[1556]: 2025-03-20 21:35:13.593 [INFO][5790] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" iface="eth0" netns="" Mar 20 21:35:13.618885 containerd[1556]: 2025-03-20 21:35:13.593 [INFO][5790] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Mar 20 21:35:13.618885 containerd[1556]: 2025-03-20 21:35:13.593 [INFO][5790] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Mar 20 21:35:13.618885 containerd[1556]: 2025-03-20 21:35:13.611 [INFO][5797] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" HandleID="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Workload="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:35:13.618885 containerd[1556]: 2025-03-20 21:35:13.612 [INFO][5797] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 21:35:13.618885 containerd[1556]: 2025-03-20 21:35:13.612 [INFO][5797] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 21:35:13.618885 containerd[1556]: 2025-03-20 21:35:13.615 [WARNING][5797] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" HandleID="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Workload="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:35:13.618885 containerd[1556]: 2025-03-20 21:35:13.615 [INFO][5797] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" HandleID="k8s-pod-network.4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Workload="localhost-k8s-calico--kube--controllers--57c7f44ccb--nf8xh-eth0" Mar 20 21:35:13.618885 containerd[1556]: 2025-03-20 21:35:13.616 [INFO][5797] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 21:35:13.618885 containerd[1556]: 2025-03-20 21:35:13.617 [INFO][5790] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840" Mar 20 21:35:13.620492 containerd[1556]: time="2025-03-20T21:35:13.618916369Z" level=info msg="TearDown network for sandbox \"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\" successfully" Mar 20 21:35:13.620970 containerd[1556]: time="2025-03-20T21:35:13.620517581Z" level=info msg="Ensure that sandbox 4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840 in task-service has been cleanup successfully" Mar 20 21:35:13.621628 containerd[1556]: time="2025-03-20T21:35:13.621612344Z" level=info msg="RemovePodSandbox \"4d770be70e3264854f1f3b7f8ead6a0a71c8f28e332f98458e9477f0eff45840\" returns successfully" Mar 20 21:35:13.621950 containerd[1556]: time="2025-03-20T21:35:13.621939638Z" level=info msg="StopPodSandbox for \"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\"" Mar 20 21:35:13.622142 containerd[1556]: time="2025-03-20T21:35:13.622103264Z" level=info msg="TearDown network for sandbox \"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\" successfully" Mar 20 21:35:13.622217 containerd[1556]: time="2025-03-20T21:35:13.622114362Z" level=info msg="StopPodSandbox for \"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\" returns successfully" Mar 20 21:35:13.622386 containerd[1556]: time="2025-03-20T21:35:13.622371577Z" level=info msg="RemovePodSandbox for \"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\"" Mar 20 21:35:13.622433 containerd[1556]: time="2025-03-20T21:35:13.622422376Z" level=info msg="Forcibly stopping sandbox \"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\"" Mar 20 21:35:13.622487 containerd[1556]: time="2025-03-20T21:35:13.622473973Z" level=info msg="TearDown network for sandbox \"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\" successfully" Mar 20 21:35:13.623464 containerd[1556]: time="2025-03-20T21:35:13.623450477Z" level=info msg="Ensure that sandbox f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d in task-service has been cleanup successfully" Mar 20 21:35:13.624448 containerd[1556]: time="2025-03-20T21:35:13.624429676Z" level=info msg="RemovePodSandbox \"f6574670a82e3aa3944df74b4aa9d129030e17ad013f2a646f3fa4577c94169d\" returns successfully" Mar 20 21:35:15.480894 containerd[1556]: time="2025-03-20T21:35:15.480855616Z" level=info msg="TaskExit event in podsandbox handler container_id:\"77c5bf06973da8e6ae96bafedbfbb7cdb316bb4fe89f3b02c8d6e4acfd123be8\" id:\"0e6e2bd08317a7ddd69ba03bf262a5aa108161213c777013188da5a2319ff5b9\" pid:5815 exited_at:{seconds:1742506515 nanos:480700058}" Mar 20 21:35:28.863930 containerd[1556]: time="2025-03-20T21:35:28.863894265Z" level=info msg="TaskExit event in podsandbox handler container_id:\"05db4c95029c4d21ef54b8573327d398cf41efcf6a405efd8fe96ccb56d20c05\" id:\"023edee625609bb94cf8701bdf5b6a9e29029a32ffd486f995c0e35b84ef6469\" pid:5858 exited_at:{seconds:1742506528 nanos:863412055}" Mar 20 21:35:33.628942 systemd[1]: Started sshd@7-139.178.70.103:22-147.75.109.163:41758.service - OpenSSH per-connection server daemon (147.75.109.163:41758). Mar 20 21:35:33.847944 sshd[5874]: Accepted publickey for core from 147.75.109.163 port 41758 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:35:33.850427 sshd-session[5874]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:35:33.855284 systemd-logind[1540]: New session 10 of user core. Mar 20 21:35:33.859158 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 20 21:35:34.368518 containerd[1556]: time="2025-03-20T21:35:34.358914803Z" level=info msg="TaskExit event in podsandbox handler container_id:\"77c5bf06973da8e6ae96bafedbfbb7cdb316bb4fe89f3b02c8d6e4acfd123be8\" id:\"8c2087bc1637fc426476ae1f95f72d6fbb8c7a99e4a51c54c4675bd30aff598a\" pid:5907 exited_at:{seconds:1742506534 nanos:358559718}" Mar 20 21:35:34.701214 sshd[5887]: Connection closed by 147.75.109.163 port 41758 Mar 20 21:35:34.701111 sshd-session[5874]: pam_unix(sshd:session): session closed for user core Mar 20 21:35:34.711767 systemd[1]: sshd@7-139.178.70.103:22-147.75.109.163:41758.service: Deactivated successfully. Mar 20 21:35:34.712850 systemd[1]: session-10.scope: Deactivated successfully. Mar 20 21:35:34.713666 systemd-logind[1540]: Session 10 logged out. Waiting for processes to exit. Mar 20 21:35:34.714934 systemd-logind[1540]: Removed session 10. Mar 20 21:35:39.712708 systemd[1]: Started sshd@8-139.178.70.103:22-147.75.109.163:55910.service - OpenSSH per-connection server daemon (147.75.109.163:55910). Mar 20 21:35:39.763982 sshd[5922]: Accepted publickey for core from 147.75.109.163 port 55910 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:35:39.764885 sshd-session[5922]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:35:39.768264 systemd-logind[1540]: New session 11 of user core. Mar 20 21:35:39.775139 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 20 21:35:39.907909 sshd[5924]: Connection closed by 147.75.109.163 port 55910 Mar 20 21:35:39.908343 sshd-session[5922]: pam_unix(sshd:session): session closed for user core Mar 20 21:35:39.909974 systemd-logind[1540]: Session 11 logged out. Waiting for processes to exit. Mar 20 21:35:39.910971 systemd[1]: sshd@8-139.178.70.103:22-147.75.109.163:55910.service: Deactivated successfully. Mar 20 21:35:39.912302 systemd[1]: session-11.scope: Deactivated successfully. Mar 20 21:35:39.913123 systemd-logind[1540]: Removed session 11. Mar 20 21:35:44.917179 systemd[1]: Started sshd@9-139.178.70.103:22-147.75.109.163:39316.service - OpenSSH per-connection server daemon (147.75.109.163:39316). Mar 20 21:35:45.562621 sshd[5937]: Accepted publickey for core from 147.75.109.163 port 39316 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:35:45.573638 sshd-session[5937]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:35:45.581484 systemd-logind[1540]: New session 12 of user core. Mar 20 21:35:45.587164 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 20 21:35:46.443393 sshd[5939]: Connection closed by 147.75.109.163 port 39316 Mar 20 21:35:46.444749 sshd-session[5937]: pam_unix(sshd:session): session closed for user core Mar 20 21:35:46.453109 systemd[1]: sshd@9-139.178.70.103:22-147.75.109.163:39316.service: Deactivated successfully. Mar 20 21:35:46.454954 systemd[1]: session-12.scope: Deactivated successfully. Mar 20 21:35:46.456809 systemd-logind[1540]: Session 12 logged out. Waiting for processes to exit. Mar 20 21:35:46.458099 systemd[1]: Started sshd@10-139.178.70.103:22-147.75.109.163:39330.service - OpenSSH per-connection server daemon (147.75.109.163:39330). Mar 20 21:35:46.459946 systemd-logind[1540]: Removed session 12. Mar 20 21:35:46.497305 sshd[5960]: Accepted publickey for core from 147.75.109.163 port 39330 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:35:46.498118 sshd-session[5960]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:35:46.501908 systemd-logind[1540]: New session 13 of user core. Mar 20 21:35:46.506176 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 20 21:35:46.676619 sshd[5963]: Connection closed by 147.75.109.163 port 39330 Mar 20 21:35:46.677357 sshd-session[5960]: pam_unix(sshd:session): session closed for user core Mar 20 21:35:46.685865 systemd[1]: sshd@10-139.178.70.103:22-147.75.109.163:39330.service: Deactivated successfully. Mar 20 21:35:46.689546 systemd[1]: session-13.scope: Deactivated successfully. Mar 20 21:35:46.692915 systemd-logind[1540]: Session 13 logged out. Waiting for processes to exit. Mar 20 21:35:46.696628 systemd[1]: Started sshd@11-139.178.70.103:22-147.75.109.163:39342.service - OpenSSH per-connection server daemon (147.75.109.163:39342). Mar 20 21:35:46.700596 systemd-logind[1540]: Removed session 13. Mar 20 21:35:46.744063 sshd[5972]: Accepted publickey for core from 147.75.109.163 port 39342 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:35:46.744942 sshd-session[5972]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:35:46.748792 systemd-logind[1540]: New session 14 of user core. Mar 20 21:35:46.754178 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 20 21:35:46.933502 sshd[5975]: Connection closed by 147.75.109.163 port 39342 Mar 20 21:35:46.933870 sshd-session[5972]: pam_unix(sshd:session): session closed for user core Mar 20 21:35:46.936252 systemd[1]: sshd@11-139.178.70.103:22-147.75.109.163:39342.service: Deactivated successfully. Mar 20 21:35:46.937459 systemd[1]: session-14.scope: Deactivated successfully. Mar 20 21:35:46.937900 systemd-logind[1540]: Session 14 logged out. Waiting for processes to exit. Mar 20 21:35:46.938501 systemd-logind[1540]: Removed session 14. Mar 20 21:35:51.945469 systemd[1]: Started sshd@12-139.178.70.103:22-147.75.109.163:39354.service - OpenSSH per-connection server daemon (147.75.109.163:39354). Mar 20 21:35:52.027104 sshd[5989]: Accepted publickey for core from 147.75.109.163 port 39354 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:35:52.027968 sshd-session[5989]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:35:52.031540 systemd-logind[1540]: New session 15 of user core. Mar 20 21:35:52.033179 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 20 21:35:52.626569 sshd[5991]: Connection closed by 147.75.109.163 port 39354 Mar 20 21:35:52.627652 sshd-session[5989]: pam_unix(sshd:session): session closed for user core Mar 20 21:35:52.632546 systemd-logind[1540]: Session 15 logged out. Waiting for processes to exit. Mar 20 21:35:52.633031 systemd[1]: sshd@12-139.178.70.103:22-147.75.109.163:39354.service: Deactivated successfully. Mar 20 21:35:52.635165 systemd[1]: session-15.scope: Deactivated successfully. Mar 20 21:35:52.636608 systemd-logind[1540]: Removed session 15. Mar 20 21:35:57.640023 systemd[1]: Started sshd@13-139.178.70.103:22-147.75.109.163:60428.service - OpenSSH per-connection server daemon (147.75.109.163:60428). Mar 20 21:35:57.692916 sshd[6008]: Accepted publickey for core from 147.75.109.163 port 60428 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:35:57.693937 sshd-session[6008]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:35:57.696739 systemd-logind[1540]: New session 16 of user core. Mar 20 21:35:57.704129 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 20 21:35:57.888027 sshd[6010]: Connection closed by 147.75.109.163 port 60428 Mar 20 21:35:57.887354 sshd-session[6008]: pam_unix(sshd:session): session closed for user core Mar 20 21:35:57.889943 systemd[1]: sshd@13-139.178.70.103:22-147.75.109.163:60428.service: Deactivated successfully. Mar 20 21:35:57.891095 systemd[1]: session-16.scope: Deactivated successfully. Mar 20 21:35:57.891631 systemd-logind[1540]: Session 16 logged out. Waiting for processes to exit. Mar 20 21:35:57.892324 systemd-logind[1540]: Removed session 16. Mar 20 21:35:58.947618 containerd[1556]: time="2025-03-20T21:35:58.947589148Z" level=info msg="TaskExit event in podsandbox handler container_id:\"05db4c95029c4d21ef54b8573327d398cf41efcf6a405efd8fe96ccb56d20c05\" id:\"41d9dc3fccee96238dce30c433dffa97f0be823fa447168e3941150bb3a5af11\" pid:6033 exited_at:{seconds:1742506558 nanos:947305524}" Mar 20 21:36:02.897061 systemd[1]: Started sshd@14-139.178.70.103:22-147.75.109.163:60438.service - OpenSSH per-connection server daemon (147.75.109.163:60438). Mar 20 21:36:02.960788 sshd[6046]: Accepted publickey for core from 147.75.109.163 port 60438 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:36:02.961819 sshd-session[6046]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:36:02.964912 systemd-logind[1540]: New session 17 of user core. Mar 20 21:36:02.969137 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 20 21:36:03.146595 sshd[6048]: Connection closed by 147.75.109.163 port 60438 Mar 20 21:36:03.146315 sshd-session[6046]: pam_unix(sshd:session): session closed for user core Mar 20 21:36:03.148279 systemd[1]: sshd@14-139.178.70.103:22-147.75.109.163:60438.service: Deactivated successfully. Mar 20 21:36:03.149574 systemd[1]: session-17.scope: Deactivated successfully. Mar 20 21:36:03.150473 systemd-logind[1540]: Session 17 logged out. Waiting for processes to exit. Mar 20 21:36:03.151438 systemd-logind[1540]: Removed session 17. Mar 20 21:36:04.343935 containerd[1556]: time="2025-03-20T21:36:04.343906620Z" level=info msg="TaskExit event in podsandbox handler container_id:\"77c5bf06973da8e6ae96bafedbfbb7cdb316bb4fe89f3b02c8d6e4acfd123be8\" id:\"86666e61c44740389056a2600673ec32e3153559c2da7ee294b5a8d08bc10036\" pid:6071 exited_at:{seconds:1742506564 nanos:343674020}" Mar 20 21:36:08.156349 systemd[1]: Started sshd@15-139.178.70.103:22-147.75.109.163:33188.service - OpenSSH per-connection server daemon (147.75.109.163:33188). Mar 20 21:36:08.199656 sshd[6081]: Accepted publickey for core from 147.75.109.163 port 33188 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:36:08.200525 sshd-session[6081]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:36:08.203643 systemd-logind[1540]: New session 18 of user core. Mar 20 21:36:08.212143 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 20 21:36:08.406588 sshd[6083]: Connection closed by 147.75.109.163 port 33188 Mar 20 21:36:08.406963 sshd-session[6081]: pam_unix(sshd:session): session closed for user core Mar 20 21:36:08.414563 systemd[1]: sshd@15-139.178.70.103:22-147.75.109.163:33188.service: Deactivated successfully. Mar 20 21:36:08.415669 systemd[1]: session-18.scope: Deactivated successfully. Mar 20 21:36:08.417079 systemd-logind[1540]: Session 18 logged out. Waiting for processes to exit. Mar 20 21:36:08.418292 systemd[1]: Started sshd@16-139.178.70.103:22-147.75.109.163:33202.service - OpenSSH per-connection server daemon (147.75.109.163:33202). Mar 20 21:36:08.419538 systemd-logind[1540]: Removed session 18. Mar 20 21:36:08.455282 sshd[6093]: Accepted publickey for core from 147.75.109.163 port 33202 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:36:08.456098 sshd-session[6093]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:36:08.459812 systemd-logind[1540]: New session 19 of user core. Mar 20 21:36:08.462145 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 20 21:36:09.078828 sshd[6096]: Connection closed by 147.75.109.163 port 33202 Mar 20 21:36:09.079935 sshd-session[6093]: pam_unix(sshd:session): session closed for user core Mar 20 21:36:09.086741 systemd[1]: sshd@16-139.178.70.103:22-147.75.109.163:33202.service: Deactivated successfully. Mar 20 21:36:09.088836 systemd[1]: session-19.scope: Deactivated successfully. Mar 20 21:36:09.089469 systemd-logind[1540]: Session 19 logged out. Waiting for processes to exit. Mar 20 21:36:09.091816 systemd[1]: Started sshd@17-139.178.70.103:22-147.75.109.163:33216.service - OpenSSH per-connection server daemon (147.75.109.163:33216). Mar 20 21:36:09.092443 systemd-logind[1540]: Removed session 19. Mar 20 21:36:09.224743 sshd[6112]: Accepted publickey for core from 147.75.109.163 port 33216 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:36:09.225598 sshd-session[6112]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:36:09.229427 systemd-logind[1540]: New session 20 of user core. Mar 20 21:36:09.232152 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 20 21:36:10.917076 sshd[6115]: Connection closed by 147.75.109.163 port 33216 Mar 20 21:36:10.929243 sshd-session[6112]: pam_unix(sshd:session): session closed for user core Mar 20 21:36:10.935561 systemd[1]: Started sshd@18-139.178.70.103:22-147.75.109.163:33220.service - OpenSSH per-connection server daemon (147.75.109.163:33220). Mar 20 21:36:10.941697 systemd[1]: sshd@17-139.178.70.103:22-147.75.109.163:33216.service: Deactivated successfully. Mar 20 21:36:10.943461 systemd[1]: session-20.scope: Deactivated successfully. Mar 20 21:36:10.943689 systemd[1]: session-20.scope: Consumed 357ms CPU time, 68.4M memory peak. Mar 20 21:36:10.944998 systemd-logind[1540]: Session 20 logged out. Waiting for processes to exit. Mar 20 21:36:10.946769 systemd-logind[1540]: Removed session 20. Mar 20 21:36:11.014004 sshd[6127]: Accepted publickey for core from 147.75.109.163 port 33220 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:36:11.015734 sshd-session[6127]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:36:11.020100 systemd-logind[1540]: New session 21 of user core. Mar 20 21:36:11.026148 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 20 21:36:11.555204 sshd[6135]: Connection closed by 147.75.109.163 port 33220 Mar 20 21:36:11.556032 sshd-session[6127]: pam_unix(sshd:session): session closed for user core Mar 20 21:36:11.563464 systemd[1]: sshd@18-139.178.70.103:22-147.75.109.163:33220.service: Deactivated successfully. Mar 20 21:36:11.564673 systemd[1]: session-21.scope: Deactivated successfully. Mar 20 21:36:11.565224 systemd-logind[1540]: Session 21 logged out. Waiting for processes to exit. Mar 20 21:36:11.567000 systemd[1]: Started sshd@19-139.178.70.103:22-147.75.109.163:33226.service - OpenSSH per-connection server daemon (147.75.109.163:33226). Mar 20 21:36:11.567735 systemd-logind[1540]: Removed session 21. Mar 20 21:36:11.616537 sshd[6144]: Accepted publickey for core from 147.75.109.163 port 33226 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:36:11.617265 sshd-session[6144]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:36:11.620074 systemd-logind[1540]: New session 22 of user core. Mar 20 21:36:11.623212 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 20 21:36:11.725000 sshd[6147]: Connection closed by 147.75.109.163 port 33226 Mar 20 21:36:11.726170 sshd-session[6144]: pam_unix(sshd:session): session closed for user core Mar 20 21:36:11.727866 systemd-logind[1540]: Session 22 logged out. Waiting for processes to exit. Mar 20 21:36:11.727966 systemd[1]: sshd@19-139.178.70.103:22-147.75.109.163:33226.service: Deactivated successfully. Mar 20 21:36:11.729289 systemd[1]: session-22.scope: Deactivated successfully. Mar 20 21:36:11.730753 systemd-logind[1540]: Removed session 22. Mar 20 21:36:15.535297 containerd[1556]: time="2025-03-20T21:36:15.535239534Z" level=info msg="TaskExit event in podsandbox handler container_id:\"77c5bf06973da8e6ae96bafedbfbb7cdb316bb4fe89f3b02c8d6e4acfd123be8\" id:\"cb78db824c1fe881333f29917b5b7a3562e34a120be1670eca7227fad42e67b3\" pid:6175 exited_at:{seconds:1742506575 nanos:535064478}" Mar 20 21:36:16.740068 systemd[1]: Started sshd@20-139.178.70.103:22-147.75.109.163:52940.service - OpenSSH per-connection server daemon (147.75.109.163:52940). Mar 20 21:36:16.825146 sshd[6187]: Accepted publickey for core from 147.75.109.163 port 52940 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:36:16.826259 sshd-session[6187]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:36:16.829863 systemd-logind[1540]: New session 23 of user core. Mar 20 21:36:16.833125 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 20 21:36:16.984083 sshd[6189]: Connection closed by 147.75.109.163 port 52940 Mar 20 21:36:16.984875 sshd-session[6187]: pam_unix(sshd:session): session closed for user core Mar 20 21:36:16.987380 systemd-logind[1540]: Session 23 logged out. Waiting for processes to exit. Mar 20 21:36:16.987390 systemd[1]: sshd@20-139.178.70.103:22-147.75.109.163:52940.service: Deactivated successfully. Mar 20 21:36:16.988605 systemd[1]: session-23.scope: Deactivated successfully. Mar 20 21:36:16.989203 systemd-logind[1540]: Removed session 23. Mar 20 21:36:21.996017 systemd[1]: Started sshd@21-139.178.70.103:22-147.75.109.163:52948.service - OpenSSH per-connection server daemon (147.75.109.163:52948). Mar 20 21:36:22.061257 sshd[6205]: Accepted publickey for core from 147.75.109.163 port 52948 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:36:22.062190 sshd-session[6205]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:36:22.065909 systemd-logind[1540]: New session 24 of user core. Mar 20 21:36:22.072130 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 20 21:36:22.217658 sshd[6207]: Connection closed by 147.75.109.163 port 52948 Mar 20 21:36:22.219327 sshd-session[6205]: pam_unix(sshd:session): session closed for user core Mar 20 21:36:22.224890 systemd[1]: sshd@21-139.178.70.103:22-147.75.109.163:52948.service: Deactivated successfully. Mar 20 21:36:22.226319 systemd[1]: session-24.scope: Deactivated successfully. Mar 20 21:36:22.227550 systemd-logind[1540]: Session 24 logged out. Waiting for processes to exit. Mar 20 21:36:22.228396 systemd-logind[1540]: Removed session 24. Mar 20 21:36:27.228179 systemd[1]: Started sshd@22-139.178.70.103:22-147.75.109.163:54778.service - OpenSSH per-connection server daemon (147.75.109.163:54778). Mar 20 21:36:27.280029 sshd[6226]: Accepted publickey for core from 147.75.109.163 port 54778 ssh2: RSA SHA256:c7W568SXGw3mx9bNYbzcFt+qBsal3QHXoYv14h3euLg Mar 20 21:36:27.280810 sshd-session[6226]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 21:36:27.284977 systemd-logind[1540]: New session 25 of user core. Mar 20 21:36:27.288136 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 20 21:36:27.405857 sshd[6228]: Connection closed by 147.75.109.163 port 54778 Mar 20 21:36:27.406099 sshd-session[6226]: pam_unix(sshd:session): session closed for user core Mar 20 21:36:27.408751 systemd[1]: sshd@22-139.178.70.103:22-147.75.109.163:54778.service: Deactivated successfully. Mar 20 21:36:27.410049 systemd[1]: session-25.scope: Deactivated successfully. Mar 20 21:36:27.410640 systemd-logind[1540]: Session 25 logged out. Waiting for processes to exit. Mar 20 21:36:27.411243 systemd-logind[1540]: Removed session 25. Mar 20 21:36:28.932177 containerd[1556]: time="2025-03-20T21:36:28.932151000Z" level=info msg="TaskExit event in podsandbox handler container_id:\"05db4c95029c4d21ef54b8573327d398cf41efcf6a405efd8fe96ccb56d20c05\" id:\"e39c7233ca6a6bbf88d48f17c3fdf081bdd74f9e0ef0d70cdc4068cec577f8f2\" pid:6250 exited_at:{seconds:1742506588 nanos:931936262}"