Jan 13 20:49:21.734953 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Jan 13 19:01:45 -00 2025 Jan 13 20:49:21.734969 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 20:49:21.734975 kernel: Disabled fast string operations Jan 13 20:49:21.734979 kernel: BIOS-provided physical RAM map: Jan 13 20:49:21.734983 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Jan 13 20:49:21.734987 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Jan 13 20:49:21.734993 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Jan 13 20:49:21.734997 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Jan 13 20:49:21.735001 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Jan 13 20:49:21.735005 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Jan 13 20:49:21.735010 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Jan 13 20:49:21.735014 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Jan 13 20:49:21.735018 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Jan 13 20:49:21.735022 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Jan 13 20:49:21.735028 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Jan 13 20:49:21.735034 kernel: NX (Execute Disable) protection: active Jan 13 20:49:21.735038 kernel: APIC: Static calls initialized Jan 13 20:49:21.735043 kernel: SMBIOS 2.7 present. Jan 13 20:49:21.735048 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Jan 13 20:49:21.735053 kernel: vmware: hypercall mode: 0x00 Jan 13 20:49:21.735057 kernel: Hypervisor detected: VMware Jan 13 20:49:21.735062 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Jan 13 20:49:21.735068 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Jan 13 20:49:21.735072 kernel: vmware: using clock offset of 2640272691 ns Jan 13 20:49:21.735077 kernel: tsc: Detected 3408.000 MHz processor Jan 13 20:49:21.735082 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 13 20:49:21.735088 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 13 20:49:21.735092 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Jan 13 20:49:21.735097 kernel: total RAM covered: 3072M Jan 13 20:49:21.735102 kernel: Found optimal setting for mtrr clean up Jan 13 20:49:21.735109 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Jan 13 20:49:21.735115 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Jan 13 20:49:21.735120 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 13 20:49:21.735124 kernel: Using GB pages for direct mapping Jan 13 20:49:21.735129 kernel: ACPI: Early table checksum verification disabled Jan 13 20:49:21.735134 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Jan 13 20:49:21.735139 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Jan 13 20:49:21.735143 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Jan 13 20:49:21.735148 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Jan 13 20:49:21.735153 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:49:21.735161 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:49:21.735166 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Jan 13 20:49:21.735171 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Jan 13 20:49:21.735176 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Jan 13 20:49:21.735181 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Jan 13 20:49:21.735187 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Jan 13 20:49:21.735193 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Jan 13 20:49:21.735198 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Jan 13 20:49:21.735203 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Jan 13 20:49:21.735208 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:49:21.735213 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:49:21.735218 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Jan 13 20:49:21.735223 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Jan 13 20:49:21.735228 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Jan 13 20:49:21.735233 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Jan 13 20:49:21.735239 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Jan 13 20:49:21.735244 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Jan 13 20:49:21.735249 kernel: system APIC only can use physical flat Jan 13 20:49:21.735254 kernel: APIC: Switched APIC routing to: physical flat Jan 13 20:49:21.735259 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 13 20:49:21.735264 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 13 20:49:21.735269 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 13 20:49:21.735274 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 13 20:49:21.735279 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 13 20:49:21.735285 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 13 20:49:21.735290 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 13 20:49:21.735295 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 13 20:49:21.735300 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Jan 13 20:49:21.735305 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Jan 13 20:49:21.735309 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Jan 13 20:49:21.735314 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Jan 13 20:49:21.735319 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Jan 13 20:49:21.735324 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Jan 13 20:49:21.735329 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Jan 13 20:49:21.735335 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Jan 13 20:49:21.735340 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Jan 13 20:49:21.735345 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Jan 13 20:49:21.735350 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Jan 13 20:49:21.735355 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Jan 13 20:49:21.735360 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Jan 13 20:49:21.735365 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Jan 13 20:49:21.735370 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Jan 13 20:49:21.735374 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Jan 13 20:49:21.735379 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Jan 13 20:49:21.735384 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Jan 13 20:49:21.735390 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Jan 13 20:49:21.735395 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Jan 13 20:49:21.735400 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Jan 13 20:49:21.735405 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Jan 13 20:49:21.735410 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Jan 13 20:49:21.735415 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Jan 13 20:49:21.735420 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Jan 13 20:49:21.735425 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Jan 13 20:49:21.735430 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Jan 13 20:49:21.735435 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Jan 13 20:49:21.735441 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Jan 13 20:49:21.735446 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Jan 13 20:49:21.735451 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Jan 13 20:49:21.735456 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Jan 13 20:49:21.735461 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Jan 13 20:49:21.735466 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Jan 13 20:49:21.735471 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Jan 13 20:49:21.735476 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Jan 13 20:49:21.735481 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Jan 13 20:49:21.735486 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Jan 13 20:49:21.735491 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Jan 13 20:49:21.735496 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Jan 13 20:49:21.735501 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Jan 13 20:49:21.735506 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Jan 13 20:49:21.735511 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Jan 13 20:49:21.735516 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Jan 13 20:49:21.735521 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Jan 13 20:49:21.735526 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Jan 13 20:49:21.735531 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Jan 13 20:49:21.735536 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Jan 13 20:49:21.735542 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Jan 13 20:49:21.735547 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Jan 13 20:49:21.735552 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Jan 13 20:49:21.735561 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Jan 13 20:49:21.735566 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Jan 13 20:49:21.735571 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Jan 13 20:49:21.735576 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Jan 13 20:49:21.735582 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Jan 13 20:49:21.735587 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Jan 13 20:49:21.735594 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Jan 13 20:49:21.735599 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Jan 13 20:49:21.735604 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Jan 13 20:49:21.735610 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Jan 13 20:49:21.735615 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Jan 13 20:49:21.735620 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Jan 13 20:49:21.735625 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Jan 13 20:49:21.735630 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Jan 13 20:49:21.735636 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Jan 13 20:49:21.735641 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Jan 13 20:49:21.735647 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Jan 13 20:49:21.735653 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Jan 13 20:49:21.735658 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Jan 13 20:49:21.735663 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Jan 13 20:49:21.735668 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Jan 13 20:49:21.735674 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Jan 13 20:49:21.735679 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Jan 13 20:49:21.735684 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Jan 13 20:49:21.735689 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Jan 13 20:49:21.735695 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Jan 13 20:49:21.735701 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Jan 13 20:49:21.735706 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Jan 13 20:49:21.735711 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Jan 13 20:49:21.735716 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Jan 13 20:49:21.735722 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Jan 13 20:49:21.735727 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Jan 13 20:49:21.735732 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Jan 13 20:49:21.735737 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Jan 13 20:49:21.735743 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Jan 13 20:49:21.735748 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Jan 13 20:49:21.735753 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Jan 13 20:49:21.735759 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Jan 13 20:49:21.735774 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Jan 13 20:49:21.735780 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Jan 13 20:49:21.735785 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Jan 13 20:49:21.735790 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Jan 13 20:49:21.735796 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Jan 13 20:49:21.735801 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Jan 13 20:49:21.735806 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Jan 13 20:49:21.735812 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Jan 13 20:49:21.735817 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Jan 13 20:49:21.735824 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Jan 13 20:49:21.735829 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Jan 13 20:49:21.735835 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Jan 13 20:49:21.735840 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Jan 13 20:49:21.735845 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Jan 13 20:49:21.735850 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Jan 13 20:49:21.735856 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Jan 13 20:49:21.735861 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Jan 13 20:49:21.735866 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Jan 13 20:49:21.735872 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Jan 13 20:49:21.735878 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Jan 13 20:49:21.735884 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Jan 13 20:49:21.735889 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Jan 13 20:49:21.735894 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Jan 13 20:49:21.735900 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Jan 13 20:49:21.735905 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Jan 13 20:49:21.735910 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Jan 13 20:49:21.735915 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Jan 13 20:49:21.735920 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Jan 13 20:49:21.735926 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Jan 13 20:49:21.735932 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Jan 13 20:49:21.735937 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Jan 13 20:49:21.735943 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 13 20:49:21.735948 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 13 20:49:21.735953 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Jan 13 20:49:21.735959 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Jan 13 20:49:21.735965 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Jan 13 20:49:21.735970 kernel: Zone ranges: Jan 13 20:49:21.735976 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 13 20:49:21.735981 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Jan 13 20:49:21.735987 kernel: Normal empty Jan 13 20:49:21.735993 kernel: Movable zone start for each node Jan 13 20:49:21.735998 kernel: Early memory node ranges Jan 13 20:49:21.736004 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Jan 13 20:49:21.736009 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Jan 13 20:49:21.736014 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Jan 13 20:49:21.736020 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Jan 13 20:49:21.736025 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 13 20:49:21.736030 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Jan 13 20:49:21.736037 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Jan 13 20:49:21.736043 kernel: ACPI: PM-Timer IO Port: 0x1008 Jan 13 20:49:21.736048 kernel: system APIC only can use physical flat Jan 13 20:49:21.736053 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Jan 13 20:49:21.736059 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Jan 13 20:49:21.736064 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Jan 13 20:49:21.736069 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Jan 13 20:49:21.736075 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Jan 13 20:49:21.736080 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Jan 13 20:49:21.736085 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Jan 13 20:49:21.736092 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Jan 13 20:49:21.736097 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Jan 13 20:49:21.736103 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Jan 13 20:49:21.736108 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Jan 13 20:49:21.736113 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Jan 13 20:49:21.736119 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Jan 13 20:49:21.736124 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Jan 13 20:49:21.736130 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Jan 13 20:49:21.736135 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Jan 13 20:49:21.736141 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Jan 13 20:49:21.736147 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Jan 13 20:49:21.736152 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Jan 13 20:49:21.736157 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Jan 13 20:49:21.736163 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Jan 13 20:49:21.736168 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Jan 13 20:49:21.736173 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Jan 13 20:49:21.736179 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Jan 13 20:49:21.736184 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Jan 13 20:49:21.736189 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Jan 13 20:49:21.736196 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Jan 13 20:49:21.736201 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Jan 13 20:49:21.736206 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Jan 13 20:49:21.736212 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Jan 13 20:49:21.736217 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Jan 13 20:49:21.736222 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Jan 13 20:49:21.736228 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Jan 13 20:49:21.736233 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Jan 13 20:49:21.736239 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Jan 13 20:49:21.736244 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Jan 13 20:49:21.736250 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Jan 13 20:49:21.736256 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Jan 13 20:49:21.736261 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Jan 13 20:49:21.736267 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Jan 13 20:49:21.736272 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Jan 13 20:49:21.736277 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Jan 13 20:49:21.736283 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Jan 13 20:49:21.736288 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Jan 13 20:49:21.736293 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Jan 13 20:49:21.736303 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Jan 13 20:49:21.736309 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Jan 13 20:49:21.736314 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Jan 13 20:49:21.736319 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Jan 13 20:49:21.736325 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Jan 13 20:49:21.736330 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Jan 13 20:49:21.736335 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Jan 13 20:49:21.736341 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Jan 13 20:49:21.736346 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Jan 13 20:49:21.736351 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Jan 13 20:49:21.736358 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Jan 13 20:49:21.736363 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Jan 13 20:49:21.736369 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Jan 13 20:49:21.736374 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Jan 13 20:49:21.736379 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Jan 13 20:49:21.736385 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Jan 13 20:49:21.736390 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Jan 13 20:49:21.736395 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Jan 13 20:49:21.736401 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Jan 13 20:49:21.736406 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Jan 13 20:49:21.736413 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Jan 13 20:49:21.736418 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Jan 13 20:49:21.736423 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Jan 13 20:49:21.736429 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Jan 13 20:49:21.736434 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Jan 13 20:49:21.736439 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Jan 13 20:49:21.736445 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Jan 13 20:49:21.736450 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Jan 13 20:49:21.736455 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Jan 13 20:49:21.736462 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Jan 13 20:49:21.736467 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Jan 13 20:49:21.736472 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Jan 13 20:49:21.736478 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Jan 13 20:49:21.736483 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Jan 13 20:49:21.736488 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Jan 13 20:49:21.736494 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Jan 13 20:49:21.736499 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Jan 13 20:49:21.736504 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Jan 13 20:49:21.736510 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Jan 13 20:49:21.736516 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Jan 13 20:49:21.736522 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Jan 13 20:49:21.736527 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Jan 13 20:49:21.736533 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Jan 13 20:49:21.736538 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Jan 13 20:49:21.736543 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Jan 13 20:49:21.736548 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Jan 13 20:49:21.736554 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Jan 13 20:49:21.736560 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Jan 13 20:49:21.736565 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Jan 13 20:49:21.736572 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Jan 13 20:49:21.736577 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Jan 13 20:49:21.736582 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Jan 13 20:49:21.736588 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Jan 13 20:49:21.736593 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Jan 13 20:49:21.736598 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Jan 13 20:49:21.736604 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Jan 13 20:49:21.736609 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Jan 13 20:49:21.736614 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Jan 13 20:49:21.736621 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Jan 13 20:49:21.736626 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Jan 13 20:49:21.736631 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Jan 13 20:49:21.736637 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Jan 13 20:49:21.736642 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Jan 13 20:49:21.736647 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Jan 13 20:49:21.736653 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Jan 13 20:49:21.736658 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Jan 13 20:49:21.736663 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Jan 13 20:49:21.736669 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Jan 13 20:49:21.736675 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Jan 13 20:49:21.736680 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Jan 13 20:49:21.736686 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Jan 13 20:49:21.736691 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Jan 13 20:49:21.736696 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Jan 13 20:49:21.736702 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Jan 13 20:49:21.736707 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Jan 13 20:49:21.736713 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Jan 13 20:49:21.736718 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Jan 13 20:49:21.736723 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Jan 13 20:49:21.736730 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Jan 13 20:49:21.736735 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Jan 13 20:49:21.736740 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Jan 13 20:49:21.736746 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Jan 13 20:49:21.736751 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Jan 13 20:49:21.736756 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Jan 13 20:49:21.736769 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Jan 13 20:49:21.736776 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 13 20:49:21.736781 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Jan 13 20:49:21.736788 kernel: TSC deadline timer available Jan 13 20:49:21.736793 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Jan 13 20:49:21.736799 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Jan 13 20:49:21.736804 kernel: Booting paravirtualized kernel on VMware hypervisor Jan 13 20:49:21.736810 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 13 20:49:21.736815 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Jan 13 20:49:21.736821 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 13 20:49:21.736826 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 13 20:49:21.736832 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Jan 13 20:49:21.736838 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Jan 13 20:49:21.736844 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Jan 13 20:49:21.736849 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Jan 13 20:49:21.736854 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Jan 13 20:49:21.736866 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Jan 13 20:49:21.736873 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Jan 13 20:49:21.736878 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Jan 13 20:49:21.736884 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Jan 13 20:49:21.736889 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Jan 13 20:49:21.736896 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Jan 13 20:49:21.736902 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Jan 13 20:49:21.736908 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Jan 13 20:49:21.736913 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Jan 13 20:49:21.736919 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Jan 13 20:49:21.736924 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Jan 13 20:49:21.736931 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 20:49:21.736937 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 13 20:49:21.736944 kernel: random: crng init done Jan 13 20:49:21.736949 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Jan 13 20:49:21.736955 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Jan 13 20:49:21.736961 kernel: printk: log_buf_len min size: 262144 bytes Jan 13 20:49:21.736967 kernel: printk: log_buf_len: 1048576 bytes Jan 13 20:49:21.736972 kernel: printk: early log buf free: 239648(91%) Jan 13 20:49:21.736978 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 13 20:49:21.736984 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 13 20:49:21.736990 kernel: Fallback order for Node 0: 0 Jan 13 20:49:21.736997 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Jan 13 20:49:21.737002 kernel: Policy zone: DMA32 Jan 13 20:49:21.737008 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 13 20:49:21.737014 kernel: Memory: 1936356K/2096628K available (12288K kernel code, 2299K rwdata, 22736K rodata, 42976K init, 2216K bss, 160012K reserved, 0K cma-reserved) Jan 13 20:49:21.737021 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Jan 13 20:49:21.737028 kernel: ftrace: allocating 37920 entries in 149 pages Jan 13 20:49:21.737034 kernel: ftrace: allocated 149 pages with 4 groups Jan 13 20:49:21.737039 kernel: Dynamic Preempt: voluntary Jan 13 20:49:21.737045 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 13 20:49:21.737051 kernel: rcu: RCU event tracing is enabled. Jan 13 20:49:21.737057 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Jan 13 20:49:21.737063 kernel: Trampoline variant of Tasks RCU enabled. Jan 13 20:49:21.737069 kernel: Rude variant of Tasks RCU enabled. Jan 13 20:49:21.737075 kernel: Tracing variant of Tasks RCU enabled. Jan 13 20:49:21.737080 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 13 20:49:21.737087 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Jan 13 20:49:21.737093 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Jan 13 20:49:21.737099 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Jan 13 20:49:21.737105 kernel: Console: colour VGA+ 80x25 Jan 13 20:49:21.737110 kernel: printk: console [tty0] enabled Jan 13 20:49:21.737116 kernel: printk: console [ttyS0] enabled Jan 13 20:49:21.737122 kernel: ACPI: Core revision 20230628 Jan 13 20:49:21.737128 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Jan 13 20:49:21.737134 kernel: APIC: Switch to symmetric I/O mode setup Jan 13 20:49:21.737140 kernel: x2apic enabled Jan 13 20:49:21.737146 kernel: APIC: Switched APIC routing to: physical x2apic Jan 13 20:49:21.737152 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 13 20:49:21.737158 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:49:21.737164 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Jan 13 20:49:21.737170 kernel: Disabled fast string operations Jan 13 20:49:21.737175 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Jan 13 20:49:21.737181 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Jan 13 20:49:21.737187 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 13 20:49:21.737194 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Jan 13 20:49:21.737200 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Jan 13 20:49:21.737205 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 13 20:49:21.737211 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 13 20:49:21.737217 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 13 20:49:21.737224 kernel: RETBleed: Mitigation: Enhanced IBRS Jan 13 20:49:21.737230 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 13 20:49:21.737236 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 13 20:49:21.737242 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jan 13 20:49:21.737249 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 13 20:49:21.737255 kernel: GDS: Unknown: Dependent on hypervisor status Jan 13 20:49:21.737261 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 13 20:49:21.737266 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 13 20:49:21.737272 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 13 20:49:21.737278 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 13 20:49:21.737284 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jan 13 20:49:21.737290 kernel: Freeing SMP alternatives memory: 32K Jan 13 20:49:21.737295 kernel: pid_max: default: 131072 minimum: 1024 Jan 13 20:49:21.737306 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 13 20:49:21.737312 kernel: landlock: Up and running. Jan 13 20:49:21.737318 kernel: SELinux: Initializing. Jan 13 20:49:21.737324 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:49:21.737330 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:49:21.737336 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Jan 13 20:49:21.737342 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:49:21.737348 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:49:21.737355 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:49:21.737361 kernel: Performance Events: Skylake events, core PMU driver. Jan 13 20:49:21.737366 kernel: core: CPUID marked event: 'cpu cycles' unavailable Jan 13 20:49:21.737372 kernel: core: CPUID marked event: 'instructions' unavailable Jan 13 20:49:21.737378 kernel: core: CPUID marked event: 'bus cycles' unavailable Jan 13 20:49:21.737383 kernel: core: CPUID marked event: 'cache references' unavailable Jan 13 20:49:21.737389 kernel: core: CPUID marked event: 'cache misses' unavailable Jan 13 20:49:21.737394 kernel: core: CPUID marked event: 'branch instructions' unavailable Jan 13 20:49:21.737400 kernel: core: CPUID marked event: 'branch misses' unavailable Jan 13 20:49:21.737407 kernel: ... version: 1 Jan 13 20:49:21.737413 kernel: ... bit width: 48 Jan 13 20:49:21.737418 kernel: ... generic registers: 4 Jan 13 20:49:21.737424 kernel: ... value mask: 0000ffffffffffff Jan 13 20:49:21.737430 kernel: ... max period: 000000007fffffff Jan 13 20:49:21.737435 kernel: ... fixed-purpose events: 0 Jan 13 20:49:21.737441 kernel: ... event mask: 000000000000000f Jan 13 20:49:21.737447 kernel: signal: max sigframe size: 1776 Jan 13 20:49:21.737453 kernel: rcu: Hierarchical SRCU implementation. Jan 13 20:49:21.737459 kernel: rcu: Max phase no-delay instances is 400. Jan 13 20:49:21.737465 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 13 20:49:21.737471 kernel: smp: Bringing up secondary CPUs ... Jan 13 20:49:21.737477 kernel: smpboot: x86: Booting SMP configuration: Jan 13 20:49:21.737483 kernel: .... node #0, CPUs: #1 Jan 13 20:49:21.737488 kernel: Disabled fast string operations Jan 13 20:49:21.737495 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Jan 13 20:49:21.737500 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 13 20:49:21.737506 kernel: smp: Brought up 1 node, 2 CPUs Jan 13 20:49:21.737512 kernel: smpboot: Max logical packages: 128 Jan 13 20:49:21.737518 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Jan 13 20:49:21.737524 kernel: devtmpfs: initialized Jan 13 20:49:21.737530 kernel: x86/mm: Memory block size: 128MB Jan 13 20:49:21.737536 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Jan 13 20:49:21.737542 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 13 20:49:21.737548 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Jan 13 20:49:21.737554 kernel: pinctrl core: initialized pinctrl subsystem Jan 13 20:49:21.737560 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 13 20:49:21.737565 kernel: audit: initializing netlink subsys (disabled) Jan 13 20:49:21.737572 kernel: audit: type=2000 audit(1736801360.067:1): state=initialized audit_enabled=0 res=1 Jan 13 20:49:21.737579 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 13 20:49:21.737585 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 13 20:49:21.737591 kernel: cpuidle: using governor menu Jan 13 20:49:21.737597 kernel: Simple Boot Flag at 0x36 set to 0x80 Jan 13 20:49:21.737602 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 13 20:49:21.737608 kernel: dca service started, version 1.12.1 Jan 13 20:49:21.737614 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Jan 13 20:49:21.737620 kernel: PCI: Using configuration type 1 for base access Jan 13 20:49:21.737627 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 13 20:49:21.737632 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 13 20:49:21.737638 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 13 20:49:21.737644 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 13 20:49:21.737649 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 13 20:49:21.737655 kernel: ACPI: Added _OSI(Module Device) Jan 13 20:49:21.737661 kernel: ACPI: Added _OSI(Processor Device) Jan 13 20:49:21.737667 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 13 20:49:21.737673 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 13 20:49:21.737680 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 13 20:49:21.737685 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Jan 13 20:49:21.737691 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 13 20:49:21.737697 kernel: ACPI: Interpreter enabled Jan 13 20:49:21.737702 kernel: ACPI: PM: (supports S0 S1 S5) Jan 13 20:49:21.737708 kernel: ACPI: Using IOAPIC for interrupt routing Jan 13 20:49:21.737714 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 13 20:49:21.737720 kernel: PCI: Using E820 reservations for host bridge windows Jan 13 20:49:21.737725 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Jan 13 20:49:21.737732 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Jan 13 20:49:21.737821 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 13 20:49:21.737877 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Jan 13 20:49:21.737925 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Jan 13 20:49:21.737933 kernel: PCI host bridge to bus 0000:00 Jan 13 20:49:21.737982 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 13 20:49:21.738030 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Jan 13 20:49:21.738073 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 13 20:49:21.738116 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 13 20:49:21.738159 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Jan 13 20:49:21.738203 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Jan 13 20:49:21.738262 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Jan 13 20:49:21.738317 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Jan 13 20:49:21.738372 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Jan 13 20:49:21.738426 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Jan 13 20:49:21.738476 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Jan 13 20:49:21.738525 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Jan 13 20:49:21.738574 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Jan 13 20:49:21.738622 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Jan 13 20:49:21.738673 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Jan 13 20:49:21.738725 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Jan 13 20:49:21.738797 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Jan 13 20:49:21.738849 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Jan 13 20:49:21.738903 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Jan 13 20:49:21.738952 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Jan 13 20:49:21.739004 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Jan 13 20:49:21.739057 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Jan 13 20:49:21.739107 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Jan 13 20:49:21.739156 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Jan 13 20:49:21.739205 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Jan 13 20:49:21.739253 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Jan 13 20:49:21.739302 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 13 20:49:21.739357 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Jan 13 20:49:21.739416 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.739466 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.739519 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.739569 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.739621 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.739670 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.739725 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.740124 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.740191 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.740243 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.740301 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.740353 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.740412 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.740461 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.740514 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.740563 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.740619 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.740668 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.740724 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.742784 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.742853 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.742907 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.742962 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743016 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743070 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743120 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743172 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743221 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743274 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743326 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743382 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743433 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743486 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743535 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743590 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743639 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743694 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743743 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743823 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743873 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743926 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743975 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744030 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744080 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744133 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744182 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744235 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744285 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744340 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744390 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744445 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744495 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744549 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744598 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744652 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744703 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744756 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744821 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744875 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744924 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744977 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.745029 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.745081 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.745131 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.745182 kernel: pci_bus 0000:01: extended config space not accessible Jan 13 20:49:21.745233 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:49:21.745286 kernel: pci_bus 0000:02: extended config space not accessible Jan 13 20:49:21.745297 kernel: acpiphp: Slot [32] registered Jan 13 20:49:21.745303 kernel: acpiphp: Slot [33] registered Jan 13 20:49:21.745309 kernel: acpiphp: Slot [34] registered Jan 13 20:49:21.745315 kernel: acpiphp: Slot [35] registered Jan 13 20:49:21.745320 kernel: acpiphp: Slot [36] registered Jan 13 20:49:21.745326 kernel: acpiphp: Slot [37] registered Jan 13 20:49:21.745332 kernel: acpiphp: Slot [38] registered Jan 13 20:49:21.745338 kernel: acpiphp: Slot [39] registered Jan 13 20:49:21.745343 kernel: acpiphp: Slot [40] registered Jan 13 20:49:21.745351 kernel: acpiphp: Slot [41] registered Jan 13 20:49:21.745356 kernel: acpiphp: Slot [42] registered Jan 13 20:49:21.745362 kernel: acpiphp: Slot [43] registered Jan 13 20:49:21.745368 kernel: acpiphp: Slot [44] registered Jan 13 20:49:21.745374 kernel: acpiphp: Slot [45] registered Jan 13 20:49:21.745380 kernel: acpiphp: Slot [46] registered Jan 13 20:49:21.745385 kernel: acpiphp: Slot [47] registered Jan 13 20:49:21.745391 kernel: acpiphp: Slot [48] registered Jan 13 20:49:21.745397 kernel: acpiphp: Slot [49] registered Jan 13 20:49:21.745403 kernel: acpiphp: Slot [50] registered Jan 13 20:49:21.745410 kernel: acpiphp: Slot [51] registered Jan 13 20:49:21.745416 kernel: acpiphp: Slot [52] registered Jan 13 20:49:21.745421 kernel: acpiphp: Slot [53] registered Jan 13 20:49:21.745427 kernel: acpiphp: Slot [54] registered Jan 13 20:49:21.745433 kernel: acpiphp: Slot [55] registered Jan 13 20:49:21.745439 kernel: acpiphp: Slot [56] registered Jan 13 20:49:21.745444 kernel: acpiphp: Slot [57] registered Jan 13 20:49:21.745450 kernel: acpiphp: Slot [58] registered Jan 13 20:49:21.745456 kernel: acpiphp: Slot [59] registered Jan 13 20:49:21.745463 kernel: acpiphp: Slot [60] registered Jan 13 20:49:21.745469 kernel: acpiphp: Slot [61] registered Jan 13 20:49:21.745474 kernel: acpiphp: Slot [62] registered Jan 13 20:49:21.745480 kernel: acpiphp: Slot [63] registered Jan 13 20:49:21.745528 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jan 13 20:49:21.745577 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:49:21.745625 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:49:21.745672 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:49:21.745721 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Jan 13 20:49:21.746074 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Jan 13 20:49:21.746130 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Jan 13 20:49:21.746180 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Jan 13 20:49:21.746229 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Jan 13 20:49:21.746285 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Jan 13 20:49:21.746342 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Jan 13 20:49:21.746394 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Jan 13 20:49:21.746447 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:49:21.746497 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.746548 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:49:21.746598 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:49:21.746648 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:49:21.746697 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:49:21.746747 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:49:21.746873 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:49:21.746923 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:49:21.746971 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:49:21.747021 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:49:21.747069 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:49:21.747117 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:49:21.747165 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:49:21.747214 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:49:21.747266 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:49:21.747324 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:49:21.747377 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:49:21.747425 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:49:21.747474 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:49:21.747526 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:49:21.747574 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:49:21.747623 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:49:21.747709 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:49:21.747782 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:49:21.747836 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:49:21.747886 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:49:21.747938 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:49:21.747987 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:49:21.748044 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Jan 13 20:49:21.748096 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Jan 13 20:49:21.748146 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Jan 13 20:49:21.748195 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Jan 13 20:49:21.748245 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Jan 13 20:49:21.748294 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:49:21.748348 kernel: pci 0000:0b:00.0: supports D1 D2 Jan 13 20:49:21.748398 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Jan 13 20:49:21.748447 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:49:21.748497 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:49:21.748546 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:49:21.750827 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:49:21.750886 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:49:21.750943 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:49:21.750993 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:49:21.751042 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:49:21.751094 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:49:21.751143 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:49:21.751193 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:49:21.751241 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:49:21.751292 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:49:21.751343 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:49:21.751393 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:49:21.751443 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:49:21.751492 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:49:21.751542 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:49:21.751593 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:49:21.751642 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:49:21.751691 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:49:21.751744 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:49:21.751861 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:49:21.751931 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:49:21.751998 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:49:21.752050 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:49:21.752100 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:49:21.752149 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:49:21.752198 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:49:21.752250 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:49:21.752299 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:49:21.752350 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:49:21.752400 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:49:21.752448 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:49:21.752497 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:49:21.752546 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:49:21.752597 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:49:21.752647 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:49:21.752915 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:49:21.752976 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:49:21.753029 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:49:21.753081 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:49:21.753133 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:49:21.753184 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:49:21.753238 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:49:21.753290 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:49:21.753346 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:49:21.753397 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:49:21.753448 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:49:21.753499 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:49:21.753550 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:49:21.753601 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:49:21.753655 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:49:21.753705 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:49:21.753756 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:49:21.753822 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:49:21.753874 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:49:21.753923 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:49:21.753974 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:49:21.754023 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:49:21.754076 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:49:21.754125 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:49:21.754175 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:49:21.754225 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:49:21.754285 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:49:21.754354 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:49:21.754406 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:49:21.754456 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:49:21.754508 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:49:21.754557 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:49:21.754606 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:49:21.754656 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:49:21.754705 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:49:21.754754 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:49:21.754845 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:49:21.754895 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:49:21.754948 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:49:21.754998 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:49:21.755047 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:49:21.755096 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:49:21.755104 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Jan 13 20:49:21.755111 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Jan 13 20:49:21.755118 kernel: ACPI: PCI: Interrupt link LNKB disabled Jan 13 20:49:21.755124 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 13 20:49:21.755131 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Jan 13 20:49:21.755137 kernel: iommu: Default domain type: Translated Jan 13 20:49:21.755143 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 13 20:49:21.755149 kernel: PCI: Using ACPI for IRQ routing Jan 13 20:49:21.755155 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 13 20:49:21.755161 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Jan 13 20:49:21.755167 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Jan 13 20:49:21.755216 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Jan 13 20:49:21.755265 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Jan 13 20:49:21.755317 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 13 20:49:21.755326 kernel: vgaarb: loaded Jan 13 20:49:21.755332 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Jan 13 20:49:21.755338 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Jan 13 20:49:21.755344 kernel: clocksource: Switched to clocksource tsc-early Jan 13 20:49:21.755350 kernel: VFS: Disk quotas dquot_6.6.0 Jan 13 20:49:21.755356 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 13 20:49:21.755362 kernel: pnp: PnP ACPI init Jan 13 20:49:21.755413 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Jan 13 20:49:21.755462 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Jan 13 20:49:21.755507 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Jan 13 20:49:21.755555 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Jan 13 20:49:21.755603 kernel: pnp 00:06: [dma 2] Jan 13 20:49:21.755653 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Jan 13 20:49:21.755699 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Jan 13 20:49:21.755746 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Jan 13 20:49:21.755754 kernel: pnp: PnP ACPI: found 8 devices Jan 13 20:49:21.755761 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 13 20:49:21.755781 kernel: NET: Registered PF_INET protocol family Jan 13 20:49:21.755787 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 13 20:49:21.755794 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 13 20:49:21.755800 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 13 20:49:21.755805 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 13 20:49:21.755814 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 13 20:49:21.755820 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 13 20:49:21.755826 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:49:21.755832 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:49:21.755838 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 13 20:49:21.755844 kernel: NET: Registered PF_XDP protocol family Jan 13 20:49:21.755899 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Jan 13 20:49:21.755951 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 13 20:49:21.756005 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 13 20:49:21.756056 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 13 20:49:21.756105 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 13 20:49:21.756154 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 13 20:49:21.756204 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 13 20:49:21.756254 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 13 20:49:21.756310 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 13 20:49:21.756361 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 13 20:49:21.756410 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 13 20:49:21.756460 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 13 20:49:21.756510 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 13 20:49:21.756560 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 13 20:49:21.756613 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 13 20:49:21.756663 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 13 20:49:21.756712 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 13 20:49:21.757803 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 13 20:49:21.757868 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 13 20:49:21.757921 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 13 20:49:21.757975 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 13 20:49:21.758025 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 13 20:49:21.758075 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Jan 13 20:49:21.758124 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:49:21.758173 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:49:21.758222 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758273 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.758321 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758369 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.758417 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758466 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.758515 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758564 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.758613 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758664 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.758713 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758761 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.758827 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758877 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.758926 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758975 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759024 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.759076 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759126 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.759175 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759224 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.759273 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759324 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.759374 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759422 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.759474 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759523 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.759571 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759621 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.759669 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759718 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.762786 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.762850 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.762906 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.762958 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763009 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763059 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763109 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763158 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763207 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763257 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763310 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763362 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763410 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763458 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763507 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763555 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763603 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763652 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763700 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763749 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763811 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763861 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763910 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763959 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764008 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764057 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764106 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764154 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764204 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764256 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764305 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764353 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764402 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764450 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764499 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764547 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764596 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764645 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764693 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764744 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.768750 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.768815 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.768867 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.768918 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.768968 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.769019 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.769068 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.769118 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.769170 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.769221 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.769269 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.769322 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.769371 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.769421 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:49:21.769471 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Jan 13 20:49:21.769520 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:49:21.769567 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:49:21.769615 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:49:21.769672 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Jan 13 20:49:21.769722 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:49:21.769783 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:49:21.769836 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:49:21.769886 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:49:21.769936 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:49:21.769985 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:49:21.770035 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:49:21.770088 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:49:21.770139 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:49:21.770189 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:49:21.770238 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:49:21.770287 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:49:21.770340 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:49:21.770390 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:49:21.770439 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:49:21.770487 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:49:21.770540 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:49:21.770588 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:49:21.770640 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:49:21.770689 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:49:21.770738 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:49:21.772155 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:49:21.772215 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:49:21.772268 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:49:21.772320 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:49:21.772370 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:49:21.772421 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:49:21.772475 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Jan 13 20:49:21.772527 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:49:21.772577 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:49:21.772626 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:49:21.772679 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:49:21.772731 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:49:21.772790 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:49:21.772840 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:49:21.772889 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:49:21.772941 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:49:21.772991 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:49:21.773041 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:49:21.773090 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:49:21.773140 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:49:21.773193 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:49:21.773242 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:49:21.773292 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:49:21.773353 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:49:21.773403 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:49:21.773454 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:49:21.773503 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:49:21.773554 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:49:21.773605 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:49:21.773658 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:49:21.773707 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:49:21.773758 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:49:21.777160 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:49:21.777218 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:49:21.777272 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:49:21.777330 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:49:21.777380 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:49:21.777430 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:49:21.777481 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:49:21.777535 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:49:21.777584 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:49:21.777633 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:49:21.777684 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:49:21.777734 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:49:21.777791 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:49:21.777840 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:49:21.777891 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:49:21.777941 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:49:21.777993 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:49:21.778044 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:49:21.778097 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:49:21.778146 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:49:21.778197 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:49:21.778247 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:49:21.778296 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:49:21.778346 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:49:21.778397 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:49:21.778446 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:49:21.778500 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:49:21.778549 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:49:21.778598 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:49:21.778649 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:49:21.778698 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:49:21.778747 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:49:21.778846 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:49:21.778897 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:49:21.778947 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:49:21.778999 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:49:21.779048 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:49:21.779099 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:49:21.779148 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:49:21.779197 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:49:21.779248 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:49:21.779297 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:49:21.779349 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:49:21.779400 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:49:21.779449 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:49:21.779501 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:49:21.779552 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:49:21.779601 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:49:21.779649 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:49:21.779700 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:49:21.779749 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:49:21.779812 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:49:21.779866 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:49:21.779916 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:49:21.779968 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:49:21.780019 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:49:21.780066 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:49:21.780111 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:49:21.780166 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:49:21.780213 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:49:21.780263 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Jan 13 20:49:21.780325 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Jan 13 20:49:21.780376 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:49:21.780422 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:49:21.780467 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:49:21.780512 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:49:21.780558 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:49:21.780602 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:49:21.780653 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Jan 13 20:49:21.780703 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Jan 13 20:49:21.780748 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:49:21.781984 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Jan 13 20:49:21.782036 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Jan 13 20:49:21.782083 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:49:21.782134 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Jan 13 20:49:21.782181 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Jan 13 20:49:21.782230 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:49:21.782281 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Jan 13 20:49:21.782327 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:49:21.782378 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Jan 13 20:49:21.782424 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:49:21.782474 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Jan 13 20:49:21.782523 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:49:21.782573 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Jan 13 20:49:21.782622 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:49:21.782676 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Jan 13 20:49:21.782731 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:49:21.784508 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Jan 13 20:49:21.784563 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Jan 13 20:49:21.784609 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:49:21.784660 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Jan 13 20:49:21.784707 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Jan 13 20:49:21.784754 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:49:21.784818 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Jan 13 20:49:21.784869 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Jan 13 20:49:21.784919 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:49:21.784970 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Jan 13 20:49:21.785016 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:49:21.785066 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Jan 13 20:49:21.785112 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:49:21.785163 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Jan 13 20:49:21.785212 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:49:21.785261 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Jan 13 20:49:21.785307 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:49:21.785358 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Jan 13 20:49:21.785404 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:49:21.785456 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Jan 13 20:49:21.785505 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Jan 13 20:49:21.785551 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:49:21.785600 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Jan 13 20:49:21.785647 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Jan 13 20:49:21.785693 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:49:21.785743 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Jan 13 20:49:21.787634 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Jan 13 20:49:21.787692 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:49:21.787746 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Jan 13 20:49:21.787809 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:49:21.787862 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Jan 13 20:49:21.787908 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:49:21.787958 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Jan 13 20:49:21.788009 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:49:21.788060 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Jan 13 20:49:21.788106 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:49:21.788156 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Jan 13 20:49:21.788202 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:49:21.788257 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 13 20:49:21.788306 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Jan 13 20:49:21.788352 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:49:21.788401 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Jan 13 20:49:21.788449 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Jan 13 20:49:21.788495 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:49:21.788544 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Jan 13 20:49:21.788593 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:49:21.788645 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Jan 13 20:49:21.788691 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:49:21.788741 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Jan 13 20:49:21.788798 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:49:21.788850 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Jan 13 20:49:21.788897 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:49:21.788951 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Jan 13 20:49:21.788998 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:49:21.789050 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Jan 13 20:49:21.789096 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:49:21.789153 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jan 13 20:49:21.789163 kernel: PCI: CLS 32 bytes, default 64 Jan 13 20:49:21.789172 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 13 20:49:21.789179 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:49:21.789185 kernel: clocksource: Switched to clocksource tsc Jan 13 20:49:21.789191 kernel: Initialise system trusted keyrings Jan 13 20:49:21.789198 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 13 20:49:21.789204 kernel: Key type asymmetric registered Jan 13 20:49:21.789210 kernel: Asymmetric key parser 'x509' registered Jan 13 20:49:21.789216 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 13 20:49:21.789223 kernel: io scheduler mq-deadline registered Jan 13 20:49:21.789231 kernel: io scheduler kyber registered Jan 13 20:49:21.789237 kernel: io scheduler bfq registered Jan 13 20:49:21.789290 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Jan 13 20:49:21.789345 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.789398 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Jan 13 20:49:21.789449 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.789502 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Jan 13 20:49:21.789553 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.789607 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Jan 13 20:49:21.789659 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.789711 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Jan 13 20:49:21.789761 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790203 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Jan 13 20:49:21.790257 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790317 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Jan 13 20:49:21.790368 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790420 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Jan 13 20:49:21.790471 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790522 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Jan 13 20:49:21.790575 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790627 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Jan 13 20:49:21.790677 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790729 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Jan 13 20:49:21.790786 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790838 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Jan 13 20:49:21.790888 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790942 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Jan 13 20:49:21.790994 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.791046 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Jan 13 20:49:21.791096 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.791148 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Jan 13 20:49:21.791201 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.791253 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Jan 13 20:49:21.791304 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.791356 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Jan 13 20:49:21.791406 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.791457 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Jan 13 20:49:21.791510 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.791562 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Jan 13 20:49:21.791612 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.791662 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Jan 13 20:49:21.791713 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.792007 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Jan 13 20:49:21.792075 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.792130 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Jan 13 20:49:21.792184 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.792235 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Jan 13 20:49:21.792287 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.792355 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Jan 13 20:49:21.792409 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.792460 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Jan 13 20:49:21.792510 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.792561 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Jan 13 20:49:21.792610 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.792660 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Jan 13 20:49:21.792712 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.793319 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Jan 13 20:49:21.793388 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.793442 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Jan 13 20:49:21.793493 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.793548 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Jan 13 20:49:21.793598 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.793649 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Jan 13 20:49:21.793699 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.793748 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Jan 13 20:49:21.793812 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.793825 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 13 20:49:21.793832 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 13 20:49:21.793839 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 13 20:49:21.793845 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Jan 13 20:49:21.793852 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 13 20:49:21.793858 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 13 20:49:21.793911 kernel: rtc_cmos 00:01: registered as rtc0 Jan 13 20:49:21.793959 kernel: rtc_cmos 00:01: setting system clock to 2025-01-13T20:49:21 UTC (1736801361) Jan 13 20:49:21.794003 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Jan 13 20:49:21.794012 kernel: intel_pstate: CPU model not supported Jan 13 20:49:21.794018 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 13 20:49:21.794025 kernel: NET: Registered PF_INET6 protocol family Jan 13 20:49:21.794031 kernel: Segment Routing with IPv6 Jan 13 20:49:21.794037 kernel: In-situ OAM (IOAM) with IPv6 Jan 13 20:49:21.794044 kernel: NET: Registered PF_PACKET protocol family Jan 13 20:49:21.794050 kernel: Key type dns_resolver registered Jan 13 20:49:21.794058 kernel: IPI shorthand broadcast: enabled Jan 13 20:49:21.794064 kernel: sched_clock: Marking stable (876438053, 223432303)->(1157339130, -57468774) Jan 13 20:49:21.794070 kernel: registered taskstats version 1 Jan 13 20:49:21.794076 kernel: Loading compiled-in X.509 certificates Jan 13 20:49:21.794083 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: 98739e9049f62881f4df7ffd1e39335f7f55b344' Jan 13 20:49:21.794089 kernel: Key type .fscrypt registered Jan 13 20:49:21.794095 kernel: Key type fscrypt-provisioning registered Jan 13 20:49:21.794101 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 13 20:49:21.794109 kernel: ima: Allocated hash algorithm: sha1 Jan 13 20:49:21.794116 kernel: ima: No architecture policies found Jan 13 20:49:21.794122 kernel: clk: Disabling unused clocks Jan 13 20:49:21.794128 kernel: Freeing unused kernel image (initmem) memory: 42976K Jan 13 20:49:21.794134 kernel: Write protecting the kernel read-only data: 36864k Jan 13 20:49:21.794140 kernel: Freeing unused kernel image (rodata/data gap) memory: 1840K Jan 13 20:49:21.794146 kernel: Run /init as init process Jan 13 20:49:21.794152 kernel: with arguments: Jan 13 20:49:21.794159 kernel: /init Jan 13 20:49:21.794165 kernel: with environment: Jan 13 20:49:21.794172 kernel: HOME=/ Jan 13 20:49:21.794178 kernel: TERM=linux Jan 13 20:49:21.794184 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 13 20:49:21.794192 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 20:49:21.794200 systemd[1]: Detected virtualization vmware. Jan 13 20:49:21.794207 systemd[1]: Detected architecture x86-64. Jan 13 20:49:21.794213 systemd[1]: Running in initrd. Jan 13 20:49:21.794219 systemd[1]: No hostname configured, using default hostname. Jan 13 20:49:21.794226 systemd[1]: Hostname set to . Jan 13 20:49:21.794233 systemd[1]: Initializing machine ID from random generator. Jan 13 20:49:21.794239 systemd[1]: Queued start job for default target initrd.target. Jan 13 20:49:21.794246 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:49:21.794252 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:49:21.794259 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 13 20:49:21.794266 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 20:49:21.794273 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 13 20:49:21.794279 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 13 20:49:21.794287 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 13 20:49:21.794294 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 13 20:49:21.794300 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:49:21.794306 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:49:21.794313 systemd[1]: Reached target paths.target - Path Units. Jan 13 20:49:21.794320 systemd[1]: Reached target slices.target - Slice Units. Jan 13 20:49:21.794327 systemd[1]: Reached target swap.target - Swaps. Jan 13 20:49:21.794333 systemd[1]: Reached target timers.target - Timer Units. Jan 13 20:49:21.794339 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:49:21.794346 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:49:21.794352 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 13 20:49:21.794358 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 13 20:49:21.794365 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:49:21.794371 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 20:49:21.794378 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:49:21.794384 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 20:49:21.794391 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 13 20:49:21.794397 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 20:49:21.794403 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 13 20:49:21.794409 systemd[1]: Starting systemd-fsck-usr.service... Jan 13 20:49:21.794416 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 20:49:21.794422 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 20:49:21.794428 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:49:21.794436 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 13 20:49:21.794454 systemd-journald[217]: Collecting audit messages is disabled. Jan 13 20:49:21.794470 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:49:21.794477 systemd[1]: Finished systemd-fsck-usr.service. Jan 13 20:49:21.794485 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 20:49:21.794491 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:49:21.794498 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:49:21.794504 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:49:21.794512 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 20:49:21.794518 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:49:21.794525 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 13 20:49:21.794531 kernel: Bridge firewalling registered Jan 13 20:49:21.794538 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 13 20:49:21.794544 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 20:49:21.794551 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:49:21.794558 systemd-journald[217]: Journal started Jan 13 20:49:21.794573 systemd-journald[217]: Runtime Journal (/run/log/journal/355c9f29c2614988856e80866a2d421f) is 4.8M, max 38.7M, 33.8M free. Jan 13 20:49:21.751994 systemd-modules-load[218]: Inserted module 'overlay' Jan 13 20:49:21.789839 systemd-modules-load[218]: Inserted module 'br_netfilter' Jan 13 20:49:21.796588 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 20:49:21.800450 dracut-cmdline[234]: dracut-dracut-053 Jan 13 20:49:21.801856 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 20:49:21.804859 dracut-cmdline[234]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 20:49:21.803069 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 20:49:21.808053 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:49:21.817906 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:49:21.821839 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 20:49:21.838503 systemd-resolved[288]: Positive Trust Anchors: Jan 13 20:49:21.838513 systemd-resolved[288]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 20:49:21.838535 systemd-resolved[288]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 20:49:21.840171 systemd-resolved[288]: Defaulting to hostname 'linux'. Jan 13 20:49:21.840738 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 20:49:21.840914 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:49:21.850774 kernel: SCSI subsystem initialized Jan 13 20:49:21.856777 kernel: Loading iSCSI transport class v2.0-870. Jan 13 20:49:21.863785 kernel: iscsi: registered transport (tcp) Jan 13 20:49:21.876788 kernel: iscsi: registered transport (qla4xxx) Jan 13 20:49:21.876830 kernel: QLogic iSCSI HBA Driver Jan 13 20:49:21.896655 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 13 20:49:21.900914 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 13 20:49:21.916058 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 13 20:49:21.916135 kernel: device-mapper: uevent: version 1.0.3 Jan 13 20:49:21.917246 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 13 20:49:21.948786 kernel: raid6: avx2x4 gen() 48679 MB/s Jan 13 20:49:21.965779 kernel: raid6: avx2x2 gen() 53522 MB/s Jan 13 20:49:21.982951 kernel: raid6: avx2x1 gen() 44922 MB/s Jan 13 20:49:21.982971 kernel: raid6: using algorithm avx2x2 gen() 53522 MB/s Jan 13 20:49:22.000950 kernel: raid6: .... xor() 31345 MB/s, rmw enabled Jan 13 20:49:22.000971 kernel: raid6: using avx2x2 recovery algorithm Jan 13 20:49:22.013777 kernel: xor: automatically using best checksumming function avx Jan 13 20:49:22.111782 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 13 20:49:22.117151 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:49:22.121877 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:49:22.129156 systemd-udevd[434]: Using default interface naming scheme 'v255'. Jan 13 20:49:22.131679 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:49:22.138901 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 13 20:49:22.145634 dracut-pre-trigger[439]: rd.md=0: removing MD RAID activation Jan 13 20:49:22.160768 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:49:22.164878 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 20:49:22.234989 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:49:22.240933 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 13 20:49:22.253939 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 13 20:49:22.254628 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:49:22.255140 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:49:22.255423 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 20:49:22.261909 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 13 20:49:22.271587 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:49:22.303775 kernel: VMware PVSCSI driver - version 1.0.7.0-k Jan 13 20:49:22.305772 kernel: vmw_pvscsi: using 64bit dma Jan 13 20:49:22.308772 kernel: vmw_pvscsi: max_id: 16 Jan 13 20:49:22.308790 kernel: vmw_pvscsi: setting ring_pages to 8 Jan 13 20:49:22.310833 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Jan 13 20:49:22.316223 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Jan 13 20:49:22.326634 kernel: vmw_pvscsi: enabling reqCallThreshold Jan 13 20:49:22.326645 kernel: vmw_pvscsi: driver-based request coalescing enabled Jan 13 20:49:22.326653 kernel: vmw_pvscsi: using MSI-X Jan 13 20:49:22.326661 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Jan 13 20:49:22.326679 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Jan 13 20:49:22.328792 kernel: cryptd: max_cpu_qlen set to 1000 Jan 13 20:49:22.332826 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Jan 13 20:49:22.338305 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Jan 13 20:49:22.338389 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Jan 13 20:49:22.338462 kernel: libata version 3.00 loaded. Jan 13 20:49:22.340794 kernel: ata_piix 0000:00:07.1: version 2.13 Jan 13 20:49:22.361116 kernel: scsi host1: ata_piix Jan 13 20:49:22.361194 kernel: AVX2 version of gcm_enc/dec engaged. Jan 13 20:49:22.361204 kernel: AES CTR mode by8 optimization enabled Jan 13 20:49:22.361211 kernel: scsi host2: ata_piix Jan 13 20:49:22.361276 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Jan 13 20:49:22.361285 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Jan 13 20:49:22.345319 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:49:22.345384 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:49:22.345568 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:49:22.345660 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:49:22.345722 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:49:22.345831 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:49:22.351380 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:49:22.368483 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:49:22.372903 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:49:22.381272 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:49:22.514837 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Jan 13 20:49:22.518787 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Jan 13 20:49:22.528019 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Jan 13 20:49:22.531932 kernel: sd 0:0:0:0: [sda] Write Protect is off Jan 13 20:49:22.532018 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Jan 13 20:49:22.532082 kernel: sd 0:0:0:0: [sda] Cache data unavailable Jan 13 20:49:22.532141 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Jan 13 20:49:22.532200 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:49:22.532209 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jan 13 20:49:22.550847 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Jan 13 20:49:22.559007 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 13 20:49:22.559020 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (491) Jan 13 20:49:22.559031 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jan 13 20:49:22.563527 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Jan 13 20:49:22.564937 kernel: BTRFS: device fsid 5e7921ba-229a-48a0-bc77-9b30aaa34aeb devid 1 transid 36 /dev/sda3 scanned by (udev-worker) (488) Jan 13 20:49:22.567293 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Jan 13 20:49:22.570364 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 20:49:22.572427 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Jan 13 20:49:22.572567 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Jan 13 20:49:22.578926 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 13 20:49:22.602975 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:49:22.609780 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:49:23.609777 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:49:23.609842 disk-uuid[589]: The operation has completed successfully. Jan 13 20:49:23.646067 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 13 20:49:23.646118 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 13 20:49:23.650854 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 13 20:49:23.652543 sh[607]: Success Jan 13 20:49:23.660776 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Jan 13 20:49:23.703466 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 13 20:49:23.715579 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 13 20:49:23.715930 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 13 20:49:23.730072 kernel: BTRFS info (device dm-0): first mount of filesystem 5e7921ba-229a-48a0-bc77-9b30aaa34aeb Jan 13 20:49:23.730093 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:49:23.730102 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 13 20:49:23.731951 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 13 20:49:23.731965 kernel: BTRFS info (device dm-0): using free space tree Jan 13 20:49:23.738777 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 13 20:49:23.740520 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 13 20:49:23.749835 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Jan 13 20:49:23.751031 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 13 20:49:23.772775 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:49:23.772810 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:49:23.772819 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:49:23.791787 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:49:23.797935 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 13 20:49:23.798791 kernel: BTRFS info (device sda6): last unmount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:49:23.801086 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 13 20:49:23.805884 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 13 20:49:23.818061 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:49:23.822895 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 13 20:49:23.884467 ignition[667]: Ignition 2.20.0 Jan 13 20:49:23.884473 ignition[667]: Stage: fetch-offline Jan 13 20:49:23.884606 ignition[667]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:49:23.884613 ignition[667]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:49:23.884671 ignition[667]: parsed url from cmdline: "" Jan 13 20:49:23.884674 ignition[667]: no config URL provided Jan 13 20:49:23.884678 ignition[667]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 20:49:23.884684 ignition[667]: no config at "/usr/lib/ignition/user.ign" Jan 13 20:49:23.885468 ignition[667]: config successfully fetched Jan 13 20:49:23.885485 ignition[667]: parsing config with SHA512: 88f721b84580e4882dd4a781389762e99cb01553a468b81557acb04e18b8c59954baaed9d2cdd9defb9eabe6553b674261a1f47770423b011308c178b6a6003e Jan 13 20:49:23.888058 unknown[667]: fetched base config from "system" Jan 13 20:49:23.888301 ignition[667]: fetch-offline: fetch-offline passed Jan 13 20:49:23.888065 unknown[667]: fetched user config from "vmware" Jan 13 20:49:23.888347 ignition[667]: Ignition finished successfully Jan 13 20:49:23.889827 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:49:23.895190 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:49:23.900904 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 20:49:23.912637 systemd-networkd[804]: lo: Link UP Jan 13 20:49:23.912644 systemd-networkd[804]: lo: Gained carrier Jan 13 20:49:23.913336 systemd-networkd[804]: Enumeration completed Jan 13 20:49:23.913522 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 20:49:23.913601 systemd-networkd[804]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Jan 13 20:49:23.916987 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 20:49:23.917099 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 20:49:23.913776 systemd[1]: Reached target network.target - Network. Jan 13 20:49:23.913871 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jan 13 20:49:23.916752 systemd-networkd[804]: ens192: Link UP Jan 13 20:49:23.916756 systemd-networkd[804]: ens192: Gained carrier Jan 13 20:49:23.920875 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 13 20:49:23.928796 ignition[807]: Ignition 2.20.0 Jan 13 20:49:23.928804 ignition[807]: Stage: kargs Jan 13 20:49:23.928914 ignition[807]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:49:23.928920 ignition[807]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:49:23.929426 ignition[807]: kargs: kargs passed Jan 13 20:49:23.929452 ignition[807]: Ignition finished successfully Jan 13 20:49:23.930543 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 13 20:49:23.940061 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 13 20:49:23.947128 ignition[814]: Ignition 2.20.0 Jan 13 20:49:23.947134 ignition[814]: Stage: disks Jan 13 20:49:23.947241 ignition[814]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:49:23.947247 ignition[814]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:49:23.947798 ignition[814]: disks: disks passed Jan 13 20:49:23.948506 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 13 20:49:23.947837 ignition[814]: Ignition finished successfully Jan 13 20:49:23.948907 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 13 20:49:23.949007 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 13 20:49:23.949101 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 20:49:23.949181 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 20:49:23.949267 systemd[1]: Reached target basic.target - Basic System. Jan 13 20:49:23.952868 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 13 20:49:23.963984 systemd-fsck[822]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 13 20:49:23.965532 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 13 20:49:23.968885 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 13 20:49:24.029806 kernel: EXT4-fs (sda9): mounted filesystem 84bcd1b2-5573-4e91-8fd5-f97782397085 r/w with ordered data mode. Quota mode: none. Jan 13 20:49:24.029786 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 13 20:49:24.030140 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 13 20:49:24.033804 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:49:24.035462 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 13 20:49:24.035777 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 13 20:49:24.035804 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 13 20:49:24.035817 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:49:24.039875 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 13 20:49:24.041781 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (830) Jan 13 20:49:24.041845 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 13 20:49:24.046034 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:49:24.046058 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:49:24.046072 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:49:24.048965 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:49:24.049966 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:49:24.074849 initrd-setup-root[854]: cut: /sysroot/etc/passwd: No such file or directory Jan 13 20:49:24.078271 initrd-setup-root[861]: cut: /sysroot/etc/group: No such file or directory Jan 13 20:49:24.080438 initrd-setup-root[868]: cut: /sysroot/etc/shadow: No such file or directory Jan 13 20:49:24.082341 initrd-setup-root[875]: cut: /sysroot/etc/gshadow: No such file or directory Jan 13 20:49:24.157998 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 13 20:49:24.161895 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 13 20:49:24.163869 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 13 20:49:24.167882 kernel: BTRFS info (device sda6): last unmount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:49:24.183504 ignition[943]: INFO : Ignition 2.20.0 Jan 13 20:49:24.183504 ignition[943]: INFO : Stage: mount Jan 13 20:49:24.183889 ignition[943]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:49:24.183889 ignition[943]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:49:24.184134 ignition[943]: INFO : mount: mount passed Jan 13 20:49:24.184370 ignition[943]: INFO : Ignition finished successfully Jan 13 20:49:24.186232 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 13 20:49:24.186646 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 13 20:49:24.189863 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 13 20:49:24.728815 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 13 20:49:24.733965 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:49:24.741785 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (954) Jan 13 20:49:24.744900 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:49:24.744922 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:49:24.744933 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:49:24.748774 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:49:24.748887 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:49:24.766299 ignition[971]: INFO : Ignition 2.20.0 Jan 13 20:49:24.766299 ignition[971]: INFO : Stage: files Jan 13 20:49:24.766585 ignition[971]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:49:24.766585 ignition[971]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:49:24.766877 ignition[971]: DEBUG : files: compiled without relabeling support, skipping Jan 13 20:49:24.767360 ignition[971]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 13 20:49:24.767360 ignition[971]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 13 20:49:24.769263 ignition[971]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 13 20:49:24.769439 ignition[971]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 13 20:49:24.769575 ignition[971]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 13 20:49:24.769516 unknown[971]: wrote ssh authorized keys file for user: core Jan 13 20:49:24.770921 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:49:24.771136 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 13 20:49:24.808261 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 13 20:49:24.895249 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:49:24.895522 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 13 20:49:24.895522 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 13 20:49:24.895522 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:49:24.895522 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:49:24.895522 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:49:24.895522 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:49:24.895522 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:49:24.896580 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:49:24.896580 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:49:24.896580 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:49:24.896580 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:49:24.896580 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:49:24.896580 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:49:24.896580 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.29.2-x86-64.raw: attempt #1 Jan 13 20:49:25.238964 systemd-networkd[804]: ens192: Gained IPv6LL Jan 13 20:49:25.356097 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 13 20:49:25.584679 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:49:25.584990 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:49:25.584990 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:49:25.584990 ignition[971]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Jan 13 20:49:25.623387 ignition[971]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:49:25.625628 ignition[971]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:49:25.626112 ignition[971]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Jan 13 20:49:25.626112 ignition[971]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Jan 13 20:49:25.626112 ignition[971]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Jan 13 20:49:25.626112 ignition[971]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:49:25.626112 ignition[971]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:49:25.626112 ignition[971]: INFO : files: files passed Jan 13 20:49:25.626112 ignition[971]: INFO : Ignition finished successfully Jan 13 20:49:25.627815 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 13 20:49:25.632858 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 13 20:49:25.634364 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 13 20:49:25.634909 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 13 20:49:25.635089 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 13 20:49:25.640245 initrd-setup-root-after-ignition[1001]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:49:25.640245 initrd-setup-root-after-ignition[1001]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:49:25.641349 initrd-setup-root-after-ignition[1005]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:49:25.642018 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:49:25.642446 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 13 20:49:25.647863 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 13 20:49:25.659202 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 13 20:49:25.659258 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 13 20:49:25.659564 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 13 20:49:25.659683 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 13 20:49:25.659920 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 13 20:49:25.660325 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 13 20:49:25.669324 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:49:25.673872 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 13 20:49:25.678991 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:49:25.679157 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:49:25.679318 systemd[1]: Stopped target timers.target - Timer Units. Jan 13 20:49:25.679508 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 13 20:49:25.679582 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:49:25.679881 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 13 20:49:25.680020 systemd[1]: Stopped target basic.target - Basic System. Jan 13 20:49:25.680153 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 13 20:49:25.680296 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:49:25.680446 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 13 20:49:25.680590 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 13 20:49:25.680728 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:49:25.681860 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 13 20:49:25.682083 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 13 20:49:25.682252 systemd[1]: Stopped target swap.target - Swaps. Jan 13 20:49:25.682387 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 13 20:49:25.682447 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:49:25.682807 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:49:25.683031 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:49:25.683195 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 13 20:49:25.683256 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:49:25.683498 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 13 20:49:25.683575 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 13 20:49:25.683923 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 13 20:49:25.684001 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:49:25.684221 systemd[1]: Stopped target paths.target - Path Units. Jan 13 20:49:25.684464 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 13 20:49:25.687784 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:49:25.687971 systemd[1]: Stopped target slices.target - Slice Units. Jan 13 20:49:25.688149 systemd[1]: Stopped target sockets.target - Socket Units. Jan 13 20:49:25.688311 systemd[1]: iscsid.socket: Deactivated successfully. Jan 13 20:49:25.688376 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:49:25.688585 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 13 20:49:25.688629 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:49:25.688868 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 13 20:49:25.688929 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:49:25.689171 systemd[1]: ignition-files.service: Deactivated successfully. Jan 13 20:49:25.689225 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 13 20:49:25.696884 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 13 20:49:25.698887 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 13 20:49:25.698991 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 13 20:49:25.699087 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:49:25.699430 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 13 20:49:25.699512 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:49:25.702369 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 13 20:49:25.702458 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 13 20:49:25.705843 ignition[1025]: INFO : Ignition 2.20.0 Jan 13 20:49:25.708945 ignition[1025]: INFO : Stage: umount Jan 13 20:49:25.708945 ignition[1025]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:49:25.708945 ignition[1025]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:49:25.708945 ignition[1025]: INFO : umount: umount passed Jan 13 20:49:25.708945 ignition[1025]: INFO : Ignition finished successfully Jan 13 20:49:25.709964 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 13 20:49:25.710040 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 13 20:49:25.710255 systemd[1]: Stopped target network.target - Network. Jan 13 20:49:25.710338 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 13 20:49:25.710365 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 13 20:49:25.710464 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 13 20:49:25.710485 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 13 20:49:25.710582 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 13 20:49:25.710603 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 13 20:49:25.710697 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 13 20:49:25.710718 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 13 20:49:25.710932 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 13 20:49:25.711304 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 13 20:49:25.715943 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 13 20:49:25.716006 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 13 20:49:25.716723 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 13 20:49:25.716959 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 13 20:49:25.716977 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:49:25.721814 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 13 20:49:25.721912 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 13 20:49:25.721941 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:49:25.722067 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Jan 13 20:49:25.722087 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:49:25.722240 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:49:25.722437 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 13 20:49:25.723639 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 13 20:49:25.725737 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 13 20:49:25.726099 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:49:25.726355 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 13 20:49:25.726538 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 13 20:49:25.726670 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 13 20:49:25.726691 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:49:25.729215 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 13 20:49:25.729285 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 13 20:49:25.735182 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 13 20:49:25.735254 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:49:25.735471 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 13 20:49:25.735493 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 13 20:49:25.735593 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 13 20:49:25.735611 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:49:25.735700 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 13 20:49:25.735721 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:49:25.735946 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 13 20:49:25.735967 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 13 20:49:25.736270 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:49:25.736290 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:49:25.740846 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 13 20:49:25.740954 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 13 20:49:25.740981 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:49:25.741104 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 13 20:49:25.741125 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:49:25.741242 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 13 20:49:25.741262 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:49:25.741377 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:49:25.741397 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:49:25.743781 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 13 20:49:25.743833 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 13 20:49:25.827217 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 13 20:49:25.827296 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 13 20:49:25.827674 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 13 20:49:25.827851 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 13 20:49:25.827883 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 13 20:49:25.831870 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 13 20:49:25.837002 systemd[1]: Switching root. Jan 13 20:49:25.874992 systemd-journald[217]: Journal stopped Jan 13 20:49:21.734953 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Jan 13 19:01:45 -00 2025 Jan 13 20:49:21.734969 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 20:49:21.734975 kernel: Disabled fast string operations Jan 13 20:49:21.734979 kernel: BIOS-provided physical RAM map: Jan 13 20:49:21.734983 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Jan 13 20:49:21.734987 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Jan 13 20:49:21.734993 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Jan 13 20:49:21.734997 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Jan 13 20:49:21.735001 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Jan 13 20:49:21.735005 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Jan 13 20:49:21.735010 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Jan 13 20:49:21.735014 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Jan 13 20:49:21.735018 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Jan 13 20:49:21.735022 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Jan 13 20:49:21.735028 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Jan 13 20:49:21.735034 kernel: NX (Execute Disable) protection: active Jan 13 20:49:21.735038 kernel: APIC: Static calls initialized Jan 13 20:49:21.735043 kernel: SMBIOS 2.7 present. Jan 13 20:49:21.735048 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Jan 13 20:49:21.735053 kernel: vmware: hypercall mode: 0x00 Jan 13 20:49:21.735057 kernel: Hypervisor detected: VMware Jan 13 20:49:21.735062 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Jan 13 20:49:21.735068 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Jan 13 20:49:21.735072 kernel: vmware: using clock offset of 2640272691 ns Jan 13 20:49:21.735077 kernel: tsc: Detected 3408.000 MHz processor Jan 13 20:49:21.735082 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 13 20:49:21.735088 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 13 20:49:21.735092 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Jan 13 20:49:21.735097 kernel: total RAM covered: 3072M Jan 13 20:49:21.735102 kernel: Found optimal setting for mtrr clean up Jan 13 20:49:21.735109 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Jan 13 20:49:21.735115 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Jan 13 20:49:21.735120 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 13 20:49:21.735124 kernel: Using GB pages for direct mapping Jan 13 20:49:21.735129 kernel: ACPI: Early table checksum verification disabled Jan 13 20:49:21.735134 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Jan 13 20:49:21.735139 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Jan 13 20:49:21.735143 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Jan 13 20:49:21.735148 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Jan 13 20:49:21.735153 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:49:21.735161 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 20:49:21.735166 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Jan 13 20:49:21.735171 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Jan 13 20:49:21.735176 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Jan 13 20:49:21.735181 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Jan 13 20:49:21.735187 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Jan 13 20:49:21.735193 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Jan 13 20:49:21.735198 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Jan 13 20:49:21.735203 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Jan 13 20:49:21.735208 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:49:21.735213 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 20:49:21.735218 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Jan 13 20:49:21.735223 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Jan 13 20:49:21.735228 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Jan 13 20:49:21.735233 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Jan 13 20:49:21.735239 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Jan 13 20:49:21.735244 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Jan 13 20:49:21.735249 kernel: system APIC only can use physical flat Jan 13 20:49:21.735254 kernel: APIC: Switched APIC routing to: physical flat Jan 13 20:49:21.735259 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 13 20:49:21.735264 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 13 20:49:21.735269 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 13 20:49:21.735274 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 13 20:49:21.735279 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 13 20:49:21.735285 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 13 20:49:21.735290 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 13 20:49:21.735295 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 13 20:49:21.735300 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Jan 13 20:49:21.735305 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Jan 13 20:49:21.735309 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Jan 13 20:49:21.735314 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Jan 13 20:49:21.735319 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Jan 13 20:49:21.735324 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Jan 13 20:49:21.735329 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Jan 13 20:49:21.735335 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Jan 13 20:49:21.735340 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Jan 13 20:49:21.735345 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Jan 13 20:49:21.735350 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Jan 13 20:49:21.735355 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Jan 13 20:49:21.735360 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Jan 13 20:49:21.735365 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Jan 13 20:49:21.735370 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Jan 13 20:49:21.735374 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Jan 13 20:49:21.735379 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Jan 13 20:49:21.735384 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Jan 13 20:49:21.735390 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Jan 13 20:49:21.735395 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Jan 13 20:49:21.735400 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Jan 13 20:49:21.735405 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Jan 13 20:49:21.735410 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Jan 13 20:49:21.735415 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Jan 13 20:49:21.735420 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Jan 13 20:49:21.735425 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Jan 13 20:49:21.735430 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Jan 13 20:49:21.735435 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Jan 13 20:49:21.735441 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Jan 13 20:49:21.735446 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Jan 13 20:49:21.735451 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Jan 13 20:49:21.735456 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Jan 13 20:49:21.735461 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Jan 13 20:49:21.735466 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Jan 13 20:49:21.735471 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Jan 13 20:49:21.735476 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Jan 13 20:49:21.735481 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Jan 13 20:49:21.735486 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Jan 13 20:49:21.735491 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Jan 13 20:49:21.735496 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Jan 13 20:49:21.735501 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Jan 13 20:49:21.735506 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Jan 13 20:49:21.735511 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Jan 13 20:49:21.735516 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Jan 13 20:49:21.735521 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Jan 13 20:49:21.735526 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Jan 13 20:49:21.735531 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Jan 13 20:49:21.735536 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Jan 13 20:49:21.735542 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Jan 13 20:49:21.735547 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Jan 13 20:49:21.735552 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Jan 13 20:49:21.735561 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Jan 13 20:49:21.735566 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Jan 13 20:49:21.735571 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Jan 13 20:49:21.735576 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Jan 13 20:49:21.735582 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Jan 13 20:49:21.735587 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Jan 13 20:49:21.735594 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Jan 13 20:49:21.735599 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Jan 13 20:49:21.735604 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Jan 13 20:49:21.735610 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Jan 13 20:49:21.735615 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Jan 13 20:49:21.735620 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Jan 13 20:49:21.735625 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Jan 13 20:49:21.735630 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Jan 13 20:49:21.735636 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Jan 13 20:49:21.735641 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Jan 13 20:49:21.735647 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Jan 13 20:49:21.735653 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Jan 13 20:49:21.735658 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Jan 13 20:49:21.735663 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Jan 13 20:49:21.735668 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Jan 13 20:49:21.735674 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Jan 13 20:49:21.735679 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Jan 13 20:49:21.735684 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Jan 13 20:49:21.735689 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Jan 13 20:49:21.735695 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Jan 13 20:49:21.735701 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Jan 13 20:49:21.735706 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Jan 13 20:49:21.735711 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Jan 13 20:49:21.735716 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Jan 13 20:49:21.735722 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Jan 13 20:49:21.735727 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Jan 13 20:49:21.735732 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Jan 13 20:49:21.735737 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Jan 13 20:49:21.735743 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Jan 13 20:49:21.735748 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Jan 13 20:49:21.735753 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Jan 13 20:49:21.735759 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Jan 13 20:49:21.735774 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Jan 13 20:49:21.735780 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Jan 13 20:49:21.735785 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Jan 13 20:49:21.735790 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Jan 13 20:49:21.735796 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Jan 13 20:49:21.735801 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Jan 13 20:49:21.735806 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Jan 13 20:49:21.735812 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Jan 13 20:49:21.735817 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Jan 13 20:49:21.735824 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Jan 13 20:49:21.735829 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Jan 13 20:49:21.735835 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Jan 13 20:49:21.735840 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Jan 13 20:49:21.735845 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Jan 13 20:49:21.735850 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Jan 13 20:49:21.735856 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Jan 13 20:49:21.735861 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Jan 13 20:49:21.735866 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Jan 13 20:49:21.735872 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Jan 13 20:49:21.735878 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Jan 13 20:49:21.735884 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Jan 13 20:49:21.735889 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Jan 13 20:49:21.735894 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Jan 13 20:49:21.735900 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Jan 13 20:49:21.735905 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Jan 13 20:49:21.735910 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Jan 13 20:49:21.735915 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Jan 13 20:49:21.735920 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Jan 13 20:49:21.735926 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Jan 13 20:49:21.735932 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Jan 13 20:49:21.735937 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Jan 13 20:49:21.735943 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 13 20:49:21.735948 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 13 20:49:21.735953 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Jan 13 20:49:21.735959 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Jan 13 20:49:21.735965 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Jan 13 20:49:21.735970 kernel: Zone ranges: Jan 13 20:49:21.735976 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 13 20:49:21.735981 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Jan 13 20:49:21.735987 kernel: Normal empty Jan 13 20:49:21.735993 kernel: Movable zone start for each node Jan 13 20:49:21.735998 kernel: Early memory node ranges Jan 13 20:49:21.736004 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Jan 13 20:49:21.736009 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Jan 13 20:49:21.736014 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Jan 13 20:49:21.736020 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Jan 13 20:49:21.736025 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 13 20:49:21.736030 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Jan 13 20:49:21.736037 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Jan 13 20:49:21.736043 kernel: ACPI: PM-Timer IO Port: 0x1008 Jan 13 20:49:21.736048 kernel: system APIC only can use physical flat Jan 13 20:49:21.736053 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Jan 13 20:49:21.736059 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Jan 13 20:49:21.736064 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Jan 13 20:49:21.736069 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Jan 13 20:49:21.736075 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Jan 13 20:49:21.736080 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Jan 13 20:49:21.736085 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Jan 13 20:49:21.736092 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Jan 13 20:49:21.736097 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Jan 13 20:49:21.736103 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Jan 13 20:49:21.736108 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Jan 13 20:49:21.736113 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Jan 13 20:49:21.736119 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Jan 13 20:49:21.736124 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Jan 13 20:49:21.736130 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Jan 13 20:49:21.736135 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Jan 13 20:49:21.736141 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Jan 13 20:49:21.736147 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Jan 13 20:49:21.736152 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Jan 13 20:49:21.736157 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Jan 13 20:49:21.736163 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Jan 13 20:49:21.736168 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Jan 13 20:49:21.736173 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Jan 13 20:49:21.736179 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Jan 13 20:49:21.736184 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Jan 13 20:49:21.736189 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Jan 13 20:49:21.736196 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Jan 13 20:49:21.736201 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Jan 13 20:49:21.736206 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Jan 13 20:49:21.736212 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Jan 13 20:49:21.736217 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Jan 13 20:49:21.736222 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Jan 13 20:49:21.736228 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Jan 13 20:49:21.736233 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Jan 13 20:49:21.736239 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Jan 13 20:49:21.736244 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Jan 13 20:49:21.736250 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Jan 13 20:49:21.736256 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Jan 13 20:49:21.736261 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Jan 13 20:49:21.736267 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Jan 13 20:49:21.736272 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Jan 13 20:49:21.736277 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Jan 13 20:49:21.736283 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Jan 13 20:49:21.736288 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Jan 13 20:49:21.736293 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Jan 13 20:49:21.736303 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Jan 13 20:49:21.736309 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Jan 13 20:49:21.736314 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Jan 13 20:49:21.736319 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Jan 13 20:49:21.736325 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Jan 13 20:49:21.736330 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Jan 13 20:49:21.736335 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Jan 13 20:49:21.736341 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Jan 13 20:49:21.736346 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Jan 13 20:49:21.736351 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Jan 13 20:49:21.736358 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Jan 13 20:49:21.736363 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Jan 13 20:49:21.736369 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Jan 13 20:49:21.736374 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Jan 13 20:49:21.736379 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Jan 13 20:49:21.736385 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Jan 13 20:49:21.736390 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Jan 13 20:49:21.736395 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Jan 13 20:49:21.736401 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Jan 13 20:49:21.736406 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Jan 13 20:49:21.736413 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Jan 13 20:49:21.736418 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Jan 13 20:49:21.736423 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Jan 13 20:49:21.736429 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Jan 13 20:49:21.736434 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Jan 13 20:49:21.736439 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Jan 13 20:49:21.736445 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Jan 13 20:49:21.736450 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Jan 13 20:49:21.736455 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Jan 13 20:49:21.736462 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Jan 13 20:49:21.736467 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Jan 13 20:49:21.736472 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Jan 13 20:49:21.736478 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Jan 13 20:49:21.736483 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Jan 13 20:49:21.736488 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Jan 13 20:49:21.736494 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Jan 13 20:49:21.736499 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Jan 13 20:49:21.736504 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Jan 13 20:49:21.736510 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Jan 13 20:49:21.736516 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Jan 13 20:49:21.736522 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Jan 13 20:49:21.736527 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Jan 13 20:49:21.736533 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Jan 13 20:49:21.736538 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Jan 13 20:49:21.736543 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Jan 13 20:49:21.736548 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Jan 13 20:49:21.736554 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Jan 13 20:49:21.736560 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Jan 13 20:49:21.736565 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Jan 13 20:49:21.736572 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Jan 13 20:49:21.736577 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Jan 13 20:49:21.736582 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Jan 13 20:49:21.736588 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Jan 13 20:49:21.736593 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Jan 13 20:49:21.736598 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Jan 13 20:49:21.736604 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Jan 13 20:49:21.736609 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Jan 13 20:49:21.736614 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Jan 13 20:49:21.736621 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Jan 13 20:49:21.736626 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Jan 13 20:49:21.736631 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Jan 13 20:49:21.736637 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Jan 13 20:49:21.736642 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Jan 13 20:49:21.736647 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Jan 13 20:49:21.736653 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Jan 13 20:49:21.736658 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Jan 13 20:49:21.736663 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Jan 13 20:49:21.736669 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Jan 13 20:49:21.736675 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Jan 13 20:49:21.736680 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Jan 13 20:49:21.736686 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Jan 13 20:49:21.736691 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Jan 13 20:49:21.736696 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Jan 13 20:49:21.736702 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Jan 13 20:49:21.736707 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Jan 13 20:49:21.736713 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Jan 13 20:49:21.736718 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Jan 13 20:49:21.736723 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Jan 13 20:49:21.736730 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Jan 13 20:49:21.736735 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Jan 13 20:49:21.736740 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Jan 13 20:49:21.736746 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Jan 13 20:49:21.736751 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Jan 13 20:49:21.736756 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Jan 13 20:49:21.736769 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Jan 13 20:49:21.736776 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 13 20:49:21.736781 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Jan 13 20:49:21.736788 kernel: TSC deadline timer available Jan 13 20:49:21.736793 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Jan 13 20:49:21.736799 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Jan 13 20:49:21.736804 kernel: Booting paravirtualized kernel on VMware hypervisor Jan 13 20:49:21.736810 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 13 20:49:21.736815 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Jan 13 20:49:21.736821 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 13 20:49:21.736826 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 13 20:49:21.736832 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Jan 13 20:49:21.736838 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Jan 13 20:49:21.736844 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Jan 13 20:49:21.736849 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Jan 13 20:49:21.736854 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Jan 13 20:49:21.736866 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Jan 13 20:49:21.736873 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Jan 13 20:49:21.736878 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Jan 13 20:49:21.736884 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Jan 13 20:49:21.736889 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Jan 13 20:49:21.736896 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Jan 13 20:49:21.736902 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Jan 13 20:49:21.736908 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Jan 13 20:49:21.736913 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Jan 13 20:49:21.736919 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Jan 13 20:49:21.736924 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Jan 13 20:49:21.736931 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 20:49:21.736937 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 13 20:49:21.736944 kernel: random: crng init done Jan 13 20:49:21.736949 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Jan 13 20:49:21.736955 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Jan 13 20:49:21.736961 kernel: printk: log_buf_len min size: 262144 bytes Jan 13 20:49:21.736967 kernel: printk: log_buf_len: 1048576 bytes Jan 13 20:49:21.736972 kernel: printk: early log buf free: 239648(91%) Jan 13 20:49:21.736978 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 13 20:49:21.736984 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 13 20:49:21.736990 kernel: Fallback order for Node 0: 0 Jan 13 20:49:21.736997 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Jan 13 20:49:21.737002 kernel: Policy zone: DMA32 Jan 13 20:49:21.737008 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 13 20:49:21.737014 kernel: Memory: 1936356K/2096628K available (12288K kernel code, 2299K rwdata, 22736K rodata, 42976K init, 2216K bss, 160012K reserved, 0K cma-reserved) Jan 13 20:49:21.737021 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Jan 13 20:49:21.737028 kernel: ftrace: allocating 37920 entries in 149 pages Jan 13 20:49:21.737034 kernel: ftrace: allocated 149 pages with 4 groups Jan 13 20:49:21.737039 kernel: Dynamic Preempt: voluntary Jan 13 20:49:21.737045 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 13 20:49:21.737051 kernel: rcu: RCU event tracing is enabled. Jan 13 20:49:21.737057 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Jan 13 20:49:21.737063 kernel: Trampoline variant of Tasks RCU enabled. Jan 13 20:49:21.737069 kernel: Rude variant of Tasks RCU enabled. Jan 13 20:49:21.737075 kernel: Tracing variant of Tasks RCU enabled. Jan 13 20:49:21.737080 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 13 20:49:21.737087 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Jan 13 20:49:21.737093 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Jan 13 20:49:21.737099 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Jan 13 20:49:21.737105 kernel: Console: colour VGA+ 80x25 Jan 13 20:49:21.737110 kernel: printk: console [tty0] enabled Jan 13 20:49:21.737116 kernel: printk: console [ttyS0] enabled Jan 13 20:49:21.737122 kernel: ACPI: Core revision 20230628 Jan 13 20:49:21.737128 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Jan 13 20:49:21.737134 kernel: APIC: Switch to symmetric I/O mode setup Jan 13 20:49:21.737140 kernel: x2apic enabled Jan 13 20:49:21.737146 kernel: APIC: Switched APIC routing to: physical x2apic Jan 13 20:49:21.737152 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 13 20:49:21.737158 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:49:21.737164 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Jan 13 20:49:21.737170 kernel: Disabled fast string operations Jan 13 20:49:21.737175 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Jan 13 20:49:21.737181 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Jan 13 20:49:21.737187 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 13 20:49:21.737194 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Jan 13 20:49:21.737200 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Jan 13 20:49:21.737205 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 13 20:49:21.737211 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 13 20:49:21.737217 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 13 20:49:21.737224 kernel: RETBleed: Mitigation: Enhanced IBRS Jan 13 20:49:21.737230 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 13 20:49:21.737236 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 13 20:49:21.737242 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jan 13 20:49:21.737249 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 13 20:49:21.737255 kernel: GDS: Unknown: Dependent on hypervisor status Jan 13 20:49:21.737261 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 13 20:49:21.737266 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 13 20:49:21.737272 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 13 20:49:21.737278 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 13 20:49:21.737284 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jan 13 20:49:21.737290 kernel: Freeing SMP alternatives memory: 32K Jan 13 20:49:21.737295 kernel: pid_max: default: 131072 minimum: 1024 Jan 13 20:49:21.737306 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 13 20:49:21.737312 kernel: landlock: Up and running. Jan 13 20:49:21.737318 kernel: SELinux: Initializing. Jan 13 20:49:21.737324 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:49:21.737330 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 20:49:21.737336 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Jan 13 20:49:21.737342 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:49:21.737348 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:49:21.737355 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 20:49:21.737361 kernel: Performance Events: Skylake events, core PMU driver. Jan 13 20:49:21.737366 kernel: core: CPUID marked event: 'cpu cycles' unavailable Jan 13 20:49:21.737372 kernel: core: CPUID marked event: 'instructions' unavailable Jan 13 20:49:21.737378 kernel: core: CPUID marked event: 'bus cycles' unavailable Jan 13 20:49:21.737383 kernel: core: CPUID marked event: 'cache references' unavailable Jan 13 20:49:21.737389 kernel: core: CPUID marked event: 'cache misses' unavailable Jan 13 20:49:21.737394 kernel: core: CPUID marked event: 'branch instructions' unavailable Jan 13 20:49:21.737400 kernel: core: CPUID marked event: 'branch misses' unavailable Jan 13 20:49:21.737407 kernel: ... version: 1 Jan 13 20:49:21.737413 kernel: ... bit width: 48 Jan 13 20:49:21.737418 kernel: ... generic registers: 4 Jan 13 20:49:21.737424 kernel: ... value mask: 0000ffffffffffff Jan 13 20:49:21.737430 kernel: ... max period: 000000007fffffff Jan 13 20:49:21.737435 kernel: ... fixed-purpose events: 0 Jan 13 20:49:21.737441 kernel: ... event mask: 000000000000000f Jan 13 20:49:21.737447 kernel: signal: max sigframe size: 1776 Jan 13 20:49:21.737453 kernel: rcu: Hierarchical SRCU implementation. Jan 13 20:49:21.737459 kernel: rcu: Max phase no-delay instances is 400. Jan 13 20:49:21.737465 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 13 20:49:21.737471 kernel: smp: Bringing up secondary CPUs ... Jan 13 20:49:21.737477 kernel: smpboot: x86: Booting SMP configuration: Jan 13 20:49:21.737483 kernel: .... node #0, CPUs: #1 Jan 13 20:49:21.737488 kernel: Disabled fast string operations Jan 13 20:49:21.737495 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Jan 13 20:49:21.737500 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 13 20:49:21.737506 kernel: smp: Brought up 1 node, 2 CPUs Jan 13 20:49:21.737512 kernel: smpboot: Max logical packages: 128 Jan 13 20:49:21.737518 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Jan 13 20:49:21.737524 kernel: devtmpfs: initialized Jan 13 20:49:21.737530 kernel: x86/mm: Memory block size: 128MB Jan 13 20:49:21.737536 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Jan 13 20:49:21.737542 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 13 20:49:21.737548 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Jan 13 20:49:21.737554 kernel: pinctrl core: initialized pinctrl subsystem Jan 13 20:49:21.737560 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 13 20:49:21.737565 kernel: audit: initializing netlink subsys (disabled) Jan 13 20:49:21.737572 kernel: audit: type=2000 audit(1736801360.067:1): state=initialized audit_enabled=0 res=1 Jan 13 20:49:21.737579 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 13 20:49:21.737585 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 13 20:49:21.737591 kernel: cpuidle: using governor menu Jan 13 20:49:21.737597 kernel: Simple Boot Flag at 0x36 set to 0x80 Jan 13 20:49:21.737602 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 13 20:49:21.737608 kernel: dca service started, version 1.12.1 Jan 13 20:49:21.737614 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Jan 13 20:49:21.737620 kernel: PCI: Using configuration type 1 for base access Jan 13 20:49:21.737627 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 13 20:49:21.737632 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 13 20:49:21.737638 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 13 20:49:21.737644 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 13 20:49:21.737649 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 13 20:49:21.737655 kernel: ACPI: Added _OSI(Module Device) Jan 13 20:49:21.737661 kernel: ACPI: Added _OSI(Processor Device) Jan 13 20:49:21.737667 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 13 20:49:21.737673 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 13 20:49:21.737680 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 13 20:49:21.737685 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Jan 13 20:49:21.737691 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 13 20:49:21.737697 kernel: ACPI: Interpreter enabled Jan 13 20:49:21.737702 kernel: ACPI: PM: (supports S0 S1 S5) Jan 13 20:49:21.737708 kernel: ACPI: Using IOAPIC for interrupt routing Jan 13 20:49:21.737714 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 13 20:49:21.737720 kernel: PCI: Using E820 reservations for host bridge windows Jan 13 20:49:21.737725 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Jan 13 20:49:21.737732 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Jan 13 20:49:21.737821 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 13 20:49:21.737877 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Jan 13 20:49:21.737925 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Jan 13 20:49:21.737933 kernel: PCI host bridge to bus 0000:00 Jan 13 20:49:21.737982 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 13 20:49:21.738030 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Jan 13 20:49:21.738073 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 13 20:49:21.738116 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 13 20:49:21.738159 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Jan 13 20:49:21.738203 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Jan 13 20:49:21.738262 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Jan 13 20:49:21.738317 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Jan 13 20:49:21.738372 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Jan 13 20:49:21.738426 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Jan 13 20:49:21.738476 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Jan 13 20:49:21.738525 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Jan 13 20:49:21.738574 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Jan 13 20:49:21.738622 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Jan 13 20:49:21.738673 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Jan 13 20:49:21.738725 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Jan 13 20:49:21.738797 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Jan 13 20:49:21.738849 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Jan 13 20:49:21.738903 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Jan 13 20:49:21.738952 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Jan 13 20:49:21.739004 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Jan 13 20:49:21.739057 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Jan 13 20:49:21.739107 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Jan 13 20:49:21.739156 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Jan 13 20:49:21.739205 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Jan 13 20:49:21.739253 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Jan 13 20:49:21.739302 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 13 20:49:21.739357 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Jan 13 20:49:21.739416 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.739466 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.739519 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.739569 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.739621 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.739670 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.739725 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.740124 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.740191 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.740243 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.740301 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.740353 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.740412 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.740461 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.740514 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.740563 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.740619 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.740668 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.740724 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.742784 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.742853 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.742907 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.742962 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743016 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743070 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743120 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743172 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743221 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743274 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743326 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743382 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743433 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743486 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743535 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743590 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743639 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743694 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743743 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743823 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743873 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.743926 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.743975 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744030 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744080 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744133 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744182 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744235 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744285 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744340 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744390 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744445 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744495 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744549 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744598 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744652 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744703 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744756 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744821 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744875 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.744924 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.744977 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.745029 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.745081 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Jan 13 20:49:21.745131 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.745182 kernel: pci_bus 0000:01: extended config space not accessible Jan 13 20:49:21.745233 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:49:21.745286 kernel: pci_bus 0000:02: extended config space not accessible Jan 13 20:49:21.745297 kernel: acpiphp: Slot [32] registered Jan 13 20:49:21.745303 kernel: acpiphp: Slot [33] registered Jan 13 20:49:21.745309 kernel: acpiphp: Slot [34] registered Jan 13 20:49:21.745315 kernel: acpiphp: Slot [35] registered Jan 13 20:49:21.745320 kernel: acpiphp: Slot [36] registered Jan 13 20:49:21.745326 kernel: acpiphp: Slot [37] registered Jan 13 20:49:21.745332 kernel: acpiphp: Slot [38] registered Jan 13 20:49:21.745338 kernel: acpiphp: Slot [39] registered Jan 13 20:49:21.745343 kernel: acpiphp: Slot [40] registered Jan 13 20:49:21.745351 kernel: acpiphp: Slot [41] registered Jan 13 20:49:21.745356 kernel: acpiphp: Slot [42] registered Jan 13 20:49:21.745362 kernel: acpiphp: Slot [43] registered Jan 13 20:49:21.745368 kernel: acpiphp: Slot [44] registered Jan 13 20:49:21.745374 kernel: acpiphp: Slot [45] registered Jan 13 20:49:21.745380 kernel: acpiphp: Slot [46] registered Jan 13 20:49:21.745385 kernel: acpiphp: Slot [47] registered Jan 13 20:49:21.745391 kernel: acpiphp: Slot [48] registered Jan 13 20:49:21.745397 kernel: acpiphp: Slot [49] registered Jan 13 20:49:21.745403 kernel: acpiphp: Slot [50] registered Jan 13 20:49:21.745410 kernel: acpiphp: Slot [51] registered Jan 13 20:49:21.745416 kernel: acpiphp: Slot [52] registered Jan 13 20:49:21.745421 kernel: acpiphp: Slot [53] registered Jan 13 20:49:21.745427 kernel: acpiphp: Slot [54] registered Jan 13 20:49:21.745433 kernel: acpiphp: Slot [55] registered Jan 13 20:49:21.745439 kernel: acpiphp: Slot [56] registered Jan 13 20:49:21.745444 kernel: acpiphp: Slot [57] registered Jan 13 20:49:21.745450 kernel: acpiphp: Slot [58] registered Jan 13 20:49:21.745456 kernel: acpiphp: Slot [59] registered Jan 13 20:49:21.745463 kernel: acpiphp: Slot [60] registered Jan 13 20:49:21.745469 kernel: acpiphp: Slot [61] registered Jan 13 20:49:21.745474 kernel: acpiphp: Slot [62] registered Jan 13 20:49:21.745480 kernel: acpiphp: Slot [63] registered Jan 13 20:49:21.745528 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jan 13 20:49:21.745577 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:49:21.745625 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:49:21.745672 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:49:21.745721 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Jan 13 20:49:21.746074 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Jan 13 20:49:21.746130 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Jan 13 20:49:21.746180 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Jan 13 20:49:21.746229 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Jan 13 20:49:21.746285 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Jan 13 20:49:21.746342 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Jan 13 20:49:21.746394 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Jan 13 20:49:21.746447 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:49:21.746497 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Jan 13 20:49:21.746548 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:49:21.746598 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:49:21.746648 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:49:21.746697 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:49:21.746747 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:49:21.746873 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:49:21.746923 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:49:21.746971 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:49:21.747021 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:49:21.747069 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:49:21.747117 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:49:21.747165 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:49:21.747214 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:49:21.747266 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:49:21.747324 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:49:21.747377 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:49:21.747425 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:49:21.747474 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:49:21.747526 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:49:21.747574 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:49:21.747623 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:49:21.747709 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:49:21.747782 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:49:21.747836 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:49:21.747886 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:49:21.747938 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:49:21.747987 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:49:21.748044 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Jan 13 20:49:21.748096 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Jan 13 20:49:21.748146 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Jan 13 20:49:21.748195 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Jan 13 20:49:21.748245 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Jan 13 20:49:21.748294 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 20:49:21.748348 kernel: pci 0000:0b:00.0: supports D1 D2 Jan 13 20:49:21.748398 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Jan 13 20:49:21.748447 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 20:49:21.748497 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:49:21.748546 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:49:21.750827 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:49:21.750886 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:49:21.750943 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:49:21.750993 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:49:21.751042 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:49:21.751094 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:49:21.751143 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:49:21.751193 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:49:21.751241 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:49:21.751292 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:49:21.751343 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:49:21.751393 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:49:21.751443 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:49:21.751492 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:49:21.751542 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:49:21.751593 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:49:21.751642 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:49:21.751691 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:49:21.751744 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:49:21.751861 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:49:21.751931 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:49:21.751998 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:49:21.752050 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:49:21.752100 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:49:21.752149 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:49:21.752198 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:49:21.752250 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:49:21.752299 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:49:21.752350 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:49:21.752400 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:49:21.752448 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:49:21.752497 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:49:21.752546 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:49:21.752597 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:49:21.752647 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:49:21.752915 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:49:21.752976 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:49:21.753029 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:49:21.753081 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:49:21.753133 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:49:21.753184 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:49:21.753238 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:49:21.753290 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:49:21.753346 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:49:21.753397 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:49:21.753448 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:49:21.753499 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:49:21.753550 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:49:21.753601 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:49:21.753655 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:49:21.753705 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:49:21.753756 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:49:21.753822 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:49:21.753874 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:49:21.753923 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:49:21.753974 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:49:21.754023 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:49:21.754076 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:49:21.754125 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:49:21.754175 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:49:21.754225 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:49:21.754285 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:49:21.754354 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:49:21.754406 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:49:21.754456 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:49:21.754508 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:49:21.754557 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:49:21.754606 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:49:21.754656 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:49:21.754705 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:49:21.754754 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:49:21.754845 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:49:21.754895 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:49:21.754948 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:49:21.754998 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:49:21.755047 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:49:21.755096 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:49:21.755104 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Jan 13 20:49:21.755111 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Jan 13 20:49:21.755118 kernel: ACPI: PCI: Interrupt link LNKB disabled Jan 13 20:49:21.755124 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 13 20:49:21.755131 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Jan 13 20:49:21.755137 kernel: iommu: Default domain type: Translated Jan 13 20:49:21.755143 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 13 20:49:21.755149 kernel: PCI: Using ACPI for IRQ routing Jan 13 20:49:21.755155 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 13 20:49:21.755161 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Jan 13 20:49:21.755167 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Jan 13 20:49:21.755216 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Jan 13 20:49:21.755265 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Jan 13 20:49:21.755317 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 13 20:49:21.755326 kernel: vgaarb: loaded Jan 13 20:49:21.755332 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Jan 13 20:49:21.755338 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Jan 13 20:49:21.755344 kernel: clocksource: Switched to clocksource tsc-early Jan 13 20:49:21.755350 kernel: VFS: Disk quotas dquot_6.6.0 Jan 13 20:49:21.755356 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 13 20:49:21.755362 kernel: pnp: PnP ACPI init Jan 13 20:49:21.755413 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Jan 13 20:49:21.755462 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Jan 13 20:49:21.755507 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Jan 13 20:49:21.755555 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Jan 13 20:49:21.755603 kernel: pnp 00:06: [dma 2] Jan 13 20:49:21.755653 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Jan 13 20:49:21.755699 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Jan 13 20:49:21.755746 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Jan 13 20:49:21.755754 kernel: pnp: PnP ACPI: found 8 devices Jan 13 20:49:21.755761 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 13 20:49:21.755781 kernel: NET: Registered PF_INET protocol family Jan 13 20:49:21.755787 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 13 20:49:21.755794 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 13 20:49:21.755800 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 13 20:49:21.755805 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 13 20:49:21.755814 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 13 20:49:21.755820 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 13 20:49:21.755826 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:49:21.755832 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 20:49:21.755838 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 13 20:49:21.755844 kernel: NET: Registered PF_XDP protocol family Jan 13 20:49:21.755899 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Jan 13 20:49:21.755951 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 13 20:49:21.756005 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 13 20:49:21.756056 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 13 20:49:21.756105 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 13 20:49:21.756154 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 13 20:49:21.756204 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 13 20:49:21.756254 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 13 20:49:21.756310 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 13 20:49:21.756361 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 13 20:49:21.756410 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 13 20:49:21.756460 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 13 20:49:21.756510 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 13 20:49:21.756560 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 13 20:49:21.756613 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 13 20:49:21.756663 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 13 20:49:21.756712 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 13 20:49:21.757803 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 13 20:49:21.757868 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 13 20:49:21.757921 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 13 20:49:21.757975 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 13 20:49:21.758025 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 13 20:49:21.758075 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Jan 13 20:49:21.758124 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:49:21.758173 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:49:21.758222 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758273 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.758321 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758369 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.758417 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758466 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.758515 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758564 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.758613 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758664 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.758713 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758761 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.758827 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758877 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.758926 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.758975 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759024 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.759076 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759126 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.759175 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759224 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.759273 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759324 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.759374 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759422 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.759474 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759523 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.759571 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759621 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.759669 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.759718 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.762786 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.762850 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.762906 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.762958 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763009 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763059 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763109 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763158 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763207 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763257 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763310 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763362 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763410 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763458 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763507 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763555 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763603 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763652 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763700 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763749 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763811 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763861 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.763910 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.763959 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764008 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764057 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764106 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764154 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764204 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764256 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764305 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764353 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764402 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764450 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764499 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764547 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764596 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764645 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.764693 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.764744 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.768750 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.768815 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.768867 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.768918 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.768968 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.769019 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.769068 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.769118 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.769170 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.769221 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.769269 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.769322 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 20:49:21.769371 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 20:49:21.769421 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 20:49:21.769471 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Jan 13 20:49:21.769520 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 20:49:21.769567 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 20:49:21.769615 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:49:21.769672 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Jan 13 20:49:21.769722 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 20:49:21.769783 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 20:49:21.769836 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 20:49:21.769886 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:49:21.769936 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 20:49:21.769985 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 20:49:21.770035 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 20:49:21.770088 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:49:21.770139 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 20:49:21.770189 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 20:49:21.770238 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 20:49:21.770287 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:49:21.770340 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 20:49:21.770390 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 20:49:21.770439 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:49:21.770487 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 20:49:21.770540 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 20:49:21.770588 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:49:21.770640 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 20:49:21.770689 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 20:49:21.770738 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:49:21.772155 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 20:49:21.772215 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 20:49:21.772268 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:49:21.772320 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 20:49:21.772370 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 20:49:21.772421 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:49:21.772475 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Jan 13 20:49:21.772527 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 20:49:21.772577 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 20:49:21.772626 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 20:49:21.772679 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:49:21.772731 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 20:49:21.772790 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 20:49:21.772840 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 20:49:21.772889 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:49:21.772941 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 20:49:21.772991 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 20:49:21.773041 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 20:49:21.773090 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:49:21.773140 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 20:49:21.773193 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 20:49:21.773242 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:49:21.773292 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 20:49:21.773353 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 20:49:21.773403 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:49:21.773454 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 20:49:21.773503 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 20:49:21.773554 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:49:21.773605 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 20:49:21.773658 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 20:49:21.773707 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:49:21.773758 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 20:49:21.777160 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 20:49:21.777218 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:49:21.777272 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 20:49:21.777330 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 20:49:21.777380 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 20:49:21.777430 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:49:21.777481 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 20:49:21.777535 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 20:49:21.777584 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 20:49:21.777633 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:49:21.777684 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 20:49:21.777734 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 20:49:21.777791 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 20:49:21.777840 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:49:21.777891 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 20:49:21.777941 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 20:49:21.777993 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:49:21.778044 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 20:49:21.778097 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 20:49:21.778146 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:49:21.778197 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 20:49:21.778247 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 20:49:21.778296 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:49:21.778346 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 20:49:21.778397 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 20:49:21.778446 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:49:21.778500 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 20:49:21.778549 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 20:49:21.778598 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:49:21.778649 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 20:49:21.778698 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 20:49:21.778747 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 20:49:21.778846 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:49:21.778897 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 20:49:21.778947 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 20:49:21.778999 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 20:49:21.779048 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:49:21.779099 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 20:49:21.779148 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 20:49:21.779197 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:49:21.779248 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 20:49:21.779297 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 20:49:21.779349 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:49:21.779400 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 20:49:21.779449 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 20:49:21.779501 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:49:21.779552 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 20:49:21.779601 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 20:49:21.779649 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:49:21.779700 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 20:49:21.779749 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 20:49:21.779812 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:49:21.779866 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 20:49:21.779916 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 20:49:21.779968 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:49:21.780019 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:49:21.780066 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:49:21.780111 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:49:21.780166 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:49:21.780213 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:49:21.780263 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Jan 13 20:49:21.780325 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Jan 13 20:49:21.780376 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 20:49:21.780422 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 20:49:21.780467 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 20:49:21.780512 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 20:49:21.780558 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Jan 13 20:49:21.780602 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Jan 13 20:49:21.780653 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Jan 13 20:49:21.780703 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Jan 13 20:49:21.780748 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 20:49:21.781984 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Jan 13 20:49:21.782036 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Jan 13 20:49:21.782083 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 20:49:21.782134 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Jan 13 20:49:21.782181 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Jan 13 20:49:21.782230 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 20:49:21.782281 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Jan 13 20:49:21.782327 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 20:49:21.782378 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Jan 13 20:49:21.782424 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 20:49:21.782474 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Jan 13 20:49:21.782523 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 20:49:21.782573 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Jan 13 20:49:21.782622 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 20:49:21.782676 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Jan 13 20:49:21.782731 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 20:49:21.784508 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Jan 13 20:49:21.784563 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Jan 13 20:49:21.784609 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 20:49:21.784660 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Jan 13 20:49:21.784707 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Jan 13 20:49:21.784754 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 20:49:21.784818 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Jan 13 20:49:21.784869 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Jan 13 20:49:21.784919 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 20:49:21.784970 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Jan 13 20:49:21.785016 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 20:49:21.785066 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Jan 13 20:49:21.785112 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 20:49:21.785163 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Jan 13 20:49:21.785212 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 20:49:21.785261 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Jan 13 20:49:21.785307 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 20:49:21.785358 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Jan 13 20:49:21.785404 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 20:49:21.785456 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Jan 13 20:49:21.785505 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Jan 13 20:49:21.785551 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 20:49:21.785600 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Jan 13 20:49:21.785647 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Jan 13 20:49:21.785693 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 20:49:21.785743 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Jan 13 20:49:21.787634 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Jan 13 20:49:21.787692 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 20:49:21.787746 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Jan 13 20:49:21.787809 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 20:49:21.787862 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Jan 13 20:49:21.787908 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 20:49:21.787958 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Jan 13 20:49:21.788009 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 20:49:21.788060 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Jan 13 20:49:21.788106 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 20:49:21.788156 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Jan 13 20:49:21.788202 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 20:49:21.788257 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 13 20:49:21.788306 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Jan 13 20:49:21.788352 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 20:49:21.788401 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Jan 13 20:49:21.788449 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Jan 13 20:49:21.788495 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 20:49:21.788544 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Jan 13 20:49:21.788593 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 20:49:21.788645 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Jan 13 20:49:21.788691 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 20:49:21.788741 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Jan 13 20:49:21.788798 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 20:49:21.788850 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Jan 13 20:49:21.788897 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 20:49:21.788951 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Jan 13 20:49:21.788998 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 20:49:21.789050 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Jan 13 20:49:21.789096 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 20:49:21.789153 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jan 13 20:49:21.789163 kernel: PCI: CLS 32 bytes, default 64 Jan 13 20:49:21.789172 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 13 20:49:21.789179 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 20:49:21.789185 kernel: clocksource: Switched to clocksource tsc Jan 13 20:49:21.789191 kernel: Initialise system trusted keyrings Jan 13 20:49:21.789198 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 13 20:49:21.789204 kernel: Key type asymmetric registered Jan 13 20:49:21.789210 kernel: Asymmetric key parser 'x509' registered Jan 13 20:49:21.789216 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 13 20:49:21.789223 kernel: io scheduler mq-deadline registered Jan 13 20:49:21.789231 kernel: io scheduler kyber registered Jan 13 20:49:21.789237 kernel: io scheduler bfq registered Jan 13 20:49:21.789290 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Jan 13 20:49:21.789345 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.789398 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Jan 13 20:49:21.789449 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.789502 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Jan 13 20:49:21.789553 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.789607 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Jan 13 20:49:21.789659 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.789711 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Jan 13 20:49:21.789761 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790203 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Jan 13 20:49:21.790257 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790317 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Jan 13 20:49:21.790368 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790420 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Jan 13 20:49:21.790471 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790522 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Jan 13 20:49:21.790575 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790627 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Jan 13 20:49:21.790677 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790729 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Jan 13 20:49:21.790786 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790838 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Jan 13 20:49:21.790888 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.790942 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Jan 13 20:49:21.790994 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.791046 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Jan 13 20:49:21.791096 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.791148 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Jan 13 20:49:21.791201 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.791253 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Jan 13 20:49:21.791304 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.791356 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Jan 13 20:49:21.791406 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.791457 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Jan 13 20:49:21.791510 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.791562 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Jan 13 20:49:21.791612 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.791662 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Jan 13 20:49:21.791713 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.792007 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Jan 13 20:49:21.792075 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.792130 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Jan 13 20:49:21.792184 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.792235 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Jan 13 20:49:21.792287 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.792355 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Jan 13 20:49:21.792409 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.792460 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Jan 13 20:49:21.792510 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.792561 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Jan 13 20:49:21.792610 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.792660 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Jan 13 20:49:21.792712 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.793319 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Jan 13 20:49:21.793388 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.793442 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Jan 13 20:49:21.793493 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.793548 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Jan 13 20:49:21.793598 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.793649 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Jan 13 20:49:21.793699 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.793748 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Jan 13 20:49:21.793812 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 20:49:21.793825 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 13 20:49:21.793832 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 13 20:49:21.793839 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 13 20:49:21.793845 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Jan 13 20:49:21.793852 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 13 20:49:21.793858 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 13 20:49:21.793911 kernel: rtc_cmos 00:01: registered as rtc0 Jan 13 20:49:21.793959 kernel: rtc_cmos 00:01: setting system clock to 2025-01-13T20:49:21 UTC (1736801361) Jan 13 20:49:21.794003 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Jan 13 20:49:21.794012 kernel: intel_pstate: CPU model not supported Jan 13 20:49:21.794018 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 13 20:49:21.794025 kernel: NET: Registered PF_INET6 protocol family Jan 13 20:49:21.794031 kernel: Segment Routing with IPv6 Jan 13 20:49:21.794037 kernel: In-situ OAM (IOAM) with IPv6 Jan 13 20:49:21.794044 kernel: NET: Registered PF_PACKET protocol family Jan 13 20:49:21.794050 kernel: Key type dns_resolver registered Jan 13 20:49:21.794058 kernel: IPI shorthand broadcast: enabled Jan 13 20:49:21.794064 kernel: sched_clock: Marking stable (876438053, 223432303)->(1157339130, -57468774) Jan 13 20:49:21.794070 kernel: registered taskstats version 1 Jan 13 20:49:21.794076 kernel: Loading compiled-in X.509 certificates Jan 13 20:49:21.794083 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: 98739e9049f62881f4df7ffd1e39335f7f55b344' Jan 13 20:49:21.794089 kernel: Key type .fscrypt registered Jan 13 20:49:21.794095 kernel: Key type fscrypt-provisioning registered Jan 13 20:49:21.794101 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 13 20:49:21.794109 kernel: ima: Allocated hash algorithm: sha1 Jan 13 20:49:21.794116 kernel: ima: No architecture policies found Jan 13 20:49:21.794122 kernel: clk: Disabling unused clocks Jan 13 20:49:21.794128 kernel: Freeing unused kernel image (initmem) memory: 42976K Jan 13 20:49:21.794134 kernel: Write protecting the kernel read-only data: 36864k Jan 13 20:49:21.794140 kernel: Freeing unused kernel image (rodata/data gap) memory: 1840K Jan 13 20:49:21.794146 kernel: Run /init as init process Jan 13 20:49:21.794152 kernel: with arguments: Jan 13 20:49:21.794159 kernel: /init Jan 13 20:49:21.794165 kernel: with environment: Jan 13 20:49:21.794172 kernel: HOME=/ Jan 13 20:49:21.794178 kernel: TERM=linux Jan 13 20:49:21.794184 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 13 20:49:21.794192 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 20:49:21.794200 systemd[1]: Detected virtualization vmware. Jan 13 20:49:21.794207 systemd[1]: Detected architecture x86-64. Jan 13 20:49:21.794213 systemd[1]: Running in initrd. Jan 13 20:49:21.794219 systemd[1]: No hostname configured, using default hostname. Jan 13 20:49:21.794226 systemd[1]: Hostname set to . Jan 13 20:49:21.794233 systemd[1]: Initializing machine ID from random generator. Jan 13 20:49:21.794239 systemd[1]: Queued start job for default target initrd.target. Jan 13 20:49:21.794246 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:49:21.794252 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:49:21.794259 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 13 20:49:21.794266 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 20:49:21.794273 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 13 20:49:21.794279 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 13 20:49:21.794287 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 13 20:49:21.794294 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 13 20:49:21.794300 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:49:21.794306 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:49:21.794313 systemd[1]: Reached target paths.target - Path Units. Jan 13 20:49:21.794320 systemd[1]: Reached target slices.target - Slice Units. Jan 13 20:49:21.794327 systemd[1]: Reached target swap.target - Swaps. Jan 13 20:49:21.794333 systemd[1]: Reached target timers.target - Timer Units. Jan 13 20:49:21.794339 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:49:21.794346 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:49:21.794352 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 13 20:49:21.794358 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 13 20:49:21.794365 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:49:21.794371 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 20:49:21.794378 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:49:21.794384 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 20:49:21.794391 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 13 20:49:21.794397 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 20:49:21.794403 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 13 20:49:21.794409 systemd[1]: Starting systemd-fsck-usr.service... Jan 13 20:49:21.794416 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 20:49:21.794422 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 20:49:21.794428 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:49:21.794436 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 13 20:49:21.794454 systemd-journald[217]: Collecting audit messages is disabled. Jan 13 20:49:21.794470 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:49:21.794477 systemd[1]: Finished systemd-fsck-usr.service. Jan 13 20:49:21.794485 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 20:49:21.794491 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:49:21.794498 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:49:21.794504 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:49:21.794512 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 20:49:21.794518 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:49:21.794525 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 13 20:49:21.794531 kernel: Bridge firewalling registered Jan 13 20:49:21.794538 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 13 20:49:21.794544 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 20:49:21.794551 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:49:21.794558 systemd-journald[217]: Journal started Jan 13 20:49:21.794573 systemd-journald[217]: Runtime Journal (/run/log/journal/355c9f29c2614988856e80866a2d421f) is 4.8M, max 38.7M, 33.8M free. Jan 13 20:49:21.751994 systemd-modules-load[218]: Inserted module 'overlay' Jan 13 20:49:21.789839 systemd-modules-load[218]: Inserted module 'br_netfilter' Jan 13 20:49:21.796588 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 20:49:21.800450 dracut-cmdline[234]: dracut-dracut-053 Jan 13 20:49:21.801856 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 20:49:21.804859 dracut-cmdline[234]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 20:49:21.803069 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 20:49:21.808053 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:49:21.817906 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:49:21.821839 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 20:49:21.838503 systemd-resolved[288]: Positive Trust Anchors: Jan 13 20:49:21.838513 systemd-resolved[288]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 20:49:21.838535 systemd-resolved[288]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 20:49:21.840171 systemd-resolved[288]: Defaulting to hostname 'linux'. Jan 13 20:49:21.840738 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 20:49:21.840914 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:49:21.850774 kernel: SCSI subsystem initialized Jan 13 20:49:21.856777 kernel: Loading iSCSI transport class v2.0-870. Jan 13 20:49:21.863785 kernel: iscsi: registered transport (tcp) Jan 13 20:49:21.876788 kernel: iscsi: registered transport (qla4xxx) Jan 13 20:49:21.876830 kernel: QLogic iSCSI HBA Driver Jan 13 20:49:21.896655 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 13 20:49:21.900914 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 13 20:49:21.916058 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 13 20:49:21.916135 kernel: device-mapper: uevent: version 1.0.3 Jan 13 20:49:21.917246 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 13 20:49:21.948786 kernel: raid6: avx2x4 gen() 48679 MB/s Jan 13 20:49:21.965779 kernel: raid6: avx2x2 gen() 53522 MB/s Jan 13 20:49:21.982951 kernel: raid6: avx2x1 gen() 44922 MB/s Jan 13 20:49:21.982971 kernel: raid6: using algorithm avx2x2 gen() 53522 MB/s Jan 13 20:49:22.000950 kernel: raid6: .... xor() 31345 MB/s, rmw enabled Jan 13 20:49:22.000971 kernel: raid6: using avx2x2 recovery algorithm Jan 13 20:49:22.013777 kernel: xor: automatically using best checksumming function avx Jan 13 20:49:22.111782 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 13 20:49:22.117151 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:49:22.121877 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:49:22.129156 systemd-udevd[434]: Using default interface naming scheme 'v255'. Jan 13 20:49:22.131679 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:49:22.138901 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 13 20:49:22.145634 dracut-pre-trigger[439]: rd.md=0: removing MD RAID activation Jan 13 20:49:22.160768 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:49:22.164878 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 20:49:22.234989 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:49:22.240933 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 13 20:49:22.253939 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 13 20:49:22.254628 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:49:22.255140 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:49:22.255423 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 20:49:22.261909 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 13 20:49:22.271587 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:49:22.303775 kernel: VMware PVSCSI driver - version 1.0.7.0-k Jan 13 20:49:22.305772 kernel: vmw_pvscsi: using 64bit dma Jan 13 20:49:22.308772 kernel: vmw_pvscsi: max_id: 16 Jan 13 20:49:22.308790 kernel: vmw_pvscsi: setting ring_pages to 8 Jan 13 20:49:22.310833 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Jan 13 20:49:22.316223 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Jan 13 20:49:22.326634 kernel: vmw_pvscsi: enabling reqCallThreshold Jan 13 20:49:22.326645 kernel: vmw_pvscsi: driver-based request coalescing enabled Jan 13 20:49:22.326653 kernel: vmw_pvscsi: using MSI-X Jan 13 20:49:22.326661 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Jan 13 20:49:22.326679 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Jan 13 20:49:22.328792 kernel: cryptd: max_cpu_qlen set to 1000 Jan 13 20:49:22.332826 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Jan 13 20:49:22.338305 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Jan 13 20:49:22.338389 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Jan 13 20:49:22.338462 kernel: libata version 3.00 loaded. Jan 13 20:49:22.340794 kernel: ata_piix 0000:00:07.1: version 2.13 Jan 13 20:49:22.361116 kernel: scsi host1: ata_piix Jan 13 20:49:22.361194 kernel: AVX2 version of gcm_enc/dec engaged. Jan 13 20:49:22.361204 kernel: AES CTR mode by8 optimization enabled Jan 13 20:49:22.361211 kernel: scsi host2: ata_piix Jan 13 20:49:22.361276 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Jan 13 20:49:22.361285 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Jan 13 20:49:22.345319 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:49:22.345384 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:49:22.345568 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:49:22.345660 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:49:22.345722 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:49:22.345831 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:49:22.351380 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:49:22.368483 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:49:22.372903 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:49:22.381272 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:49:22.514837 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Jan 13 20:49:22.518787 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Jan 13 20:49:22.528019 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Jan 13 20:49:22.531932 kernel: sd 0:0:0:0: [sda] Write Protect is off Jan 13 20:49:22.532018 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Jan 13 20:49:22.532082 kernel: sd 0:0:0:0: [sda] Cache data unavailable Jan 13 20:49:22.532141 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Jan 13 20:49:22.532200 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:49:22.532209 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jan 13 20:49:22.550847 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Jan 13 20:49:22.559007 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 13 20:49:22.559020 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (491) Jan 13 20:49:22.559031 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jan 13 20:49:22.563527 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Jan 13 20:49:22.564937 kernel: BTRFS: device fsid 5e7921ba-229a-48a0-bc77-9b30aaa34aeb devid 1 transid 36 /dev/sda3 scanned by (udev-worker) (488) Jan 13 20:49:22.567293 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Jan 13 20:49:22.570364 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 20:49:22.572427 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Jan 13 20:49:22.572567 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Jan 13 20:49:22.578926 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 13 20:49:22.602975 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:49:22.609780 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:49:23.609777 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:49:23.609842 disk-uuid[589]: The operation has completed successfully. Jan 13 20:49:23.646067 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 13 20:49:23.646118 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 13 20:49:23.650854 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 13 20:49:23.652543 sh[607]: Success Jan 13 20:49:23.660776 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Jan 13 20:49:23.703466 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 13 20:49:23.715579 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 13 20:49:23.715930 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 13 20:49:23.730072 kernel: BTRFS info (device dm-0): first mount of filesystem 5e7921ba-229a-48a0-bc77-9b30aaa34aeb Jan 13 20:49:23.730093 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:49:23.730102 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 13 20:49:23.731951 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 13 20:49:23.731965 kernel: BTRFS info (device dm-0): using free space tree Jan 13 20:49:23.738777 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 13 20:49:23.740520 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 13 20:49:23.749835 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Jan 13 20:49:23.751031 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 13 20:49:23.772775 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:49:23.772810 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:49:23.772819 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:49:23.791787 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:49:23.797935 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 13 20:49:23.798791 kernel: BTRFS info (device sda6): last unmount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:49:23.801086 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 13 20:49:23.805884 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 13 20:49:23.818061 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:49:23.822895 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 13 20:49:23.884467 ignition[667]: Ignition 2.20.0 Jan 13 20:49:23.884473 ignition[667]: Stage: fetch-offline Jan 13 20:49:23.884606 ignition[667]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:49:23.884613 ignition[667]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:49:23.884671 ignition[667]: parsed url from cmdline: "" Jan 13 20:49:23.884674 ignition[667]: no config URL provided Jan 13 20:49:23.884678 ignition[667]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 20:49:23.884684 ignition[667]: no config at "/usr/lib/ignition/user.ign" Jan 13 20:49:23.885468 ignition[667]: config successfully fetched Jan 13 20:49:23.885485 ignition[667]: parsing config with SHA512: 88f721b84580e4882dd4a781389762e99cb01553a468b81557acb04e18b8c59954baaed9d2cdd9defb9eabe6553b674261a1f47770423b011308c178b6a6003e Jan 13 20:49:23.888058 unknown[667]: fetched base config from "system" Jan 13 20:49:23.888301 ignition[667]: fetch-offline: fetch-offline passed Jan 13 20:49:23.888065 unknown[667]: fetched user config from "vmware" Jan 13 20:49:23.888347 ignition[667]: Ignition finished successfully Jan 13 20:49:23.889827 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:49:23.895190 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:49:23.900904 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 20:49:23.912637 systemd-networkd[804]: lo: Link UP Jan 13 20:49:23.912644 systemd-networkd[804]: lo: Gained carrier Jan 13 20:49:23.913336 systemd-networkd[804]: Enumeration completed Jan 13 20:49:23.913522 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 20:49:23.913601 systemd-networkd[804]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Jan 13 20:49:23.916987 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 20:49:23.917099 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 20:49:23.913776 systemd[1]: Reached target network.target - Network. Jan 13 20:49:23.913871 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jan 13 20:49:23.916752 systemd-networkd[804]: ens192: Link UP Jan 13 20:49:23.916756 systemd-networkd[804]: ens192: Gained carrier Jan 13 20:49:23.920875 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 13 20:49:23.928796 ignition[807]: Ignition 2.20.0 Jan 13 20:49:23.928804 ignition[807]: Stage: kargs Jan 13 20:49:23.928914 ignition[807]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:49:23.928920 ignition[807]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:49:23.929426 ignition[807]: kargs: kargs passed Jan 13 20:49:23.929452 ignition[807]: Ignition finished successfully Jan 13 20:49:23.930543 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 13 20:49:23.940061 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 13 20:49:23.947128 ignition[814]: Ignition 2.20.0 Jan 13 20:49:23.947134 ignition[814]: Stage: disks Jan 13 20:49:23.947241 ignition[814]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:49:23.947247 ignition[814]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:49:23.947798 ignition[814]: disks: disks passed Jan 13 20:49:23.948506 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 13 20:49:23.947837 ignition[814]: Ignition finished successfully Jan 13 20:49:23.948907 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 13 20:49:23.949007 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 13 20:49:23.949101 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 20:49:23.949181 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 20:49:23.949267 systemd[1]: Reached target basic.target - Basic System. Jan 13 20:49:23.952868 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 13 20:49:23.963984 systemd-fsck[822]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 13 20:49:23.965532 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 13 20:49:23.968885 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 13 20:49:24.029806 kernel: EXT4-fs (sda9): mounted filesystem 84bcd1b2-5573-4e91-8fd5-f97782397085 r/w with ordered data mode. Quota mode: none. Jan 13 20:49:24.029786 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 13 20:49:24.030140 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 13 20:49:24.033804 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:49:24.035462 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 13 20:49:24.035777 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 13 20:49:24.035804 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 13 20:49:24.035817 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:49:24.039875 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 13 20:49:24.041781 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (830) Jan 13 20:49:24.041845 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 13 20:49:24.046034 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:49:24.046058 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:49:24.046072 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:49:24.048965 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:49:24.049966 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:49:24.074849 initrd-setup-root[854]: cut: /sysroot/etc/passwd: No such file or directory Jan 13 20:49:24.078271 initrd-setup-root[861]: cut: /sysroot/etc/group: No such file or directory Jan 13 20:49:24.080438 initrd-setup-root[868]: cut: /sysroot/etc/shadow: No such file or directory Jan 13 20:49:24.082341 initrd-setup-root[875]: cut: /sysroot/etc/gshadow: No such file or directory Jan 13 20:49:24.157998 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 13 20:49:24.161895 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 13 20:49:24.163869 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 13 20:49:24.167882 kernel: BTRFS info (device sda6): last unmount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:49:24.183504 ignition[943]: INFO : Ignition 2.20.0 Jan 13 20:49:24.183504 ignition[943]: INFO : Stage: mount Jan 13 20:49:24.183889 ignition[943]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:49:24.183889 ignition[943]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:49:24.184134 ignition[943]: INFO : mount: mount passed Jan 13 20:49:24.184370 ignition[943]: INFO : Ignition finished successfully Jan 13 20:49:24.186232 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 13 20:49:24.186646 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 13 20:49:24.189863 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 13 20:49:24.728815 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 13 20:49:24.733965 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:49:24.741785 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (954) Jan 13 20:49:24.744900 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 20:49:24.744922 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 20:49:24.744933 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:49:24.748774 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:49:24.748887 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:49:24.766299 ignition[971]: INFO : Ignition 2.20.0 Jan 13 20:49:24.766299 ignition[971]: INFO : Stage: files Jan 13 20:49:24.766585 ignition[971]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:49:24.766585 ignition[971]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:49:24.766877 ignition[971]: DEBUG : files: compiled without relabeling support, skipping Jan 13 20:49:24.767360 ignition[971]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 13 20:49:24.767360 ignition[971]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 13 20:49:24.769263 ignition[971]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 13 20:49:24.769439 ignition[971]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 13 20:49:24.769575 ignition[971]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 13 20:49:24.769516 unknown[971]: wrote ssh authorized keys file for user: core Jan 13 20:49:24.770921 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:49:24.771136 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 13 20:49:24.808261 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 13 20:49:24.895249 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 20:49:24.895522 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 13 20:49:24.895522 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 13 20:49:24.895522 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:49:24.895522 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:49:24.895522 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:49:24.895522 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:49:24.895522 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:49:24.896580 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:49:24.896580 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:49:24.896580 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:49:24.896580 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:49:24.896580 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:49:24.896580 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:49:24.896580 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.29.2-x86-64.raw: attempt #1 Jan 13 20:49:25.238964 systemd-networkd[804]: ens192: Gained IPv6LL Jan 13 20:49:25.356097 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 13 20:49:25.584679 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 20:49:25.584990 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:49:25.584990 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 20:49:25.584990 ignition[971]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Jan 13 20:49:25.585516 ignition[971]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Jan 13 20:49:25.623387 ignition[971]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:49:25.625628 ignition[971]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 20:49:25.626112 ignition[971]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Jan 13 20:49:25.626112 ignition[971]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Jan 13 20:49:25.626112 ignition[971]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Jan 13 20:49:25.626112 ignition[971]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:49:25.626112 ignition[971]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:49:25.626112 ignition[971]: INFO : files: files passed Jan 13 20:49:25.626112 ignition[971]: INFO : Ignition finished successfully Jan 13 20:49:25.627815 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 13 20:49:25.632858 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 13 20:49:25.634364 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 13 20:49:25.634909 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 13 20:49:25.635089 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 13 20:49:25.640245 initrd-setup-root-after-ignition[1001]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:49:25.640245 initrd-setup-root-after-ignition[1001]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:49:25.641349 initrd-setup-root-after-ignition[1005]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:49:25.642018 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:49:25.642446 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 13 20:49:25.647863 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 13 20:49:25.659202 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 13 20:49:25.659258 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 13 20:49:25.659564 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 13 20:49:25.659683 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 13 20:49:25.659920 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 13 20:49:25.660325 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 13 20:49:25.669324 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:49:25.673872 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 13 20:49:25.678991 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:49:25.679157 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:49:25.679318 systemd[1]: Stopped target timers.target - Timer Units. Jan 13 20:49:25.679508 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 13 20:49:25.679582 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:49:25.679881 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 13 20:49:25.680020 systemd[1]: Stopped target basic.target - Basic System. Jan 13 20:49:25.680153 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 13 20:49:25.680296 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:49:25.680446 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 13 20:49:25.680590 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 13 20:49:25.680728 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:49:25.681860 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 13 20:49:25.682083 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 13 20:49:25.682252 systemd[1]: Stopped target swap.target - Swaps. Jan 13 20:49:25.682387 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 13 20:49:25.682447 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:49:25.682807 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:49:25.683031 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:49:25.683195 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 13 20:49:25.683256 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:49:25.683498 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 13 20:49:25.683575 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 13 20:49:25.683923 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 13 20:49:25.684001 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:49:25.684221 systemd[1]: Stopped target paths.target - Path Units. Jan 13 20:49:25.684464 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 13 20:49:25.687784 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:49:25.687971 systemd[1]: Stopped target slices.target - Slice Units. Jan 13 20:49:25.688149 systemd[1]: Stopped target sockets.target - Socket Units. Jan 13 20:49:25.688311 systemd[1]: iscsid.socket: Deactivated successfully. Jan 13 20:49:25.688376 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:49:25.688585 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 13 20:49:25.688629 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:49:25.688868 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 13 20:49:25.688929 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:49:25.689171 systemd[1]: ignition-files.service: Deactivated successfully. Jan 13 20:49:25.689225 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 13 20:49:25.696884 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 13 20:49:25.698887 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 13 20:49:25.698991 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 13 20:49:25.699087 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:49:25.699430 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 13 20:49:25.699512 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:49:25.702369 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 13 20:49:25.702458 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 13 20:49:25.705843 ignition[1025]: INFO : Ignition 2.20.0 Jan 13 20:49:25.708945 ignition[1025]: INFO : Stage: umount Jan 13 20:49:25.708945 ignition[1025]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:49:25.708945 ignition[1025]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 20:49:25.708945 ignition[1025]: INFO : umount: umount passed Jan 13 20:49:25.708945 ignition[1025]: INFO : Ignition finished successfully Jan 13 20:49:25.709964 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 13 20:49:25.710040 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 13 20:49:25.710255 systemd[1]: Stopped target network.target - Network. Jan 13 20:49:25.710338 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 13 20:49:25.710365 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 13 20:49:25.710464 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 13 20:49:25.710485 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 13 20:49:25.710582 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 13 20:49:25.710603 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 13 20:49:25.710697 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 13 20:49:25.710718 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 13 20:49:25.710932 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 13 20:49:25.711304 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 13 20:49:25.715943 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 13 20:49:25.716006 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 13 20:49:25.716723 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 13 20:49:25.716959 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 13 20:49:25.716977 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:49:25.721814 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 13 20:49:25.721912 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 13 20:49:25.721941 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:49:25.722067 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Jan 13 20:49:25.722087 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 20:49:25.722240 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:49:25.722437 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 13 20:49:25.723639 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 13 20:49:25.725737 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 13 20:49:25.726099 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:49:25.726355 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 13 20:49:25.726538 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 13 20:49:25.726670 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 13 20:49:25.726691 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:49:25.729215 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 13 20:49:25.729285 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 13 20:49:25.735182 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 13 20:49:25.735254 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:49:25.735471 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 13 20:49:25.735493 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 13 20:49:25.735593 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 13 20:49:25.735611 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:49:25.735700 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 13 20:49:25.735721 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:49:25.735946 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 13 20:49:25.735967 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 13 20:49:25.736270 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:49:25.736290 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:49:25.740846 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 13 20:49:25.740954 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 13 20:49:25.740981 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:49:25.741104 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 13 20:49:25.741125 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:49:25.741242 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 13 20:49:25.741262 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:49:25.741377 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:49:25.741397 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:49:25.743781 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 13 20:49:25.743833 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 13 20:49:25.827217 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 13 20:49:25.827296 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 13 20:49:25.827674 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 13 20:49:25.827851 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 13 20:49:25.827883 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 13 20:49:25.831870 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 13 20:49:25.837002 systemd[1]: Switching root. Jan 13 20:49:25.874992 systemd-journald[217]: Journal stopped Jan 13 20:49:27.269353 systemd-journald[217]: Received SIGTERM from PID 1 (systemd). Jan 13 20:49:27.269376 kernel: SELinux: policy capability network_peer_controls=1 Jan 13 20:49:27.269385 kernel: SELinux: policy capability open_perms=1 Jan 13 20:49:27.269390 kernel: SELinux: policy capability extended_socket_class=1 Jan 13 20:49:27.269395 kernel: SELinux: policy capability always_check_network=0 Jan 13 20:49:27.269401 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 13 20:49:27.269408 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 13 20:49:27.269414 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 13 20:49:27.269419 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 13 20:49:27.269425 systemd[1]: Successfully loaded SELinux policy in 37.329ms. Jan 13 20:49:27.269432 kernel: audit: type=1403 audit(1736801366.359:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 13 20:49:27.269439 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 7.402ms. Jan 13 20:49:27.269445 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 20:49:27.269453 systemd[1]: Detected virtualization vmware. Jan 13 20:49:27.269460 systemd[1]: Detected architecture x86-64. Jan 13 20:49:27.269466 systemd[1]: Detected first boot. Jan 13 20:49:27.269473 systemd[1]: Initializing machine ID from random generator. Jan 13 20:49:27.269480 zram_generator::config[1067]: No configuration found. Jan 13 20:49:27.269487 systemd[1]: Populated /etc with preset unit settings. Jan 13 20:49:27.269494 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:49:27.269502 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Jan 13 20:49:27.269508 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 13 20:49:27.269514 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 13 20:49:27.269521 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 13 20:49:27.269528 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 13 20:49:27.269535 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 13 20:49:27.269542 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 13 20:49:27.269548 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 13 20:49:27.269555 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 13 20:49:27.269561 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 13 20:49:27.269568 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 13 20:49:27.269576 systemd[1]: Created slice user.slice - User and Session Slice. Jan 13 20:49:27.269583 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:49:27.269589 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:49:27.269596 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 13 20:49:27.269602 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 13 20:49:27.269609 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 13 20:49:27.269616 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 20:49:27.269622 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 13 20:49:27.269630 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:49:27.269637 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 13 20:49:27.269645 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 13 20:49:27.269652 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 13 20:49:27.269659 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 13 20:49:27.269666 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:49:27.269673 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 20:49:27.269679 systemd[1]: Reached target slices.target - Slice Units. Jan 13 20:49:27.269688 systemd[1]: Reached target swap.target - Swaps. Jan 13 20:49:27.269695 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 13 20:49:27.269701 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 13 20:49:27.269708 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:49:27.269715 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 20:49:27.269723 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:49:27.269730 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 13 20:49:27.269737 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 13 20:49:27.269744 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 13 20:49:27.269751 systemd[1]: Mounting media.mount - External Media Directory... Jan 13 20:49:27.269758 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:49:27.271580 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 13 20:49:27.271592 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 13 20:49:27.271602 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 13 20:49:27.271611 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 13 20:49:27.271618 systemd[1]: Reached target machines.target - Containers. Jan 13 20:49:27.271625 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 13 20:49:27.271632 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Jan 13 20:49:27.271639 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 20:49:27.271646 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 13 20:49:27.271653 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:49:27.271661 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 20:49:27.271668 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 20:49:27.271675 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 13 20:49:27.271682 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:49:27.271689 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 13 20:49:27.271696 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 13 20:49:27.271703 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 13 20:49:27.271710 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 13 20:49:27.271717 systemd[1]: Stopped systemd-fsck-usr.service. Jan 13 20:49:27.271725 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 20:49:27.271732 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 20:49:27.271739 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 13 20:49:27.271746 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 13 20:49:27.271753 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 20:49:27.271760 systemd[1]: verity-setup.service: Deactivated successfully. Jan 13 20:49:27.271782 systemd[1]: Stopped verity-setup.service. Jan 13 20:49:27.271792 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:49:27.271802 kernel: fuse: init (API version 7.39) Jan 13 20:49:27.271808 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 13 20:49:27.271815 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 13 20:49:27.271822 systemd[1]: Mounted media.mount - External Media Directory. Jan 13 20:49:27.271829 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 13 20:49:27.271836 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 13 20:49:27.271843 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 13 20:49:27.271850 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:49:27.271856 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 13 20:49:27.271865 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 13 20:49:27.271872 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:49:27.271878 kernel: loop: module loaded Jan 13 20:49:27.271885 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:49:27.271892 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 20:49:27.271899 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 20:49:27.271906 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 13 20:49:27.271913 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 13 20:49:27.271920 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:49:27.271939 systemd-journald[1147]: Collecting audit messages is disabled. Jan 13 20:49:27.271955 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:49:27.271963 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 13 20:49:27.271970 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 13 20:49:27.271979 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 13 20:49:27.271986 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 13 20:49:27.271993 systemd-journald[1147]: Journal started Jan 13 20:49:27.272008 systemd-journald[1147]: Runtime Journal (/run/log/journal/6f09051b8f5146ab81585f84a882a591) is 4.8M, max 38.7M, 33.8M free. Jan 13 20:49:27.005342 systemd[1]: Queued start job for default target multi-user.target. Jan 13 20:49:27.064745 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 13 20:49:27.065032 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 13 20:49:27.272508 jq[1134]: true Jan 13 20:49:27.273985 jq[1155]: true Jan 13 20:49:27.286850 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 13 20:49:27.286912 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 13 20:49:27.286931 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 20:49:27.286942 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Jan 13 20:49:27.295092 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 13 20:49:27.306518 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 13 20:49:27.306558 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:49:27.307780 kernel: ACPI: bus type drm_connector registered Jan 13 20:49:27.323771 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 13 20:49:27.325783 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 20:49:27.330067 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 13 20:49:27.330094 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 20:49:27.333803 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 13 20:49:27.338796 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 20:49:27.341111 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 20:49:27.344593 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 20:49:27.345917 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 20:49:27.346174 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 20:49:27.346344 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 13 20:49:27.346501 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 13 20:49:27.346718 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 13 20:49:27.367904 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 13 20:49:27.368864 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 20:49:27.374013 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 13 20:49:27.374218 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 13 20:49:27.379945 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Jan 13 20:49:27.382810 kernel: loop0: detected capacity change from 0 to 211296 Jan 13 20:49:27.386048 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 13 20:49:27.396407 systemd-journald[1147]: Time spent on flushing to /var/log/journal/6f09051b8f5146ab81585f84a882a591 is 43.229ms for 1839 entries. Jan 13 20:49:27.396407 systemd-journald[1147]: System Journal (/var/log/journal/6f09051b8f5146ab81585f84a882a591) is 8.0M, max 584.8M, 576.8M free. Jan 13 20:49:27.454329 systemd-journald[1147]: Received client request to flush runtime journal. Jan 13 20:49:27.454366 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 13 20:49:27.441028 systemd-tmpfiles[1178]: ACLs are not supported, ignoring. Jan 13 20:49:27.423324 ignition[1156]: Ignition 2.20.0 Jan 13 20:49:27.441037 systemd-tmpfiles[1178]: ACLs are not supported, ignoring. Jan 13 20:49:27.423490 ignition[1156]: deleting config from guestinfo properties Jan 13 20:49:27.445842 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:49:27.449184 ignition[1156]: Successfully deleted config Jan 13 20:49:27.454900 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 13 20:49:27.455175 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 13 20:49:27.463062 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:49:27.463378 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Jan 13 20:49:27.463783 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:49:27.466828 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Jan 13 20:49:27.467519 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 13 20:49:27.467911 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Jan 13 20:49:27.483782 kernel: loop1: detected capacity change from 0 to 138184 Jan 13 20:49:27.485914 udevadm[1230]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Jan 13 20:49:27.498702 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 13 20:49:27.505891 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 20:49:27.518617 systemd-tmpfiles[1233]: ACLs are not supported, ignoring. Jan 13 20:49:27.518837 systemd-tmpfiles[1233]: ACLs are not supported, ignoring. Jan 13 20:49:27.521433 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:49:27.529780 kernel: loop2: detected capacity change from 0 to 140992 Jan 13 20:49:27.567782 kernel: loop3: detected capacity change from 0 to 2944 Jan 13 20:49:27.649780 kernel: loop4: detected capacity change from 0 to 211296 Jan 13 20:49:27.694955 kernel: loop5: detected capacity change from 0 to 138184 Jan 13 20:49:27.718781 kernel: loop6: detected capacity change from 0 to 140992 Jan 13 20:49:27.744785 kernel: loop7: detected capacity change from 0 to 2944 Jan 13 20:49:27.757210 (sd-merge)[1239]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Jan 13 20:49:27.757485 (sd-merge)[1239]: Merged extensions into '/usr'. Jan 13 20:49:27.761607 systemd[1]: Reloading requested from client PID 1177 ('systemd-sysext') (unit systemd-sysext.service)... Jan 13 20:49:27.761670 systemd[1]: Reloading... Jan 13 20:49:27.828402 zram_generator::config[1262]: No configuration found. Jan 13 20:49:27.944945 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:49:27.960264 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:49:27.988393 systemd[1]: Reloading finished in 226 ms. Jan 13 20:49:28.014683 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 13 20:49:28.021930 systemd[1]: Starting ensure-sysext.service... Jan 13 20:49:28.024902 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 20:49:28.041389 systemd-tmpfiles[1321]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 13 20:49:28.041591 systemd-tmpfiles[1321]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 13 20:49:28.042100 systemd-tmpfiles[1321]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 13 20:49:28.042264 systemd-tmpfiles[1321]: ACLs are not supported, ignoring. Jan 13 20:49:28.042296 systemd-tmpfiles[1321]: ACLs are not supported, ignoring. Jan 13 20:49:28.044549 systemd[1]: Reloading requested from client PID 1320 ('systemctl') (unit ensure-sysext.service)... Jan 13 20:49:28.044556 systemd[1]: Reloading... Jan 13 20:49:28.056691 systemd-tmpfiles[1321]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 20:49:28.056697 systemd-tmpfiles[1321]: Skipping /boot Jan 13 20:49:28.065049 systemd-tmpfiles[1321]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 20:49:28.065055 systemd-tmpfiles[1321]: Skipping /boot Jan 13 20:49:28.095637 zram_generator::config[1345]: No configuration found. Jan 13 20:49:28.119365 ldconfig[1167]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 13 20:49:28.171291 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:49:28.186241 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:49:28.214294 systemd[1]: Reloading finished in 169 ms. Jan 13 20:49:28.228099 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 13 20:49:28.228551 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 13 20:49:28.233251 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:49:28.238665 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 20:49:28.240921 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 13 20:49:28.243016 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 13 20:49:28.245926 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 20:49:28.251948 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:49:28.255345 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 13 20:49:28.256843 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:49:28.258961 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:49:28.267959 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 20:49:28.269951 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:49:28.270305 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:49:28.282469 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 13 20:49:28.282579 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:49:28.287988 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:49:28.288083 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:49:28.288457 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 20:49:28.288544 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 20:49:28.288858 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:49:28.288935 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:49:28.291446 systemd-udevd[1411]: Using default interface naming scheme 'v255'. Jan 13 20:49:28.295264 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:49:28.299951 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:49:28.304101 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 20:49:28.305485 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:49:28.305635 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:49:28.305700 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:49:28.306808 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 13 20:49:28.307254 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 13 20:49:28.308077 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:49:28.308179 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:49:28.317352 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:49:28.320849 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:49:28.322205 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 20:49:28.322649 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:49:28.324927 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 13 20:49:28.325066 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 20:49:28.325528 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 20:49:28.325616 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 20:49:28.325960 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:49:28.326040 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:49:28.328985 systemd[1]: Finished ensure-sysext.service. Jan 13 20:49:28.338046 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 13 20:49:28.339449 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 20:49:28.342262 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 13 20:49:28.342546 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:49:28.346044 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 20:49:28.346300 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 20:49:28.346383 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 20:49:28.349063 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:49:28.349414 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:49:28.349869 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 20:49:28.357345 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 13 20:49:28.357566 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 13 20:49:28.358281 augenrules[1463]: No rules Jan 13 20:49:28.362126 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 20:49:28.362306 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 20:49:28.362620 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 13 20:49:28.405240 systemd-resolved[1409]: Positive Trust Anchors: Jan 13 20:49:28.405250 systemd-resolved[1409]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 20:49:28.405273 systemd-resolved[1409]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 20:49:28.412989 systemd-resolved[1409]: Defaulting to hostname 'linux'. Jan 13 20:49:28.423378 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 13 20:49:28.426281 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 20:49:28.426458 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:49:28.434519 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 13 20:49:28.434858 systemd[1]: Reached target time-set.target - System Time Set. Jan 13 20:49:28.457524 systemd-networkd[1457]: lo: Link UP Jan 13 20:49:28.457529 systemd-networkd[1457]: lo: Gained carrier Jan 13 20:49:28.459246 systemd-networkd[1457]: Enumeration completed Jan 13 20:49:28.459379 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 20:49:28.459583 systemd-networkd[1457]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Jan 13 20:49:28.459654 systemd[1]: Reached target network.target - Network. Jan 13 20:49:28.463778 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 20:49:28.463937 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 20:49:28.463315 systemd-networkd[1457]: ens192: Link UP Jan 13 20:49:28.463441 systemd-networkd[1457]: ens192: Gained carrier Jan 13 20:49:28.465006 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 13 20:49:28.469842 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Jan 13 20:49:28.470285 systemd-timesyncd[1455]: Network configuration changed, trying to establish connection. Jan 13 20:49:28.476786 kernel: ACPI: button: Power Button [PWRF] Jan 13 20:49:28.500922 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1482) Jan 13 20:49:28.538673 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 20:49:28.544782 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Jan 13 20:49:28.545900 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 13 20:49:28.555883 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Jan 13 20:49:28.557207 kernel: Guest personality initialized and is active Jan 13 20:49:28.558277 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Jan 13 20:49:28.558302 kernel: Initialized host personality Jan 13 20:49:28.571825 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input4 Jan 13 20:49:28.581144 (udev-worker)[1458]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Jan 13 20:49:28.586252 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:49:28.589781 kernel: mousedev: PS/2 mouse device common for all mice Jan 13 20:49:28.591094 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 13 20:49:28.608095 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Jan 13 20:49:28.611960 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Jan 13 20:49:28.638784 lvm[1510]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 13 20:49:28.667330 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Jan 13 20:49:28.667844 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:49:28.672850 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Jan 13 20:49:28.675575 lvm[1512]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 13 20:49:28.707936 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Jan 13 20:49:28.803189 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:49:28.803468 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 20:49:28.803657 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 13 20:49:28.803819 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 13 20:49:28.804051 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 13 20:49:28.804231 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 13 20:49:28.804358 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 13 20:49:28.804482 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 13 20:49:28.804504 systemd[1]: Reached target paths.target - Path Units. Jan 13 20:49:28.804604 systemd[1]: Reached target timers.target - Timer Units. Jan 13 20:49:28.805061 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 13 20:49:28.806284 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 13 20:49:28.821322 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 13 20:49:28.821897 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 13 20:49:28.822058 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 20:49:28.822165 systemd[1]: Reached target basic.target - Basic System. Jan 13 20:49:28.822293 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 13 20:49:28.822310 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 13 20:49:28.823226 systemd[1]: Starting containerd.service - containerd container runtime... Jan 13 20:49:28.825885 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 13 20:49:28.828130 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 13 20:49:28.830853 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 13 20:49:28.830982 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 13 20:49:28.832810 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 13 20:49:28.833553 jq[1521]: false Jan 13 20:49:28.836270 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 13 20:49:28.838527 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 13 20:49:28.840899 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 13 20:49:28.844493 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 13 20:49:28.844851 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 13 20:49:28.845315 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 13 20:49:28.847906 systemd[1]: Starting update-engine.service - Update Engine... Jan 13 20:49:28.848910 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 13 20:49:28.853855 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Jan 13 20:49:28.862010 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 13 20:49:28.863560 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 13 20:49:28.863837 systemd[1]: motdgen.service: Deactivated successfully. Jan 13 20:49:28.863943 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 13 20:49:28.872194 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 13 20:49:28.872310 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 13 20:49:28.888430 jq[1530]: true Jan 13 20:49:28.891686 update_engine[1529]: I20250113 20:49:28.891182 1529 main.cc:92] Flatcar Update Engine starting Jan 13 20:49:28.896824 extend-filesystems[1522]: Found loop4 Jan 13 20:49:28.896824 extend-filesystems[1522]: Found loop5 Jan 13 20:49:28.896824 extend-filesystems[1522]: Found loop6 Jan 13 20:49:28.896824 extend-filesystems[1522]: Found loop7 Jan 13 20:49:28.896824 extend-filesystems[1522]: Found sda Jan 13 20:49:28.896824 extend-filesystems[1522]: Found sda1 Jan 13 20:49:28.896824 extend-filesystems[1522]: Found sda2 Jan 13 20:49:28.896824 extend-filesystems[1522]: Found sda3 Jan 13 20:49:28.896824 extend-filesystems[1522]: Found usr Jan 13 20:49:28.896824 extend-filesystems[1522]: Found sda4 Jan 13 20:49:28.896824 extend-filesystems[1522]: Found sda6 Jan 13 20:49:28.896824 extend-filesystems[1522]: Found sda7 Jan 13 20:49:28.896824 extend-filesystems[1522]: Found sda9 Jan 13 20:49:28.896824 extend-filesystems[1522]: Checking size of /dev/sda9 Jan 13 20:49:28.896488 (ntainerd)[1540]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 13 20:49:28.900367 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Jan 13 20:49:28.905124 jq[1550]: true Jan 13 20:49:28.907905 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Jan 13 20:49:28.905632 dbus-daemon[1520]: [system] SELinux support is enabled Jan 13 20:49:28.908158 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 13 20:49:28.911869 tar[1539]: linux-amd64/helm Jan 13 20:49:28.913837 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 13 20:49:28.913870 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 13 20:49:28.914324 update_engine[1529]: I20250113 20:49:28.914206 1529 update_check_scheduler.cc:74] Next update check in 11m58s Jan 13 20:49:28.914604 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 13 20:49:28.914618 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 13 20:49:28.914928 systemd[1]: Started update-engine.service - Update Engine. Jan 13 20:49:28.923956 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 13 20:49:28.927336 extend-filesystems[1522]: Old size kept for /dev/sda9 Jan 13 20:49:28.927336 extend-filesystems[1522]: Found sr0 Jan 13 20:49:28.933898 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Jan 13 20:49:28.935732 systemd-logind[1528]: Watching system buttons on /dev/input/event1 (Power Button) Jan 13 20:49:28.935747 systemd-logind[1528]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 13 20:49:28.937031 systemd-logind[1528]: New seat seat0. Jan 13 20:49:28.939115 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 13 20:49:28.939373 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 13 20:49:28.941641 systemd[1]: Started systemd-logind.service - User Login Management. Jan 13 20:49:28.955206 unknown[1554]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Jan 13 20:49:28.956466 unknown[1554]: Core dump limit set to -1 Jan 13 20:49:28.959809 bash[1578]: Updated "/home/core/.ssh/authorized_keys" Jan 13 20:49:28.960030 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 13 20:49:28.961091 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Jan 13 20:49:28.973780 kernel: NET: Registered PF_VSOCK protocol family Jan 13 20:49:29.005355 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1465) Jan 13 20:49:29.087969 locksmithd[1560]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 13 20:49:29.108723 sshd_keygen[1559]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 13 20:49:29.143512 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 13 20:49:29.151041 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 13 20:49:29.158006 systemd[1]: issuegen.service: Deactivated successfully. Jan 13 20:49:29.158423 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 13 20:49:29.164571 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 13 20:49:29.191073 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 13 20:49:29.198051 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 13 20:49:29.199725 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 13 20:49:29.200982 systemd[1]: Reached target getty.target - Login Prompts. Jan 13 20:49:29.244334 containerd[1540]: time="2025-01-13T20:49:29.244283698Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Jan 13 20:49:29.269093 containerd[1540]: time="2025-01-13T20:49:29.269059336Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:49:29.270162 containerd[1540]: time="2025-01-13T20:49:29.270138563Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.71-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:49:29.270225 containerd[1540]: time="2025-01-13T20:49:29.270217117Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Jan 13 20:49:29.270262 containerd[1540]: time="2025-01-13T20:49:29.270254713Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Jan 13 20:49:29.270396 containerd[1540]: time="2025-01-13T20:49:29.270387144Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Jan 13 20:49:29.270445 containerd[1540]: time="2025-01-13T20:49:29.270437580Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Jan 13 20:49:29.270515 containerd[1540]: time="2025-01-13T20:49:29.270504692Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:49:29.270547 containerd[1540]: time="2025-01-13T20:49:29.270540485Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:49:29.270677 containerd[1540]: time="2025-01-13T20:49:29.270666583Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:49:29.270710 containerd[1540]: time="2025-01-13T20:49:29.270703850Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Jan 13 20:49:29.270751 containerd[1540]: time="2025-01-13T20:49:29.270741965Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:49:29.270832 containerd[1540]: time="2025-01-13T20:49:29.270825512Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Jan 13 20:49:29.270913 containerd[1540]: time="2025-01-13T20:49:29.270904062Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:49:29.271308 containerd[1540]: time="2025-01-13T20:49:29.271070094Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:49:29.271308 containerd[1540]: time="2025-01-13T20:49:29.271133111Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:49:29.271308 containerd[1540]: time="2025-01-13T20:49:29.271142283Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Jan 13 20:49:29.271308 containerd[1540]: time="2025-01-13T20:49:29.271186287Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Jan 13 20:49:29.271308 containerd[1540]: time="2025-01-13T20:49:29.271213891Z" level=info msg="metadata content store policy set" policy=shared Jan 13 20:49:29.280316 containerd[1540]: time="2025-01-13T20:49:29.280286456Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Jan 13 20:49:29.280847 containerd[1540]: time="2025-01-13T20:49:29.280450728Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Jan 13 20:49:29.280847 containerd[1540]: time="2025-01-13T20:49:29.280470524Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Jan 13 20:49:29.280847 containerd[1540]: time="2025-01-13T20:49:29.280487754Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Jan 13 20:49:29.280847 containerd[1540]: time="2025-01-13T20:49:29.280499428Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Jan 13 20:49:29.280847 containerd[1540]: time="2025-01-13T20:49:29.280632571Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Jan 13 20:49:29.281003 containerd[1540]: time="2025-01-13T20:49:29.280992185Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Jan 13 20:49:29.281115 containerd[1540]: time="2025-01-13T20:49:29.281103196Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Jan 13 20:49:29.281157 containerd[1540]: time="2025-01-13T20:49:29.281149121Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Jan 13 20:49:29.281192 containerd[1540]: time="2025-01-13T20:49:29.281184916Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Jan 13 20:49:29.281225 containerd[1540]: time="2025-01-13T20:49:29.281218227Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Jan 13 20:49:29.281265 containerd[1540]: time="2025-01-13T20:49:29.281257686Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Jan 13 20:49:29.281305 containerd[1540]: time="2025-01-13T20:49:29.281297522Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Jan 13 20:49:29.281342 containerd[1540]: time="2025-01-13T20:49:29.281335506Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Jan 13 20:49:29.281377 containerd[1540]: time="2025-01-13T20:49:29.281369367Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Jan 13 20:49:29.281415 containerd[1540]: time="2025-01-13T20:49:29.281406847Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Jan 13 20:49:29.281453 containerd[1540]: time="2025-01-13T20:49:29.281443135Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281492944Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281516176Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281526172Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281534305Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281541996Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281552845Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281561036Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281567815Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281575053Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281582309Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281591311Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281597774Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281604238Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281612180Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282601 containerd[1540]: time="2025-01-13T20:49:29.281621542Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Jan 13 20:49:29.282955 containerd[1540]: time="2025-01-13T20:49:29.281641096Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282955 containerd[1540]: time="2025-01-13T20:49:29.281654584Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282955 containerd[1540]: time="2025-01-13T20:49:29.281664265Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Jan 13 20:49:29.282955 containerd[1540]: time="2025-01-13T20:49:29.281702307Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Jan 13 20:49:29.282955 containerd[1540]: time="2025-01-13T20:49:29.281715555Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Jan 13 20:49:29.282955 containerd[1540]: time="2025-01-13T20:49:29.281722064Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Jan 13 20:49:29.282955 containerd[1540]: time="2025-01-13T20:49:29.281729090Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Jan 13 20:49:29.282955 containerd[1540]: time="2025-01-13T20:49:29.281734591Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.282955 containerd[1540]: time="2025-01-13T20:49:29.281741496Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Jan 13 20:49:29.282955 containerd[1540]: time="2025-01-13T20:49:29.281747486Z" level=info msg="NRI interface is disabled by configuration." Jan 13 20:49:29.282955 containerd[1540]: time="2025-01-13T20:49:29.281753092Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Jan 13 20:49:29.283121 containerd[1540]: time="2025-01-13T20:49:29.281933261Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Jan 13 20:49:29.283121 containerd[1540]: time="2025-01-13T20:49:29.281973438Z" level=info msg="Connect containerd service" Jan 13 20:49:29.283121 containerd[1540]: time="2025-01-13T20:49:29.281997486Z" level=info msg="using legacy CRI server" Jan 13 20:49:29.283121 containerd[1540]: time="2025-01-13T20:49:29.282006050Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 13 20:49:29.283121 containerd[1540]: time="2025-01-13T20:49:29.282088441Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Jan 13 20:49:29.283121 containerd[1540]: time="2025-01-13T20:49:29.282459544Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 13 20:49:29.283121 containerd[1540]: time="2025-01-13T20:49:29.282565555Z" level=info msg="Start subscribing containerd event" Jan 13 20:49:29.283121 containerd[1540]: time="2025-01-13T20:49:29.282599643Z" level=info msg="Start recovering state" Jan 13 20:49:29.283121 containerd[1540]: time="2025-01-13T20:49:29.282642010Z" level=info msg="Start event monitor" Jan 13 20:49:29.283121 containerd[1540]: time="2025-01-13T20:49:29.282653763Z" level=info msg="Start snapshots syncer" Jan 13 20:49:29.283121 containerd[1540]: time="2025-01-13T20:49:29.282660092Z" level=info msg="Start cni network conf syncer for default" Jan 13 20:49:29.283121 containerd[1540]: time="2025-01-13T20:49:29.282666154Z" level=info msg="Start streaming server" Jan 13 20:49:29.283603 containerd[1540]: time="2025-01-13T20:49:29.283426134Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 13 20:49:29.283677 containerd[1540]: time="2025-01-13T20:49:29.283667115Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 13 20:49:29.283794 containerd[1540]: time="2025-01-13T20:49:29.283783415Z" level=info msg="containerd successfully booted in 0.040029s" Jan 13 20:49:29.283840 systemd[1]: Started containerd.service - containerd container runtime. Jan 13 20:49:29.417158 tar[1539]: linux-amd64/LICENSE Jan 13 20:49:29.417158 tar[1539]: linux-amd64/README.md Jan 13 20:49:29.424172 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 13 20:49:29.526971 systemd-networkd[1457]: ens192: Gained IPv6LL Jan 13 20:49:29.527235 systemd-timesyncd[1455]: Network configuration changed, trying to establish connection. Jan 13 20:49:29.528092 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 13 20:49:29.528860 systemd[1]: Reached target network-online.target - Network is Online. Jan 13 20:49:29.534954 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Jan 13 20:49:29.537036 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:49:29.540694 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 13 20:49:29.574037 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 13 20:49:29.597383 systemd[1]: coreos-metadata.service: Deactivated successfully. Jan 13 20:49:29.597514 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Jan 13 20:49:29.597954 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 13 20:49:30.472423 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:49:30.472848 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 13 20:49:30.474820 systemd[1]: Startup finished in 958ms (kernel) + 4.727s (initrd) + 4.150s (userspace) = 9.837s. Jan 13 20:49:30.477886 (kubelet)[1698]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:49:30.530908 login[1628]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 13 20:49:30.532636 login[1629]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 13 20:49:30.537720 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 13 20:49:30.542921 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 13 20:49:30.544370 systemd-logind[1528]: New session 1 of user core. Jan 13 20:49:30.548044 systemd-logind[1528]: New session 2 of user core. Jan 13 20:49:30.552591 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 13 20:49:30.562959 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 13 20:49:30.564513 (systemd)[1705]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 13 20:49:30.618551 systemd[1705]: Queued start job for default target default.target. Jan 13 20:49:30.622557 systemd[1705]: Created slice app.slice - User Application Slice. Jan 13 20:49:30.622579 systemd[1705]: Reached target paths.target - Paths. Jan 13 20:49:30.622589 systemd[1705]: Reached target timers.target - Timers. Jan 13 20:49:30.623314 systemd[1705]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 13 20:49:30.633138 systemd[1705]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 13 20:49:30.633697 systemd[1705]: Reached target sockets.target - Sockets. Jan 13 20:49:30.633709 systemd[1705]: Reached target basic.target - Basic System. Jan 13 20:49:30.633731 systemd[1705]: Reached target default.target - Main User Target. Jan 13 20:49:30.633747 systemd[1705]: Startup finished in 65ms. Jan 13 20:49:30.633846 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 13 20:49:30.636840 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 13 20:49:30.637431 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 13 20:49:31.290899 kubelet[1698]: E0113 20:49:31.290846 1698 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:49:31.292585 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:49:31.292680 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:49:41.543109 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 13 20:49:41.552924 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:49:41.708455 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:49:41.718084 (kubelet)[1747]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:49:41.814520 kubelet[1747]: E0113 20:49:41.814409 1747 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:49:41.817332 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:49:41.817437 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:49:52.067986 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 13 20:49:52.074938 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:49:52.306295 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:49:52.309175 (kubelet)[1763]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:49:52.338384 kubelet[1763]: E0113 20:49:52.338308 1763 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:49:52.339583 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:49:52.339661 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:51:12.779685 systemd-timesyncd[1455]: Contacted time server 172.232.28.194:123 (2.flatcar.pool.ntp.org). Jan 13 20:51:12.779687 systemd-resolved[1409]: Clock change detected. Flushing caches. Jan 13 20:51:12.779733 systemd-timesyncd[1455]: Initial clock synchronization to Mon 2025-01-13 20:51:12.779547 UTC. Jan 13 20:51:15.356649 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 13 20:51:15.364467 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:51:15.603496 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:51:15.606310 (kubelet)[1780]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:51:15.649096 kubelet[1780]: E0113 20:51:15.647295 1780 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:51:15.648451 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:51:15.648526 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:51:21.935421 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 13 20:51:21.936731 systemd[1]: Started sshd@0-139.178.70.104:22-147.75.109.163:44068.service - OpenSSH per-connection server daemon (147.75.109.163:44068). Jan 13 20:51:21.978400 sshd[1788]: Accepted publickey for core from 147.75.109.163 port 44068 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:51:21.979485 sshd-session[1788]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:51:21.983578 systemd-logind[1528]: New session 3 of user core. Jan 13 20:51:21.989632 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 13 20:51:22.045414 systemd[1]: Started sshd@1-139.178.70.104:22-147.75.109.163:44076.service - OpenSSH per-connection server daemon (147.75.109.163:44076). Jan 13 20:51:22.080557 sshd[1793]: Accepted publickey for core from 147.75.109.163 port 44076 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:51:22.081568 sshd-session[1793]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:51:22.084334 systemd-logind[1528]: New session 4 of user core. Jan 13 20:51:22.090427 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 13 20:51:22.138593 sshd[1795]: Connection closed by 147.75.109.163 port 44076 Jan 13 20:51:22.138979 sshd-session[1793]: pam_unix(sshd:session): session closed for user core Jan 13 20:51:22.149047 systemd[1]: sshd@1-139.178.70.104:22-147.75.109.163:44076.service: Deactivated successfully. Jan 13 20:51:22.149925 systemd[1]: session-4.scope: Deactivated successfully. Jan 13 20:51:22.150981 systemd-logind[1528]: Session 4 logged out. Waiting for processes to exit. Jan 13 20:51:22.151829 systemd[1]: Started sshd@2-139.178.70.104:22-147.75.109.163:44082.service - OpenSSH per-connection server daemon (147.75.109.163:44082). Jan 13 20:51:22.153593 systemd-logind[1528]: Removed session 4. Jan 13 20:51:22.190582 sshd[1800]: Accepted publickey for core from 147.75.109.163 port 44082 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:51:22.191394 sshd-session[1800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:51:22.195749 systemd-logind[1528]: New session 5 of user core. Jan 13 20:51:22.201461 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 13 20:51:22.248165 sshd[1802]: Connection closed by 147.75.109.163 port 44082 Jan 13 20:51:22.248582 sshd-session[1800]: pam_unix(sshd:session): session closed for user core Jan 13 20:51:22.257067 systemd[1]: sshd@2-139.178.70.104:22-147.75.109.163:44082.service: Deactivated successfully. Jan 13 20:51:22.258121 systemd[1]: session-5.scope: Deactivated successfully. Jan 13 20:51:22.259159 systemd-logind[1528]: Session 5 logged out. Waiting for processes to exit. Jan 13 20:51:22.261618 systemd[1]: Started sshd@3-139.178.70.104:22-147.75.109.163:44088.service - OpenSSH per-connection server daemon (147.75.109.163:44088). Jan 13 20:51:22.262630 systemd-logind[1528]: Removed session 5. Jan 13 20:51:22.296030 sshd[1807]: Accepted publickey for core from 147.75.109.163 port 44088 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:51:22.297026 sshd-session[1807]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:51:22.300225 systemd-logind[1528]: New session 6 of user core. Jan 13 20:51:22.306464 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 13 20:51:22.355752 sshd[1809]: Connection closed by 147.75.109.163 port 44088 Jan 13 20:51:22.355254 sshd-session[1807]: pam_unix(sshd:session): session closed for user core Jan 13 20:51:22.363626 systemd[1]: sshd@3-139.178.70.104:22-147.75.109.163:44088.service: Deactivated successfully. Jan 13 20:51:22.364345 systemd[1]: session-6.scope: Deactivated successfully. Jan 13 20:51:22.364668 systemd-logind[1528]: Session 6 logged out. Waiting for processes to exit. Jan 13 20:51:22.365586 systemd[1]: Started sshd@4-139.178.70.104:22-147.75.109.163:44092.service - OpenSSH per-connection server daemon (147.75.109.163:44092). Jan 13 20:51:22.366811 systemd-logind[1528]: Removed session 6. Jan 13 20:51:22.400033 sshd[1814]: Accepted publickey for core from 147.75.109.163 port 44092 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:51:22.400695 sshd-session[1814]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:51:22.403204 systemd-logind[1528]: New session 7 of user core. Jan 13 20:51:22.409444 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 13 20:51:22.466605 sudo[1817]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 13 20:51:22.466825 sudo[1817]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:51:22.475963 sudo[1817]: pam_unix(sudo:session): session closed for user root Jan 13 20:51:22.476693 sshd[1816]: Connection closed by 147.75.109.163 port 44092 Jan 13 20:51:22.477696 sshd-session[1814]: pam_unix(sshd:session): session closed for user core Jan 13 20:51:22.482014 systemd[1]: sshd@4-139.178.70.104:22-147.75.109.163:44092.service: Deactivated successfully. Jan 13 20:51:22.482996 systemd[1]: session-7.scope: Deactivated successfully. Jan 13 20:51:22.484034 systemd-logind[1528]: Session 7 logged out. Waiting for processes to exit. Jan 13 20:51:22.488544 systemd[1]: Started sshd@5-139.178.70.104:22-147.75.109.163:44102.service - OpenSSH per-connection server daemon (147.75.109.163:44102). Jan 13 20:51:22.489451 systemd-logind[1528]: Removed session 7. Jan 13 20:51:22.524591 sshd[1822]: Accepted publickey for core from 147.75.109.163 port 44102 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:51:22.525790 sshd-session[1822]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:51:22.529405 systemd-logind[1528]: New session 8 of user core. Jan 13 20:51:22.538450 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 13 20:51:22.588130 sudo[1826]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 13 20:51:22.588570 sudo[1826]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:51:22.590982 sudo[1826]: pam_unix(sudo:session): session closed for user root Jan 13 20:51:22.594268 sudo[1825]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 13 20:51:22.594463 sudo[1825]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:51:22.609759 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 20:51:22.626148 augenrules[1848]: No rules Jan 13 20:51:22.626981 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 20:51:22.627142 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 20:51:22.627931 sudo[1825]: pam_unix(sudo:session): session closed for user root Jan 13 20:51:22.628621 sshd[1824]: Connection closed by 147.75.109.163 port 44102 Jan 13 20:51:22.629537 sshd-session[1822]: pam_unix(sshd:session): session closed for user core Jan 13 20:51:22.633501 systemd[1]: sshd@5-139.178.70.104:22-147.75.109.163:44102.service: Deactivated successfully. Jan 13 20:51:22.634565 systemd[1]: session-8.scope: Deactivated successfully. Jan 13 20:51:22.635473 systemd-logind[1528]: Session 8 logged out. Waiting for processes to exit. Jan 13 20:51:22.636412 systemd[1]: Started sshd@6-139.178.70.104:22-147.75.109.163:44116.service - OpenSSH per-connection server daemon (147.75.109.163:44116). Jan 13 20:51:22.637736 systemd-logind[1528]: Removed session 8. Jan 13 20:51:22.683833 sshd[1856]: Accepted publickey for core from 147.75.109.163 port 44116 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:51:22.684560 sshd-session[1856]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:51:22.687577 systemd-logind[1528]: New session 9 of user core. Jan 13 20:51:22.695433 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 13 20:51:22.743548 sudo[1859]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 13 20:51:22.743701 sudo[1859]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:51:23.001584 (dockerd)[1877]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 13 20:51:23.001871 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 13 20:51:23.258069 dockerd[1877]: time="2025-01-13T20:51:23.257987760Z" level=info msg="Starting up" Jan 13 20:51:23.320691 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3501435635-merged.mount: Deactivated successfully. Jan 13 20:51:23.337858 dockerd[1877]: time="2025-01-13T20:51:23.337750996Z" level=info msg="Loading containers: start." Jan 13 20:51:23.432370 kernel: Initializing XFRM netlink socket Jan 13 20:51:23.495673 systemd-networkd[1457]: docker0: Link UP Jan 13 20:51:23.518436 dockerd[1877]: time="2025-01-13T20:51:23.518336114Z" level=info msg="Loading containers: done." Jan 13 20:51:23.535306 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1425149028-merged.mount: Deactivated successfully. Jan 13 20:51:23.550379 dockerd[1877]: time="2025-01-13T20:51:23.550206968Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 13 20:51:23.550379 dockerd[1877]: time="2025-01-13T20:51:23.550274240Z" level=info msg="Docker daemon" commit=8b539b8df24032dabeaaa099cf1d0535ef0286a3 containerd-snapshotter=false storage-driver=overlay2 version=27.2.1 Jan 13 20:51:23.550379 dockerd[1877]: time="2025-01-13T20:51:23.550337163Z" level=info msg="Daemon has completed initialization" Jan 13 20:51:23.613153 dockerd[1877]: time="2025-01-13T20:51:23.613106680Z" level=info msg="API listen on /run/docker.sock" Jan 13 20:51:23.613412 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 13 20:51:24.595928 containerd[1540]: time="2025-01-13T20:51:24.595897375Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.29.12\"" Jan 13 20:51:25.194589 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount903932460.mount: Deactivated successfully. Jan 13 20:51:25.897758 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 13 20:51:25.912635 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:51:25.976550 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:51:25.978330 (kubelet)[2129]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:51:26.023553 kubelet[2129]: E0113 20:51:26.023519 2129 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:51:26.025235 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:51:26.025314 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:51:26.511978 containerd[1540]: time="2025-01-13T20:51:26.511945785Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:26.514489 containerd[1540]: time="2025-01-13T20:51:26.514467681Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.29.12: active requests=0, bytes read=35139254" Jan 13 20:51:26.518612 containerd[1540]: time="2025-01-13T20:51:26.518586355Z" level=info msg="ImageCreate event name:\"sha256:92fbbe8caf9c923e0406b93c082b9e7af30032ace2d836c785633f90514bfefa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:26.523696 containerd[1540]: time="2025-01-13T20:51:26.523675425Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:2804b1e7b9e08f3a3468f8fd2f6487c55968b9293ee51b9efb865b3298acfa26\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:26.524149 containerd[1540]: time="2025-01-13T20:51:26.524131776Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.29.12\" with image id \"sha256:92fbbe8caf9c923e0406b93c082b9e7af30032ace2d836c785633f90514bfefa\", repo tag \"registry.k8s.io/kube-apiserver:v1.29.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:2804b1e7b9e08f3a3468f8fd2f6487c55968b9293ee51b9efb865b3298acfa26\", size \"35136054\" in 1.928203221s" Jan 13 20:51:26.524177 containerd[1540]: time="2025-01-13T20:51:26.524151875Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.29.12\" returns image reference \"sha256:92fbbe8caf9c923e0406b93c082b9e7af30032ace2d836c785633f90514bfefa\"" Jan 13 20:51:26.536503 containerd[1540]: time="2025-01-13T20:51:26.536475086Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.29.12\"" Jan 13 20:51:27.408699 update_engine[1529]: I20250113 20:51:27.408643 1529 update_attempter.cc:509] Updating boot flags... Jan 13 20:51:27.438366 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (2155) Jan 13 20:51:28.948376 containerd[1540]: time="2025-01-13T20:51:28.948105757Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:28.957378 containerd[1540]: time="2025-01-13T20:51:28.957362005Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.29.12: active requests=0, bytes read=32217732" Jan 13 20:51:28.962085 containerd[1540]: time="2025-01-13T20:51:28.962058144Z" level=info msg="ImageCreate event name:\"sha256:f3b58a53109c96b6bf82adb5973fefa4baec46e2e9ee200be5cc03f3afbf127d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:28.971632 containerd[1540]: time="2025-01-13T20:51:28.971602044Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:e2f26a3f5ef3fd01f6330cab8b078cf303cfb6d36911a210d0915d535910e412\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:28.972743 containerd[1540]: time="2025-01-13T20:51:28.972247060Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.29.12\" with image id \"sha256:f3b58a53109c96b6bf82adb5973fefa4baec46e2e9ee200be5cc03f3afbf127d\", repo tag \"registry.k8s.io/kube-controller-manager:v1.29.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:e2f26a3f5ef3fd01f6330cab8b078cf303cfb6d36911a210d0915d535910e412\", size \"33662844\" in 2.435749709s" Jan 13 20:51:28.972743 containerd[1540]: time="2025-01-13T20:51:28.972270512Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.29.12\" returns image reference \"sha256:f3b58a53109c96b6bf82adb5973fefa4baec46e2e9ee200be5cc03f3afbf127d\"" Jan 13 20:51:28.988935 containerd[1540]: time="2025-01-13T20:51:28.988809232Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.29.12\"" Jan 13 20:51:30.808400 containerd[1540]: time="2025-01-13T20:51:30.808370624Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:30.809260 containerd[1540]: time="2025-01-13T20:51:30.809225498Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.29.12: active requests=0, bytes read=17332822" Jan 13 20:51:30.809726 containerd[1540]: time="2025-01-13T20:51:30.809706836Z" level=info msg="ImageCreate event name:\"sha256:e6d3373aa79026111619cc6cc1ffff8b27006c56422e7c95724b03a61b530eaf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:30.811376 containerd[1540]: time="2025-01-13T20:51:30.811346033Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:ed66e2102f4705d45de7513decf3ac61879704984409323779d19e98b970568c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:30.812531 containerd[1540]: time="2025-01-13T20:51:30.812510401Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.29.12\" with image id \"sha256:e6d3373aa79026111619cc6cc1ffff8b27006c56422e7c95724b03a61b530eaf\", repo tag \"registry.k8s.io/kube-scheduler:v1.29.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:ed66e2102f4705d45de7513decf3ac61879704984409323779d19e98b970568c\", size \"18777952\" in 1.823658655s" Jan 13 20:51:30.812571 containerd[1540]: time="2025-01-13T20:51:30.812530348Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.29.12\" returns image reference \"sha256:e6d3373aa79026111619cc6cc1ffff8b27006c56422e7c95724b03a61b530eaf\"" Jan 13 20:51:30.826323 containerd[1540]: time="2025-01-13T20:51:30.826294006Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.12\"" Jan 13 20:51:32.400740 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount478070456.mount: Deactivated successfully. Jan 13 20:51:33.172765 containerd[1540]: time="2025-01-13T20:51:33.172583152Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:33.177376 containerd[1540]: time="2025-01-13T20:51:33.177307412Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.29.12: active requests=0, bytes read=28619958" Jan 13 20:51:33.183114 containerd[1540]: time="2025-01-13T20:51:33.183080426Z" level=info msg="ImageCreate event name:\"sha256:d699d5830022f9e67c3271d1c2af58eaede81e3567df82728b7d2a8bf12ed153\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:33.188134 containerd[1540]: time="2025-01-13T20:51:33.188084479Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bc761494b78fa152a759457f42bc9b86ee9d18f5929bb127bd5f72f8e2112c39\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:33.188647 containerd[1540]: time="2025-01-13T20:51:33.188409182Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.29.12\" with image id \"sha256:d699d5830022f9e67c3271d1c2af58eaede81e3567df82728b7d2a8bf12ed153\", repo tag \"registry.k8s.io/kube-proxy:v1.29.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:bc761494b78fa152a759457f42bc9b86ee9d18f5929bb127bd5f72f8e2112c39\", size \"28618977\" in 2.36208888s" Jan 13 20:51:33.188647 containerd[1540]: time="2025-01-13T20:51:33.188429243Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.12\" returns image reference \"sha256:d699d5830022f9e67c3271d1c2af58eaede81e3567df82728b7d2a8bf12ed153\"" Jan 13 20:51:33.203288 containerd[1540]: time="2025-01-13T20:51:33.203266248Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Jan 13 20:51:33.774773 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount546575183.mount: Deactivated successfully. Jan 13 20:51:34.430365 containerd[1540]: time="2025-01-13T20:51:34.429755588Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:34.430912 containerd[1540]: time="2025-01-13T20:51:34.430882984Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" Jan 13 20:51:34.431168 containerd[1540]: time="2025-01-13T20:51:34.431156919Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:34.433023 containerd[1540]: time="2025-01-13T20:51:34.433011118Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:34.433695 containerd[1540]: time="2025-01-13T20:51:34.433681331Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.23039373s" Jan 13 20:51:34.433745 containerd[1540]: time="2025-01-13T20:51:34.433737296Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Jan 13 20:51:34.447202 containerd[1540]: time="2025-01-13T20:51:34.447174061Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Jan 13 20:51:35.424956 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1063123030.mount: Deactivated successfully. Jan 13 20:51:35.427823 containerd[1540]: time="2025-01-13T20:51:35.427751761Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:35.428320 containerd[1540]: time="2025-01-13T20:51:35.428299722Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322290" Jan 13 20:51:35.429239 containerd[1540]: time="2025-01-13T20:51:35.428478933Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:35.430167 containerd[1540]: time="2025-01-13T20:51:35.430142480Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:35.431093 containerd[1540]: time="2025-01-13T20:51:35.430881240Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 983.677288ms" Jan 13 20:51:35.431093 containerd[1540]: time="2025-01-13T20:51:35.430904261Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Jan 13 20:51:35.447711 containerd[1540]: time="2025-01-13T20:51:35.447685358Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\"" Jan 13 20:51:35.978413 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3817611666.mount: Deactivated successfully. Jan 13 20:51:36.147832 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jan 13 20:51:36.154323 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:51:36.547526 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:51:36.551036 (kubelet)[2292]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:51:36.628277 kubelet[2292]: E0113 20:51:36.628219 2292 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:51:36.629891 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:51:36.629976 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:51:38.749541 containerd[1540]: time="2025-01-13T20:51:38.749494154Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.10-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:38.750146 containerd[1540]: time="2025-01-13T20:51:38.750121290Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.10-0: active requests=0, bytes read=56651625" Jan 13 20:51:38.750559 containerd[1540]: time="2025-01-13T20:51:38.750229821Z" level=info msg="ImageCreate event name:\"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:38.751891 containerd[1540]: time="2025-01-13T20:51:38.751877659Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:51:38.752777 containerd[1540]: time="2025-01-13T20:51:38.752650566Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.10-0\" with image id \"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\", repo tag \"registry.k8s.io/etcd:3.5.10-0\", repo digest \"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\", size \"56649232\" in 3.304940232s" Jan 13 20:51:38.752777 containerd[1540]: time="2025-01-13T20:51:38.752681506Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\" returns image reference \"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\"" Jan 13 20:51:40.546510 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:51:40.550473 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:51:40.566006 systemd[1]: Reloading requested from client PID 2376 ('systemctl') (unit session-9.scope)... Jan 13 20:51:40.566017 systemd[1]: Reloading... Jan 13 20:51:40.627397 zram_generator::config[2413]: No configuration found. Jan 13 20:51:40.679770 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:51:40.695548 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:51:40.738836 systemd[1]: Reloading finished in 172 ms. Jan 13 20:51:40.766982 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 13 20:51:40.767045 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 13 20:51:40.767202 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:51:40.771536 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:51:41.051953 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:51:41.055093 (kubelet)[2481]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 20:51:41.084287 kubelet[2481]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:51:41.084287 kubelet[2481]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 13 20:51:41.084287 kubelet[2481]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:51:41.105973 kubelet[2481]: I0113 20:51:41.105931 2481 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 20:51:41.286976 kubelet[2481]: I0113 20:51:41.286952 2481 server.go:487] "Kubelet version" kubeletVersion="v1.29.2" Jan 13 20:51:41.286976 kubelet[2481]: I0113 20:51:41.286974 2481 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 20:51:41.287116 kubelet[2481]: I0113 20:51:41.287105 2481 server.go:919] "Client rotation is on, will bootstrap in background" Jan 13 20:51:41.305258 kubelet[2481]: I0113 20:51:41.304834 2481 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 20:51:41.305467 kubelet[2481]: E0113 20:51:41.305454 2481 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://139.178.70.104:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:41.317408 kubelet[2481]: I0113 20:51:41.317077 2481 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 20:51:41.317408 kubelet[2481]: I0113 20:51:41.317216 2481 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 20:51:41.318095 kubelet[2481]: I0113 20:51:41.318085 2481 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 13 20:51:41.318625 kubelet[2481]: I0113 20:51:41.318616 2481 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 20:51:41.318665 kubelet[2481]: I0113 20:51:41.318657 2481 container_manager_linux.go:301] "Creating device plugin manager" Jan 13 20:51:41.318754 kubelet[2481]: I0113 20:51:41.318748 2481 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:51:41.318860 kubelet[2481]: I0113 20:51:41.318853 2481 kubelet.go:396] "Attempting to sync node with API server" Jan 13 20:51:41.318897 kubelet[2481]: I0113 20:51:41.318893 2481 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 20:51:41.318947 kubelet[2481]: I0113 20:51:41.318941 2481 kubelet.go:312] "Adding apiserver pod source" Jan 13 20:51:41.318982 kubelet[2481]: I0113 20:51:41.318977 2481 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 20:51:41.322705 kubelet[2481]: I0113 20:51:41.322695 2481 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 13 20:51:41.322854 kubelet[2481]: W0113 20:51:41.322830 2481 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:41.322886 kubelet[2481]: E0113 20:51:41.322864 2481 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:41.322912 kubelet[2481]: W0113 20:51:41.322896 2481 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:41.323036 kubelet[2481]: E0113 20:51:41.322913 2481 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:41.325957 kubelet[2481]: I0113 20:51:41.325898 2481 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 20:51:41.326894 kubelet[2481]: W0113 20:51:41.326773 2481 probe.go:268] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 13 20:51:41.327182 kubelet[2481]: I0113 20:51:41.327091 2481 server.go:1256] "Started kubelet" Jan 13 20:51:41.328266 kubelet[2481]: I0113 20:51:41.328251 2481 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 20:51:41.332333 kubelet[2481]: E0113 20:51:41.332310 2481 event.go:355] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.104:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.104:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.181a5bc373f73719 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-01-13 20:51:41.327079193 +0000 UTC m=+0.269414470,LastTimestamp:2025-01-13 20:51:41.327079193 +0000 UTC m=+0.269414470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jan 13 20:51:41.332713 kubelet[2481]: I0113 20:51:41.332704 2481 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 20:51:41.334110 kubelet[2481]: I0113 20:51:41.334101 2481 server.go:461] "Adding debug handlers to kubelet server" Jan 13 20:51:41.336061 kubelet[2481]: I0113 20:51:41.334213 2481 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 20:51:41.336711 kubelet[2481]: I0113 20:51:41.336332 2481 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 20:51:41.336711 kubelet[2481]: E0113 20:51:41.336383 2481 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 20:51:41.336711 kubelet[2481]: I0113 20:51:41.336398 2481 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 13 20:51:41.336711 kubelet[2481]: I0113 20:51:41.336439 2481 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Jan 13 20:51:41.336711 kubelet[2481]: I0113 20:51:41.336464 2481 reconciler_new.go:29] "Reconciler: start to sync state" Jan 13 20:51:41.336711 kubelet[2481]: W0113 20:51:41.336632 2481 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:41.336711 kubelet[2481]: E0113 20:51:41.336652 2481 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:41.336711 kubelet[2481]: E0113 20:51:41.336684 2481 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="200ms" Jan 13 20:51:41.337251 kubelet[2481]: I0113 20:51:41.336908 2481 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 20:51:41.339544 kubelet[2481]: E0113 20:51:41.338766 2481 kubelet.go:1462] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 13 20:51:41.339544 kubelet[2481]: I0113 20:51:41.338853 2481 factory.go:221] Registration of the containerd container factory successfully Jan 13 20:51:41.339544 kubelet[2481]: I0113 20:51:41.338859 2481 factory.go:221] Registration of the systemd container factory successfully Jan 13 20:51:41.342676 kubelet[2481]: I0113 20:51:41.342667 2481 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 20:51:41.344038 kubelet[2481]: I0113 20:51:41.343914 2481 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 20:51:41.344038 kubelet[2481]: I0113 20:51:41.343928 2481 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 13 20:51:41.344038 kubelet[2481]: I0113 20:51:41.343937 2481 kubelet.go:2329] "Starting kubelet main sync loop" Jan 13 20:51:41.344038 kubelet[2481]: E0113 20:51:41.343959 2481 kubelet.go:2353] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 13 20:51:41.348129 kubelet[2481]: W0113 20:51:41.347983 2481 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:41.348129 kubelet[2481]: E0113 20:51:41.348009 2481 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:41.359963 kubelet[2481]: I0113 20:51:41.359949 2481 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 13 20:51:41.359994 kubelet[2481]: I0113 20:51:41.359966 2481 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 13 20:51:41.359994 kubelet[2481]: I0113 20:51:41.359976 2481 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:51:41.360924 kubelet[2481]: I0113 20:51:41.360912 2481 policy_none.go:49] "None policy: Start" Jan 13 20:51:41.361137 kubelet[2481]: I0113 20:51:41.361120 2481 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 13 20:51:41.361171 kubelet[2481]: I0113 20:51:41.361156 2481 state_mem.go:35] "Initializing new in-memory state store" Jan 13 20:51:41.367673 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 13 20:51:41.376139 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 13 20:51:41.378066 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 13 20:51:41.394124 kubelet[2481]: I0113 20:51:41.393895 2481 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 20:51:41.394124 kubelet[2481]: I0113 20:51:41.394030 2481 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 20:51:41.395500 kubelet[2481]: E0113 20:51:41.395436 2481 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Jan 13 20:51:41.437965 kubelet[2481]: I0113 20:51:41.437948 2481 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:51:41.438178 kubelet[2481]: E0113 20:51:41.438168 2481 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Jan 13 20:51:41.444296 kubelet[2481]: I0113 20:51:41.444282 2481 topology_manager.go:215] "Topology Admit Handler" podUID="5e0e38bde411af31dd63872fd2cf4f42" podNamespace="kube-system" podName="kube-apiserver-localhost" Jan 13 20:51:41.444721 kubelet[2481]: I0113 20:51:41.444712 2481 topology_manager.go:215] "Topology Admit Handler" podUID="4f8e0d694c07e04969646aa3c152c34a" podNamespace="kube-system" podName="kube-controller-manager-localhost" Jan 13 20:51:41.445529 kubelet[2481]: I0113 20:51:41.445459 2481 topology_manager.go:215] "Topology Admit Handler" podUID="c4144e8f85b2123a6afada0c1705bbba" podNamespace="kube-system" podName="kube-scheduler-localhost" Jan 13 20:51:41.449279 systemd[1]: Created slice kubepods-burstable-pod5e0e38bde411af31dd63872fd2cf4f42.slice - libcontainer container kubepods-burstable-pod5e0e38bde411af31dd63872fd2cf4f42.slice. Jan 13 20:51:41.461257 systemd[1]: Created slice kubepods-burstable-pod4f8e0d694c07e04969646aa3c152c34a.slice - libcontainer container kubepods-burstable-pod4f8e0d694c07e04969646aa3c152c34a.slice. Jan 13 20:51:41.470235 systemd[1]: Created slice kubepods-burstable-podc4144e8f85b2123a6afada0c1705bbba.slice - libcontainer container kubepods-burstable-podc4144e8f85b2123a6afada0c1705bbba.slice. Jan 13 20:51:41.537408 kubelet[2481]: E0113 20:51:41.537390 2481 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="400ms" Jan 13 20:51:41.537466 kubelet[2481]: I0113 20:51:41.537443 2481 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5e0e38bde411af31dd63872fd2cf4f42-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"5e0e38bde411af31dd63872fd2cf4f42\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:51:41.537466 kubelet[2481]: I0113 20:51:41.537456 2481 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5e0e38bde411af31dd63872fd2cf4f42-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"5e0e38bde411af31dd63872fd2cf4f42\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:51:41.537504 kubelet[2481]: I0113 20:51:41.537470 2481 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:51:41.537504 kubelet[2481]: I0113 20:51:41.537481 2481 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:51:41.537504 kubelet[2481]: I0113 20:51:41.537492 2481 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:51:41.537504 kubelet[2481]: I0113 20:51:41.537505 2481 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5e0e38bde411af31dd63872fd2cf4f42-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"5e0e38bde411af31dd63872fd2cf4f42\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:51:41.537568 kubelet[2481]: I0113 20:51:41.537515 2481 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:51:41.537568 kubelet[2481]: I0113 20:51:41.537528 2481 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:51:41.537568 kubelet[2481]: I0113 20:51:41.537540 2481 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c4144e8f85b2123a6afada0c1705bbba-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"c4144e8f85b2123a6afada0c1705bbba\") " pod="kube-system/kube-scheduler-localhost" Jan 13 20:51:41.639896 kubelet[2481]: I0113 20:51:41.639789 2481 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:51:41.639986 kubelet[2481]: E0113 20:51:41.639975 2481 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Jan 13 20:51:41.761073 containerd[1540]: time="2025-01-13T20:51:41.761020191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:5e0e38bde411af31dd63872fd2cf4f42,Namespace:kube-system,Attempt:0,}" Jan 13 20:51:41.763369 containerd[1540]: time="2025-01-13T20:51:41.763294992Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:4f8e0d694c07e04969646aa3c152c34a,Namespace:kube-system,Attempt:0,}" Jan 13 20:51:41.771854 containerd[1540]: time="2025-01-13T20:51:41.771837260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:c4144e8f85b2123a6afada0c1705bbba,Namespace:kube-system,Attempt:0,}" Jan 13 20:51:41.937743 kubelet[2481]: E0113 20:51:41.937715 2481 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="800ms" Jan 13 20:51:42.041376 kubelet[2481]: I0113 20:51:42.041129 2481 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:51:42.041376 kubelet[2481]: E0113 20:51:42.041326 2481 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Jan 13 20:51:42.270652 kubelet[2481]: W0113 20:51:42.270563 2481 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:42.270652 kubelet[2481]: E0113 20:51:42.270604 2481 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:42.276936 kubelet[2481]: W0113 20:51:42.276906 2481 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:42.276967 kubelet[2481]: E0113 20:51:42.276942 2481 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:42.342998 kubelet[2481]: W0113 20:51:42.342947 2481 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:42.342998 kubelet[2481]: E0113 20:51:42.342985 2481 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:42.454270 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1159513433.mount: Deactivated successfully. Jan 13 20:51:42.456116 containerd[1540]: time="2025-01-13T20:51:42.456097398Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:51:42.456698 containerd[1540]: time="2025-01-13T20:51:42.456681969Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:51:42.457257 containerd[1540]: time="2025-01-13T20:51:42.457203251Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:51:42.458034 containerd[1540]: time="2025-01-13T20:51:42.457969072Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 13 20:51:42.458141 containerd[1540]: time="2025-01-13T20:51:42.458076996Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:51:42.458567 containerd[1540]: time="2025-01-13T20:51:42.458472345Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 13 20:51:42.458912 containerd[1540]: time="2025-01-13T20:51:42.458895892Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Jan 13 20:51:42.459777 containerd[1540]: time="2025-01-13T20:51:42.459499780Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:51:42.460393 containerd[1540]: time="2025-01-13T20:51:42.460257067Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 688.377996ms" Jan 13 20:51:42.461916 containerd[1540]: time="2025-01-13T20:51:42.461877036Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 698.539337ms" Jan 13 20:51:42.463177 containerd[1540]: time="2025-01-13T20:51:42.463013598Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 701.92916ms" Jan 13 20:51:42.573342 containerd[1540]: time="2025-01-13T20:51:42.570464910Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:51:42.573342 containerd[1540]: time="2025-01-13T20:51:42.572841645Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:51:42.573642 containerd[1540]: time="2025-01-13T20:51:42.572853682Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:51:42.573642 containerd[1540]: time="2025-01-13T20:51:42.572916971Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:51:42.575238 containerd[1540]: time="2025-01-13T20:51:42.575115982Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:51:42.575238 containerd[1540]: time="2025-01-13T20:51:42.575146609Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:51:42.575238 containerd[1540]: time="2025-01-13T20:51:42.575156411Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:51:42.575238 containerd[1540]: time="2025-01-13T20:51:42.575200199Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:51:42.577475 containerd[1540]: time="2025-01-13T20:51:42.577429270Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:51:42.577475 containerd[1540]: time="2025-01-13T20:51:42.577456120Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:51:42.577626 containerd[1540]: time="2025-01-13T20:51:42.577471626Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:51:42.577626 containerd[1540]: time="2025-01-13T20:51:42.577516937Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:51:42.592489 systemd[1]: Started cri-containerd-949f51cd1f5b7ee51e689752e73716311dc535034b27341f5ce20e62991dd323.scope - libcontainer container 949f51cd1f5b7ee51e689752e73716311dc535034b27341f5ce20e62991dd323. Jan 13 20:51:42.596190 systemd[1]: Started cri-containerd-a23085bb8c67faa591a31d3b8b10fa44cd13d695cf2806321329db45aff34061.scope - libcontainer container a23085bb8c67faa591a31d3b8b10fa44cd13d695cf2806321329db45aff34061. Jan 13 20:51:42.598012 systemd[1]: Started cri-containerd-dbfc70ff1da4b2c77aedcb0a15b443838ad08fdaccf67700f299c3f882e3179d.scope - libcontainer container dbfc70ff1da4b2c77aedcb0a15b443838ad08fdaccf67700f299c3f882e3179d. Jan 13 20:51:42.633277 containerd[1540]: time="2025-01-13T20:51:42.633198332Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:5e0e38bde411af31dd63872fd2cf4f42,Namespace:kube-system,Attempt:0,} returns sandbox id \"a23085bb8c67faa591a31d3b8b10fa44cd13d695cf2806321329db45aff34061\"" Jan 13 20:51:42.639421 containerd[1540]: time="2025-01-13T20:51:42.639397603Z" level=info msg="CreateContainer within sandbox \"a23085bb8c67faa591a31d3b8b10fa44cd13d695cf2806321329db45aff34061\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 13 20:51:42.649084 containerd[1540]: time="2025-01-13T20:51:42.649063424Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:4f8e0d694c07e04969646aa3c152c34a,Namespace:kube-system,Attempt:0,} returns sandbox id \"949f51cd1f5b7ee51e689752e73716311dc535034b27341f5ce20e62991dd323\"" Jan 13 20:51:42.650986 containerd[1540]: time="2025-01-13T20:51:42.650970131Z" level=info msg="CreateContainer within sandbox \"949f51cd1f5b7ee51e689752e73716311dc535034b27341f5ce20e62991dd323\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 13 20:51:42.653848 containerd[1540]: time="2025-01-13T20:51:42.653828040Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:c4144e8f85b2123a6afada0c1705bbba,Namespace:kube-system,Attempt:0,} returns sandbox id \"dbfc70ff1da4b2c77aedcb0a15b443838ad08fdaccf67700f299c3f882e3179d\"" Jan 13 20:51:42.658582 containerd[1540]: time="2025-01-13T20:51:42.658559843Z" level=info msg="CreateContainer within sandbox \"dbfc70ff1da4b2c77aedcb0a15b443838ad08fdaccf67700f299c3f882e3179d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 13 20:51:42.664615 containerd[1540]: time="2025-01-13T20:51:42.664439607Z" level=info msg="CreateContainer within sandbox \"a23085bb8c67faa591a31d3b8b10fa44cd13d695cf2806321329db45aff34061\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"8f457c0a2a9f20ce0d1cb348dd9b60c1daa39388eb31e282cb0c47cce5618a01\"" Jan 13 20:51:42.664820 containerd[1540]: time="2025-01-13T20:51:42.664802566Z" level=info msg="StartContainer for \"8f457c0a2a9f20ce0d1cb348dd9b60c1daa39388eb31e282cb0c47cce5618a01\"" Jan 13 20:51:42.666108 containerd[1540]: time="2025-01-13T20:51:42.666084737Z" level=info msg="CreateContainer within sandbox \"949f51cd1f5b7ee51e689752e73716311dc535034b27341f5ce20e62991dd323\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"d0465360bf0256d3d16b240756d97fa3bb3ea1f9a62f264b6c37e492ccee0610\"" Jan 13 20:51:42.666633 containerd[1540]: time="2025-01-13T20:51:42.666584708Z" level=info msg="StartContainer for \"d0465360bf0256d3d16b240756d97fa3bb3ea1f9a62f264b6c37e492ccee0610\"" Jan 13 20:51:42.681481 containerd[1540]: time="2025-01-13T20:51:42.681393476Z" level=info msg="CreateContainer within sandbox \"dbfc70ff1da4b2c77aedcb0a15b443838ad08fdaccf67700f299c3f882e3179d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3f245c2a922dd8f8d8829b275b28d9e904eaf5eecb4771638c5c796634ccd831\"" Jan 13 20:51:42.683049 containerd[1540]: time="2025-01-13T20:51:42.682927484Z" level=info msg="StartContainer for \"3f245c2a922dd8f8d8829b275b28d9e904eaf5eecb4771638c5c796634ccd831\"" Jan 13 20:51:42.684862 systemd[1]: Started cri-containerd-8f457c0a2a9f20ce0d1cb348dd9b60c1daa39388eb31e282cb0c47cce5618a01.scope - libcontainer container 8f457c0a2a9f20ce0d1cb348dd9b60c1daa39388eb31e282cb0c47cce5618a01. Jan 13 20:51:42.703543 systemd[1]: Started cri-containerd-d0465360bf0256d3d16b240756d97fa3bb3ea1f9a62f264b6c37e492ccee0610.scope - libcontainer container d0465360bf0256d3d16b240756d97fa3bb3ea1f9a62f264b6c37e492ccee0610. Jan 13 20:51:42.706465 systemd[1]: Started cri-containerd-3f245c2a922dd8f8d8829b275b28d9e904eaf5eecb4771638c5c796634ccd831.scope - libcontainer container 3f245c2a922dd8f8d8829b275b28d9e904eaf5eecb4771638c5c796634ccd831. Jan 13 20:51:42.737258 containerd[1540]: time="2025-01-13T20:51:42.737230154Z" level=info msg="StartContainer for \"8f457c0a2a9f20ce0d1cb348dd9b60c1daa39388eb31e282cb0c47cce5618a01\" returns successfully" Jan 13 20:51:42.738777 kubelet[2481]: E0113 20:51:42.738758 2481 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="1.6s" Jan 13 20:51:42.747532 containerd[1540]: time="2025-01-13T20:51:42.747469257Z" level=info msg="StartContainer for \"d0465360bf0256d3d16b240756d97fa3bb3ea1f9a62f264b6c37e492ccee0610\" returns successfully" Jan 13 20:51:42.756906 containerd[1540]: time="2025-01-13T20:51:42.756878707Z" level=info msg="StartContainer for \"3f245c2a922dd8f8d8829b275b28d9e904eaf5eecb4771638c5c796634ccd831\" returns successfully" Jan 13 20:51:42.806652 kubelet[2481]: W0113 20:51:42.806522 2481 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:42.806652 kubelet[2481]: E0113 20:51:42.806561 2481 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:42.842513 kubelet[2481]: I0113 20:51:42.842455 2481 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:51:42.842651 kubelet[2481]: E0113 20:51:42.842637 2481 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Jan 13 20:51:42.994534 kubelet[2481]: E0113 20:51:42.994515 2481 event.go:355] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.104:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.104:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.181a5bc373f73719 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-01-13 20:51:41.327079193 +0000 UTC m=+0.269414470,LastTimestamp:2025-01-13 20:51:41.327079193 +0000 UTC m=+0.269414470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jan 13 20:51:43.347681 kubelet[2481]: E0113 20:51:43.347603 2481 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://139.178.70.104:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 139.178.70.104:6443: connect: connection refused Jan 13 20:51:44.445144 kubelet[2481]: I0113 20:51:44.445114 2481 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:51:44.741622 kubelet[2481]: E0113 20:51:44.741524 2481 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Jan 13 20:51:44.744720 kubelet[2481]: I0113 20:51:44.744547 2481 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Jan 13 20:51:45.326511 kubelet[2481]: I0113 20:51:45.326473 2481 apiserver.go:52] "Watching apiserver" Jan 13 20:51:45.337392 kubelet[2481]: I0113 20:51:45.337366 2481 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Jan 13 20:51:47.225998 systemd[1]: Reloading requested from client PID 2756 ('systemctl') (unit session-9.scope)... Jan 13 20:51:47.226014 systemd[1]: Reloading... Jan 13 20:51:47.284397 zram_generator::config[2797]: No configuration found. Jan 13 20:51:47.352901 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 20:51:47.370198 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:51:47.422892 systemd[1]: Reloading finished in 196 ms. Jan 13 20:51:47.448150 kubelet[2481]: I0113 20:51:47.448087 2481 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 20:51:47.448420 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:51:47.455058 systemd[1]: kubelet.service: Deactivated successfully. Jan 13 20:51:47.455413 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:51:47.459536 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:51:47.849304 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:51:47.854400 (kubelet)[2861]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 20:51:47.945981 kubelet[2861]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:51:47.945981 kubelet[2861]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 13 20:51:47.945981 kubelet[2861]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:51:47.945981 kubelet[2861]: I0113 20:51:47.945676 2861 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 20:51:47.949492 kubelet[2861]: I0113 20:51:47.949471 2861 server.go:487] "Kubelet version" kubeletVersion="v1.29.2" Jan 13 20:51:47.949492 kubelet[2861]: I0113 20:51:47.949490 2861 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 20:51:47.949629 kubelet[2861]: I0113 20:51:47.949618 2861 server.go:919] "Client rotation is on, will bootstrap in background" Jan 13 20:51:47.950786 kubelet[2861]: I0113 20:51:47.950582 2861 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 13 20:51:47.951769 kubelet[2861]: I0113 20:51:47.951689 2861 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 20:51:47.958690 kubelet[2861]: I0113 20:51:47.958428 2861 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 20:51:47.960077 kubelet[2861]: I0113 20:51:47.959739 2861 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 20:51:47.960077 kubelet[2861]: I0113 20:51:47.959855 2861 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 13 20:51:47.960077 kubelet[2861]: I0113 20:51:47.959868 2861 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 20:51:47.960077 kubelet[2861]: I0113 20:51:47.959875 2861 container_manager_linux.go:301] "Creating device plugin manager" Jan 13 20:51:47.960077 kubelet[2861]: I0113 20:51:47.959899 2861 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:51:47.960077 kubelet[2861]: I0113 20:51:47.959965 2861 kubelet.go:396] "Attempting to sync node with API server" Jan 13 20:51:47.960239 kubelet[2861]: I0113 20:51:47.959974 2861 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 20:51:47.960239 kubelet[2861]: I0113 20:51:47.959990 2861 kubelet.go:312] "Adding apiserver pod source" Jan 13 20:51:47.960239 kubelet[2861]: I0113 20:51:47.959999 2861 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 20:51:47.961495 kubelet[2861]: I0113 20:51:47.961030 2861 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 13 20:51:47.961495 kubelet[2861]: I0113 20:51:47.961462 2861 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 20:51:47.964585 kubelet[2861]: I0113 20:51:47.964248 2861 server.go:1256] "Started kubelet" Jan 13 20:51:47.967915 kubelet[2861]: I0113 20:51:47.967437 2861 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 20:51:47.975410 kubelet[2861]: I0113 20:51:47.974088 2861 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 20:51:47.975410 kubelet[2861]: I0113 20:51:47.974886 2861 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 20:51:47.975410 kubelet[2861]: I0113 20:51:47.975101 2861 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 20:51:47.977010 kubelet[2861]: I0113 20:51:47.976573 2861 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 13 20:51:47.979606 kubelet[2861]: I0113 20:51:47.979591 2861 server.go:461] "Adding debug handlers to kubelet server" Jan 13 20:51:47.979698 kubelet[2861]: I0113 20:51:47.979686 2861 factory.go:221] Registration of the systemd container factory successfully Jan 13 20:51:47.979760 kubelet[2861]: I0113 20:51:47.979747 2861 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 20:51:47.980688 kubelet[2861]: E0113 20:51:47.979620 2861 kubelet.go:1462] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 13 20:51:47.980742 kubelet[2861]: I0113 20:51:47.980732 2861 factory.go:221] Registration of the containerd container factory successfully Jan 13 20:51:47.980792 kubelet[2861]: I0113 20:51:47.980785 2861 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Jan 13 20:51:47.980892 kubelet[2861]: I0113 20:51:47.980886 2861 reconciler_new.go:29] "Reconciler: start to sync state" Jan 13 20:51:48.004911 kubelet[2861]: I0113 20:51:48.004896 2861 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 20:51:48.006438 kubelet[2861]: I0113 20:51:48.006426 2861 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 20:51:48.006505 kubelet[2861]: I0113 20:51:48.006500 2861 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 13 20:51:48.006559 kubelet[2861]: I0113 20:51:48.006553 2861 kubelet.go:2329] "Starting kubelet main sync loop" Jan 13 20:51:48.006629 kubelet[2861]: E0113 20:51:48.006622 2861 kubelet.go:2353] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 13 20:51:48.016630 kubelet[2861]: I0113 20:51:48.016609 2861 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 13 20:51:48.016630 kubelet[2861]: I0113 20:51:48.016624 2861 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 13 20:51:48.016630 kubelet[2861]: I0113 20:51:48.016635 2861 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:51:48.016764 kubelet[2861]: I0113 20:51:48.016733 2861 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 13 20:51:48.016764 kubelet[2861]: I0113 20:51:48.016746 2861 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 13 20:51:48.016764 kubelet[2861]: I0113 20:51:48.016751 2861 policy_none.go:49] "None policy: Start" Jan 13 20:51:48.017025 kubelet[2861]: I0113 20:51:48.017013 2861 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 13 20:51:48.017051 kubelet[2861]: I0113 20:51:48.017028 2861 state_mem.go:35] "Initializing new in-memory state store" Jan 13 20:51:48.017116 kubelet[2861]: I0113 20:51:48.017106 2861 state_mem.go:75] "Updated machine memory state" Jan 13 20:51:48.019598 kubelet[2861]: I0113 20:51:48.019586 2861 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 20:51:48.019855 kubelet[2861]: I0113 20:51:48.019710 2861 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 20:51:48.078377 kubelet[2861]: I0113 20:51:48.078332 2861 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 20:51:48.107851 kubelet[2861]: I0113 20:51:48.107753 2861 topology_manager.go:215] "Topology Admit Handler" podUID="5e0e38bde411af31dd63872fd2cf4f42" podNamespace="kube-system" podName="kube-apiserver-localhost" Jan 13 20:51:48.112036 kubelet[2861]: I0113 20:51:48.108814 2861 topology_manager.go:215] "Topology Admit Handler" podUID="4f8e0d694c07e04969646aa3c152c34a" podNamespace="kube-system" podName="kube-controller-manager-localhost" Jan 13 20:51:48.112036 kubelet[2861]: I0113 20:51:48.108862 2861 topology_manager.go:215] "Topology Admit Handler" podUID="c4144e8f85b2123a6afada0c1705bbba" podNamespace="kube-system" podName="kube-scheduler-localhost" Jan 13 20:51:48.119910 kubelet[2861]: I0113 20:51:48.119893 2861 kubelet_node_status.go:112] "Node was previously registered" node="localhost" Jan 13 20:51:48.120057 kubelet[2861]: I0113 20:51:48.120047 2861 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Jan 13 20:51:48.181782 kubelet[2861]: I0113 20:51:48.181764 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c4144e8f85b2123a6afada0c1705bbba-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"c4144e8f85b2123a6afada0c1705bbba\") " pod="kube-system/kube-scheduler-localhost" Jan 13 20:51:48.181919 kubelet[2861]: I0113 20:51:48.181912 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5e0e38bde411af31dd63872fd2cf4f42-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"5e0e38bde411af31dd63872fd2cf4f42\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:51:48.181985 kubelet[2861]: I0113 20:51:48.181980 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:51:48.182046 kubelet[2861]: I0113 20:51:48.182023 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:51:48.182114 kubelet[2861]: I0113 20:51:48.182108 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:51:48.182196 kubelet[2861]: I0113 20:51:48.182189 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:51:48.182256 kubelet[2861]: I0113 20:51:48.182251 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5e0e38bde411af31dd63872fd2cf4f42-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"5e0e38bde411af31dd63872fd2cf4f42\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:51:48.182308 kubelet[2861]: I0113 20:51:48.182296 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5e0e38bde411af31dd63872fd2cf4f42-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"5e0e38bde411af31dd63872fd2cf4f42\") " pod="kube-system/kube-apiserver-localhost" Jan 13 20:51:48.182380 kubelet[2861]: I0113 20:51:48.182374 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 20:51:48.961546 kubelet[2861]: I0113 20:51:48.961392 2861 apiserver.go:52] "Watching apiserver" Jan 13 20:51:48.981055 kubelet[2861]: I0113 20:51:48.981013 2861 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Jan 13 20:51:49.027382 kubelet[2861]: E0113 20:51:49.027064 2861 kubelet.go:1921] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Jan 13 20:51:49.055783 kubelet[2861]: I0113 20:51:49.055755 2861 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.055721249 podStartE2EDuration="1.055721249s" podCreationTimestamp="2025-01-13 20:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:51:49.040869779 +0000 UTC m=+1.147154520" watchObservedRunningTime="2025-01-13 20:51:49.055721249 +0000 UTC m=+1.162005980" Jan 13 20:51:49.061067 kubelet[2861]: I0113 20:51:49.061044 2861 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.0610206739999999 podStartE2EDuration="1.061020674s" podCreationTimestamp="2025-01-13 20:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:51:49.056317148 +0000 UTC m=+1.162601887" watchObservedRunningTime="2025-01-13 20:51:49.061020674 +0000 UTC m=+1.167305413" Jan 13 20:51:49.066528 kubelet[2861]: I0113 20:51:49.066453 2861 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.066425966 podStartE2EDuration="1.066425966s" podCreationTimestamp="2025-01-13 20:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:51:49.061160149 +0000 UTC m=+1.167444888" watchObservedRunningTime="2025-01-13 20:51:49.066425966 +0000 UTC m=+1.172710705" Jan 13 20:51:51.961823 sudo[1859]: pam_unix(sudo:session): session closed for user root Jan 13 20:51:51.962955 sshd[1858]: Connection closed by 147.75.109.163 port 44116 Jan 13 20:51:51.963660 sshd-session[1856]: pam_unix(sshd:session): session closed for user core Jan 13 20:51:51.965301 systemd[1]: sshd@6-139.178.70.104:22-147.75.109.163:44116.service: Deactivated successfully. Jan 13 20:51:51.966649 systemd[1]: session-9.scope: Deactivated successfully. Jan 13 20:51:51.966831 systemd[1]: session-9.scope: Consumed 2.738s CPU time, 185.4M memory peak, 0B memory swap peak. Jan 13 20:51:51.967699 systemd-logind[1528]: Session 9 logged out. Waiting for processes to exit. Jan 13 20:51:51.968331 systemd-logind[1528]: Removed session 9. Jan 13 20:52:02.612043 kubelet[2861]: I0113 20:52:02.612027 2861 kuberuntime_manager.go:1529] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 13 20:52:02.612685 containerd[1540]: time="2025-01-13T20:52:02.612561726Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 13 20:52:02.612872 kubelet[2861]: I0113 20:52:02.612678 2861 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 13 20:52:03.238885 kubelet[2861]: I0113 20:52:03.238855 2861 topology_manager.go:215] "Topology Admit Handler" podUID="487f5dfb-18d3-4731-b2ef-d6f4ce50aca7" podNamespace="kube-system" podName="kube-proxy-qzbkm" Jan 13 20:52:03.245345 systemd[1]: Created slice kubepods-besteffort-pod487f5dfb_18d3_4731_b2ef_d6f4ce50aca7.slice - libcontainer container kubepods-besteffort-pod487f5dfb_18d3_4731_b2ef_d6f4ce50aca7.slice. Jan 13 20:52:03.388430 kubelet[2861]: I0113 20:52:03.388411 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/487f5dfb-18d3-4731-b2ef-d6f4ce50aca7-xtables-lock\") pod \"kube-proxy-qzbkm\" (UID: \"487f5dfb-18d3-4731-b2ef-d6f4ce50aca7\") " pod="kube-system/kube-proxy-qzbkm" Jan 13 20:52:03.388997 kubelet[2861]: I0113 20:52:03.388885 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/487f5dfb-18d3-4731-b2ef-d6f4ce50aca7-lib-modules\") pod \"kube-proxy-qzbkm\" (UID: \"487f5dfb-18d3-4731-b2ef-d6f4ce50aca7\") " pod="kube-system/kube-proxy-qzbkm" Jan 13 20:52:03.388997 kubelet[2861]: I0113 20:52:03.388910 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/487f5dfb-18d3-4731-b2ef-d6f4ce50aca7-kube-proxy\") pod \"kube-proxy-qzbkm\" (UID: \"487f5dfb-18d3-4731-b2ef-d6f4ce50aca7\") " pod="kube-system/kube-proxy-qzbkm" Jan 13 20:52:03.388997 kubelet[2861]: I0113 20:52:03.388929 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76m76\" (UniqueName: \"kubernetes.io/projected/487f5dfb-18d3-4731-b2ef-d6f4ce50aca7-kube-api-access-76m76\") pod \"kube-proxy-qzbkm\" (UID: \"487f5dfb-18d3-4731-b2ef-d6f4ce50aca7\") " pod="kube-system/kube-proxy-qzbkm" Jan 13 20:52:03.495932 kubelet[2861]: I0113 20:52:03.494219 2861 topology_manager.go:215] "Topology Admit Handler" podUID="41a87b64-71f0-4f75-8a70-a695ab348265" podNamespace="tigera-operator" podName="tigera-operator-c7ccbd65-6xl2t" Jan 13 20:52:03.499249 systemd[1]: Created slice kubepods-besteffort-pod41a87b64_71f0_4f75_8a70_a695ab348265.slice - libcontainer container kubepods-besteffort-pod41a87b64_71f0_4f75_8a70_a695ab348265.slice. Jan 13 20:52:03.552068 containerd[1540]: time="2025-01-13T20:52:03.552042667Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qzbkm,Uid:487f5dfb-18d3-4731-b2ef-d6f4ce50aca7,Namespace:kube-system,Attempt:0,}" Jan 13 20:52:03.564716 containerd[1540]: time="2025-01-13T20:52:03.564665878Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:52:03.564716 containerd[1540]: time="2025-01-13T20:52:03.564696585Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:52:03.564908 containerd[1540]: time="2025-01-13T20:52:03.564703761Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:03.564908 containerd[1540]: time="2025-01-13T20:52:03.564755391Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:03.576472 systemd[1]: Started cri-containerd-a6b48ee20142512ea84c444f12fbed1a785141b19d838995bdec55e0868339e4.scope - libcontainer container a6b48ee20142512ea84c444f12fbed1a785141b19d838995bdec55e0868339e4. Jan 13 20:52:03.589968 kubelet[2861]: I0113 20:52:03.589895 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dxx6\" (UniqueName: \"kubernetes.io/projected/41a87b64-71f0-4f75-8a70-a695ab348265-kube-api-access-2dxx6\") pod \"tigera-operator-c7ccbd65-6xl2t\" (UID: \"41a87b64-71f0-4f75-8a70-a695ab348265\") " pod="tigera-operator/tigera-operator-c7ccbd65-6xl2t" Jan 13 20:52:03.589968 kubelet[2861]: I0113 20:52:03.589932 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/41a87b64-71f0-4f75-8a70-a695ab348265-var-lib-calico\") pod \"tigera-operator-c7ccbd65-6xl2t\" (UID: \"41a87b64-71f0-4f75-8a70-a695ab348265\") " pod="tigera-operator/tigera-operator-c7ccbd65-6xl2t" Jan 13 20:52:03.592022 containerd[1540]: time="2025-01-13T20:52:03.591986090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qzbkm,Uid:487f5dfb-18d3-4731-b2ef-d6f4ce50aca7,Namespace:kube-system,Attempt:0,} returns sandbox id \"a6b48ee20142512ea84c444f12fbed1a785141b19d838995bdec55e0868339e4\"" Jan 13 20:52:03.594509 containerd[1540]: time="2025-01-13T20:52:03.594320285Z" level=info msg="CreateContainer within sandbox \"a6b48ee20142512ea84c444f12fbed1a785141b19d838995bdec55e0868339e4\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 13 20:52:03.603003 containerd[1540]: time="2025-01-13T20:52:03.601543655Z" level=info msg="CreateContainer within sandbox \"a6b48ee20142512ea84c444f12fbed1a785141b19d838995bdec55e0868339e4\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b2a85c132e91f2ddef20416dbf952363cd2efce0e9b550835ca22dd43c9d5903\"" Jan 13 20:52:03.603409 containerd[1540]: time="2025-01-13T20:52:03.603361206Z" level=info msg="StartContainer for \"b2a85c132e91f2ddef20416dbf952363cd2efce0e9b550835ca22dd43c9d5903\"" Jan 13 20:52:03.619509 systemd[1]: Started cri-containerd-b2a85c132e91f2ddef20416dbf952363cd2efce0e9b550835ca22dd43c9d5903.scope - libcontainer container b2a85c132e91f2ddef20416dbf952363cd2efce0e9b550835ca22dd43c9d5903. Jan 13 20:52:03.637187 containerd[1540]: time="2025-01-13T20:52:03.637158042Z" level=info msg="StartContainer for \"b2a85c132e91f2ddef20416dbf952363cd2efce0e9b550835ca22dd43c9d5903\" returns successfully" Jan 13 20:52:03.803684 containerd[1540]: time="2025-01-13T20:52:03.803603987Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-c7ccbd65-6xl2t,Uid:41a87b64-71f0-4f75-8a70-a695ab348265,Namespace:tigera-operator,Attempt:0,}" Jan 13 20:52:03.816645 containerd[1540]: time="2025-01-13T20:52:03.816398211Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:52:03.816645 containerd[1540]: time="2025-01-13T20:52:03.816456041Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:52:03.816645 containerd[1540]: time="2025-01-13T20:52:03.816468366Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:03.816645 containerd[1540]: time="2025-01-13T20:52:03.816521285Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:03.829446 systemd[1]: Started cri-containerd-9b94dfa95aa45776a3e7fca294a2cc3f5af51999de3fc617c8fe5637f88d7f22.scope - libcontainer container 9b94dfa95aa45776a3e7fca294a2cc3f5af51999de3fc617c8fe5637f88d7f22. Jan 13 20:52:03.856637 containerd[1540]: time="2025-01-13T20:52:03.856617430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-c7ccbd65-6xl2t,Uid:41a87b64-71f0-4f75-8a70-a695ab348265,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"9b94dfa95aa45776a3e7fca294a2cc3f5af51999de3fc617c8fe5637f88d7f22\"" Jan 13 20:52:03.857702 containerd[1540]: time="2025-01-13T20:52:03.857524310Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Jan 13 20:52:05.338973 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount835576266.mount: Deactivated successfully. Jan 13 20:52:05.655712 containerd[1540]: time="2025-01-13T20:52:05.655664632Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:05.656121 containerd[1540]: time="2025-01-13T20:52:05.656098334Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21764321" Jan 13 20:52:05.656727 containerd[1540]: time="2025-01-13T20:52:05.656393869Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:05.657469 containerd[1540]: time="2025-01-13T20:52:05.657447503Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:05.658178 containerd[1540]: time="2025-01-13T20:52:05.657899315Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 1.80035834s" Jan 13 20:52:05.658178 containerd[1540]: time="2025-01-13T20:52:05.657916780Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Jan 13 20:52:05.658944 containerd[1540]: time="2025-01-13T20:52:05.658871860Z" level=info msg="CreateContainer within sandbox \"9b94dfa95aa45776a3e7fca294a2cc3f5af51999de3fc617c8fe5637f88d7f22\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 13 20:52:05.666505 containerd[1540]: time="2025-01-13T20:52:05.666456790Z" level=info msg="CreateContainer within sandbox \"9b94dfa95aa45776a3e7fca294a2cc3f5af51999de3fc617c8fe5637f88d7f22\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"05d1f01765aedc292d0bf50c61e000411af046f43256a76ccdd0382f0e4fb6ae\"" Jan 13 20:52:05.669953 containerd[1540]: time="2025-01-13T20:52:05.666754673Z" level=info msg="StartContainer for \"05d1f01765aedc292d0bf50c61e000411af046f43256a76ccdd0382f0e4fb6ae\"" Jan 13 20:52:05.687467 systemd[1]: Started cri-containerd-05d1f01765aedc292d0bf50c61e000411af046f43256a76ccdd0382f0e4fb6ae.scope - libcontainer container 05d1f01765aedc292d0bf50c61e000411af046f43256a76ccdd0382f0e4fb6ae. Jan 13 20:52:05.703376 containerd[1540]: time="2025-01-13T20:52:05.703338227Z" level=info msg="StartContainer for \"05d1f01765aedc292d0bf50c61e000411af046f43256a76ccdd0382f0e4fb6ae\" returns successfully" Jan 13 20:52:06.071877 kubelet[2861]: I0113 20:52:06.070930 2861 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-proxy-qzbkm" podStartSLOduration=3.07089129 podStartE2EDuration="3.07089129s" podCreationTimestamp="2025-01-13 20:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:52:04.050468047 +0000 UTC m=+16.156752786" watchObservedRunningTime="2025-01-13 20:52:06.07089129 +0000 UTC m=+18.177176030" Jan 13 20:52:08.019173 kubelet[2861]: I0113 20:52:08.019140 2861 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="tigera-operator/tigera-operator-c7ccbd65-6xl2t" podStartSLOduration=3.218300377 podStartE2EDuration="5.019109853s" podCreationTimestamp="2025-01-13 20:52:03 +0000 UTC" firstStartedPulling="2025-01-13 20:52:03.85728699 +0000 UTC m=+15.963571725" lastFinishedPulling="2025-01-13 20:52:05.658096471 +0000 UTC m=+17.764381201" observedRunningTime="2025-01-13 20:52:06.071531025 +0000 UTC m=+18.177815755" watchObservedRunningTime="2025-01-13 20:52:08.019109853 +0000 UTC m=+20.125394609" Jan 13 20:52:08.518767 kubelet[2861]: I0113 20:52:08.518658 2861 topology_manager.go:215] "Topology Admit Handler" podUID="aeba9dfe-23e2-4693-9190-c72e73d772ae" podNamespace="calico-system" podName="calico-typha-599994c46d-8kdkq" Jan 13 20:52:08.526903 systemd[1]: Created slice kubepods-besteffort-podaeba9dfe_23e2_4693_9190_c72e73d772ae.slice - libcontainer container kubepods-besteffort-podaeba9dfe_23e2_4693_9190_c72e73d772ae.slice. Jan 13 20:52:08.621925 kubelet[2861]: I0113 20:52:08.621827 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzp46\" (UniqueName: \"kubernetes.io/projected/aeba9dfe-23e2-4693-9190-c72e73d772ae-kube-api-access-wzp46\") pod \"calico-typha-599994c46d-8kdkq\" (UID: \"aeba9dfe-23e2-4693-9190-c72e73d772ae\") " pod="calico-system/calico-typha-599994c46d-8kdkq" Jan 13 20:52:08.621925 kubelet[2861]: I0113 20:52:08.621856 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/aeba9dfe-23e2-4693-9190-c72e73d772ae-typha-certs\") pod \"calico-typha-599994c46d-8kdkq\" (UID: \"aeba9dfe-23e2-4693-9190-c72e73d772ae\") " pod="calico-system/calico-typha-599994c46d-8kdkq" Jan 13 20:52:08.621925 kubelet[2861]: I0113 20:52:08.621872 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeba9dfe-23e2-4693-9190-c72e73d772ae-tigera-ca-bundle\") pod \"calico-typha-599994c46d-8kdkq\" (UID: \"aeba9dfe-23e2-4693-9190-c72e73d772ae\") " pod="calico-system/calico-typha-599994c46d-8kdkq" Jan 13 20:52:08.641411 kubelet[2861]: I0113 20:52:08.641389 2861 topology_manager.go:215] "Topology Admit Handler" podUID="5ae12788-a88d-4b93-b16c-86265eaf0a93" podNamespace="calico-system" podName="calico-node-kwhkn" Jan 13 20:52:08.646527 systemd[1]: Created slice kubepods-besteffort-pod5ae12788_a88d_4b93_b16c_86265eaf0a93.slice - libcontainer container kubepods-besteffort-pod5ae12788_a88d_4b93_b16c_86265eaf0a93.slice. Jan 13 20:52:08.722299 kubelet[2861]: I0113 20:52:08.722112 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-policysync\") pod \"calico-node-kwhkn\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " pod="calico-system/calico-node-kwhkn" Jan 13 20:52:08.722299 kubelet[2861]: I0113 20:52:08.722142 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mrd4\" (UniqueName: \"kubernetes.io/projected/5ae12788-a88d-4b93-b16c-86265eaf0a93-kube-api-access-2mrd4\") pod \"calico-node-kwhkn\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " pod="calico-system/calico-node-kwhkn" Jan 13 20:52:08.722299 kubelet[2861]: I0113 20:52:08.722159 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ae12788-a88d-4b93-b16c-86265eaf0a93-tigera-ca-bundle\") pod \"calico-node-kwhkn\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " pod="calico-system/calico-node-kwhkn" Jan 13 20:52:08.722299 kubelet[2861]: I0113 20:52:08.722182 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/5ae12788-a88d-4b93-b16c-86265eaf0a93-node-certs\") pod \"calico-node-kwhkn\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " pod="calico-system/calico-node-kwhkn" Jan 13 20:52:08.722299 kubelet[2861]: I0113 20:52:08.722197 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-var-run-calico\") pod \"calico-node-kwhkn\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " pod="calico-system/calico-node-kwhkn" Jan 13 20:52:08.722518 kubelet[2861]: I0113 20:52:08.722212 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-cni-bin-dir\") pod \"calico-node-kwhkn\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " pod="calico-system/calico-node-kwhkn" Jan 13 20:52:08.722518 kubelet[2861]: I0113 20:52:08.722236 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-var-lib-calico\") pod \"calico-node-kwhkn\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " pod="calico-system/calico-node-kwhkn" Jan 13 20:52:08.722518 kubelet[2861]: I0113 20:52:08.722252 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-xtables-lock\") pod \"calico-node-kwhkn\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " pod="calico-system/calico-node-kwhkn" Jan 13 20:52:08.722518 kubelet[2861]: I0113 20:52:08.722276 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-lib-modules\") pod \"calico-node-kwhkn\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " pod="calico-system/calico-node-kwhkn" Jan 13 20:52:08.722518 kubelet[2861]: I0113 20:52:08.722290 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-cni-log-dir\") pod \"calico-node-kwhkn\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " pod="calico-system/calico-node-kwhkn" Jan 13 20:52:08.723073 kubelet[2861]: I0113 20:52:08.722304 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-flexvol-driver-host\") pod \"calico-node-kwhkn\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " pod="calico-system/calico-node-kwhkn" Jan 13 20:52:08.723073 kubelet[2861]: I0113 20:52:08.722320 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-cni-net-dir\") pod \"calico-node-kwhkn\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " pod="calico-system/calico-node-kwhkn" Jan 13 20:52:08.781557 kubelet[2861]: I0113 20:52:08.781495 2861 topology_manager.go:215] "Topology Admit Handler" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" podNamespace="calico-system" podName="csi-node-driver-7qmqc" Jan 13 20:52:08.812379 kubelet[2861]: E0113 20:52:08.812182 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:08.855230 kubelet[2861]: E0113 20:52:08.855085 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.855230 kubelet[2861]: W0113 20:52:08.855104 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.855230 kubelet[2861]: E0113 20:52:08.855129 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.855503 kubelet[2861]: E0113 20:52:08.855439 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.855503 kubelet[2861]: W0113 20:52:08.855446 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.855503 kubelet[2861]: E0113 20:52:08.855453 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.855660 kubelet[2861]: E0113 20:52:08.855617 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.855660 kubelet[2861]: W0113 20:52:08.855623 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.855660 kubelet[2861]: E0113 20:52:08.855630 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.856445 kubelet[2861]: E0113 20:52:08.856389 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.856445 kubelet[2861]: W0113 20:52:08.856396 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.856445 kubelet[2861]: E0113 20:52:08.856407 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.856649 kubelet[2861]: E0113 20:52:08.856567 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.856649 kubelet[2861]: W0113 20:52:08.856573 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.856649 kubelet[2861]: E0113 20:52:08.856579 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.856802 kubelet[2861]: E0113 20:52:08.856727 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.856802 kubelet[2861]: W0113 20:52:08.856733 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.856802 kubelet[2861]: E0113 20:52:08.856740 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.856921 kubelet[2861]: E0113 20:52:08.856878 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.856921 kubelet[2861]: W0113 20:52:08.856885 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.856921 kubelet[2861]: E0113 20:52:08.856891 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.857097 kubelet[2861]: E0113 20:52:08.857025 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.857097 kubelet[2861]: W0113 20:52:08.857031 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.857097 kubelet[2861]: E0113 20:52:08.857037 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.861126 kubelet[2861]: E0113 20:52:08.857233 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.861126 kubelet[2861]: W0113 20:52:08.857237 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.861126 kubelet[2861]: E0113 20:52:08.857244 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.861126 kubelet[2861]: E0113 20:52:08.857327 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.861126 kubelet[2861]: W0113 20:52:08.857332 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.861126 kubelet[2861]: E0113 20:52:08.857338 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.861126 kubelet[2861]: E0113 20:52:08.857429 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.861126 kubelet[2861]: W0113 20:52:08.857433 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.861126 kubelet[2861]: E0113 20:52:08.857439 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.861126 kubelet[2861]: E0113 20:52:08.857692 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.861298 kubelet[2861]: W0113 20:52:08.857699 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.861298 kubelet[2861]: E0113 20:52:08.857705 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.861298 kubelet[2861]: E0113 20:52:08.857790 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.861298 kubelet[2861]: W0113 20:52:08.857795 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.861298 kubelet[2861]: E0113 20:52:08.857801 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.861298 kubelet[2861]: E0113 20:52:08.857881 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.861298 kubelet[2861]: W0113 20:52:08.857885 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.861298 kubelet[2861]: E0113 20:52:08.857892 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.861298 kubelet[2861]: E0113 20:52:08.857997 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.861298 kubelet[2861]: W0113 20:52:08.858002 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.861490 kubelet[2861]: E0113 20:52:08.858008 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.861490 kubelet[2861]: E0113 20:52:08.858271 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.861490 kubelet[2861]: W0113 20:52:08.858276 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.861490 kubelet[2861]: E0113 20:52:08.858283 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.861490 kubelet[2861]: E0113 20:52:08.858369 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.861490 kubelet[2861]: W0113 20:52:08.858374 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.861490 kubelet[2861]: E0113 20:52:08.858380 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.861490 kubelet[2861]: E0113 20:52:08.858461 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.861490 kubelet[2861]: W0113 20:52:08.858466 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.861490 kubelet[2861]: E0113 20:52:08.858471 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.870171 kubelet[2861]: E0113 20:52:08.858541 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.870171 kubelet[2861]: W0113 20:52:08.858548 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.870171 kubelet[2861]: E0113 20:52:08.858554 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.870171 kubelet[2861]: E0113 20:52:08.858641 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.870171 kubelet[2861]: W0113 20:52:08.858645 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.870171 kubelet[2861]: E0113 20:52:08.858651 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.870171 kubelet[2861]: E0113 20:52:08.858740 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.870171 kubelet[2861]: W0113 20:52:08.858745 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.870171 kubelet[2861]: E0113 20:52:08.858751 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.870998 containerd[1540]: time="2025-01-13T20:52:08.870911391Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-599994c46d-8kdkq,Uid:aeba9dfe-23e2-4693-9190-c72e73d772ae,Namespace:calico-system,Attempt:0,}" Jan 13 20:52:08.922792 kubelet[2861]: E0113 20:52:08.922772 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.922792 kubelet[2861]: W0113 20:52:08.922783 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.922792 kubelet[2861]: E0113 20:52:08.922795 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.923304 kubelet[2861]: I0113 20:52:08.922841 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6b0be92d-fb03-4015-90fc-415d37c2d78b-registration-dir\") pod \"csi-node-driver-7qmqc\" (UID: \"6b0be92d-fb03-4015-90fc-415d37c2d78b\") " pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:08.923304 kubelet[2861]: E0113 20:52:08.923028 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.923304 kubelet[2861]: W0113 20:52:08.923034 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.923304 kubelet[2861]: E0113 20:52:08.923054 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.923304 kubelet[2861]: I0113 20:52:08.923066 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b0be92d-fb03-4015-90fc-415d37c2d78b-kubelet-dir\") pod \"csi-node-driver-7qmqc\" (UID: \"6b0be92d-fb03-4015-90fc-415d37c2d78b\") " pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:08.923304 kubelet[2861]: E0113 20:52:08.923163 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.923304 kubelet[2861]: W0113 20:52:08.923168 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.923304 kubelet[2861]: E0113 20:52:08.923175 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.923455 kubelet[2861]: I0113 20:52:08.923186 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/6b0be92d-fb03-4015-90fc-415d37c2d78b-varrun\") pod \"csi-node-driver-7qmqc\" (UID: \"6b0be92d-fb03-4015-90fc-415d37c2d78b\") " pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:08.923455 kubelet[2861]: E0113 20:52:08.923287 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.923455 kubelet[2861]: W0113 20:52:08.923292 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.923455 kubelet[2861]: E0113 20:52:08.923308 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.923455 kubelet[2861]: I0113 20:52:08.923326 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6b0be92d-fb03-4015-90fc-415d37c2d78b-socket-dir\") pod \"csi-node-driver-7qmqc\" (UID: \"6b0be92d-fb03-4015-90fc-415d37c2d78b\") " pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:08.923455 kubelet[2861]: E0113 20:52:08.923443 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.923455 kubelet[2861]: W0113 20:52:08.923449 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.923455 kubelet[2861]: E0113 20:52:08.923456 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.923600 kubelet[2861]: I0113 20:52:08.923466 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxhlw\" (UniqueName: \"kubernetes.io/projected/6b0be92d-fb03-4015-90fc-415d37c2d78b-kube-api-access-qxhlw\") pod \"csi-node-driver-7qmqc\" (UID: \"6b0be92d-fb03-4015-90fc-415d37c2d78b\") " pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:08.923600 kubelet[2861]: E0113 20:52:08.923587 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.923633 kubelet[2861]: W0113 20:52:08.923604 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.923633 kubelet[2861]: E0113 20:52:08.923613 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.923890 kubelet[2861]: E0113 20:52:08.923701 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.923890 kubelet[2861]: W0113 20:52:08.923709 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.923890 kubelet[2861]: E0113 20:52:08.923724 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.923890 kubelet[2861]: E0113 20:52:08.923835 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.923890 kubelet[2861]: W0113 20:52:08.923839 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.923890 kubelet[2861]: E0113 20:52:08.923847 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.924449 kubelet[2861]: E0113 20:52:08.923938 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.924449 kubelet[2861]: W0113 20:52:08.923943 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.924449 kubelet[2861]: E0113 20:52:08.923954 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.924449 kubelet[2861]: E0113 20:52:08.924078 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.924449 kubelet[2861]: W0113 20:52:08.924083 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.924449 kubelet[2861]: E0113 20:52:08.924090 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.924449 kubelet[2861]: E0113 20:52:08.924167 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.924449 kubelet[2861]: W0113 20:52:08.924171 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.924449 kubelet[2861]: E0113 20:52:08.924297 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.924449 kubelet[2861]: W0113 20:52:08.924302 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.924654 kubelet[2861]: E0113 20:52:08.924311 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.924654 kubelet[2861]: E0113 20:52:08.924309 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.924654 kubelet[2861]: E0113 20:52:08.924448 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.924654 kubelet[2861]: W0113 20:52:08.924453 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.924654 kubelet[2861]: E0113 20:52:08.924459 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.924654 kubelet[2861]: E0113 20:52:08.924570 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.924654 kubelet[2861]: W0113 20:52:08.924574 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.924654 kubelet[2861]: E0113 20:52:08.924580 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.924777 kubelet[2861]: E0113 20:52:08.924696 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:08.924777 kubelet[2861]: W0113 20:52:08.924700 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:08.924777 kubelet[2861]: E0113 20:52:08.924706 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:08.960548 containerd[1540]: time="2025-01-13T20:52:08.960137862Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kwhkn,Uid:5ae12788-a88d-4b93-b16c-86265eaf0a93,Namespace:calico-system,Attempt:0,}" Jan 13 20:52:09.024402 kubelet[2861]: E0113 20:52:09.024309 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.024402 kubelet[2861]: W0113 20:52:09.024322 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.024402 kubelet[2861]: E0113 20:52:09.024337 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.025070 kubelet[2861]: E0113 20:52:09.024821 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.025070 kubelet[2861]: W0113 20:52:09.024829 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.025070 kubelet[2861]: E0113 20:52:09.024836 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.025070 kubelet[2861]: E0113 20:52:09.024927 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.025070 kubelet[2861]: W0113 20:52:09.024932 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.025070 kubelet[2861]: E0113 20:52:09.024938 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.025070 kubelet[2861]: E0113 20:52:09.025023 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.025070 kubelet[2861]: W0113 20:52:09.025027 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.025070 kubelet[2861]: E0113 20:52:09.025034 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.025369 kubelet[2861]: E0113 20:52:09.025284 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.025369 kubelet[2861]: W0113 20:52:09.025291 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.025369 kubelet[2861]: E0113 20:52:09.025297 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.025482 kubelet[2861]: E0113 20:52:09.025429 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.025482 kubelet[2861]: W0113 20:52:09.025433 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.025662 kubelet[2861]: E0113 20:52:09.025613 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.025662 kubelet[2861]: W0113 20:52:09.025619 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.025662 kubelet[2861]: E0113 20:52:09.025626 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.025793 kubelet[2861]: E0113 20:52:09.025719 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.025793 kubelet[2861]: W0113 20:52:09.025723 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.025793 kubelet[2861]: E0113 20:52:09.025730 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.025883 kubelet[2861]: E0113 20:52:09.025870 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.026001 kubelet[2861]: E0113 20:52:09.025928 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.026001 kubelet[2861]: W0113 20:52:09.025938 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.026001 kubelet[2861]: E0113 20:52:09.025945 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.026103 kubelet[2861]: E0113 20:52:09.026097 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.026165 kubelet[2861]: W0113 20:52:09.026132 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.026165 kubelet[2861]: E0113 20:52:09.026162 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.026344 kubelet[2861]: E0113 20:52:09.026290 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.026344 kubelet[2861]: W0113 20:52:09.026296 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.026344 kubelet[2861]: E0113 20:52:09.026306 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.026512 kubelet[2861]: E0113 20:52:09.026495 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.026512 kubelet[2861]: W0113 20:52:09.026501 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.026643 kubelet[2861]: E0113 20:52:09.026574 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.026735 kubelet[2861]: E0113 20:52:09.026693 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.026735 kubelet[2861]: W0113 20:52:09.026699 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.026799 kubelet[2861]: E0113 20:52:09.026794 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.026832 kubelet[2861]: W0113 20:52:09.026827 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.026938 kubelet[2861]: E0113 20:52:09.026933 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.027010 kubelet[2861]: W0113 20:52:09.026967 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.027010 kubelet[2861]: E0113 20:52:09.026975 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.027080 kubelet[2861]: E0113 20:52:09.027075 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.027112 kubelet[2861]: W0113 20:52:09.027107 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.027141 kubelet[2861]: E0113 20:52:09.027137 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.027302 kubelet[2861]: E0113 20:52:09.027244 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.027302 kubelet[2861]: W0113 20:52:09.027250 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.027302 kubelet[2861]: E0113 20:52:09.027255 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.027401 kubelet[2861]: E0113 20:52:09.027396 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.027433 kubelet[2861]: W0113 20:52:09.027428 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.027463 kubelet[2861]: E0113 20:52:09.027459 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.027534 kubelet[2861]: E0113 20:52:09.027496 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.027628 kubelet[2861]: E0113 20:52:09.027583 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.027628 kubelet[2861]: W0113 20:52:09.027588 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.027628 kubelet[2861]: E0113 20:52:09.027594 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.027707 kubelet[2861]: E0113 20:52:09.027702 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.027737 kubelet[2861]: W0113 20:52:09.027732 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.027771 kubelet[2861]: E0113 20:52:09.027766 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.028269 kubelet[2861]: E0113 20:52:09.027889 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.028269 kubelet[2861]: W0113 20:52:09.027894 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.028269 kubelet[2861]: E0113 20:52:09.027901 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.028396 kubelet[2861]: E0113 20:52:09.028383 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.028600 kubelet[2861]: E0113 20:52:09.028555 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.028600 kubelet[2861]: W0113 20:52:09.028561 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.028600 kubelet[2861]: E0113 20:52:09.028569 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.028820 kubelet[2861]: E0113 20:52:09.028742 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.028820 kubelet[2861]: W0113 20:52:09.028749 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.028820 kubelet[2861]: E0113 20:52:09.028757 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.028965 kubelet[2861]: E0113 20:52:09.028939 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.028965 kubelet[2861]: W0113 20:52:09.028946 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.028965 kubelet[2861]: E0113 20:52:09.028952 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.031892 kubelet[2861]: E0113 20:52:09.031825 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.031892 kubelet[2861]: W0113 20:52:09.031834 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.031892 kubelet[2861]: E0113 20:52:09.031842 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.042717 kubelet[2861]: E0113 20:52:09.041867 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:09.042717 kubelet[2861]: W0113 20:52:09.041878 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:09.042717 kubelet[2861]: E0113 20:52:09.041890 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:09.059257 containerd[1540]: time="2025-01-13T20:52:09.059202260Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:52:09.060424 containerd[1540]: time="2025-01-13T20:52:09.059266143Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:52:09.060424 containerd[1540]: time="2025-01-13T20:52:09.059277517Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:09.060424 containerd[1540]: time="2025-01-13T20:52:09.059981983Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:09.076173 systemd[1]: Started cri-containerd-520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196.scope - libcontainer container 520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196. Jan 13 20:52:09.102910 containerd[1540]: time="2025-01-13T20:52:09.102890710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-599994c46d-8kdkq,Uid:aeba9dfe-23e2-4693-9190-c72e73d772ae,Namespace:calico-system,Attempt:0,} returns sandbox id \"520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196\"" Jan 13 20:52:09.104556 containerd[1540]: time="2025-01-13T20:52:09.104439234Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Jan 13 20:52:09.115419 containerd[1540]: time="2025-01-13T20:52:09.115250777Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:52:09.115419 containerd[1540]: time="2025-01-13T20:52:09.115291857Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:52:09.115419 containerd[1540]: time="2025-01-13T20:52:09.115299908Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:09.115419 containerd[1540]: time="2025-01-13T20:52:09.115364821Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:09.131592 systemd[1]: Started cri-containerd-3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78.scope - libcontainer container 3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78. Jan 13 20:52:09.147934 containerd[1540]: time="2025-01-13T20:52:09.147911425Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kwhkn,Uid:5ae12788-a88d-4b93-b16c-86265eaf0a93,Namespace:calico-system,Attempt:0,} returns sandbox id \"3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78\"" Jan 13 20:52:10.007242 kubelet[2861]: E0113 20:52:10.007112 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:10.745909 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount281544537.mount: Deactivated successfully. Jan 13 20:52:11.490875 containerd[1540]: time="2025-01-13T20:52:11.490401586Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:11.490875 containerd[1540]: time="2025-01-13T20:52:11.490764163Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=31343363" Jan 13 20:52:11.490875 containerd[1540]: time="2025-01-13T20:52:11.490849246Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:11.492013 containerd[1540]: time="2025-01-13T20:52:11.492001219Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:11.492429 containerd[1540]: time="2025-01-13T20:52:11.492413738Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 2.387956435s" Jan 13 20:52:11.492464 containerd[1540]: time="2025-01-13T20:52:11.492429931Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Jan 13 20:52:11.492896 containerd[1540]: time="2025-01-13T20:52:11.492876369Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Jan 13 20:52:11.501193 containerd[1540]: time="2025-01-13T20:52:11.501175109Z" level=info msg="CreateContainer within sandbox \"520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 13 20:52:11.506536 containerd[1540]: time="2025-01-13T20:52:11.506517368Z" level=info msg="CreateContainer within sandbox \"520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6\"" Jan 13 20:52:11.506889 containerd[1540]: time="2025-01-13T20:52:11.506863200Z" level=info msg="StartContainer for \"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6\"" Jan 13 20:52:11.545578 systemd[1]: Started cri-containerd-0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6.scope - libcontainer container 0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6. Jan 13 20:52:11.575090 containerd[1540]: time="2025-01-13T20:52:11.575030764Z" level=info msg="StartContainer for \"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6\" returns successfully" Jan 13 20:52:12.007593 kubelet[2861]: E0113 20:52:12.007398 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:12.124235 kubelet[2861]: I0113 20:52:12.124208 2861 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-599994c46d-8kdkq" podStartSLOduration=1.735808709 podStartE2EDuration="4.124185283s" podCreationTimestamp="2025-01-13 20:52:08 +0000 UTC" firstStartedPulling="2025-01-13 20:52:09.104222036 +0000 UTC m=+21.210506765" lastFinishedPulling="2025-01-13 20:52:11.492598609 +0000 UTC m=+23.598883339" observedRunningTime="2025-01-13 20:52:12.124082737 +0000 UTC m=+24.230367476" watchObservedRunningTime="2025-01-13 20:52:12.124185283 +0000 UTC m=+24.230470022" Jan 13 20:52:12.180957 kubelet[2861]: E0113 20:52:12.180893 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.180957 kubelet[2861]: W0113 20:52:12.180907 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.180957 kubelet[2861]: E0113 20:52:12.180920 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.181402 kubelet[2861]: E0113 20:52:12.181024 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.181402 kubelet[2861]: W0113 20:52:12.181030 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.181402 kubelet[2861]: E0113 20:52:12.181037 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.181402 kubelet[2861]: E0113 20:52:12.181120 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.181402 kubelet[2861]: W0113 20:52:12.181125 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.181402 kubelet[2861]: E0113 20:52:12.181134 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.181402 kubelet[2861]: E0113 20:52:12.181252 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.181402 kubelet[2861]: W0113 20:52:12.181257 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.181402 kubelet[2861]: E0113 20:52:12.181262 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.181402 kubelet[2861]: E0113 20:52:12.181384 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.181751 kubelet[2861]: W0113 20:52:12.181389 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.181751 kubelet[2861]: E0113 20:52:12.181395 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.181751 kubelet[2861]: E0113 20:52:12.181473 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.181751 kubelet[2861]: W0113 20:52:12.181477 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.181751 kubelet[2861]: E0113 20:52:12.181483 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.181751 kubelet[2861]: E0113 20:52:12.181561 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.181751 kubelet[2861]: W0113 20:52:12.181565 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.181751 kubelet[2861]: E0113 20:52:12.181571 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.181751 kubelet[2861]: E0113 20:52:12.181669 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.181751 kubelet[2861]: W0113 20:52:12.181674 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.181913 kubelet[2861]: E0113 20:52:12.181679 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.181913 kubelet[2861]: E0113 20:52:12.181772 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.181913 kubelet[2861]: W0113 20:52:12.181776 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.181913 kubelet[2861]: E0113 20:52:12.181783 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.181913 kubelet[2861]: E0113 20:52:12.181852 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.181913 kubelet[2861]: W0113 20:52:12.181856 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.181913 kubelet[2861]: E0113 20:52:12.181862 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.182023 kubelet[2861]: E0113 20:52:12.181931 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.182023 kubelet[2861]: W0113 20:52:12.181936 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.182023 kubelet[2861]: E0113 20:52:12.181942 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.182023 kubelet[2861]: E0113 20:52:12.182011 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.182023 kubelet[2861]: W0113 20:52:12.182015 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.182023 kubelet[2861]: E0113 20:52:12.182020 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.182128 kubelet[2861]: E0113 20:52:12.182090 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.182128 kubelet[2861]: W0113 20:52:12.182094 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.182128 kubelet[2861]: E0113 20:52:12.182100 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.182178 kubelet[2861]: E0113 20:52:12.182172 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.182178 kubelet[2861]: W0113 20:52:12.182176 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.182212 kubelet[2861]: E0113 20:52:12.182181 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.182463 kubelet[2861]: E0113 20:52:12.182251 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.182463 kubelet[2861]: W0113 20:52:12.182255 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.182463 kubelet[2861]: E0113 20:52:12.182261 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.243185 kubelet[2861]: E0113 20:52:12.243162 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.243185 kubelet[2861]: W0113 20:52:12.243177 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.243185 kubelet[2861]: E0113 20:52:12.243191 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.243528 kubelet[2861]: E0113 20:52:12.243326 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.243528 kubelet[2861]: W0113 20:52:12.243332 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.243528 kubelet[2861]: E0113 20:52:12.243339 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.243528 kubelet[2861]: E0113 20:52:12.243458 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.243528 kubelet[2861]: W0113 20:52:12.243466 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.243528 kubelet[2861]: E0113 20:52:12.243479 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.243883 kubelet[2861]: E0113 20:52:12.243667 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.243883 kubelet[2861]: W0113 20:52:12.243672 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.243883 kubelet[2861]: E0113 20:52:12.243685 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.244149 kubelet[2861]: E0113 20:52:12.244068 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.244149 kubelet[2861]: W0113 20:52:12.244074 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.244149 kubelet[2861]: E0113 20:52:12.244087 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.244399 kubelet[2861]: E0113 20:52:12.244204 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.244399 kubelet[2861]: W0113 20:52:12.244208 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.244399 kubelet[2861]: E0113 20:52:12.244218 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.244399 kubelet[2861]: E0113 20:52:12.244374 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.244399 kubelet[2861]: W0113 20:52:12.244379 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.244399 kubelet[2861]: E0113 20:52:12.244394 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.244541 kubelet[2861]: E0113 20:52:12.244530 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.244541 kubelet[2861]: W0113 20:52:12.244538 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.244587 kubelet[2861]: E0113 20:52:12.244578 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.244648 kubelet[2861]: E0113 20:52:12.244640 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.244648 kubelet[2861]: W0113 20:52:12.244646 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.244694 kubelet[2861]: E0113 20:52:12.244683 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.244753 kubelet[2861]: E0113 20:52:12.244744 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.244753 kubelet[2861]: W0113 20:52:12.244750 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.244800 kubelet[2861]: E0113 20:52:12.244758 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.244856 kubelet[2861]: E0113 20:52:12.244847 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.244856 kubelet[2861]: W0113 20:52:12.244853 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.244899 kubelet[2861]: E0113 20:52:12.244867 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.244965 kubelet[2861]: E0113 20:52:12.244957 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.244965 kubelet[2861]: W0113 20:52:12.244964 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.245003 kubelet[2861]: E0113 20:52:12.244978 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.245088 kubelet[2861]: E0113 20:52:12.245078 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.245088 kubelet[2861]: W0113 20:52:12.245086 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.245130 kubelet[2861]: E0113 20:52:12.245094 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.245270 kubelet[2861]: E0113 20:52:12.245261 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.245270 kubelet[2861]: W0113 20:52:12.245268 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.245318 kubelet[2861]: E0113 20:52:12.245277 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.245387 kubelet[2861]: E0113 20:52:12.245376 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.245387 kubelet[2861]: W0113 20:52:12.245382 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.245715 kubelet[2861]: E0113 20:52:12.245389 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.245715 kubelet[2861]: E0113 20:52:12.245465 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.245715 kubelet[2861]: W0113 20:52:12.245470 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.245715 kubelet[2861]: E0113 20:52:12.245475 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.245715 kubelet[2861]: E0113 20:52:12.245577 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.245715 kubelet[2861]: W0113 20:52:12.245583 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.245715 kubelet[2861]: E0113 20:52:12.245588 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:12.245844 kubelet[2861]: E0113 20:52:12.245836 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:12.245844 kubelet[2861]: W0113 20:52:12.245843 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:12.245880 kubelet[2861]: E0113 20:52:12.245849 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.083618 kubelet[2861]: I0113 20:52:13.083574 2861 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:52:13.086927 kubelet[2861]: E0113 20:52:13.086854 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.086927 kubelet[2861]: W0113 20:52:13.086864 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.086927 kubelet[2861]: E0113 20:52:13.086885 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.087033 kubelet[2861]: E0113 20:52:13.087013 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.087033 kubelet[2861]: W0113 20:52:13.087020 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.087033 kubelet[2861]: E0113 20:52:13.087031 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.087150 kubelet[2861]: E0113 20:52:13.087140 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.087150 kubelet[2861]: W0113 20:52:13.087149 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.087194 kubelet[2861]: E0113 20:52:13.087159 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.087280 kubelet[2861]: E0113 20:52:13.087270 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.087280 kubelet[2861]: W0113 20:52:13.087279 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.087328 kubelet[2861]: E0113 20:52:13.087290 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.087444 kubelet[2861]: E0113 20:52:13.087434 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.087444 kubelet[2861]: W0113 20:52:13.087443 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.087485 kubelet[2861]: E0113 20:52:13.087453 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.087570 kubelet[2861]: E0113 20:52:13.087561 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.087570 kubelet[2861]: W0113 20:52:13.087569 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.087618 kubelet[2861]: E0113 20:52:13.087580 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.087733 kubelet[2861]: E0113 20:52:13.087723 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.087759 kubelet[2861]: W0113 20:52:13.087733 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.087759 kubelet[2861]: E0113 20:52:13.087746 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.087877 kubelet[2861]: E0113 20:52:13.087867 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.087900 kubelet[2861]: W0113 20:52:13.087877 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.087900 kubelet[2861]: E0113 20:52:13.087887 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.088014 kubelet[2861]: E0113 20:52:13.088004 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.088014 kubelet[2861]: W0113 20:52:13.088013 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.088060 kubelet[2861]: E0113 20:52:13.088023 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.088141 kubelet[2861]: E0113 20:52:13.088131 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.088141 kubelet[2861]: W0113 20:52:13.088140 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.088187 kubelet[2861]: E0113 20:52:13.088150 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.088264 kubelet[2861]: E0113 20:52:13.088254 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.088264 kubelet[2861]: W0113 20:52:13.088263 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.088307 kubelet[2861]: E0113 20:52:13.088273 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.088401 kubelet[2861]: E0113 20:52:13.088392 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.088401 kubelet[2861]: W0113 20:52:13.088400 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.088453 kubelet[2861]: E0113 20:52:13.088411 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.088531 kubelet[2861]: E0113 20:52:13.088523 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.088551 kubelet[2861]: W0113 20:52:13.088531 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.088551 kubelet[2861]: E0113 20:52:13.088545 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.088663 kubelet[2861]: E0113 20:52:13.088654 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.088688 kubelet[2861]: W0113 20:52:13.088663 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.088688 kubelet[2861]: E0113 20:52:13.088675 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.088794 kubelet[2861]: E0113 20:52:13.088785 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.088818 kubelet[2861]: W0113 20:52:13.088794 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.088818 kubelet[2861]: E0113 20:52:13.088804 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.149208 kubelet[2861]: E0113 20:52:13.149182 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.149208 kubelet[2861]: W0113 20:52:13.149200 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.149325 kubelet[2861]: E0113 20:52:13.149219 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.149413 kubelet[2861]: E0113 20:52:13.149402 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.149413 kubelet[2861]: W0113 20:52:13.149411 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.149679 kubelet[2861]: E0113 20:52:13.149429 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.149679 kubelet[2861]: E0113 20:52:13.149625 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.149679 kubelet[2861]: W0113 20:52:13.149631 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.149679 kubelet[2861]: E0113 20:52:13.149649 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.149813 kubelet[2861]: E0113 20:52:13.149787 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.149813 kubelet[2861]: W0113 20:52:13.149794 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.149855 kubelet[2861]: E0113 20:52:13.149812 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.149947 kubelet[2861]: E0113 20:52:13.149935 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.149947 kubelet[2861]: W0113 20:52:13.149945 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.149994 kubelet[2861]: E0113 20:52:13.149958 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.150078 kubelet[2861]: E0113 20:52:13.150063 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.150078 kubelet[2861]: W0113 20:52:13.150073 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.150121 kubelet[2861]: E0113 20:52:13.150082 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.150238 kubelet[2861]: E0113 20:52:13.150228 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.150238 kubelet[2861]: W0113 20:52:13.150237 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.150461 kubelet[2861]: E0113 20:52:13.150255 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.150641 kubelet[2861]: E0113 20:52:13.150603 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.150641 kubelet[2861]: W0113 20:52:13.150613 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.150641 kubelet[2861]: E0113 20:52:13.150629 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.150753 kubelet[2861]: E0113 20:52:13.150741 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.150753 kubelet[2861]: W0113 20:52:13.150751 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.150801 kubelet[2861]: E0113 20:52:13.150761 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.150874 kubelet[2861]: E0113 20:52:13.150864 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.150899 kubelet[2861]: W0113 20:52:13.150874 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.150973 kubelet[2861]: E0113 20:52:13.150928 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.151096 kubelet[2861]: E0113 20:52:13.150994 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.151096 kubelet[2861]: W0113 20:52:13.151001 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.151096 kubelet[2861]: E0113 20:52:13.151029 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.151315 kubelet[2861]: E0113 20:52:13.151115 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.151315 kubelet[2861]: W0113 20:52:13.151121 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.151315 kubelet[2861]: E0113 20:52:13.151137 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.151315 kubelet[2861]: E0113 20:52:13.151271 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.151315 kubelet[2861]: W0113 20:52:13.151277 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.151315 kubelet[2861]: E0113 20:52:13.151295 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.151709 kubelet[2861]: E0113 20:52:13.151547 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.151709 kubelet[2861]: W0113 20:52:13.151554 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.151709 kubelet[2861]: E0113 20:52:13.151568 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.151709 kubelet[2861]: E0113 20:52:13.151690 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.151709 kubelet[2861]: W0113 20:52:13.151696 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.151796 kubelet[2861]: E0113 20:52:13.151714 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.151903 kubelet[2861]: E0113 20:52:13.151858 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.151903 kubelet[2861]: W0113 20:52:13.151864 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.151903 kubelet[2861]: E0113 20:52:13.151878 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.152164 kubelet[2861]: E0113 20:52:13.152095 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.152164 kubelet[2861]: W0113 20:52:13.152103 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.152164 kubelet[2861]: E0113 20:52:13.152115 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.152286 kubelet[2861]: E0113 20:52:13.152230 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:52:13.152286 kubelet[2861]: W0113 20:52:13.152255 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:52:13.152286 kubelet[2861]: E0113 20:52:13.152266 2861 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:52:13.550253 containerd[1540]: time="2025-01-13T20:52:13.549660273Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:13.550657 containerd[1540]: time="2025-01-13T20:52:13.550630867Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5362121" Jan 13 20:52:13.555725 containerd[1540]: time="2025-01-13T20:52:13.555703587Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:13.556752 containerd[1540]: time="2025-01-13T20:52:13.556734571Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:13.557195 containerd[1540]: time="2025-01-13T20:52:13.557180407Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 2.064243138s" Jan 13 20:52:13.557253 containerd[1540]: time="2025-01-13T20:52:13.557243648Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Jan 13 20:52:13.558324 containerd[1540]: time="2025-01-13T20:52:13.558311311Z" level=info msg="CreateContainer within sandbox \"3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 13 20:52:13.575216 containerd[1540]: time="2025-01-13T20:52:13.575186806Z" level=info msg="CreateContainer within sandbox \"3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9\"" Jan 13 20:52:13.576191 containerd[1540]: time="2025-01-13T20:52:13.576176908Z" level=info msg="StartContainer for \"5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9\"" Jan 13 20:52:13.601456 systemd[1]: Started cri-containerd-5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9.scope - libcontainer container 5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9. Jan 13 20:52:13.625668 containerd[1540]: time="2025-01-13T20:52:13.625614881Z" level=info msg="StartContainer for \"5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9\" returns successfully" Jan 13 20:52:13.634756 systemd[1]: cri-containerd-5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9.scope: Deactivated successfully. Jan 13 20:52:13.649272 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9-rootfs.mount: Deactivated successfully. Jan 13 20:52:14.008036 kubelet[2861]: E0113 20:52:14.007272 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:14.127787 containerd[1540]: time="2025-01-13T20:52:14.117891113Z" level=info msg="shim disconnected" id=5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9 namespace=k8s.io Jan 13 20:52:14.127787 containerd[1540]: time="2025-01-13T20:52:14.127705623Z" level=warning msg="cleaning up after shim disconnected" id=5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9 namespace=k8s.io Jan 13 20:52:14.127787 containerd[1540]: time="2025-01-13T20:52:14.127715182Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:52:15.089125 containerd[1540]: time="2025-01-13T20:52:15.088874116Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Jan 13 20:52:16.008006 kubelet[2861]: E0113 20:52:16.007364 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:17.108282 kubelet[2861]: I0113 20:52:17.108180 2861 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:52:18.007833 kubelet[2861]: E0113 20:52:18.007652 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:20.007942 kubelet[2861]: E0113 20:52:20.007172 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:20.721457 containerd[1540]: time="2025-01-13T20:52:20.721372927Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:20.722086 containerd[1540]: time="2025-01-13T20:52:20.721994867Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Jan 13 20:52:20.722837 containerd[1540]: time="2025-01-13T20:52:20.722358514Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:20.724091 containerd[1540]: time="2025-01-13T20:52:20.724053013Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:20.724835 containerd[1540]: time="2025-01-13T20:52:20.724601865Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 5.63570527s" Jan 13 20:52:20.724835 containerd[1540]: time="2025-01-13T20:52:20.724619100Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Jan 13 20:52:20.728745 containerd[1540]: time="2025-01-13T20:52:20.728542892Z" level=info msg="CreateContainer within sandbox \"3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 13 20:52:20.739041 containerd[1540]: time="2025-01-13T20:52:20.739009088Z" level=info msg="CreateContainer within sandbox \"3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b\"" Jan 13 20:52:20.739556 containerd[1540]: time="2025-01-13T20:52:20.739520984Z" level=info msg="StartContainer for \"165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b\"" Jan 13 20:52:20.810434 systemd[1]: Started cri-containerd-165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b.scope - libcontainer container 165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b. Jan 13 20:52:20.849856 containerd[1540]: time="2025-01-13T20:52:20.849776691Z" level=info msg="StartContainer for \"165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b\" returns successfully" Jan 13 20:52:22.007679 kubelet[2861]: E0113 20:52:22.006979 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:24.007616 kubelet[2861]: E0113 20:52:24.007569 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:24.439969 systemd[1]: cri-containerd-165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b.scope: Deactivated successfully. Jan 13 20:52:24.477869 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b-rootfs.mount: Deactivated successfully. Jan 13 20:52:24.483570 containerd[1540]: time="2025-01-13T20:52:24.483521491Z" level=info msg="shim disconnected" id=165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b namespace=k8s.io Jan 13 20:52:24.485877 containerd[1540]: time="2025-01-13T20:52:24.484017556Z" level=warning msg="cleaning up after shim disconnected" id=165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b namespace=k8s.io Jan 13 20:52:24.485877 containerd[1540]: time="2025-01-13T20:52:24.484030311Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:52:24.494329 containerd[1540]: time="2025-01-13T20:52:24.494293999Z" level=warning msg="cleanup warnings time=\"2025-01-13T20:52:24Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Jan 13 20:52:24.503376 kubelet[2861]: I0113 20:52:24.503224 2861 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Jan 13 20:52:24.738075 kubelet[2861]: I0113 20:52:24.737771 2861 topology_manager.go:215] "Topology Admit Handler" podUID="dd0ed670-6fae-4b3f-8750-1689ff0c62c3" podNamespace="kube-system" podName="coredns-76f75df574-f9v7b" Jan 13 20:52:24.781383 kubelet[2861]: I0113 20:52:24.781183 2861 topology_manager.go:215] "Topology Admit Handler" podUID="26da34a0-8538-4fcf-9a78-f93cb2d6a0ef" podNamespace="calico-apiserver" podName="calico-apiserver-565485b44-5l4bw" Jan 13 20:52:24.781523 kubelet[2861]: I0113 20:52:24.781468 2861 topology_manager.go:215] "Topology Admit Handler" podUID="b067aee8-97d0-47ca-9359-80c070636930" podNamespace="calico-system" podName="calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:24.782142 kubelet[2861]: I0113 20:52:24.781560 2861 topology_manager.go:215] "Topology Admit Handler" podUID="42cec424-d73c-431a-b548-ae49975c9420" podNamespace="kube-system" podName="coredns-76f75df574-2wvd8" Jan 13 20:52:24.782142 kubelet[2861]: I0113 20:52:24.781638 2861 topology_manager.go:215] "Topology Admit Handler" podUID="c9c96bfa-10f6-4dae-9986-cb25e73d9966" podNamespace="calico-apiserver" podName="calico-apiserver-565485b44-cbpp6" Jan 13 20:52:24.822616 systemd[1]: Created slice kubepods-besteffort-pod26da34a0_8538_4fcf_9a78_f93cb2d6a0ef.slice - libcontainer container kubepods-besteffort-pod26da34a0_8538_4fcf_9a78_f93cb2d6a0ef.slice. Jan 13 20:52:24.825497 kubelet[2861]: I0113 20:52:24.825355 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnj9z\" (UniqueName: \"kubernetes.io/projected/dd0ed670-6fae-4b3f-8750-1689ff0c62c3-kube-api-access-bnj9z\") pod \"coredns-76f75df574-f9v7b\" (UID: \"dd0ed670-6fae-4b3f-8750-1689ff0c62c3\") " pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:24.825497 kubelet[2861]: I0113 20:52:24.825393 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c9c96bfa-10f6-4dae-9986-cb25e73d9966-calico-apiserver-certs\") pod \"calico-apiserver-565485b44-cbpp6\" (UID: \"c9c96bfa-10f6-4dae-9986-cb25e73d9966\") " pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:24.825497 kubelet[2861]: I0113 20:52:24.825412 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mvln\" (UniqueName: \"kubernetes.io/projected/b067aee8-97d0-47ca-9359-80c070636930-kube-api-access-8mvln\") pod \"calico-kube-controllers-86544f5f57-nbx6b\" (UID: \"b067aee8-97d0-47ca-9359-80c070636930\") " pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:24.825497 kubelet[2861]: I0113 20:52:24.825427 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b067aee8-97d0-47ca-9359-80c070636930-tigera-ca-bundle\") pod \"calico-kube-controllers-86544f5f57-nbx6b\" (UID: \"b067aee8-97d0-47ca-9359-80c070636930\") " pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:24.825497 kubelet[2861]: I0113 20:52:24.825441 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5grp5\" (UniqueName: \"kubernetes.io/projected/42cec424-d73c-431a-b548-ae49975c9420-kube-api-access-5grp5\") pod \"coredns-76f75df574-2wvd8\" (UID: \"42cec424-d73c-431a-b548-ae49975c9420\") " pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:24.825686 kubelet[2861]: I0113 20:52:24.825455 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd0ed670-6fae-4b3f-8750-1689ff0c62c3-config-volume\") pod \"coredns-76f75df574-f9v7b\" (UID: \"dd0ed670-6fae-4b3f-8750-1689ff0c62c3\") " pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:24.825686 kubelet[2861]: I0113 20:52:24.825468 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/26da34a0-8538-4fcf-9a78-f93cb2d6a0ef-calico-apiserver-certs\") pod \"calico-apiserver-565485b44-5l4bw\" (UID: \"26da34a0-8538-4fcf-9a78-f93cb2d6a0ef\") " pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:24.825686 kubelet[2861]: I0113 20:52:24.825480 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8zjd\" (UniqueName: \"kubernetes.io/projected/26da34a0-8538-4fcf-9a78-f93cb2d6a0ef-kube-api-access-m8zjd\") pod \"calico-apiserver-565485b44-5l4bw\" (UID: \"26da34a0-8538-4fcf-9a78-f93cb2d6a0ef\") " pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:24.825686 kubelet[2861]: I0113 20:52:24.825491 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42cec424-d73c-431a-b548-ae49975c9420-config-volume\") pod \"coredns-76f75df574-2wvd8\" (UID: \"42cec424-d73c-431a-b548-ae49975c9420\") " pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:24.825686 kubelet[2861]: I0113 20:52:24.825504 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkc7r\" (UniqueName: \"kubernetes.io/projected/c9c96bfa-10f6-4dae-9986-cb25e73d9966-kube-api-access-bkc7r\") pod \"calico-apiserver-565485b44-cbpp6\" (UID: \"c9c96bfa-10f6-4dae-9986-cb25e73d9966\") " pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:24.829345 systemd[1]: Created slice kubepods-besteffort-podb067aee8_97d0_47ca_9359_80c070636930.slice - libcontainer container kubepods-besteffort-podb067aee8_97d0_47ca_9359_80c070636930.slice. Jan 13 20:52:24.835170 systemd[1]: Created slice kubepods-burstable-pod42cec424_d73c_431a_b548_ae49975c9420.slice - libcontainer container kubepods-burstable-pod42cec424_d73c_431a_b548_ae49975c9420.slice. Jan 13 20:52:24.842252 systemd[1]: Created slice kubepods-burstable-poddd0ed670_6fae_4b3f_8750_1689ff0c62c3.slice - libcontainer container kubepods-burstable-poddd0ed670_6fae_4b3f_8750_1689ff0c62c3.slice. Jan 13 20:52:24.845743 systemd[1]: Created slice kubepods-besteffort-podc9c96bfa_10f6_4dae_9986_cb25e73d9966.slice - libcontainer container kubepods-besteffort-podc9c96bfa_10f6_4dae_9986_cb25e73d9966.slice. Jan 13 20:52:25.125301 containerd[1540]: time="2025-01-13T20:52:25.125237011Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:0,}" Jan 13 20:52:25.215049 containerd[1540]: time="2025-01-13T20:52:25.214825073Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:0,}" Jan 13 20:52:25.215240 containerd[1540]: time="2025-01-13T20:52:25.215148835Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:0,}" Jan 13 20:52:25.215323 containerd[1540]: time="2025-01-13T20:52:25.215302747Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:0,}" Jan 13 20:52:25.215548 containerd[1540]: time="2025-01-13T20:52:25.215523341Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:0,}" Jan 13 20:52:25.343416 containerd[1540]: time="2025-01-13T20:52:25.343184656Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Jan 13 20:52:26.022485 systemd[1]: Created slice kubepods-besteffort-pod6b0be92d_fb03_4015_90fc_415d37c2d78b.slice - libcontainer container kubepods-besteffort-pod6b0be92d_fb03_4015_90fc_415d37c2d78b.slice. Jan 13 20:52:26.024101 containerd[1540]: time="2025-01-13T20:52:26.024078181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:0,}" Jan 13 20:52:26.223221 containerd[1540]: time="2025-01-13T20:52:26.222784485Z" level=error msg="Failed to destroy network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.226743 containerd[1540]: time="2025-01-13T20:52:26.226545347Z" level=error msg="encountered an error cleaning up failed sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.226743 containerd[1540]: time="2025-01-13T20:52:26.226587108Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.226981 containerd[1540]: time="2025-01-13T20:52:26.226960939Z" level=error msg="Failed to destroy network for sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.227147 containerd[1540]: time="2025-01-13T20:52:26.227129280Z" level=error msg="encountered an error cleaning up failed sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.227174 containerd[1540]: time="2025-01-13T20:52:26.227159440Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.235870 containerd[1540]: time="2025-01-13T20:52:26.235854153Z" level=error msg="Failed to destroy network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.236153 containerd[1540]: time="2025-01-13T20:52:26.236076035Z" level=error msg="encountered an error cleaning up failed sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.236153 containerd[1540]: time="2025-01-13T20:52:26.236132083Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.246368 containerd[1540]: time="2025-01-13T20:52:26.246314974Z" level=error msg="Failed to destroy network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.246522 containerd[1540]: time="2025-01-13T20:52:26.246501303Z" level=error msg="encountered an error cleaning up failed sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.246556 containerd[1540]: time="2025-01-13T20:52:26.246534701Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.264773 containerd[1540]: time="2025-01-13T20:52:26.264746019Z" level=error msg="Failed to destroy network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.264953 containerd[1540]: time="2025-01-13T20:52:26.264938149Z" level=error msg="encountered an error cleaning up failed sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.264980 containerd[1540]: time="2025-01-13T20:52:26.264970041Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.265443 containerd[1540]: time="2025-01-13T20:52:26.265429428Z" level=error msg="Failed to destroy network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.265702 containerd[1540]: time="2025-01-13T20:52:26.265688585Z" level=error msg="encountered an error cleaning up failed sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.265806 containerd[1540]: time="2025-01-13T20:52:26.265749530Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.329455 kubelet[2861]: E0113 20:52:26.329358 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.330086 kubelet[2861]: E0113 20:52:26.329768 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.330086 kubelet[2861]: E0113 20:52:26.329900 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:26.330086 kubelet[2861]: E0113 20:52:26.329944 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:26.341811 kubelet[2861]: E0113 20:52:26.329980 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:26.341811 kubelet[2861]: E0113 20:52:26.330124 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.341811 kubelet[2861]: E0113 20:52:26.330139 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:26.341950 kubelet[2861]: E0113 20:52:26.330149 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:26.341950 kubelet[2861]: E0113 20:52:26.330169 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-f9v7b" podUID="dd0ed670-6fae-4b3f-8750-1689ff0c62c3" Jan 13 20:52:26.341950 kubelet[2861]: E0113 20:52:26.330185 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.342052 kubelet[2861]: E0113 20:52:26.330188 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:26.342052 kubelet[2861]: E0113 20:52:26.330196 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:26.342052 kubelet[2861]: E0113 20:52:26.330203 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:26.342052 kubelet[2861]: E0113 20:52:26.330207 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:26.342138 kubelet[2861]: E0113 20:52:26.330220 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.342138 kubelet[2861]: E0113 20:52:26.330229 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" podUID="b067aee8-97d0-47ca-9359-80c070636930" Jan 13 20:52:26.342138 kubelet[2861]: E0113 20:52:26.330231 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:26.342262 kubelet[2861]: E0113 20:52:26.330242 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:26.342262 kubelet[2861]: E0113 20:52:26.330250 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" podUID="c9c96bfa-10f6-4dae-9986-cb25e73d9966" Jan 13 20:52:26.342262 kubelet[2861]: E0113 20:52:26.330259 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-2wvd8_kube-system(42cec424-d73c-431a-b548-ae49975c9420)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-2wvd8_kube-system(42cec424-d73c-431a-b548-ae49975c9420)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-2wvd8" podUID="42cec424-d73c-431a-b548-ae49975c9420" Jan 13 20:52:26.342530 kubelet[2861]: E0113 20:52:26.330263 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:26.342530 kubelet[2861]: E0113 20:52:26.330276 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:26.342530 kubelet[2861]: E0113 20:52:26.330286 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:26.342599 kubelet[2861]: E0113 20:52:26.330306 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" podUID="26da34a0-8538-4fcf-9a78-f93cb2d6a0ef" Jan 13 20:52:26.342599 kubelet[2861]: I0113 20:52:26.331804 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede" Jan 13 20:52:26.342599 kubelet[2861]: I0113 20:52:26.332590 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658" Jan 13 20:52:26.478069 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec-shm.mount: Deactivated successfully. Jan 13 20:52:26.478127 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd-shm.mount: Deactivated successfully. Jan 13 20:52:26.478310 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0-shm.mount: Deactivated successfully. Jan 13 20:52:26.478382 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab-shm.mount: Deactivated successfully. Jan 13 20:52:26.478428 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede-shm.mount: Deactivated successfully. Jan 13 20:52:26.503178 kubelet[2861]: I0113 20:52:26.503154 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec" Jan 13 20:52:26.569912 kubelet[2861]: I0113 20:52:26.569639 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd" Jan 13 20:52:26.570686 kubelet[2861]: I0113 20:52:26.570482 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab" Jan 13 20:52:26.571177 kubelet[2861]: I0113 20:52:26.571167 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0" Jan 13 20:52:26.871652 containerd[1540]: time="2025-01-13T20:52:26.871566277Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\"" Jan 13 20:52:26.871652 containerd[1540]: time="2025-01-13T20:52:26.871611272Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\"" Jan 13 20:52:26.883969 containerd[1540]: time="2025-01-13T20:52:26.871580273Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\"" Jan 13 20:52:26.883969 containerd[1540]: time="2025-01-13T20:52:26.883626749Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\"" Jan 13 20:52:26.883969 containerd[1540]: time="2025-01-13T20:52:26.883781344Z" level=info msg="Ensure that sandbox bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab in task-service has been cleanup successfully" Jan 13 20:52:26.883969 containerd[1540]: time="2025-01-13T20:52:26.883964351Z" level=info msg="TearDown network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" successfully" Jan 13 20:52:26.883969 containerd[1540]: time="2025-01-13T20:52:26.883976201Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" returns successfully" Jan 13 20:52:26.884199 containerd[1540]: time="2025-01-13T20:52:26.883782331Z" level=info msg="Ensure that sandbox 4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede in task-service has been cleanup successfully" Jan 13 20:52:26.885938 systemd[1]: run-netns-cni\x2dc845bd3e\x2d3cee\x2d8a66\x2d4b0d\x2dcdcd5410d683.mount: Deactivated successfully. Jan 13 20:52:26.886714 containerd[1540]: time="2025-01-13T20:52:26.886371672Z" level=info msg="TearDown network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" successfully" Jan 13 20:52:26.886714 containerd[1540]: time="2025-01-13T20:52:26.886391963Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" returns successfully" Jan 13 20:52:26.886714 containerd[1540]: time="2025-01-13T20:52:26.886480782Z" level=info msg="Ensure that sandbox 4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd in task-service has been cleanup successfully" Jan 13 20:52:26.886714 containerd[1540]: time="2025-01-13T20:52:26.886667167Z" level=info msg="TearDown network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" successfully" Jan 13 20:52:26.886714 containerd[1540]: time="2025-01-13T20:52:26.886677896Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" returns successfully" Jan 13 20:52:26.886012 systemd[1]: run-netns-cni\x2d031e8f67\x2d7242\x2db01c\x2d63c0\x2d321b1d30354e.mount: Deactivated successfully. Jan 13 20:52:26.886912 containerd[1540]: time="2025-01-13T20:52:26.886758674Z" level=info msg="Ensure that sandbox 261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658 in task-service has been cleanup successfully" Jan 13 20:52:26.886912 containerd[1540]: time="2025-01-13T20:52:26.886896469Z" level=info msg="TearDown network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" successfully" Jan 13 20:52:26.886912 containerd[1540]: time="2025-01-13T20:52:26.886906479Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" returns successfully" Jan 13 20:52:26.888331 containerd[1540]: time="2025-01-13T20:52:26.887151193Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\"" Jan 13 20:52:26.888331 containerd[1540]: time="2025-01-13T20:52:26.887302819Z" level=info msg="Ensure that sandbox ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec in task-service has been cleanup successfully" Jan 13 20:52:26.888331 containerd[1540]: time="2025-01-13T20:52:26.887749126Z" level=info msg="TearDown network for sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" successfully" Jan 13 20:52:26.888331 containerd[1540]: time="2025-01-13T20:52:26.887761153Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" returns successfully" Jan 13 20:52:26.888331 containerd[1540]: time="2025-01-13T20:52:26.887996120Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\"" Jan 13 20:52:26.888504 containerd[1540]: time="2025-01-13T20:52:26.888460461Z" level=info msg="Ensure that sandbox aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0 in task-service has been cleanup successfully" Jan 13 20:52:26.889547 containerd[1540]: time="2025-01-13T20:52:26.888965083Z" level=info msg="TearDown network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" successfully" Jan 13 20:52:26.889547 containerd[1540]: time="2025-01-13T20:52:26.888979914Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" returns successfully" Jan 13 20:52:26.891526 systemd[1]: run-netns-cni\x2d6968b0d8\x2d20a0\x2df4db\x2d6e7c\x2d2afd1a558f0e.mount: Deactivated successfully. Jan 13 20:52:26.891669 systemd[1]: run-netns-cni\x2d81000610\x2d3891\x2d2793\x2db3b1\x2d731b1b7ec520.mount: Deactivated successfully. Jan 13 20:52:26.891838 systemd[1]: run-netns-cni\x2dba4a754c\x2d3882\x2d036b\x2dc341\x2d2ed7d8309e5a.mount: Deactivated successfully. Jan 13 20:52:26.891921 systemd[1]: run-netns-cni\x2d19328478\x2d7209\x2dac59\x2d2546\x2d785247320b78.mount: Deactivated successfully. Jan 13 20:52:26.906013 containerd[1540]: time="2025-01-13T20:52:26.905981817Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:1,}" Jan 13 20:52:26.906186 containerd[1540]: time="2025-01-13T20:52:26.906170116Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:1,}" Jan 13 20:52:26.906309 containerd[1540]: time="2025-01-13T20:52:26.906293402Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:1,}" Jan 13 20:52:26.908231 containerd[1540]: time="2025-01-13T20:52:26.908201295Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:1,}" Jan 13 20:52:26.909057 containerd[1540]: time="2025-01-13T20:52:26.909037517Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:1,}" Jan 13 20:52:26.909448 containerd[1540]: time="2025-01-13T20:52:26.909331322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:1,}" Jan 13 20:52:27.169903 containerd[1540]: time="2025-01-13T20:52:27.169868509Z" level=error msg="Failed to destroy network for sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.173609 containerd[1540]: time="2025-01-13T20:52:27.173584679Z" level=error msg="encountered an error cleaning up failed sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.173667 containerd[1540]: time="2025-01-13T20:52:27.173632346Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.184008 containerd[1540]: time="2025-01-13T20:52:27.178265729Z" level=error msg="Failed to destroy network for sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.184008 containerd[1540]: time="2025-01-13T20:52:27.178450367Z" level=error msg="encountered an error cleaning up failed sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.184008 containerd[1540]: time="2025-01-13T20:52:27.178481391Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.184142 kubelet[2861]: E0113 20:52:27.178678 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.184142 kubelet[2861]: E0113 20:52:27.178723 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:27.184142 kubelet[2861]: E0113 20:52:27.178742 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:27.184206 kubelet[2861]: E0113 20:52:27.178774 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" podUID="b067aee8-97d0-47ca-9359-80c070636930" Jan 13 20:52:27.188961 containerd[1540]: time="2025-01-13T20:52:27.188901160Z" level=error msg="Failed to destroy network for sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.189273 containerd[1540]: time="2025-01-13T20:52:27.188902566Z" level=error msg="Failed to destroy network for sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.189273 containerd[1540]: time="2025-01-13T20:52:27.189111326Z" level=error msg="encountered an error cleaning up failed sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.189273 containerd[1540]: time="2025-01-13T20:52:27.189144710Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.189273 containerd[1540]: time="2025-01-13T20:52:27.189229520Z" level=error msg="encountered an error cleaning up failed sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.189273 containerd[1540]: time="2025-01-13T20:52:27.189254655Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.189423 kubelet[2861]: E0113 20:52:27.189290 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.195652 containerd[1540]: time="2025-01-13T20:52:27.195614522Z" level=error msg="Failed to destroy network for sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.196108 kubelet[2861]: E0113 20:52:27.196016 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:27.196108 kubelet[2861]: E0113 20:52:27.196102 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:27.196187 kubelet[2861]: E0113 20:52:27.196148 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-2wvd8_kube-system(42cec424-d73c-431a-b548-ae49975c9420)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-2wvd8_kube-system(42cec424-d73c-431a-b548-ae49975c9420)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-2wvd8" podUID="42cec424-d73c-431a-b548-ae49975c9420" Jan 13 20:52:27.196319 containerd[1540]: time="2025-01-13T20:52:27.196049926Z" level=error msg="encountered an error cleaning up failed sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.196319 containerd[1540]: time="2025-01-13T20:52:27.196277313Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.196406 kubelet[2861]: E0113 20:52:27.196330 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.196406 kubelet[2861]: E0113 20:52:27.196370 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:27.196406 kubelet[2861]: E0113 20:52:27.196383 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:27.196557 kubelet[2861]: E0113 20:52:27.196405 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" podUID="26da34a0-8538-4fcf-9a78-f93cb2d6a0ef" Jan 13 20:52:27.196914 kubelet[2861]: E0113 20:52:27.196722 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.196914 kubelet[2861]: E0113 20:52:27.196752 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:27.196914 kubelet[2861]: E0113 20:52:27.196763 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.196914 kubelet[2861]: E0113 20:52:27.196780 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:27.197028 kubelet[2861]: E0113 20:52:27.196791 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:27.197028 kubelet[2861]: E0113 20:52:27.196813 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-f9v7b" podUID="dd0ed670-6fae-4b3f-8750-1689ff0c62c3" Jan 13 20:52:27.197028 kubelet[2861]: E0113 20:52:27.196866 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:27.197122 kubelet[2861]: E0113 20:52:27.196900 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:27.201748 containerd[1540]: time="2025-01-13T20:52:27.201692098Z" level=error msg="Failed to destroy network for sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.202086 containerd[1540]: time="2025-01-13T20:52:27.202073154Z" level=error msg="encountered an error cleaning up failed sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.202226 containerd[1540]: time="2025-01-13T20:52:27.202182971Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.202537 kubelet[2861]: E0113 20:52:27.202433 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:27.202537 kubelet[2861]: E0113 20:52:27.202465 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:27.202537 kubelet[2861]: E0113 20:52:27.202480 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:27.202742 kubelet[2861]: E0113 20:52:27.202516 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" podUID="c9c96bfa-10f6-4dae-9986-cb25e73d9966" Jan 13 20:52:27.478832 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b-shm.mount: Deactivated successfully. Jan 13 20:52:27.574642 kubelet[2861]: I0113 20:52:27.574471 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509" Jan 13 20:52:27.575545 containerd[1540]: time="2025-01-13T20:52:27.575161932Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\"" Jan 13 20:52:27.575545 containerd[1540]: time="2025-01-13T20:52:27.575318321Z" level=info msg="Ensure that sandbox 1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509 in task-service has been cleanup successfully" Jan 13 20:52:27.578311 containerd[1540]: time="2025-01-13T20:52:27.577373549Z" level=info msg="TearDown network for sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" successfully" Jan 13 20:52:27.578311 containerd[1540]: time="2025-01-13T20:52:27.577393370Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" returns successfully" Jan 13 20:52:27.578390 systemd[1]: run-netns-cni\x2d5a60eed0\x2da4b8\x2dfcad\x2d6ad9\x2dd3741359ac35.mount: Deactivated successfully. Jan 13 20:52:27.582269 containerd[1540]: time="2025-01-13T20:52:27.582248503Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\"" Jan 13 20:52:27.582645 containerd[1540]: time="2025-01-13T20:52:27.582515737Z" level=info msg="TearDown network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" successfully" Jan 13 20:52:27.582645 containerd[1540]: time="2025-01-13T20:52:27.582526958Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" returns successfully" Jan 13 20:52:27.582756 kubelet[2861]: I0113 20:52:27.582531 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b" Jan 13 20:52:27.583142 containerd[1540]: time="2025-01-13T20:52:27.583125459Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\"" Jan 13 20:52:27.583317 containerd[1540]: time="2025-01-13T20:52:27.583262284Z" level=info msg="Ensure that sandbox 96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b in task-service has been cleanup successfully" Jan 13 20:52:27.583786 containerd[1540]: time="2025-01-13T20:52:27.583760311Z" level=info msg="TearDown network for sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" successfully" Jan 13 20:52:27.583786 containerd[1540]: time="2025-01-13T20:52:27.583775105Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" returns successfully" Jan 13 20:52:27.586195 containerd[1540]: time="2025-01-13T20:52:27.584179656Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\"" Jan 13 20:52:27.586195 containerd[1540]: time="2025-01-13T20:52:27.584232139Z" level=info msg="TearDown network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" successfully" Jan 13 20:52:27.586195 containerd[1540]: time="2025-01-13T20:52:27.584238783Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" returns successfully" Jan 13 20:52:27.586195 containerd[1540]: time="2025-01-13T20:52:27.584344892Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:2,}" Jan 13 20:52:27.585620 systemd[1]: run-netns-cni\x2d2153837c\x2dfb7b\x2d0d07\x2d367e\x2d96005e8fa067.mount: Deactivated successfully. Jan 13 20:52:27.590467 systemd[1]: run-netns-cni\x2d62b58a49\x2db546\x2d2aad\x2d2993\x2d4e297dea6828.mount: Deactivated successfully. Jan 13 20:52:27.590870 kubelet[2861]: I0113 20:52:27.587975 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d" Jan 13 20:52:27.590870 kubelet[2861]: I0113 20:52:27.588913 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.587237707Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:2,}" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.588416965Z" level=info msg="StopPodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\"" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.588521645Z" level=info msg="Ensure that sandbox ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d in task-service has been cleanup successfully" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.588883326Z" level=info msg="TearDown network for sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" successfully" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.588892464Z" level=info msg="StopPodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" returns successfully" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.589517171Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\"" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.589839510Z" level=info msg="Ensure that sandbox 6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a in task-service has been cleanup successfully" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.589981459Z" level=info msg="TearDown network for sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" successfully" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.589993085Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" returns successfully" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.589983318Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\"" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.590056758Z" level=info msg="TearDown network for sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" successfully" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.590064807Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" returns successfully" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.590933064Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\"" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.590985983Z" level=info msg="TearDown network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" successfully" Jan 13 20:52:27.591031 containerd[1540]: time="2025-01-13T20:52:27.590993304Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" returns successfully" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.591043840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:2,}" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.592166226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:2,}" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.593082956Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\"" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.593204392Z" level=info msg="Ensure that sandbox 2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6 in task-service has been cleanup successfully" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.593989592Z" level=info msg="TearDown network for sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" successfully" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.593999356Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" returns successfully" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.594393305Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\"" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.594495762Z" level=info msg="TearDown network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" successfully" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.594503861Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" returns successfully" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.594705713Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:2,}" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.594873096Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\"" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.594959577Z" level=info msg="Ensure that sandbox 3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588 in task-service has been cleanup successfully" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.595046075Z" level=info msg="TearDown network for sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" successfully" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.595053094Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" returns successfully" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.595328079Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\"" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.595422543Z" level=info msg="TearDown network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" successfully" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.595429721Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" returns successfully" Jan 13 20:52:27.599691 containerd[1540]: time="2025-01-13T20:52:27.595641020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:2,}" Jan 13 20:52:27.593023 systemd[1]: run-netns-cni\x2dceaf5317\x2d188d\x2dc97f\x2db0ef\x2db06ba2a9ca06.mount: Deactivated successfully. Jan 13 20:52:27.606392 kubelet[2861]: I0113 20:52:27.592834 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6" Jan 13 20:52:27.606392 kubelet[2861]: I0113 20:52:27.594564 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588" Jan 13 20:52:28.044816 containerd[1540]: time="2025-01-13T20:52:28.044677832Z" level=error msg="Failed to destroy network for sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.045526 containerd[1540]: time="2025-01-13T20:52:28.045054949Z" level=error msg="encountered an error cleaning up failed sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.045526 containerd[1540]: time="2025-01-13T20:52:28.045092182Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.058278 containerd[1540]: time="2025-01-13T20:52:28.058239553Z" level=error msg="Failed to destroy network for sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.058697 containerd[1540]: time="2025-01-13T20:52:28.058528443Z" level=error msg="encountered an error cleaning up failed sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.058697 containerd[1540]: time="2025-01-13T20:52:28.058590065Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.064300 kubelet[2861]: E0113 20:52:28.064275 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.064411 kubelet[2861]: E0113 20:52:28.064322 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:28.064411 kubelet[2861]: E0113 20:52:28.064386 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:28.065763 kubelet[2861]: E0113 20:52:28.065690 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-f9v7b" podUID="dd0ed670-6fae-4b3f-8750-1689ff0c62c3" Jan 13 20:52:28.068426 containerd[1540]: time="2025-01-13T20:52:28.068272460Z" level=error msg="Failed to destroy network for sandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.068863 containerd[1540]: time="2025-01-13T20:52:28.068763091Z" level=error msg="encountered an error cleaning up failed sandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.068863 containerd[1540]: time="2025-01-13T20:52:28.068807602Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.068950 containerd[1540]: time="2025-01-13T20:52:28.068929146Z" level=error msg="Failed to destroy network for sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.069426 containerd[1540]: time="2025-01-13T20:52:28.069402905Z" level=error msg="encountered an error cleaning up failed sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.069466 containerd[1540]: time="2025-01-13T20:52:28.069432811Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.070465 containerd[1540]: time="2025-01-13T20:52:28.070444619Z" level=error msg="Failed to destroy network for sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.071476 containerd[1540]: time="2025-01-13T20:52:28.071458127Z" level=error msg="encountered an error cleaning up failed sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.071510 containerd[1540]: time="2025-01-13T20:52:28.071488137Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.074887 containerd[1540]: time="2025-01-13T20:52:28.074856828Z" level=error msg="Failed to destroy network for sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.075162 containerd[1540]: time="2025-01-13T20:52:28.075135614Z" level=error msg="encountered an error cleaning up failed sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.075198 containerd[1540]: time="2025-01-13T20:52:28.075185035Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.080863 kubelet[2861]: E0113 20:52:28.080517 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.080863 kubelet[2861]: E0113 20:52:28.080553 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:28.080863 kubelet[2861]: E0113 20:52:28.080567 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:28.080981 kubelet[2861]: E0113 20:52:28.080607 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" podUID="26da34a0-8538-4fcf-9a78-f93cb2d6a0ef" Jan 13 20:52:28.080981 kubelet[2861]: E0113 20:52:28.080664 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.080981 kubelet[2861]: E0113 20:52:28.080678 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:28.081073 kubelet[2861]: E0113 20:52:28.080688 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:28.081073 kubelet[2861]: E0113 20:52:28.080707 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-2wvd8_kube-system(42cec424-d73c-431a-b548-ae49975c9420)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-2wvd8_kube-system(42cec424-d73c-431a-b548-ae49975c9420)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-2wvd8" podUID="42cec424-d73c-431a-b548-ae49975c9420" Jan 13 20:52:28.081073 kubelet[2861]: E0113 20:52:28.080722 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.081144 kubelet[2861]: E0113 20:52:28.080733 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:28.081144 kubelet[2861]: E0113 20:52:28.080743 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:28.081144 kubelet[2861]: E0113 20:52:28.080777 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" podUID="b067aee8-97d0-47ca-9359-80c070636930" Jan 13 20:52:28.081218 kubelet[2861]: E0113 20:52:28.080809 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.081218 kubelet[2861]: E0113 20:52:28.080824 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:28.081218 kubelet[2861]: E0113 20:52:28.080842 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:28.081274 kubelet[2861]: E0113 20:52:28.080866 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" podUID="c9c96bfa-10f6-4dae-9986-cb25e73d9966" Jan 13 20:52:28.081312 kubelet[2861]: E0113 20:52:28.081306 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.081336 kubelet[2861]: E0113 20:52:28.081323 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:28.082294 kubelet[2861]: E0113 20:52:28.081363 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:28.082294 kubelet[2861]: E0113 20:52:28.081391 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:28.478912 systemd[1]: run-netns-cni\x2d3d057dde\x2d7492\x2d6fd1\x2d1d05\x2d9d6ddf43efbc.mount: Deactivated successfully. Jan 13 20:52:28.479161 systemd[1]: run-netns-cni\x2d7b7a726f\x2d7ddc\x2dc6fd\x2d5721\x2d2b17dbcbb4d2.mount: Deactivated successfully. Jan 13 20:52:28.596897 kubelet[2861]: I0113 20:52:28.596867 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7" Jan 13 20:52:28.597567 containerd[1540]: time="2025-01-13T20:52:28.597426478Z" level=info msg="StopPodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\"" Jan 13 20:52:28.598633 kubelet[2861]: I0113 20:52:28.597969 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3" Jan 13 20:52:28.599119 containerd[1540]: time="2025-01-13T20:52:28.598223432Z" level=info msg="StopPodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\"" Jan 13 20:52:28.599119 containerd[1540]: time="2025-01-13T20:52:28.598342790Z" level=info msg="Ensure that sandbox f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3 in task-service has been cleanup successfully" Jan 13 20:52:28.599119 containerd[1540]: time="2025-01-13T20:52:28.598491649Z" level=info msg="TearDown network for sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" successfully" Jan 13 20:52:28.599119 containerd[1540]: time="2025-01-13T20:52:28.598500051Z" level=info msg="StopPodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" returns successfully" Jan 13 20:52:28.599119 containerd[1540]: time="2025-01-13T20:52:28.598592450Z" level=info msg="Ensure that sandbox ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7 in task-service has been cleanup successfully" Jan 13 20:52:28.599119 containerd[1540]: time="2025-01-13T20:52:28.598892105Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\"" Jan 13 20:52:28.599119 containerd[1540]: time="2025-01-13T20:52:28.598952608Z" level=info msg="TearDown network for sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" successfully" Jan 13 20:52:28.599119 containerd[1540]: time="2025-01-13T20:52:28.598963410Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" returns successfully" Jan 13 20:52:28.600294 containerd[1540]: time="2025-01-13T20:52:28.599526612Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\"" Jan 13 20:52:28.600294 containerd[1540]: time="2025-01-13T20:52:28.599573934Z" level=info msg="TearDown network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" successfully" Jan 13 20:52:28.600294 containerd[1540]: time="2025-01-13T20:52:28.599580573Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" returns successfully" Jan 13 20:52:28.600294 containerd[1540]: time="2025-01-13T20:52:28.599953620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:3,}" Jan 13 20:52:28.601837 containerd[1540]: time="2025-01-13T20:52:28.601814644Z" level=info msg="TearDown network for sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" successfully" Jan 13 20:52:28.601847 systemd[1]: run-netns-cni\x2d38158153\x2d4332\x2d3712\x2da72b\x2dc094bf86259d.mount: Deactivated successfully. Jan 13 20:52:28.601951 systemd[1]: run-netns-cni\x2d9d771733\x2dc44e\x2de7ef\x2d7d31\x2d33f5ad60e5bd.mount: Deactivated successfully. Jan 13 20:52:28.605433 containerd[1540]: time="2025-01-13T20:52:28.604600306Z" level=info msg="StopPodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" returns successfully" Jan 13 20:52:28.605819 kubelet[2861]: I0113 20:52:28.605804 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4" Jan 13 20:52:28.607743 containerd[1540]: time="2025-01-13T20:52:28.607088045Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\"" Jan 13 20:52:28.607743 containerd[1540]: time="2025-01-13T20:52:28.607157101Z" level=info msg="TearDown network for sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" successfully" Jan 13 20:52:28.607743 containerd[1540]: time="2025-01-13T20:52:28.607165807Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" returns successfully" Jan 13 20:52:28.608002 containerd[1540]: time="2025-01-13T20:52:28.607895422Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\"" Jan 13 20:52:28.608002 containerd[1540]: time="2025-01-13T20:52:28.607958497Z" level=info msg="TearDown network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" successfully" Jan 13 20:52:28.608002 containerd[1540]: time="2025-01-13T20:52:28.607968202Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" returns successfully" Jan 13 20:52:28.608204 containerd[1540]: time="2025-01-13T20:52:28.608093635Z" level=info msg="StopPodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\"" Jan 13 20:52:28.608318 containerd[1540]: time="2025-01-13T20:52:28.608305842Z" level=info msg="Ensure that sandbox c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4 in task-service has been cleanup successfully" Jan 13 20:52:28.608961 containerd[1540]: time="2025-01-13T20:52:28.608949541Z" level=info msg="TearDown network for sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" successfully" Jan 13 20:52:28.609016 containerd[1540]: time="2025-01-13T20:52:28.609008633Z" level=info msg="StopPodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" returns successfully" Jan 13 20:52:28.609145 containerd[1540]: time="2025-01-13T20:52:28.609013362Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:3,}" Jan 13 20:52:28.610577 containerd[1540]: time="2025-01-13T20:52:28.610559017Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\"" Jan 13 20:52:28.610666 containerd[1540]: time="2025-01-13T20:52:28.610618199Z" level=info msg="TearDown network for sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" successfully" Jan 13 20:52:28.610666 containerd[1540]: time="2025-01-13T20:52:28.610627974Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" returns successfully" Jan 13 20:52:28.611848 containerd[1540]: time="2025-01-13T20:52:28.610954953Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\"" Jan 13 20:52:28.611979 kubelet[2861]: I0113 20:52:28.611478 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b" Jan 13 20:52:28.612546 containerd[1540]: time="2025-01-13T20:52:28.612190339Z" level=info msg="TearDown network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" successfully" Jan 13 20:52:28.612546 containerd[1540]: time="2025-01-13T20:52:28.612203991Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" returns successfully" Jan 13 20:52:28.612546 containerd[1540]: time="2025-01-13T20:52:28.612337485Z" level=info msg="StopPodSandbox for \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\"" Jan 13 20:52:28.612546 containerd[1540]: time="2025-01-13T20:52:28.612470893Z" level=info msg="Ensure that sandbox 206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b in task-service has been cleanup successfully" Jan 13 20:52:28.612273 systemd[1]: run-netns-cni\x2d7c5bab18\x2d12c0\x2d2b70\x2db442\x2d2f7b9603c40d.mount: Deactivated successfully. Jan 13 20:52:28.613372 containerd[1540]: time="2025-01-13T20:52:28.612922700Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:3,}" Jan 13 20:52:28.614534 containerd[1540]: time="2025-01-13T20:52:28.614037747Z" level=info msg="TearDown network for sandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\" successfully" Jan 13 20:52:28.614534 containerd[1540]: time="2025-01-13T20:52:28.614054871Z" level=info msg="StopPodSandbox for \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\" returns successfully" Jan 13 20:52:28.615960 systemd[1]: run-netns-cni\x2df336a57a\x2d2903\x2da321\x2d9c16\x2dbbfd8f8368a0.mount: Deactivated successfully. Jan 13 20:52:28.617102 containerd[1540]: time="2025-01-13T20:52:28.616345093Z" level=info msg="StopPodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\"" Jan 13 20:52:28.617102 containerd[1540]: time="2025-01-13T20:52:28.616515852Z" level=info msg="TearDown network for sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" successfully" Jan 13 20:52:28.617102 containerd[1540]: time="2025-01-13T20:52:28.616958788Z" level=info msg="StopPodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" returns successfully" Jan 13 20:52:28.617383 containerd[1540]: time="2025-01-13T20:52:28.617209117Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\"" Jan 13 20:52:28.617383 containerd[1540]: time="2025-01-13T20:52:28.617256710Z" level=info msg="TearDown network for sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" successfully" Jan 13 20:52:28.617383 containerd[1540]: time="2025-01-13T20:52:28.617265183Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" returns successfully" Jan 13 20:52:28.617827 kubelet[2861]: I0113 20:52:28.617606 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134" Jan 13 20:52:28.618086 containerd[1540]: time="2025-01-13T20:52:28.617928526Z" level=info msg="StopPodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\"" Jan 13 20:52:28.618086 containerd[1540]: time="2025-01-13T20:52:28.618057334Z" level=info msg="Ensure that sandbox 3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134 in task-service has been cleanup successfully" Jan 13 20:52:28.618340 containerd[1540]: time="2025-01-13T20:52:28.618328900Z" level=info msg="TearDown network for sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" successfully" Jan 13 20:52:28.618473 containerd[1540]: time="2025-01-13T20:52:28.618396980Z" level=info msg="StopPodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" returns successfully" Jan 13 20:52:28.618720 containerd[1540]: time="2025-01-13T20:52:28.618702524Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:3,}" Jan 13 20:52:28.620640 containerd[1540]: time="2025-01-13T20:52:28.618778714Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\"" Jan 13 20:52:28.620640 containerd[1540]: time="2025-01-13T20:52:28.618825046Z" level=info msg="TearDown network for sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" successfully" Jan 13 20:52:28.620640 containerd[1540]: time="2025-01-13T20:52:28.618839215Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" returns successfully" Jan 13 20:52:28.620640 containerd[1540]: time="2025-01-13T20:52:28.619687977Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\"" Jan 13 20:52:28.621594 systemd[1]: run-netns-cni\x2d0e38424f\x2dfa95\x2d0990\x2d6227\x2da0f4cf35efdc.mount: Deactivated successfully. Jan 13 20:52:28.622246 containerd[1540]: time="2025-01-13T20:52:28.622090092Z" level=info msg="TearDown network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" successfully" Jan 13 20:52:28.622246 containerd[1540]: time="2025-01-13T20:52:28.622244907Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" returns successfully" Jan 13 20:52:28.623576 containerd[1540]: time="2025-01-13T20:52:28.623551731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:3,}" Jan 13 20:52:28.624105 kubelet[2861]: I0113 20:52:28.624088 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5" Jan 13 20:52:28.624454 containerd[1540]: time="2025-01-13T20:52:28.624426352Z" level=info msg="StopPodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\"" Jan 13 20:52:28.625080 containerd[1540]: time="2025-01-13T20:52:28.624635697Z" level=info msg="Ensure that sandbox 7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5 in task-service has been cleanup successfully" Jan 13 20:52:28.625080 containerd[1540]: time="2025-01-13T20:52:28.624890836Z" level=info msg="TearDown network for sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" successfully" Jan 13 20:52:28.625080 containerd[1540]: time="2025-01-13T20:52:28.624900117Z" level=info msg="StopPodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" returns successfully" Jan 13 20:52:28.626064 containerd[1540]: time="2025-01-13T20:52:28.625469582Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\"" Jan 13 20:52:28.626064 containerd[1540]: time="2025-01-13T20:52:28.625525742Z" level=info msg="TearDown network for sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" successfully" Jan 13 20:52:28.626064 containerd[1540]: time="2025-01-13T20:52:28.625535565Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" returns successfully" Jan 13 20:52:28.626064 containerd[1540]: time="2025-01-13T20:52:28.625696245Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\"" Jan 13 20:52:28.626064 containerd[1540]: time="2025-01-13T20:52:28.625748526Z" level=info msg="TearDown network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" successfully" Jan 13 20:52:28.626064 containerd[1540]: time="2025-01-13T20:52:28.625756486Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" returns successfully" Jan 13 20:52:28.626064 containerd[1540]: time="2025-01-13T20:52:28.626030976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:3,}" Jan 13 20:52:28.817848 containerd[1540]: time="2025-01-13T20:52:28.816813721Z" level=error msg="Failed to destroy network for sandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.817848 containerd[1540]: time="2025-01-13T20:52:28.817037673Z" level=error msg="encountered an error cleaning up failed sandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.817848 containerd[1540]: time="2025-01-13T20:52:28.817089141Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.818305 kubelet[2861]: E0113 20:52:28.818083 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.818305 kubelet[2861]: E0113 20:52:28.818118 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:28.818305 kubelet[2861]: E0113 20:52:28.818135 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:28.818650 kubelet[2861]: E0113 20:52:28.818182 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" podUID="26da34a0-8538-4fcf-9a78-f93cb2d6a0ef" Jan 13 20:52:28.831380 containerd[1540]: time="2025-01-13T20:52:28.830709833Z" level=error msg="Failed to destroy network for sandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.831380 containerd[1540]: time="2025-01-13T20:52:28.831000458Z" level=error msg="encountered an error cleaning up failed sandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.831380 containerd[1540]: time="2025-01-13T20:52:28.831044076Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.831665 kubelet[2861]: E0113 20:52:28.831176 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.831665 kubelet[2861]: E0113 20:52:28.831210 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:28.831665 kubelet[2861]: E0113 20:52:28.831230 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:28.831745 kubelet[2861]: E0113 20:52:28.831270 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:28.854509 containerd[1540]: time="2025-01-13T20:52:28.854476783Z" level=error msg="Failed to destroy network for sandbox \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.854913 containerd[1540]: time="2025-01-13T20:52:28.854896453Z" level=error msg="encountered an error cleaning up failed sandbox \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.855009 containerd[1540]: time="2025-01-13T20:52:28.854993429Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.855548 kubelet[2861]: E0113 20:52:28.855256 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.855548 kubelet[2861]: E0113 20:52:28.855299 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:28.855548 kubelet[2861]: E0113 20:52:28.855317 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:28.855656 kubelet[2861]: E0113 20:52:28.855532 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-2wvd8_kube-system(42cec424-d73c-431a-b548-ae49975c9420)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-2wvd8_kube-system(42cec424-d73c-431a-b548-ae49975c9420)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-2wvd8" podUID="42cec424-d73c-431a-b548-ae49975c9420" Jan 13 20:52:28.858205 containerd[1540]: time="2025-01-13T20:52:28.858165921Z" level=error msg="Failed to destroy network for sandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.859036 containerd[1540]: time="2025-01-13T20:52:28.859016814Z" level=error msg="encountered an error cleaning up failed sandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.859223 containerd[1540]: time="2025-01-13T20:52:28.859140672Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.860078 containerd[1540]: time="2025-01-13T20:52:28.860056840Z" level=error msg="Failed to destroy network for sandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.860489 containerd[1540]: time="2025-01-13T20:52:28.860458624Z" level=error msg="encountered an error cleaning up failed sandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.860544 kubelet[2861]: E0113 20:52:28.860462 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.860544 kubelet[2861]: E0113 20:52:28.860511 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:28.860544 kubelet[2861]: E0113 20:52:28.860532 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:28.860753 kubelet[2861]: E0113 20:52:28.860572 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" podUID="c9c96bfa-10f6-4dae-9986-cb25e73d9966" Jan 13 20:52:28.861127 containerd[1540]: time="2025-01-13T20:52:28.860603986Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.861207 kubelet[2861]: E0113 20:52:28.860940 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.861207 kubelet[2861]: E0113 20:52:28.860972 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:28.861207 kubelet[2861]: E0113 20:52:28.860985 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:28.861333 kubelet[2861]: E0113 20:52:28.861069 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" podUID="b067aee8-97d0-47ca-9359-80c070636930" Jan 13 20:52:28.863792 containerd[1540]: time="2025-01-13T20:52:28.863696358Z" level=error msg="Failed to destroy network for sandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.864528 containerd[1540]: time="2025-01-13T20:52:28.864165232Z" level=error msg="encountered an error cleaning up failed sandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.865310 containerd[1540]: time="2025-01-13T20:52:28.864708777Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.865472 kubelet[2861]: E0113 20:52:28.865024 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:28.865472 kubelet[2861]: E0113 20:52:28.865062 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:28.865472 kubelet[2861]: E0113 20:52:28.865079 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:28.865829 kubelet[2861]: E0113 20:52:28.865126 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-f9v7b" podUID="dd0ed670-6fae-4b3f-8750-1689ff0c62c3" Jan 13 20:52:29.478941 systemd[1]: run-netns-cni\x2d193656ee\x2dd01a\x2d8c7f\x2d2633\x2dc41be06f5909.mount: Deactivated successfully. Jan 13 20:52:29.629441 kubelet[2861]: I0113 20:52:29.629391 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3" Jan 13 20:52:29.630666 containerd[1540]: time="2025-01-13T20:52:29.630179170Z" level=info msg="StopPodSandbox for \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\"" Jan 13 20:52:29.630666 containerd[1540]: time="2025-01-13T20:52:29.630337435Z" level=info msg="Ensure that sandbox 218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3 in task-service has been cleanup successfully" Jan 13 20:52:29.633118 containerd[1540]: time="2025-01-13T20:52:29.631525256Z" level=info msg="TearDown network for sandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\" successfully" Jan 13 20:52:29.633118 containerd[1540]: time="2025-01-13T20:52:29.631540709Z" level=info msg="StopPodSandbox for \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\" returns successfully" Jan 13 20:52:29.633118 containerd[1540]: time="2025-01-13T20:52:29.631999289Z" level=info msg="StopPodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\"" Jan 13 20:52:29.633118 containerd[1540]: time="2025-01-13T20:52:29.632048431Z" level=info msg="TearDown network for sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" successfully" Jan 13 20:52:29.633118 containerd[1540]: time="2025-01-13T20:52:29.632077730Z" level=info msg="StopPodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" returns successfully" Jan 13 20:52:29.633118 containerd[1540]: time="2025-01-13T20:52:29.632232644Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\"" Jan 13 20:52:29.633118 containerd[1540]: time="2025-01-13T20:52:29.632284549Z" level=info msg="TearDown network for sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" successfully" Jan 13 20:52:29.633118 containerd[1540]: time="2025-01-13T20:52:29.632290880Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" returns successfully" Jan 13 20:52:29.632462 systemd[1]: run-netns-cni\x2d622d7203\x2d0653\x2d26d8\x2d5efe\x2df99a48b379a9.mount: Deactivated successfully. Jan 13 20:52:29.647943 kubelet[2861]: I0113 20:52:29.634475 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c" Jan 13 20:52:29.647943 kubelet[2861]: I0113 20:52:29.637117 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5" Jan 13 20:52:29.647943 kubelet[2861]: I0113 20:52:29.642187 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e" Jan 13 20:52:29.647943 kubelet[2861]: I0113 20:52:29.647388 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.633841685Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\"" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.633902867Z" level=info msg="TearDown network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.633910644Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" returns successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.634681600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:4,}" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.636174886Z" level=info msg="StopPodSandbox for \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\"" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.636634120Z" level=info msg="Ensure that sandbox ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c in task-service has been cleanup successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.637423364Z" level=info msg="StopPodSandbox for \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\"" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.637552920Z" level=info msg="Ensure that sandbox 3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5 in task-service has been cleanup successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.637998560Z" level=info msg="TearDown network for sandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\" successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.638005302Z" level=info msg="TearDown network for sandbox \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\" successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.638020020Z" level=info msg="StopPodSandbox for \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\" returns successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.638007552Z" level=info msg="StopPodSandbox for \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\" returns successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.639563562Z" level=info msg="StopPodSandbox for \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\"" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.639628521Z" level=info msg="TearDown network for sandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\" successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.639639149Z" level=info msg="StopPodSandbox for \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\" returns successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.639703000Z" level=info msg="StopPodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\"" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.639739507Z" level=info msg="TearDown network for sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.639747326Z" level=info msg="StopPodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" returns successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.639965937Z" level=info msg="StopPodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\"" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.640118065Z" level=info msg="TearDown network for sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.640125745Z" level=info msg="StopPodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" returns successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.639994610Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\"" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.640196817Z" level=info msg="TearDown network for sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.640206829Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" returns successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.640360378Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\"" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.640410688Z" level=info msg="TearDown network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.640417256Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" returns successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.640442044Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\"" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.641202793Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:4,}" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.641465178Z" level=info msg="TearDown network for sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.641474183Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" returns successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.642501379Z" level=info msg="StopPodSandbox for \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\"" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.642684487Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:4,}" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.642918382Z" level=info msg="Ensure that sandbox de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e in task-service has been cleanup successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.643482257Z" level=info msg="TearDown network for sandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\" successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.643492328Z" level=info msg="StopPodSandbox for \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\" returns successfully" Jan 13 20:52:29.648041 containerd[1540]: time="2025-01-13T20:52:29.643944112Z" level=info msg="StopPodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\"" Jan 13 20:52:29.640540 systemd[1]: run-netns-cni\x2d279c856d\x2db60d\x2d1256\x2d1b3b\x2d16cbbd6eec0f.mount: Deactivated successfully. Jan 13 20:52:29.648841 containerd[1540]: time="2025-01-13T20:52:29.646028377Z" level=info msg="TearDown network for sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" successfully" Jan 13 20:52:29.648841 containerd[1540]: time="2025-01-13T20:52:29.646041528Z" level=info msg="StopPodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" returns successfully" Jan 13 20:52:29.648841 containerd[1540]: time="2025-01-13T20:52:29.646322685Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\"" Jan 13 20:52:29.648841 containerd[1540]: time="2025-01-13T20:52:29.646426521Z" level=info msg="TearDown network for sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" successfully" Jan 13 20:52:29.648841 containerd[1540]: time="2025-01-13T20:52:29.646437651Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" returns successfully" Jan 13 20:52:29.648841 containerd[1540]: time="2025-01-13T20:52:29.646573826Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\"" Jan 13 20:52:29.648841 containerd[1540]: time="2025-01-13T20:52:29.646644049Z" level=info msg="TearDown network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" successfully" Jan 13 20:52:29.648841 containerd[1540]: time="2025-01-13T20:52:29.646652986Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" returns successfully" Jan 13 20:52:29.648841 containerd[1540]: time="2025-01-13T20:52:29.647026555Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:4,}" Jan 13 20:52:29.648841 containerd[1540]: time="2025-01-13T20:52:29.647671845Z" level=info msg="StopPodSandbox for \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\"" Jan 13 20:52:29.648841 containerd[1540]: time="2025-01-13T20:52:29.647887765Z" level=info msg="Ensure that sandbox 63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f in task-service has been cleanup successfully" Jan 13 20:52:29.648841 containerd[1540]: time="2025-01-13T20:52:29.648039562Z" level=info msg="TearDown network for sandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\" successfully" Jan 13 20:52:29.648841 containerd[1540]: time="2025-01-13T20:52:29.648051418Z" level=info msg="StopPodSandbox for \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\" returns successfully" Jan 13 20:52:29.640603 systemd[1]: run-netns-cni\x2d569c5ed0\x2d3a91\x2df8a0\x2dc896\x2d97fa8aeec645.mount: Deactivated successfully. Jan 13 20:52:29.645467 systemd[1]: run-netns-cni\x2d83820f37\x2dffb8\x2d2fae\x2d5682\x2d96bbbc32b1d3.mount: Deactivated successfully. Jan 13 20:52:29.649852 containerd[1540]: time="2025-01-13T20:52:29.649398471Z" level=info msg="StopPodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\"" Jan 13 20:52:29.649852 containerd[1540]: time="2025-01-13T20:52:29.649448407Z" level=info msg="TearDown network for sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" successfully" Jan 13 20:52:29.649852 containerd[1540]: time="2025-01-13T20:52:29.649457903Z" level=info msg="StopPodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" returns successfully" Jan 13 20:52:29.649852 containerd[1540]: time="2025-01-13T20:52:29.649764655Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\"" Jan 13 20:52:29.650090 containerd[1540]: time="2025-01-13T20:52:29.649924653Z" level=info msg="TearDown network for sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" successfully" Jan 13 20:52:29.650090 containerd[1540]: time="2025-01-13T20:52:29.649934053Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" returns successfully" Jan 13 20:52:29.650158 kubelet[2861]: I0113 20:52:29.650056 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87" Jan 13 20:52:29.650455 containerd[1540]: time="2025-01-13T20:52:29.650437810Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\"" Jan 13 20:52:29.650735 containerd[1540]: time="2025-01-13T20:52:29.650505640Z" level=info msg="TearDown network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" successfully" Jan 13 20:52:29.650735 containerd[1540]: time="2025-01-13T20:52:29.650517125Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" returns successfully" Jan 13 20:52:29.650735 containerd[1540]: time="2025-01-13T20:52:29.650692767Z" level=info msg="StopPodSandbox for \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\"" Jan 13 20:52:29.651021 containerd[1540]: time="2025-01-13T20:52:29.650829741Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:4,}" Jan 13 20:52:29.651021 containerd[1540]: time="2025-01-13T20:52:29.650933235Z" level=info msg="Ensure that sandbox 9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87 in task-service has been cleanup successfully" Jan 13 20:52:29.651171 containerd[1540]: time="2025-01-13T20:52:29.651157911Z" level=info msg="TearDown network for sandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\" successfully" Jan 13 20:52:29.651224 containerd[1540]: time="2025-01-13T20:52:29.651212800Z" level=info msg="StopPodSandbox for \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\" returns successfully" Jan 13 20:52:29.651524 containerd[1540]: time="2025-01-13T20:52:29.651507036Z" level=info msg="StopPodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\"" Jan 13 20:52:29.651576 containerd[1540]: time="2025-01-13T20:52:29.651561851Z" level=info msg="TearDown network for sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" successfully" Jan 13 20:52:29.651576 containerd[1540]: time="2025-01-13T20:52:29.651574176Z" level=info msg="StopPodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" returns successfully" Jan 13 20:52:29.656103 containerd[1540]: time="2025-01-13T20:52:29.651736181Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\"" Jan 13 20:52:29.656103 containerd[1540]: time="2025-01-13T20:52:29.651817247Z" level=info msg="TearDown network for sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" successfully" Jan 13 20:52:29.656103 containerd[1540]: time="2025-01-13T20:52:29.651828980Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" returns successfully" Jan 13 20:52:29.656103 containerd[1540]: time="2025-01-13T20:52:29.654428945Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\"" Jan 13 20:52:29.656103 containerd[1540]: time="2025-01-13T20:52:29.654524693Z" level=info msg="TearDown network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" successfully" Jan 13 20:52:29.656103 containerd[1540]: time="2025-01-13T20:52:29.654548758Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" returns successfully" Jan 13 20:52:29.656103 containerd[1540]: time="2025-01-13T20:52:29.655006531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:4,}" Jan 13 20:52:30.279382 containerd[1540]: time="2025-01-13T20:52:30.279189662Z" level=error msg="Failed to destroy network for sandbox \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.279487 containerd[1540]: time="2025-01-13T20:52:30.279467742Z" level=error msg="encountered an error cleaning up failed sandbox \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.279530 containerd[1540]: time="2025-01-13T20:52:30.279513226Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.280306 kubelet[2861]: E0113 20:52:30.279700 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.280306 kubelet[2861]: E0113 20:52:30.279748 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:30.280306 kubelet[2861]: E0113 20:52:30.279764 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:30.280444 kubelet[2861]: E0113 20:52:30.279807 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:30.324436 containerd[1540]: time="2025-01-13T20:52:30.324397536Z" level=error msg="Failed to destroy network for sandbox \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.324745 containerd[1540]: time="2025-01-13T20:52:30.324706135Z" level=error msg="encountered an error cleaning up failed sandbox \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.324805 containerd[1540]: time="2025-01-13T20:52:30.324765417Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.325233 kubelet[2861]: E0113 20:52:30.325108 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.325233 kubelet[2861]: E0113 20:52:30.325159 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:30.325233 kubelet[2861]: E0113 20:52:30.325185 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:30.325472 kubelet[2861]: E0113 20:52:30.325407 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-2wvd8_kube-system(42cec424-d73c-431a-b548-ae49975c9420)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-2wvd8_kube-system(42cec424-d73c-431a-b548-ae49975c9420)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-2wvd8" podUID="42cec424-d73c-431a-b548-ae49975c9420" Jan 13 20:52:30.331142 containerd[1540]: time="2025-01-13T20:52:30.331110014Z" level=error msg="Failed to destroy network for sandbox \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.331609 containerd[1540]: time="2025-01-13T20:52:30.331536328Z" level=error msg="encountered an error cleaning up failed sandbox \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.331692 containerd[1540]: time="2025-01-13T20:52:30.331578731Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.331847 kubelet[2861]: E0113 20:52:30.331825 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.332080 kubelet[2861]: E0113 20:52:30.331986 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:30.332080 kubelet[2861]: E0113 20:52:30.332005 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:30.332080 kubelet[2861]: E0113 20:52:30.332048 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" podUID="26da34a0-8538-4fcf-9a78-f93cb2d6a0ef" Jan 13 20:52:30.341837 containerd[1540]: time="2025-01-13T20:52:30.341804657Z" level=error msg="Failed to destroy network for sandbox \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.342310 containerd[1540]: time="2025-01-13T20:52:30.342158645Z" level=error msg="encountered an error cleaning up failed sandbox \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.342310 containerd[1540]: time="2025-01-13T20:52:30.342197656Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.343018 kubelet[2861]: E0113 20:52:30.342499 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.343018 kubelet[2861]: E0113 20:52:30.342541 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:30.343018 kubelet[2861]: E0113 20:52:30.342563 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:30.343152 kubelet[2861]: E0113 20:52:30.342615 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" podUID="b067aee8-97d0-47ca-9359-80c070636930" Jan 13 20:52:30.343226 containerd[1540]: time="2025-01-13T20:52:30.343053822Z" level=error msg="Failed to destroy network for sandbox \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.343264 containerd[1540]: time="2025-01-13T20:52:30.343221420Z" level=error msg="encountered an error cleaning up failed sandbox \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.343264 containerd[1540]: time="2025-01-13T20:52:30.343256676Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.343587 kubelet[2861]: E0113 20:52:30.343455 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.343587 kubelet[2861]: E0113 20:52:30.343488 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:30.343587 kubelet[2861]: E0113 20:52:30.343509 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:30.343720 kubelet[2861]: E0113 20:52:30.343561 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" podUID="c9c96bfa-10f6-4dae-9986-cb25e73d9966" Jan 13 20:52:30.343924 containerd[1540]: time="2025-01-13T20:52:30.343898931Z" level=error msg="Failed to destroy network for sandbox \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.344456 containerd[1540]: time="2025-01-13T20:52:30.344385810Z" level=error msg="encountered an error cleaning up failed sandbox \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.344456 containerd[1540]: time="2025-01-13T20:52:30.344417091Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.344572 kubelet[2861]: E0113 20:52:30.344557 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:30.344616 kubelet[2861]: E0113 20:52:30.344583 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:30.344616 kubelet[2861]: E0113 20:52:30.344596 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:30.344678 kubelet[2861]: E0113 20:52:30.344621 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-f9v7b" podUID="dd0ed670-6fae-4b3f-8750-1689ff0c62c3" Jan 13 20:52:30.479537 systemd[1]: run-netns-cni\x2dfdf6cb93\x2d638b\x2da1e3\x2db6b5\x2d7966113a2265.mount: Deactivated successfully. Jan 13 20:52:30.479773 systemd[1]: run-netns-cni\x2d000e4112\x2dd826\x2d88f5\x2d8851\x2d86aa0cc0cf20.mount: Deactivated successfully. Jan 13 20:52:30.653237 kubelet[2861]: I0113 20:52:30.653198 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e" Jan 13 20:52:30.654317 containerd[1540]: time="2025-01-13T20:52:30.654162124Z" level=info msg="StopPodSandbox for \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\"" Jan 13 20:52:30.654317 containerd[1540]: time="2025-01-13T20:52:30.654305999Z" level=info msg="Ensure that sandbox 56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e in task-service has been cleanup successfully" Jan 13 20:52:30.656702 containerd[1540]: time="2025-01-13T20:52:30.654599162Z" level=info msg="TearDown network for sandbox \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\" successfully" Jan 13 20:52:30.656702 containerd[1540]: time="2025-01-13T20:52:30.654610593Z" level=info msg="StopPodSandbox for \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\" returns successfully" Jan 13 20:52:30.656702 containerd[1540]: time="2025-01-13T20:52:30.656557050Z" level=info msg="StopPodSandbox for \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\"" Jan 13 20:52:30.656702 containerd[1540]: time="2025-01-13T20:52:30.656610532Z" level=info msg="TearDown network for sandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\" successfully" Jan 13 20:52:30.656702 containerd[1540]: time="2025-01-13T20:52:30.656617884Z" level=info msg="StopPodSandbox for \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\" returns successfully" Jan 13 20:52:30.656168 systemd[1]: run-netns-cni\x2df5268360\x2df414\x2d7596\x2de7c2\x2d44625a55f1e0.mount: Deactivated successfully. Jan 13 20:52:30.659398 containerd[1540]: time="2025-01-13T20:52:30.659103629Z" level=info msg="StopPodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\"" Jan 13 20:52:30.659786 containerd[1540]: time="2025-01-13T20:52:30.659639292Z" level=info msg="TearDown network for sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" successfully" Jan 13 20:52:30.660114 containerd[1540]: time="2025-01-13T20:52:30.659836532Z" level=info msg="StopPodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" returns successfully" Jan 13 20:52:30.661471 containerd[1540]: time="2025-01-13T20:52:30.661204203Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\"" Jan 13 20:52:30.661471 containerd[1540]: time="2025-01-13T20:52:30.661263867Z" level=info msg="TearDown network for sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" successfully" Jan 13 20:52:30.661471 containerd[1540]: time="2025-01-13T20:52:30.661275070Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" returns successfully" Jan 13 20:52:30.662069 containerd[1540]: time="2025-01-13T20:52:30.662056435Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\"" Jan 13 20:52:30.662157 containerd[1540]: time="2025-01-13T20:52:30.662148133Z" level=info msg="TearDown network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" successfully" Jan 13 20:52:30.662317 containerd[1540]: time="2025-01-13T20:52:30.662305618Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" returns successfully" Jan 13 20:52:30.662751 containerd[1540]: time="2025-01-13T20:52:30.662739861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:5,}" Jan 13 20:52:30.668685 kubelet[2861]: I0113 20:52:30.668653 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461" Jan 13 20:52:30.669737 containerd[1540]: time="2025-01-13T20:52:30.669712811Z" level=info msg="StopPodSandbox for \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\"" Jan 13 20:52:30.669849 containerd[1540]: time="2025-01-13T20:52:30.669831266Z" level=info msg="Ensure that sandbox 0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461 in task-service has been cleanup successfully" Jan 13 20:52:30.670143 containerd[1540]: time="2025-01-13T20:52:30.669992029Z" level=info msg="TearDown network for sandbox \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\" successfully" Jan 13 20:52:30.670143 containerd[1540]: time="2025-01-13T20:52:30.670002140Z" level=info msg="StopPodSandbox for \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\" returns successfully" Jan 13 20:52:30.671854 containerd[1540]: time="2025-01-13T20:52:30.670421885Z" level=info msg="StopPodSandbox for \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\"" Jan 13 20:52:30.671854 containerd[1540]: time="2025-01-13T20:52:30.670465166Z" level=info msg="TearDown network for sandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\" successfully" Jan 13 20:52:30.671854 containerd[1540]: time="2025-01-13T20:52:30.670472707Z" level=info msg="StopPodSandbox for \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\" returns successfully" Jan 13 20:52:30.672283 systemd[1]: run-netns-cni\x2d18647be8\x2de1da\x2dd68c\x2dca5f\x2d0b401a823541.mount: Deactivated successfully. Jan 13 20:52:30.673227 containerd[1540]: time="2025-01-13T20:52:30.673212879Z" level=info msg="StopPodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\"" Jan 13 20:52:30.673274 containerd[1540]: time="2025-01-13T20:52:30.673266818Z" level=info msg="TearDown network for sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" successfully" Jan 13 20:52:30.673274 containerd[1540]: time="2025-01-13T20:52:30.673273682Z" level=info msg="StopPodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" returns successfully" Jan 13 20:52:30.673654 containerd[1540]: time="2025-01-13T20:52:30.673578725Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\"" Jan 13 20:52:30.673654 containerd[1540]: time="2025-01-13T20:52:30.673614917Z" level=info msg="TearDown network for sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" successfully" Jan 13 20:52:30.673654 containerd[1540]: time="2025-01-13T20:52:30.673621090Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" returns successfully" Jan 13 20:52:30.675429 containerd[1540]: time="2025-01-13T20:52:30.675411857Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\"" Jan 13 20:52:30.675487 containerd[1540]: time="2025-01-13T20:52:30.675455647Z" level=info msg="TearDown network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" successfully" Jan 13 20:52:30.675487 containerd[1540]: time="2025-01-13T20:52:30.675483657Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" returns successfully" Jan 13 20:52:30.675970 containerd[1540]: time="2025-01-13T20:52:30.675955898Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:5,}" Jan 13 20:52:30.676746 kubelet[2861]: I0113 20:52:30.676737 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797" Jan 13 20:52:30.677416 containerd[1540]: time="2025-01-13T20:52:30.677400052Z" level=info msg="StopPodSandbox for \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\"" Jan 13 20:52:30.678103 containerd[1540]: time="2025-01-13T20:52:30.678071922Z" level=info msg="Ensure that sandbox adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797 in task-service has been cleanup successfully" Jan 13 20:52:30.678197 containerd[1540]: time="2025-01-13T20:52:30.678183285Z" level=info msg="TearDown network for sandbox \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\" successfully" Jan 13 20:52:30.678197 containerd[1540]: time="2025-01-13T20:52:30.678193773Z" level=info msg="StopPodSandbox for \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\" returns successfully" Jan 13 20:52:30.678366 containerd[1540]: time="2025-01-13T20:52:30.678324204Z" level=info msg="StopPodSandbox for \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\"" Jan 13 20:52:30.678398 containerd[1540]: time="2025-01-13T20:52:30.678373346Z" level=info msg="TearDown network for sandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\" successfully" Jan 13 20:52:30.678398 containerd[1540]: time="2025-01-13T20:52:30.678380758Z" level=info msg="StopPodSandbox for \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\" returns successfully" Jan 13 20:52:30.681126 containerd[1540]: time="2025-01-13T20:52:30.679584136Z" level=info msg="StopPodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\"" Jan 13 20:52:30.681126 containerd[1540]: time="2025-01-13T20:52:30.679625870Z" level=info msg="TearDown network for sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" successfully" Jan 13 20:52:30.681126 containerd[1540]: time="2025-01-13T20:52:30.679632235Z" level=info msg="StopPodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" returns successfully" Jan 13 20:52:30.680703 systemd[1]: run-netns-cni\x2de0d8d62e\x2db87a\x2d4051\x2de3d4\x2ded95de7714b1.mount: Deactivated successfully. Jan 13 20:52:30.682011 containerd[1540]: time="2025-01-13T20:52:30.681735396Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\"" Jan 13 20:52:30.682011 containerd[1540]: time="2025-01-13T20:52:30.681788075Z" level=info msg="TearDown network for sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" successfully" Jan 13 20:52:30.682011 containerd[1540]: time="2025-01-13T20:52:30.681794819Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" returns successfully" Jan 13 20:52:30.682103 containerd[1540]: time="2025-01-13T20:52:30.682081014Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\"" Jan 13 20:52:30.682129 containerd[1540]: time="2025-01-13T20:52:30.682120176Z" level=info msg="TearDown network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" successfully" Jan 13 20:52:30.682129 containerd[1540]: time="2025-01-13T20:52:30.682125781Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" returns successfully" Jan 13 20:52:30.682557 kubelet[2861]: I0113 20:52:30.682299 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a" Jan 13 20:52:30.682597 containerd[1540]: time="2025-01-13T20:52:30.682583828Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:5,}" Jan 13 20:52:30.683543 containerd[1540]: time="2025-01-13T20:52:30.683526252Z" level=info msg="StopPodSandbox for \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\"" Jan 13 20:52:30.683696 containerd[1540]: time="2025-01-13T20:52:30.683672006Z" level=info msg="Ensure that sandbox 2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a in task-service has been cleanup successfully" Jan 13 20:52:30.685612 containerd[1540]: time="2025-01-13T20:52:30.685594826Z" level=info msg="TearDown network for sandbox \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\" successfully" Jan 13 20:52:30.685612 containerd[1540]: time="2025-01-13T20:52:30.685608453Z" level=info msg="StopPodSandbox for \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\" returns successfully" Jan 13 20:52:30.686154 containerd[1540]: time="2025-01-13T20:52:30.685791582Z" level=info msg="StopPodSandbox for \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\"" Jan 13 20:52:30.686154 containerd[1540]: time="2025-01-13T20:52:30.685831757Z" level=info msg="TearDown network for sandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\" successfully" Jan 13 20:52:30.686154 containerd[1540]: time="2025-01-13T20:52:30.685840272Z" level=info msg="StopPodSandbox for \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\" returns successfully" Jan 13 20:52:30.685891 systemd[1]: run-netns-cni\x2d0e5b0532\x2db7f4\x2dfdb1\x2dd261\x2d67847fd3126a.mount: Deactivated successfully. Jan 13 20:52:30.686809 containerd[1540]: time="2025-01-13T20:52:30.686378953Z" level=info msg="StopPodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\"" Jan 13 20:52:30.686809 containerd[1540]: time="2025-01-13T20:52:30.686428473Z" level=info msg="TearDown network for sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" successfully" Jan 13 20:52:30.686809 containerd[1540]: time="2025-01-13T20:52:30.686439155Z" level=info msg="StopPodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" returns successfully" Jan 13 20:52:30.688037 containerd[1540]: time="2025-01-13T20:52:30.687879013Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\"" Jan 13 20:52:30.688037 containerd[1540]: time="2025-01-13T20:52:30.687923571Z" level=info msg="TearDown network for sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" successfully" Jan 13 20:52:30.688037 containerd[1540]: time="2025-01-13T20:52:30.687931019Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" returns successfully" Jan 13 20:52:30.688848 containerd[1540]: time="2025-01-13T20:52:30.688643092Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\"" Jan 13 20:52:30.688848 containerd[1540]: time="2025-01-13T20:52:30.688788657Z" level=info msg="TearDown network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" successfully" Jan 13 20:52:30.688848 containerd[1540]: time="2025-01-13T20:52:30.688817916Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" returns successfully" Jan 13 20:52:30.690247 kubelet[2861]: I0113 20:52:30.690034 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212" Jan 13 20:52:30.690343 containerd[1540]: time="2025-01-13T20:52:30.690326404Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:5,}" Jan 13 20:52:30.690779 containerd[1540]: time="2025-01-13T20:52:30.690767400Z" level=info msg="StopPodSandbox for \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\"" Jan 13 20:52:30.691907 containerd[1540]: time="2025-01-13T20:52:30.691525263Z" level=info msg="Ensure that sandbox ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212 in task-service has been cleanup successfully" Jan 13 20:52:30.692169 containerd[1540]: time="2025-01-13T20:52:30.692098684Z" level=info msg="TearDown network for sandbox \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\" successfully" Jan 13 20:52:30.692169 containerd[1540]: time="2025-01-13T20:52:30.692128289Z" level=info msg="StopPodSandbox for \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\" returns successfully" Jan 13 20:52:30.694822 containerd[1540]: time="2025-01-13T20:52:30.694665066Z" level=info msg="StopPodSandbox for \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\"" Jan 13 20:52:30.694822 containerd[1540]: time="2025-01-13T20:52:30.694722014Z" level=info msg="TearDown network for sandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\" successfully" Jan 13 20:52:30.694822 containerd[1540]: time="2025-01-13T20:52:30.694728484Z" level=info msg="StopPodSandbox for \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\" returns successfully" Jan 13 20:52:30.695413 containerd[1540]: time="2025-01-13T20:52:30.695340024Z" level=info msg="StopPodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\"" Jan 13 20:52:30.695963 containerd[1540]: time="2025-01-13T20:52:30.695949699Z" level=info msg="TearDown network for sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" successfully" Jan 13 20:52:30.696371 containerd[1540]: time="2025-01-13T20:52:30.696307618Z" level=info msg="StopPodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" returns successfully" Jan 13 20:52:30.696525 containerd[1540]: time="2025-01-13T20:52:30.696508423Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\"" Jan 13 20:52:30.696798 containerd[1540]: time="2025-01-13T20:52:30.696759251Z" level=info msg="TearDown network for sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" successfully" Jan 13 20:52:30.696798 containerd[1540]: time="2025-01-13T20:52:30.696769239Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" returns successfully" Jan 13 20:52:30.697024 containerd[1540]: time="2025-01-13T20:52:30.696952734Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\"" Jan 13 20:52:30.697024 containerd[1540]: time="2025-01-13T20:52:30.696997617Z" level=info msg="TearDown network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" successfully" Jan 13 20:52:30.697024 containerd[1540]: time="2025-01-13T20:52:30.697004148Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" returns successfully" Jan 13 20:52:30.697189 kubelet[2861]: I0113 20:52:30.697174 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff" Jan 13 20:52:30.697544 containerd[1540]: time="2025-01-13T20:52:30.697341496Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:5,}" Jan 13 20:52:30.697751 containerd[1540]: time="2025-01-13T20:52:30.697738395Z" level=info msg="StopPodSandbox for \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\"" Jan 13 20:52:30.697876 containerd[1540]: time="2025-01-13T20:52:30.697859523Z" level=info msg="Ensure that sandbox 86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff in task-service has been cleanup successfully" Jan 13 20:52:30.698158 containerd[1540]: time="2025-01-13T20:52:30.698142626Z" level=info msg="TearDown network for sandbox \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\" successfully" Jan 13 20:52:30.698158 containerd[1540]: time="2025-01-13T20:52:30.698154160Z" level=info msg="StopPodSandbox for \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\" returns successfully" Jan 13 20:52:30.698757 containerd[1540]: time="2025-01-13T20:52:30.698743527Z" level=info msg="StopPodSandbox for \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\"" Jan 13 20:52:30.698861 containerd[1540]: time="2025-01-13T20:52:30.698850926Z" level=info msg="TearDown network for sandbox \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\" successfully" Jan 13 20:52:30.699494 containerd[1540]: time="2025-01-13T20:52:30.699484174Z" level=info msg="StopPodSandbox for \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\" returns successfully" Jan 13 20:52:30.699830 containerd[1540]: time="2025-01-13T20:52:30.699730898Z" level=info msg="StopPodSandbox for \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\"" Jan 13 20:52:30.699830 containerd[1540]: time="2025-01-13T20:52:30.699773010Z" level=info msg="TearDown network for sandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\" successfully" Jan 13 20:52:30.699830 containerd[1540]: time="2025-01-13T20:52:30.699779085Z" level=info msg="StopPodSandbox for \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\" returns successfully" Jan 13 20:52:30.699978 containerd[1540]: time="2025-01-13T20:52:30.699968767Z" level=info msg="StopPodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\"" Jan 13 20:52:30.700079 containerd[1540]: time="2025-01-13T20:52:30.700037487Z" level=info msg="TearDown network for sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" successfully" Jan 13 20:52:30.700079 containerd[1540]: time="2025-01-13T20:52:30.700047167Z" level=info msg="StopPodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" returns successfully" Jan 13 20:52:30.700240 containerd[1540]: time="2025-01-13T20:52:30.700165011Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\"" Jan 13 20:52:30.700240 containerd[1540]: time="2025-01-13T20:52:30.700201483Z" level=info msg="TearDown network for sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" successfully" Jan 13 20:52:30.700240 containerd[1540]: time="2025-01-13T20:52:30.700207199Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" returns successfully" Jan 13 20:52:30.700574 containerd[1540]: time="2025-01-13T20:52:30.700563953Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:5,}" Jan 13 20:52:31.479829 systemd[1]: run-netns-cni\x2dbd6cdba3\x2db3c8\x2dd22b\x2d1c99\x2df72942ea1e82.mount: Deactivated successfully. Jan 13 20:52:31.480471 systemd[1]: run-netns-cni\x2d6db694e1\x2dc026\x2d3f97\x2d475a\x2da8fed9f63fd2.mount: Deactivated successfully. Jan 13 20:52:31.876547 containerd[1540]: time="2025-01-13T20:52:31.876398475Z" level=error msg="Failed to destroy network for sandbox \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:31.877538 containerd[1540]: time="2025-01-13T20:52:31.877065502Z" level=error msg="encountered an error cleaning up failed sandbox \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:31.877538 containerd[1540]: time="2025-01-13T20:52:31.877110681Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:31.877653 kubelet[2861]: E0113 20:52:31.877274 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:31.877653 kubelet[2861]: E0113 20:52:31.877313 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:31.877653 kubelet[2861]: E0113 20:52:31.877329 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:31.877874 kubelet[2861]: E0113 20:52:31.877395 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" podUID="c9c96bfa-10f6-4dae-9986-cb25e73d9966" Jan 13 20:52:31.979508 containerd[1540]: time="2025-01-13T20:52:31.979426989Z" level=error msg="Failed to destroy network for sandbox \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:31.979682 containerd[1540]: time="2025-01-13T20:52:31.979663244Z" level=error msg="encountered an error cleaning up failed sandbox \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:31.979719 containerd[1540]: time="2025-01-13T20:52:31.979706477Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:31.979861 kubelet[2861]: E0113 20:52:31.979842 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:31.991101 kubelet[2861]: E0113 20:52:31.979881 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:31.991101 kubelet[2861]: E0113 20:52:31.979897 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:31.991101 kubelet[2861]: E0113 20:52:31.979935 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" podUID="b067aee8-97d0-47ca-9359-80c070636930" Jan 13 20:52:32.080343 containerd[1540]: time="2025-01-13T20:52:32.080314750Z" level=error msg="Failed to destroy network for sandbox \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.080729 containerd[1540]: time="2025-01-13T20:52:32.080714511Z" level=error msg="encountered an error cleaning up failed sandbox \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.080813 containerd[1540]: time="2025-01-13T20:52:32.080800534Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.081819 kubelet[2861]: E0113 20:52:32.081799 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.081869 kubelet[2861]: E0113 20:52:32.081837 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:32.081869 kubelet[2861]: E0113 20:52:32.081855 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:32.081928 kubelet[2861]: E0113 20:52:32.081890 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" podUID="26da34a0-8538-4fcf-9a78-f93cb2d6a0ef" Jan 13 20:52:32.159055 containerd[1540]: time="2025-01-13T20:52:32.159029279Z" level=error msg="Failed to destroy network for sandbox \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.159602 containerd[1540]: time="2025-01-13T20:52:32.159319389Z" level=error msg="encountered an error cleaning up failed sandbox \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.159602 containerd[1540]: time="2025-01-13T20:52:32.159366913Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.159663 kubelet[2861]: E0113 20:52:32.159517 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.159663 kubelet[2861]: E0113 20:52:32.159549 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:32.159663 kubelet[2861]: E0113 20:52:32.159564 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:32.160139 kubelet[2861]: E0113 20:52:32.159595 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:32.332300 containerd[1540]: time="2025-01-13T20:52:32.332265720Z" level=error msg="Failed to destroy network for sandbox \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.332516 containerd[1540]: time="2025-01-13T20:52:32.332487563Z" level=error msg="encountered an error cleaning up failed sandbox \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.332551 containerd[1540]: time="2025-01-13T20:52:32.332537281Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.332876 kubelet[2861]: E0113 20:52:32.332685 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.332876 kubelet[2861]: E0113 20:52:32.332720 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:32.332876 kubelet[2861]: E0113 20:52:32.332733 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:32.332957 kubelet[2861]: E0113 20:52:32.332766 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-f9v7b" podUID="dd0ed670-6fae-4b3f-8750-1689ff0c62c3" Jan 13 20:52:32.481096 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e-shm.mount: Deactivated successfully. Jan 13 20:52:32.481496 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383-shm.mount: Deactivated successfully. Jan 13 20:52:32.481562 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865-shm.mount: Deactivated successfully. Jan 13 20:52:32.972711 containerd[1540]: time="2025-01-13T20:52:32.972676485Z" level=error msg="Failed to destroy network for sandbox \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.974822 containerd[1540]: time="2025-01-13T20:52:32.974542983Z" level=error msg="encountered an error cleaning up failed sandbox \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.974822 containerd[1540]: time="2025-01-13T20:52:32.974592508Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.974449 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55-shm.mount: Deactivated successfully. Jan 13 20:52:32.978247 kubelet[2861]: E0113 20:52:32.978186 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:32.978247 kubelet[2861]: E0113 20:52:32.978229 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:33.023744 kubelet[2861]: E0113 20:52:32.978254 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:33.023744 kubelet[2861]: E0113 20:52:32.978307 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-2wvd8_kube-system(42cec424-d73c-431a-b548-ae49975c9420)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-2wvd8_kube-system(42cec424-d73c-431a-b548-ae49975c9420)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-2wvd8" podUID="42cec424-d73c-431a-b548-ae49975c9420" Jan 13 20:52:33.205897 kubelet[2861]: I0113 20:52:33.205808 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e" Jan 13 20:52:33.220463 containerd[1540]: time="2025-01-13T20:52:33.220176417Z" level=info msg="StopPodSandbox for \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\"" Jan 13 20:52:33.220463 containerd[1540]: time="2025-01-13T20:52:33.220451328Z" level=info msg="Ensure that sandbox c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e in task-service has been cleanup successfully" Jan 13 20:52:33.224042 systemd[1]: run-netns-cni\x2dbb568da1\x2d408f\x2d4426\x2d038a\x2db66de4f20666.mount: Deactivated successfully. Jan 13 20:52:33.256159 containerd[1540]: time="2025-01-13T20:52:33.256124352Z" level=info msg="TearDown network for sandbox \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\" successfully" Jan 13 20:52:33.256159 containerd[1540]: time="2025-01-13T20:52:33.256149980Z" level=info msg="StopPodSandbox for \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\" returns successfully" Jan 13 20:52:33.338483 containerd[1540]: time="2025-01-13T20:52:33.337864862Z" level=info msg="StopPodSandbox for \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\"" Jan 13 20:52:33.338483 containerd[1540]: time="2025-01-13T20:52:33.337982358Z" level=info msg="TearDown network for sandbox \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\" successfully" Jan 13 20:52:33.338483 containerd[1540]: time="2025-01-13T20:52:33.338048993Z" level=info msg="StopPodSandbox for \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\" returns successfully" Jan 13 20:52:33.412268 containerd[1540]: time="2025-01-13T20:52:33.412236554Z" level=info msg="StopPodSandbox for \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\"" Jan 13 20:52:33.412579 containerd[1540]: time="2025-01-13T20:52:33.412567583Z" level=info msg="TearDown network for sandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\" successfully" Jan 13 20:52:33.412727 containerd[1540]: time="2025-01-13T20:52:33.412710589Z" level=info msg="StopPodSandbox for \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\" returns successfully" Jan 13 20:52:33.484637 containerd[1540]: time="2025-01-13T20:52:33.484561199Z" level=info msg="StopPodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\"" Jan 13 20:52:33.485221 containerd[1540]: time="2025-01-13T20:52:33.485141003Z" level=info msg="TearDown network for sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" successfully" Jan 13 20:52:33.487820 containerd[1540]: time="2025-01-13T20:52:33.485264207Z" level=info msg="StopPodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" returns successfully" Jan 13 20:52:33.531241 containerd[1540]: time="2025-01-13T20:52:33.530429350Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\"" Jan 13 20:52:33.531241 containerd[1540]: time="2025-01-13T20:52:33.530491438Z" level=info msg="TearDown network for sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" successfully" Jan 13 20:52:33.531241 containerd[1540]: time="2025-01-13T20:52:33.530498239Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" returns successfully" Jan 13 20:52:33.553820 containerd[1540]: time="2025-01-13T20:52:33.553622727Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\"" Jan 13 20:52:33.553820 containerd[1540]: time="2025-01-13T20:52:33.553722072Z" level=info msg="TearDown network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" successfully" Jan 13 20:52:33.553820 containerd[1540]: time="2025-01-13T20:52:33.553730359Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" returns successfully" Jan 13 20:52:33.554532 kubelet[2861]: I0113 20:52:33.554198 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c" Jan 13 20:52:33.569748 containerd[1540]: time="2025-01-13T20:52:33.569454319Z" level=info msg="StopPodSandbox for \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\"" Jan 13 20:52:33.578266 containerd[1540]: time="2025-01-13T20:52:33.578188071Z" level=info msg="Ensure that sandbox 90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c in task-service has been cleanup successfully" Jan 13 20:52:33.578739 containerd[1540]: time="2025-01-13T20:52:33.578593855Z" level=info msg="TearDown network for sandbox \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\" successfully" Jan 13 20:52:33.578876 containerd[1540]: time="2025-01-13T20:52:33.578798468Z" level=info msg="StopPodSandbox for \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\" returns successfully" Jan 13 20:52:33.581739 containerd[1540]: time="2025-01-13T20:52:33.580925402Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:6,}" Jan 13 20:52:33.582536 containerd[1540]: time="2025-01-13T20:52:33.582521218Z" level=info msg="StopPodSandbox for \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\"" Jan 13 20:52:33.582779 containerd[1540]: time="2025-01-13T20:52:33.582702597Z" level=info msg="TearDown network for sandbox \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\" successfully" Jan 13 20:52:33.582880 containerd[1540]: time="2025-01-13T20:52:33.582829017Z" level=info msg="StopPodSandbox for \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\" returns successfully" Jan 13 20:52:33.583402 systemd[1]: run-netns-cni\x2d20866d09\x2d0cb4\x2d5dab\x2d3fb9\x2d5a257313cb96.mount: Deactivated successfully. Jan 13 20:52:33.595191 containerd[1540]: time="2025-01-13T20:52:33.594958103Z" level=info msg="StopPodSandbox for \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\"" Jan 13 20:52:33.595191 containerd[1540]: time="2025-01-13T20:52:33.595036051Z" level=info msg="TearDown network for sandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\" successfully" Jan 13 20:52:33.595191 containerd[1540]: time="2025-01-13T20:52:33.595044461Z" level=info msg="StopPodSandbox for \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\" returns successfully" Jan 13 20:52:33.620536 containerd[1540]: time="2025-01-13T20:52:33.620504985Z" level=info msg="StopPodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\"" Jan 13 20:52:33.620722 containerd[1540]: time="2025-01-13T20:52:33.620657499Z" level=info msg="TearDown network for sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" successfully" Jan 13 20:52:33.620722 containerd[1540]: time="2025-01-13T20:52:33.620666807Z" level=info msg="StopPodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" returns successfully" Jan 13 20:52:33.655577 containerd[1540]: time="2025-01-13T20:52:33.655365071Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\"" Jan 13 20:52:33.655577 containerd[1540]: time="2025-01-13T20:52:33.655427470Z" level=info msg="TearDown network for sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" successfully" Jan 13 20:52:33.655577 containerd[1540]: time="2025-01-13T20:52:33.655434435Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" returns successfully" Jan 13 20:52:33.673094 containerd[1540]: time="2025-01-13T20:52:33.673066057Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\"" Jan 13 20:52:33.674415 containerd[1540]: time="2025-01-13T20:52:33.674359261Z" level=info msg="TearDown network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" successfully" Jan 13 20:52:33.674415 containerd[1540]: time="2025-01-13T20:52:33.674371159Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" returns successfully" Jan 13 20:52:33.703785 containerd[1540]: time="2025-01-13T20:52:33.703592399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:6,}" Jan 13 20:52:33.757981 kubelet[2861]: I0113 20:52:33.757920 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22" Jan 13 20:52:33.769793 containerd[1540]: time="2025-01-13T20:52:33.769733103Z" level=info msg="StopPodSandbox for \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\"" Jan 13 20:52:33.774808 containerd[1540]: time="2025-01-13T20:52:33.769876211Z" level=info msg="Ensure that sandbox 6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22 in task-service has been cleanup successfully" Jan 13 20:52:33.774808 containerd[1540]: time="2025-01-13T20:52:33.770105441Z" level=info msg="TearDown network for sandbox \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\" successfully" Jan 13 20:52:33.774808 containerd[1540]: time="2025-01-13T20:52:33.770114344Z" level=info msg="StopPodSandbox for \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\" returns successfully" Jan 13 20:52:33.771485 systemd[1]: run-netns-cni\x2d27df9a26\x2d422f\x2db242\x2d52fd\x2d590a35bc17ae.mount: Deactivated successfully. Jan 13 20:52:33.799154 containerd[1540]: time="2025-01-13T20:52:33.799122563Z" level=info msg="StopPodSandbox for \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\"" Jan 13 20:52:33.799242 containerd[1540]: time="2025-01-13T20:52:33.799202451Z" level=info msg="TearDown network for sandbox \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\" successfully" Jan 13 20:52:33.799242 containerd[1540]: time="2025-01-13T20:52:33.799213171Z" level=info msg="StopPodSandbox for \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\" returns successfully" Jan 13 20:52:33.823433 containerd[1540]: time="2025-01-13T20:52:33.823264507Z" level=info msg="StopPodSandbox for \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\"" Jan 13 20:52:33.823748 containerd[1540]: time="2025-01-13T20:52:33.823673499Z" level=info msg="TearDown network for sandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\" successfully" Jan 13 20:52:33.823748 containerd[1540]: time="2025-01-13T20:52:33.823690297Z" level=info msg="StopPodSandbox for \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\" returns successfully" Jan 13 20:52:33.824192 containerd[1540]: time="2025-01-13T20:52:33.824150783Z" level=info msg="StopPodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\"" Jan 13 20:52:33.824306 containerd[1540]: time="2025-01-13T20:52:33.824239351Z" level=info msg="TearDown network for sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" successfully" Jan 13 20:52:33.824306 containerd[1540]: time="2025-01-13T20:52:33.824247711Z" level=info msg="StopPodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" returns successfully" Jan 13 20:52:33.824706 containerd[1540]: time="2025-01-13T20:52:33.824690678Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\"" Jan 13 20:52:33.824794 containerd[1540]: time="2025-01-13T20:52:33.824785675Z" level=info msg="TearDown network for sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" successfully" Jan 13 20:52:33.824906 containerd[1540]: time="2025-01-13T20:52:33.824829083Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" returns successfully" Jan 13 20:52:33.825254 containerd[1540]: time="2025-01-13T20:52:33.825243955Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\"" Jan 13 20:52:33.825448 containerd[1540]: time="2025-01-13T20:52:33.825300670Z" level=info msg="TearDown network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" successfully" Jan 13 20:52:33.825448 containerd[1540]: time="2025-01-13T20:52:33.825307098Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" returns successfully" Jan 13 20:52:33.825745 containerd[1540]: time="2025-01-13T20:52:33.825733809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:6,}" Jan 13 20:52:33.826164 kubelet[2861]: I0113 20:52:33.826026 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865" Jan 13 20:52:33.827275 containerd[1540]: time="2025-01-13T20:52:33.827085379Z" level=info msg="StopPodSandbox for \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\"" Jan 13 20:52:33.827275 containerd[1540]: time="2025-01-13T20:52:33.827196179Z" level=info msg="Ensure that sandbox b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865 in task-service has been cleanup successfully" Jan 13 20:52:33.829005 systemd[1]: run-netns-cni\x2d093f4f19\x2d94ee\x2da656\x2dcac0\x2dc5b8a820799b.mount: Deactivated successfully. Jan 13 20:52:33.830265 containerd[1540]: time="2025-01-13T20:52:33.829974868Z" level=info msg="TearDown network for sandbox \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\" successfully" Jan 13 20:52:33.830265 containerd[1540]: time="2025-01-13T20:52:33.829989885Z" level=info msg="StopPodSandbox for \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\" returns successfully" Jan 13 20:52:33.830862 containerd[1540]: time="2025-01-13T20:52:33.830756980Z" level=info msg="StopPodSandbox for \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\"" Jan 13 20:52:33.830862 containerd[1540]: time="2025-01-13T20:52:33.830813603Z" level=info msg="TearDown network for sandbox \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\" successfully" Jan 13 20:52:33.830862 containerd[1540]: time="2025-01-13T20:52:33.830820507Z" level=info msg="StopPodSandbox for \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\" returns successfully" Jan 13 20:52:33.831063 containerd[1540]: time="2025-01-13T20:52:33.831053795Z" level=info msg="StopPodSandbox for \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\"" Jan 13 20:52:33.831209 containerd[1540]: time="2025-01-13T20:52:33.831176377Z" level=info msg="TearDown network for sandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\" successfully" Jan 13 20:52:33.831209 containerd[1540]: time="2025-01-13T20:52:33.831205643Z" level=info msg="StopPodSandbox for \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\" returns successfully" Jan 13 20:52:33.831435 containerd[1540]: time="2025-01-13T20:52:33.831337555Z" level=info msg="StopPodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\"" Jan 13 20:52:33.831626 containerd[1540]: time="2025-01-13T20:52:33.831608268Z" level=info msg="TearDown network for sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" successfully" Jan 13 20:52:33.831653 containerd[1540]: time="2025-01-13T20:52:33.831623518Z" level=info msg="StopPodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" returns successfully" Jan 13 20:52:33.831980 containerd[1540]: time="2025-01-13T20:52:33.831969930Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\"" Jan 13 20:52:33.832143 containerd[1540]: time="2025-01-13T20:52:33.832129124Z" level=info msg="TearDown network for sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" successfully" Jan 13 20:52:33.832143 containerd[1540]: time="2025-01-13T20:52:33.832139590Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" returns successfully" Jan 13 20:52:33.832336 containerd[1540]: time="2025-01-13T20:52:33.832282836Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\"" Jan 13 20:52:33.832336 containerd[1540]: time="2025-01-13T20:52:33.832321984Z" level=info msg="TearDown network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" successfully" Jan 13 20:52:33.832336 containerd[1540]: time="2025-01-13T20:52:33.832327947Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" returns successfully" Jan 13 20:52:33.832797 containerd[1540]: time="2025-01-13T20:52:33.832771659Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:6,}" Jan 13 20:52:33.858963 kubelet[2861]: I0113 20:52:33.858769 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383" Jan 13 20:52:33.859642 containerd[1540]: time="2025-01-13T20:52:33.859592149Z" level=info msg="StopPodSandbox for \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\"" Jan 13 20:52:33.872217 containerd[1540]: time="2025-01-13T20:52:33.871008179Z" level=info msg="Ensure that sandbox 59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383 in task-service has been cleanup successfully" Jan 13 20:52:33.872885 containerd[1540]: time="2025-01-13T20:52:33.872815114Z" level=info msg="TearDown network for sandbox \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\" successfully" Jan 13 20:52:33.872885 containerd[1540]: time="2025-01-13T20:52:33.872830447Z" level=info msg="StopPodSandbox for \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\" returns successfully" Jan 13 20:52:33.873208 containerd[1540]: time="2025-01-13T20:52:33.873195657Z" level=info msg="StopPodSandbox for \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\"" Jan 13 20:52:33.873911 containerd[1540]: time="2025-01-13T20:52:33.873319277Z" level=info msg="TearDown network for sandbox \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\" successfully" Jan 13 20:52:33.873687 systemd[1]: run-netns-cni\x2d4be7c091\x2de713\x2df0e0\x2d4ca4\x2d099fdc312d4f.mount: Deactivated successfully. Jan 13 20:52:33.874160 containerd[1540]: time="2025-01-13T20:52:33.873331217Z" level=info msg="StopPodSandbox for \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\" returns successfully" Jan 13 20:52:33.874318 containerd[1540]: time="2025-01-13T20:52:33.874217621Z" level=info msg="StopPodSandbox for \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\"" Jan 13 20:52:33.881609 containerd[1540]: time="2025-01-13T20:52:33.881584319Z" level=info msg="TearDown network for sandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\" successfully" Jan 13 20:52:33.881790 containerd[1540]: time="2025-01-13T20:52:33.881720142Z" level=info msg="StopPodSandbox for \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\" returns successfully" Jan 13 20:52:33.883083 containerd[1540]: time="2025-01-13T20:52:33.882963746Z" level=info msg="StopPodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\"" Jan 13 20:52:33.883083 containerd[1540]: time="2025-01-13T20:52:33.883034413Z" level=info msg="TearDown network for sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" successfully" Jan 13 20:52:33.883083 containerd[1540]: time="2025-01-13T20:52:33.883043328Z" level=info msg="StopPodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" returns successfully" Jan 13 20:52:33.883472 containerd[1540]: time="2025-01-13T20:52:33.883312522Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\"" Jan 13 20:52:33.883472 containerd[1540]: time="2025-01-13T20:52:33.883371372Z" level=info msg="TearDown network for sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" successfully" Jan 13 20:52:33.883472 containerd[1540]: time="2025-01-13T20:52:33.883381248Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" returns successfully" Jan 13 20:52:33.883589 containerd[1540]: time="2025-01-13T20:52:33.883579241Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\"" Jan 13 20:52:33.883662 containerd[1540]: time="2025-01-13T20:52:33.883653397Z" level=info msg="TearDown network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" successfully" Jan 13 20:52:33.883759 containerd[1540]: time="2025-01-13T20:52:33.883728309Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" returns successfully" Jan 13 20:52:33.884499 containerd[1540]: time="2025-01-13T20:52:33.884488677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:6,}" Jan 13 20:52:34.358072 containerd[1540]: time="2025-01-13T20:52:34.354648174Z" level=error msg="Failed to destroy network for sandbox \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.358323 containerd[1540]: time="2025-01-13T20:52:34.358281718Z" level=error msg="encountered an error cleaning up failed sandbox \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.358378 containerd[1540]: time="2025-01-13T20:52:34.358343307Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:6,} failed, error" error="failed to setup network for sandbox \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.358659 kubelet[2861]: E0113 20:52:34.358645 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.359053 kubelet[2861]: E0113 20:52:34.358832 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:34.359053 kubelet[2861]: E0113 20:52:34.358859 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:34.359053 kubelet[2861]: E0113 20:52:34.358916 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" podUID="26da34a0-8538-4fcf-9a78-f93cb2d6a0ef" Jan 13 20:52:34.419224 containerd[1540]: time="2025-01-13T20:52:34.419126271Z" level=error msg="Failed to destroy network for sandbox \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.419466 containerd[1540]: time="2025-01-13T20:52:34.419450466Z" level=error msg="encountered an error cleaning up failed sandbox \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.424447 containerd[1540]: time="2025-01-13T20:52:34.419616543Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.424490 kubelet[2861]: E0113 20:52:34.419762 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.424490 kubelet[2861]: E0113 20:52:34.419803 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:34.424490 kubelet[2861]: E0113 20:52:34.419821 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:34.424560 kubelet[2861]: E0113 20:52:34.419865 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:34.502144 containerd[1540]: time="2025-01-13T20:52:34.502109608Z" level=error msg="Failed to destroy network for sandbox \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.502422 containerd[1540]: time="2025-01-13T20:52:34.502300612Z" level=error msg="encountered an error cleaning up failed sandbox \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.502422 containerd[1540]: time="2025-01-13T20:52:34.502335244Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.503268 kubelet[2861]: E0113 20:52:34.503235 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.503310 kubelet[2861]: E0113 20:52:34.503275 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:34.503795 kubelet[2861]: E0113 20:52:34.503590 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:34.503795 kubelet[2861]: E0113 20:52:34.503635 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-f9v7b" podUID="dd0ed670-6fae-4b3f-8750-1689ff0c62c3" Jan 13 20:52:34.518824 containerd[1540]: time="2025-01-13T20:52:34.518743682Z" level=error msg="Failed to destroy network for sandbox \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.519125 containerd[1540]: time="2025-01-13T20:52:34.519033087Z" level=error msg="encountered an error cleaning up failed sandbox \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.519125 containerd[1540]: time="2025-01-13T20:52:34.519066687Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:6,} failed, error" error="failed to setup network for sandbox \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.519228 kubelet[2861]: E0113 20:52:34.519211 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.519267 kubelet[2861]: E0113 20:52:34.519245 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:34.519267 kubelet[2861]: E0113 20:52:34.519258 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:34.519305 kubelet[2861]: E0113 20:52:34.519294 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" podUID="c9c96bfa-10f6-4dae-9986-cb25e73d9966" Jan 13 20:52:34.559262 containerd[1540]: time="2025-01-13T20:52:34.559087546Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Jan 13 20:52:34.576342 containerd[1540]: time="2025-01-13T20:52:34.576314051Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:34.586571 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd-shm.mount: Deactivated successfully. Jan 13 20:52:34.586638 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount806298925.mount: Deactivated successfully. Jan 13 20:52:34.593763 containerd[1540]: time="2025-01-13T20:52:34.593727377Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 9.231470593s" Jan 13 20:52:34.593763 containerd[1540]: time="2025-01-13T20:52:34.593767432Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Jan 13 20:52:34.595011 containerd[1540]: time="2025-01-13T20:52:34.594666633Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:34.596554 containerd[1540]: time="2025-01-13T20:52:34.596532984Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:34.614591 containerd[1540]: time="2025-01-13T20:52:34.614521906Z" level=error msg="Failed to destroy network for sandbox \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.614830 containerd[1540]: time="2025-01-13T20:52:34.614756729Z" level=error msg="encountered an error cleaning up failed sandbox \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.614830 containerd[1540]: time="2025-01-13T20:52:34.614797277Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.615230 kubelet[2861]: E0113 20:52:34.614982 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:34.615230 kubelet[2861]: E0113 20:52:34.615024 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:34.615230 kubelet[2861]: E0113 20:52:34.615038 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:34.615328 kubelet[2861]: E0113 20:52:34.615080 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" podUID="b067aee8-97d0-47ca-9359-80c070636930" Jan 13 20:52:34.620680 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82-shm.mount: Deactivated successfully. Jan 13 20:52:34.809144 containerd[1540]: time="2025-01-13T20:52:34.809108988Z" level=info msg="CreateContainer within sandbox \"3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 13 20:52:34.831287 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3746635753.mount: Deactivated successfully. Jan 13 20:52:34.843509 containerd[1540]: time="2025-01-13T20:52:34.843475438Z" level=info msg="CreateContainer within sandbox \"3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc\"" Jan 13 20:52:34.844574 containerd[1540]: time="2025-01-13T20:52:34.844395539Z" level=info msg="StartContainer for \"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc\"" Jan 13 20:52:34.866424 kubelet[2861]: I0113 20:52:34.866280 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac" Jan 13 20:52:34.867277 containerd[1540]: time="2025-01-13T20:52:34.866883130Z" level=info msg="StopPodSandbox for \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\"" Jan 13 20:52:34.867277 containerd[1540]: time="2025-01-13T20:52:34.867053811Z" level=info msg="Ensure that sandbox 48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac in task-service has been cleanup successfully" Jan 13 20:52:34.867932 containerd[1540]: time="2025-01-13T20:52:34.867855519Z" level=info msg="TearDown network for sandbox \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\" successfully" Jan 13 20:52:34.867932 containerd[1540]: time="2025-01-13T20:52:34.867870214Z" level=info msg="StopPodSandbox for \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\" returns successfully" Jan 13 20:52:34.868198 containerd[1540]: time="2025-01-13T20:52:34.868149596Z" level=info msg="StopPodSandbox for \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\"" Jan 13 20:52:34.868241 containerd[1540]: time="2025-01-13T20:52:34.868224245Z" level=info msg="TearDown network for sandbox \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\" successfully" Jan 13 20:52:34.868241 containerd[1540]: time="2025-01-13T20:52:34.868234982Z" level=info msg="StopPodSandbox for \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\" returns successfully" Jan 13 20:52:34.868637 containerd[1540]: time="2025-01-13T20:52:34.868496393Z" level=info msg="StopPodSandbox for \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\"" Jan 13 20:52:34.868637 containerd[1540]: time="2025-01-13T20:52:34.868549687Z" level=info msg="TearDown network for sandbox \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\" successfully" Jan 13 20:52:34.868637 containerd[1540]: time="2025-01-13T20:52:34.868559128Z" level=info msg="StopPodSandbox for \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\" returns successfully" Jan 13 20:52:34.868880 containerd[1540]: time="2025-01-13T20:52:34.868822260Z" level=info msg="StopPodSandbox for \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\"" Jan 13 20:52:34.869048 containerd[1540]: time="2025-01-13T20:52:34.868948847Z" level=info msg="TearDown network for sandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\" successfully" Jan 13 20:52:34.869048 containerd[1540]: time="2025-01-13T20:52:34.868961120Z" level=info msg="StopPodSandbox for \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\" returns successfully" Jan 13 20:52:34.869377 containerd[1540]: time="2025-01-13T20:52:34.869235063Z" level=info msg="StopPodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\"" Jan 13 20:52:34.869377 containerd[1540]: time="2025-01-13T20:52:34.869298542Z" level=info msg="TearDown network for sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" successfully" Jan 13 20:52:34.869377 containerd[1540]: time="2025-01-13T20:52:34.869307876Z" level=info msg="StopPodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" returns successfully" Jan 13 20:52:34.869916 containerd[1540]: time="2025-01-13T20:52:34.869766160Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\"" Jan 13 20:52:34.869916 containerd[1540]: time="2025-01-13T20:52:34.869835898Z" level=info msg="TearDown network for sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" successfully" Jan 13 20:52:34.869916 containerd[1540]: time="2025-01-13T20:52:34.869879673Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" returns successfully" Jan 13 20:52:34.870012 kubelet[2861]: I0113 20:52:34.869918 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85" Jan 13 20:52:34.870284 containerd[1540]: time="2025-01-13T20:52:34.870258444Z" level=info msg="StopPodSandbox for \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\"" Jan 13 20:52:34.870799 containerd[1540]: time="2025-01-13T20:52:34.870409221Z" level=info msg="Ensure that sandbox a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85 in task-service has been cleanup successfully" Jan 13 20:52:34.870799 containerd[1540]: time="2025-01-13T20:52:34.870480281Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\"" Jan 13 20:52:34.870799 containerd[1540]: time="2025-01-13T20:52:34.870554904Z" level=info msg="TearDown network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" successfully" Jan 13 20:52:34.870799 containerd[1540]: time="2025-01-13T20:52:34.870568323Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" returns successfully" Jan 13 20:52:34.870799 containerd[1540]: time="2025-01-13T20:52:34.870557472Z" level=info msg="TearDown network for sandbox \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\" successfully" Jan 13 20:52:34.870799 containerd[1540]: time="2025-01-13T20:52:34.870599725Z" level=info msg="StopPodSandbox for \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\" returns successfully" Jan 13 20:52:34.870988 containerd[1540]: time="2025-01-13T20:52:34.870898944Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:7,}" Jan 13 20:52:34.871221 containerd[1540]: time="2025-01-13T20:52:34.871201786Z" level=info msg="StopPodSandbox for \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\"" Jan 13 20:52:34.871268 containerd[1540]: time="2025-01-13T20:52:34.871257697Z" level=info msg="TearDown network for sandbox \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\" successfully" Jan 13 20:52:34.871299 containerd[1540]: time="2025-01-13T20:52:34.871268141Z" level=info msg="StopPodSandbox for \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\" returns successfully" Jan 13 20:52:34.871506 containerd[1540]: time="2025-01-13T20:52:34.871471458Z" level=info msg="StopPodSandbox for \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\"" Jan 13 20:52:34.871629 containerd[1540]: time="2025-01-13T20:52:34.871526027Z" level=info msg="TearDown network for sandbox \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\" successfully" Jan 13 20:52:34.871629 containerd[1540]: time="2025-01-13T20:52:34.871536228Z" level=info msg="StopPodSandbox for \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\" returns successfully" Jan 13 20:52:34.871698 containerd[1540]: time="2025-01-13T20:52:34.871677876Z" level=info msg="StopPodSandbox for \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\"" Jan 13 20:52:34.872277 containerd[1540]: time="2025-01-13T20:52:34.871732158Z" level=info msg="TearDown network for sandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\" successfully" Jan 13 20:52:34.872277 containerd[1540]: time="2025-01-13T20:52:34.871743957Z" level=info msg="StopPodSandbox for \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\" returns successfully" Jan 13 20:52:34.872684 containerd[1540]: time="2025-01-13T20:52:34.872342093Z" level=info msg="StopPodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\"" Jan 13 20:52:34.872684 containerd[1540]: time="2025-01-13T20:52:34.872443602Z" level=info msg="TearDown network for sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" successfully" Jan 13 20:52:34.872684 containerd[1540]: time="2025-01-13T20:52:34.872453424Z" level=info msg="StopPodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" returns successfully" Jan 13 20:52:34.873437 containerd[1540]: time="2025-01-13T20:52:34.873418072Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\"" Jan 13 20:52:34.873493 containerd[1540]: time="2025-01-13T20:52:34.873475691Z" level=info msg="TearDown network for sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" successfully" Jan 13 20:52:34.873493 containerd[1540]: time="2025-01-13T20:52:34.873488757Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" returns successfully" Jan 13 20:52:34.874548 containerd[1540]: time="2025-01-13T20:52:34.874206214Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\"" Jan 13 20:52:34.874548 containerd[1540]: time="2025-01-13T20:52:34.874261601Z" level=info msg="TearDown network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" successfully" Jan 13 20:52:34.874548 containerd[1540]: time="2025-01-13T20:52:34.874270969Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" returns successfully" Jan 13 20:52:34.874744 containerd[1540]: time="2025-01-13T20:52:34.874730080Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:7,}" Jan 13 20:52:34.875409 kubelet[2861]: I0113 20:52:34.875173 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55" Jan 13 20:52:34.875654 containerd[1540]: time="2025-01-13T20:52:34.875642921Z" level=info msg="StopPodSandbox for \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\"" Jan 13 20:52:34.875785 containerd[1540]: time="2025-01-13T20:52:34.875775290Z" level=info msg="Ensure that sandbox 1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55 in task-service has been cleanup successfully" Jan 13 20:52:34.876007 containerd[1540]: time="2025-01-13T20:52:34.875996482Z" level=info msg="TearDown network for sandbox \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\" successfully" Jan 13 20:52:34.876184 containerd[1540]: time="2025-01-13T20:52:34.876040543Z" level=info msg="StopPodSandbox for \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\" returns successfully" Jan 13 20:52:34.876404 containerd[1540]: time="2025-01-13T20:52:34.876305721Z" level=info msg="StopPodSandbox for \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\"" Jan 13 20:52:34.876404 containerd[1540]: time="2025-01-13T20:52:34.876343548Z" level=info msg="TearDown network for sandbox \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\" successfully" Jan 13 20:52:34.876601 containerd[1540]: time="2025-01-13T20:52:34.876481654Z" level=info msg="StopPodSandbox for \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\" returns successfully" Jan 13 20:52:34.876875 containerd[1540]: time="2025-01-13T20:52:34.876714153Z" level=info msg="StopPodSandbox for \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\"" Jan 13 20:52:34.876875 containerd[1540]: time="2025-01-13T20:52:34.876753874Z" level=info msg="TearDown network for sandbox \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\" successfully" Jan 13 20:52:34.876875 containerd[1540]: time="2025-01-13T20:52:34.876760337Z" level=info msg="StopPodSandbox for \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\" returns successfully" Jan 13 20:52:34.877062 containerd[1540]: time="2025-01-13T20:52:34.877042997Z" level=info msg="StopPodSandbox for \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\"" Jan 13 20:52:34.877298 containerd[1540]: time="2025-01-13T20:52:34.877142501Z" level=info msg="TearDown network for sandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\" successfully" Jan 13 20:52:34.877298 containerd[1540]: time="2025-01-13T20:52:34.877154216Z" level=info msg="StopPodSandbox for \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\" returns successfully" Jan 13 20:52:34.877585 containerd[1540]: time="2025-01-13T20:52:34.877566474Z" level=info msg="StopPodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\"" Jan 13 20:52:34.877634 containerd[1540]: time="2025-01-13T20:52:34.877620597Z" level=info msg="TearDown network for sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" successfully" Jan 13 20:52:34.877682 containerd[1540]: time="2025-01-13T20:52:34.877633017Z" level=info msg="StopPodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" returns successfully" Jan 13 20:52:34.877929 containerd[1540]: time="2025-01-13T20:52:34.877909985Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\"" Jan 13 20:52:34.877971 containerd[1540]: time="2025-01-13T20:52:34.877963585Z" level=info msg="TearDown network for sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" successfully" Jan 13 20:52:34.878011 containerd[1540]: time="2025-01-13T20:52:34.877973539Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" returns successfully" Jan 13 20:52:34.878324 containerd[1540]: time="2025-01-13T20:52:34.878306112Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:6,}" Jan 13 20:52:34.878747 kubelet[2861]: I0113 20:52:34.878736 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7" Jan 13 20:52:34.879169 containerd[1540]: time="2025-01-13T20:52:34.879106449Z" level=info msg="StopPodSandbox for \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\"" Jan 13 20:52:34.879255 containerd[1540]: time="2025-01-13T20:52:34.879237055Z" level=info msg="Ensure that sandbox cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7 in task-service has been cleanup successfully" Jan 13 20:52:34.879387 containerd[1540]: time="2025-01-13T20:52:34.879370022Z" level=info msg="TearDown network for sandbox \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\" successfully" Jan 13 20:52:34.879387 containerd[1540]: time="2025-01-13T20:52:34.879382734Z" level=info msg="StopPodSandbox for \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\" returns successfully" Jan 13 20:52:34.879601 containerd[1540]: time="2025-01-13T20:52:34.879571070Z" level=info msg="StopPodSandbox for \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\"" Jan 13 20:52:34.880517 containerd[1540]: time="2025-01-13T20:52:34.880246419Z" level=info msg="TearDown network for sandbox \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\" successfully" Jan 13 20:52:34.880517 containerd[1540]: time="2025-01-13T20:52:34.880264396Z" level=info msg="StopPodSandbox for \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\" returns successfully" Jan 13 20:52:34.880748 containerd[1540]: time="2025-01-13T20:52:34.880734453Z" level=info msg="StopPodSandbox for \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\"" Jan 13 20:52:34.880881 containerd[1540]: time="2025-01-13T20:52:34.880850423Z" level=info msg="TearDown network for sandbox \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\" successfully" Jan 13 20:52:34.881591 containerd[1540]: time="2025-01-13T20:52:34.881445831Z" level=info msg="StopPodSandbox for \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\" returns successfully" Jan 13 20:52:34.881900 containerd[1540]: time="2025-01-13T20:52:34.881796808Z" level=info msg="StopPodSandbox for \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\"" Jan 13 20:52:34.881900 containerd[1540]: time="2025-01-13T20:52:34.881871038Z" level=info msg="TearDown network for sandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\" successfully" Jan 13 20:52:34.882580 containerd[1540]: time="2025-01-13T20:52:34.881881341Z" level=info msg="StopPodSandbox for \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\" returns successfully" Jan 13 20:52:34.883401 containerd[1540]: time="2025-01-13T20:52:34.883380790Z" level=info msg="StopPodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\"" Jan 13 20:52:34.883592 containerd[1540]: time="2025-01-13T20:52:34.883499374Z" level=info msg="TearDown network for sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" successfully" Jan 13 20:52:34.883592 containerd[1540]: time="2025-01-13T20:52:34.883512831Z" level=info msg="StopPodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" returns successfully" Jan 13 20:52:34.884052 containerd[1540]: time="2025-01-13T20:52:34.883856293Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\"" Jan 13 20:52:34.884052 containerd[1540]: time="2025-01-13T20:52:34.883921871Z" level=info msg="TearDown network for sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" successfully" Jan 13 20:52:34.884052 containerd[1540]: time="2025-01-13T20:52:34.883931314Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" returns successfully" Jan 13 20:52:34.884640 containerd[1540]: time="2025-01-13T20:52:34.884394979Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\"" Jan 13 20:52:34.884640 containerd[1540]: time="2025-01-13T20:52:34.884481130Z" level=info msg="TearDown network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" successfully" Jan 13 20:52:34.884640 containerd[1540]: time="2025-01-13T20:52:34.884491012Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" returns successfully" Jan 13 20:52:34.886360 containerd[1540]: time="2025-01-13T20:52:34.886237876Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:7,}" Jan 13 20:52:34.887239 kubelet[2861]: I0113 20:52:34.887222 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82" Jan 13 20:52:34.887740 containerd[1540]: time="2025-01-13T20:52:34.887665568Z" level=info msg="StopPodSandbox for \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\"" Jan 13 20:52:34.888023 containerd[1540]: time="2025-01-13T20:52:34.887910950Z" level=info msg="Ensure that sandbox 8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82 in task-service has been cleanup successfully" Jan 13 20:52:34.888681 containerd[1540]: time="2025-01-13T20:52:34.888100520Z" level=info msg="TearDown network for sandbox \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\" successfully" Jan 13 20:52:34.888681 containerd[1540]: time="2025-01-13T20:52:34.888109617Z" level=info msg="StopPodSandbox for \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\" returns successfully" Jan 13 20:52:34.888681 containerd[1540]: time="2025-01-13T20:52:34.888286757Z" level=info msg="StopPodSandbox for \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\"" Jan 13 20:52:34.888681 containerd[1540]: time="2025-01-13T20:52:34.888338348Z" level=info msg="TearDown network for sandbox \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\" successfully" Jan 13 20:52:34.888681 containerd[1540]: time="2025-01-13T20:52:34.888359334Z" level=info msg="StopPodSandbox for \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\" returns successfully" Jan 13 20:52:34.888681 containerd[1540]: time="2025-01-13T20:52:34.888494395Z" level=info msg="StopPodSandbox for \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\"" Jan 13 20:52:34.888681 containerd[1540]: time="2025-01-13T20:52:34.888541542Z" level=info msg="TearDown network for sandbox \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\" successfully" Jan 13 20:52:34.888681 containerd[1540]: time="2025-01-13T20:52:34.888548326Z" level=info msg="StopPodSandbox for \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\" returns successfully" Jan 13 20:52:34.888942 containerd[1540]: time="2025-01-13T20:52:34.888692551Z" level=info msg="StopPodSandbox for \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\"" Jan 13 20:52:34.888942 containerd[1540]: time="2025-01-13T20:52:34.888742073Z" level=info msg="TearDown network for sandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\" successfully" Jan 13 20:52:34.888942 containerd[1540]: time="2025-01-13T20:52:34.888751484Z" level=info msg="StopPodSandbox for \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\" returns successfully" Jan 13 20:52:34.889138 containerd[1540]: time="2025-01-13T20:52:34.889119616Z" level=info msg="StopPodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\"" Jan 13 20:52:34.889265 containerd[1540]: time="2025-01-13T20:52:34.889195798Z" level=info msg="TearDown network for sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" successfully" Jan 13 20:52:34.889265 containerd[1540]: time="2025-01-13T20:52:34.889214287Z" level=info msg="StopPodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" returns successfully" Jan 13 20:52:34.889520 containerd[1540]: time="2025-01-13T20:52:34.889502480Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\"" Jan 13 20:52:34.889564 containerd[1540]: time="2025-01-13T20:52:34.889556630Z" level=info msg="TearDown network for sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" successfully" Jan 13 20:52:34.889598 containerd[1540]: time="2025-01-13T20:52:34.889566310Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" returns successfully" Jan 13 20:52:34.889801 containerd[1540]: time="2025-01-13T20:52:34.889783800Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\"" Jan 13 20:52:34.889909 containerd[1540]: time="2025-01-13T20:52:34.889838732Z" level=info msg="TearDown network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" successfully" Jan 13 20:52:34.889909 containerd[1540]: time="2025-01-13T20:52:34.889848348Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" returns successfully" Jan 13 20:52:34.890252 containerd[1540]: time="2025-01-13T20:52:34.890234081Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:7,}" Jan 13 20:52:34.890631 kubelet[2861]: I0113 20:52:34.890620 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd" Jan 13 20:52:34.890913 containerd[1540]: time="2025-01-13T20:52:34.890895926Z" level=info msg="StopPodSandbox for \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\"" Jan 13 20:52:34.891033 containerd[1540]: time="2025-01-13T20:52:34.891016142Z" level=info msg="Ensure that sandbox 21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd in task-service has been cleanup successfully" Jan 13 20:52:34.891163 containerd[1540]: time="2025-01-13T20:52:34.891130173Z" level=info msg="TearDown network for sandbox \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\" successfully" Jan 13 20:52:34.891163 containerd[1540]: time="2025-01-13T20:52:34.891143382Z" level=info msg="StopPodSandbox for \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\" returns successfully" Jan 13 20:52:34.899963 containerd[1540]: time="2025-01-13T20:52:34.899934497Z" level=info msg="StopPodSandbox for \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\"" Jan 13 20:52:34.900062 containerd[1540]: time="2025-01-13T20:52:34.900019601Z" level=info msg="TearDown network for sandbox \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\" successfully" Jan 13 20:52:34.900062 containerd[1540]: time="2025-01-13T20:52:34.900031736Z" level=info msg="StopPodSandbox for \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\" returns successfully" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.900537926Z" level=info msg="StopPodSandbox for \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\"" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.900613001Z" level=info msg="TearDown network for sandbox \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\" successfully" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.900621504Z" level=info msg="StopPodSandbox for \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\" returns successfully" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.900777156Z" level=info msg="StopPodSandbox for \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\"" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.900821415Z" level=info msg="TearDown network for sandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\" successfully" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.900827902Z" level=info msg="StopPodSandbox for \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\" returns successfully" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.900982310Z" level=info msg="StopPodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\"" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.901081290Z" level=info msg="TearDown network for sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" successfully" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.901088105Z" level=info msg="StopPodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" returns successfully" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.901223011Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\"" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.901265188Z" level=info msg="TearDown network for sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" successfully" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.901295320Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" returns successfully" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.901482589Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\"" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.901552210Z" level=info msg="TearDown network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" successfully" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.901559514Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" returns successfully" Jan 13 20:52:34.907601 containerd[1540]: time="2025-01-13T20:52:34.901923114Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:7,}" Jan 13 20:52:35.013454 systemd[1]: Started cri-containerd-6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc.scope - libcontainer container 6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc. Jan 13 20:52:35.040396 containerd[1540]: time="2025-01-13T20:52:35.040345660Z" level=info msg="StartContainer for \"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc\" returns successfully" Jan 13 20:52:35.147392 containerd[1540]: time="2025-01-13T20:52:35.147288163Z" level=error msg="Failed to destroy network for sandbox \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.162916 containerd[1540]: time="2025-01-13T20:52:35.162776488Z" level=error msg="encountered an error cleaning up failed sandbox \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.162916 containerd[1540]: time="2025-01-13T20:52:35.162839504Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:7,} failed, error" error="failed to setup network for sandbox \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.163154 kubelet[2861]: E0113 20:52:35.163140 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.163474 kubelet[2861]: E0113 20:52:35.163261 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:35.163474 kubelet[2861]: E0113 20:52:35.163277 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7qmqc" Jan 13 20:52:35.163474 kubelet[2861]: E0113 20:52:35.163315 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7qmqc_calico-system(6b0be92d-fb03-4015-90fc-415d37c2d78b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7qmqc" podUID="6b0be92d-fb03-4015-90fc-415d37c2d78b" Jan 13 20:52:35.260084 containerd[1540]: time="2025-01-13T20:52:35.260014713Z" level=error msg="Failed to destroy network for sandbox \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.260473 containerd[1540]: time="2025-01-13T20:52:35.260367763Z" level=error msg="encountered an error cleaning up failed sandbox \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.260473 containerd[1540]: time="2025-01-13T20:52:35.260406427Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:7,} failed, error" error="failed to setup network for sandbox \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.260831 kubelet[2861]: E0113 20:52:35.260610 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.260831 kubelet[2861]: E0113 20:52:35.260643 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:35.260831 kubelet[2861]: E0113 20:52:35.260658 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-f9v7b" Jan 13 20:52:35.260956 kubelet[2861]: E0113 20:52:35.260697 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-f9v7b_kube-system(dd0ed670-6fae-4b3f-8750-1689ff0c62c3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-f9v7b" podUID="dd0ed670-6fae-4b3f-8750-1689ff0c62c3" Jan 13 20:52:35.299933 containerd[1540]: time="2025-01-13T20:52:35.299460901Z" level=error msg="Failed to destroy network for sandbox \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.299933 containerd[1540]: time="2025-01-13T20:52:35.299797510Z" level=error msg="encountered an error cleaning up failed sandbox \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.299933 containerd[1540]: time="2025-01-13T20:52:35.299835613Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:7,} failed, error" error="failed to setup network for sandbox \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.300914 kubelet[2861]: E0113 20:52:35.300294 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.300914 kubelet[2861]: E0113 20:52:35.300338 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:35.300914 kubelet[2861]: E0113 20:52:35.300367 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" Jan 13 20:52:35.301004 kubelet[2861]: E0113 20:52:35.300406 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-cbpp6_calico-apiserver(c9c96bfa-10f6-4dae-9986-cb25e73d9966)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" podUID="c9c96bfa-10f6-4dae-9986-cb25e73d9966" Jan 13 20:52:35.341202 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 13 20:52:35.344052 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 13 20:52:35.352644 containerd[1540]: time="2025-01-13T20:52:35.352600866Z" level=error msg="Failed to destroy network for sandbox \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.354260 containerd[1540]: time="2025-01-13T20:52:35.353747946Z" level=error msg="encountered an error cleaning up failed sandbox \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.354517 containerd[1540]: time="2025-01-13T20:52:35.354412844Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.354773 kubelet[2861]: E0113 20:52:35.354753 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.354813 kubelet[2861]: E0113 20:52:35.354804 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:35.354848 kubelet[2861]: E0113 20:52:35.354821 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2wvd8" Jan 13 20:52:35.354877 kubelet[2861]: E0113 20:52:35.354866 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-2wvd8_kube-system(42cec424-d73c-431a-b548-ae49975c9420)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-2wvd8_kube-system(42cec424-d73c-431a-b548-ae49975c9420)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-2wvd8" podUID="42cec424-d73c-431a-b548-ae49975c9420" Jan 13 20:52:35.375910 containerd[1540]: time="2025-01-13T20:52:35.375815057Z" level=error msg="Failed to destroy network for sandbox \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.376753 containerd[1540]: time="2025-01-13T20:52:35.376082004Z" level=error msg="encountered an error cleaning up failed sandbox \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.376753 containerd[1540]: time="2025-01-13T20:52:35.376137723Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:7,} failed, error" error="failed to setup network for sandbox \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.376859 kubelet[2861]: E0113 20:52:35.376335 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.376859 kubelet[2861]: E0113 20:52:35.376406 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:35.376859 kubelet[2861]: E0113 20:52:35.376428 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" Jan 13 20:52:35.377047 kubelet[2861]: E0113 20:52:35.376484 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-565485b44-5l4bw_calico-apiserver(26da34a0-8538-4fcf-9a78-f93cb2d6a0ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" podUID="26da34a0-8538-4fcf-9a78-f93cb2d6a0ef" Jan 13 20:52:35.383285 containerd[1540]: time="2025-01-13T20:52:35.383246721Z" level=error msg="Failed to destroy network for sandbox \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.383482 containerd[1540]: time="2025-01-13T20:52:35.383466574Z" level=error msg="encountered an error cleaning up failed sandbox \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.383524 containerd[1540]: time="2025-01-13T20:52:35.383508635Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:7,} failed, error" error="failed to setup network for sandbox \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.383683 kubelet[2861]: E0113 20:52:35.383662 2861 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:52:35.389426 kubelet[2861]: E0113 20:52:35.383709 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:35.389426 kubelet[2861]: E0113 20:52:35.383724 2861 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" Jan 13 20:52:35.389426 kubelet[2861]: E0113 20:52:35.383771 2861 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86544f5f57-nbx6b_calico-system(b067aee8-97d0-47ca-9359-80c070636930)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" podUID="b067aee8-97d0-47ca-9359-80c070636930" Jan 13 20:52:35.584323 systemd[1]: run-netns-cni\x2d47540f46\x2de619\x2d90c9\x2d5d3a\x2dcf40162ae444.mount: Deactivated successfully. Jan 13 20:52:35.584546 systemd[1]: run-netns-cni\x2d582b1837\x2d2375\x2d3757\x2df48d\x2d47db2c2dd480.mount: Deactivated successfully. Jan 13 20:52:35.584595 systemd[1]: run-netns-cni\x2d151b76ae\x2da3d2\x2dbe9a\x2d7b72\x2d46094de1a403.mount: Deactivated successfully. Jan 13 20:52:35.584642 systemd[1]: run-netns-cni\x2d0b43fe25\x2dec62\x2d6b10\x2d0a22\x2d94470ad0744b.mount: Deactivated successfully. Jan 13 20:52:35.584676 systemd[1]: run-netns-cni\x2d0301d4f7\x2d64cd\x2d7882\x2dc1e2\x2d93cb6d4948d0.mount: Deactivated successfully. Jan 13 20:52:35.584705 systemd[1]: run-netns-cni\x2d2aab4a1c\x2d5be9\x2d4dad\x2dcdec\x2d18b08ed14088.mount: Deactivated successfully. Jan 13 20:52:35.919802 kubelet[2861]: I0113 20:52:35.916858 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213" Jan 13 20:52:35.923372 containerd[1540]: time="2025-01-13T20:52:35.923104651Z" level=info msg="StopPodSandbox for \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\"" Jan 13 20:52:35.923372 containerd[1540]: time="2025-01-13T20:52:35.923242880Z" level=info msg="Ensure that sandbox 8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213 in task-service has been cleanup successfully" Jan 13 20:52:35.927673 containerd[1540]: time="2025-01-13T20:52:35.926669810Z" level=info msg="TearDown network for sandbox \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\" successfully" Jan 13 20:52:35.928474 containerd[1540]: time="2025-01-13T20:52:35.928440029Z" level=info msg="StopPodSandbox for \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\" returns successfully" Jan 13 20:52:35.928777 containerd[1540]: time="2025-01-13T20:52:35.928691948Z" level=info msg="StopPodSandbox for \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\"" Jan 13 20:52:35.928879 containerd[1540]: time="2025-01-13T20:52:35.928869108Z" level=info msg="TearDown network for sandbox \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\" successfully" Jan 13 20:52:35.928928 containerd[1540]: time="2025-01-13T20:52:35.928919823Z" level=info msg="StopPodSandbox for \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\" returns successfully" Jan 13 20:52:35.929286 containerd[1540]: time="2025-01-13T20:52:35.929073579Z" level=info msg="StopPodSandbox for \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\"" Jan 13 20:52:35.929683 containerd[1540]: time="2025-01-13T20:52:35.929668095Z" level=info msg="TearDown network for sandbox \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\" successfully" Jan 13 20:52:35.929683 containerd[1540]: time="2025-01-13T20:52:35.929679196Z" level=info msg="StopPodSandbox for \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\" returns successfully" Jan 13 20:52:35.930490 systemd[1]: run-netns-cni\x2da6a71381\x2d6ada\x2d80b2\x2d2898\x2d4b2c841ef99b.mount: Deactivated successfully. Jan 13 20:52:35.931637 containerd[1540]: time="2025-01-13T20:52:35.931559181Z" level=info msg="StopPodSandbox for \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\"" Jan 13 20:52:35.931808 containerd[1540]: time="2025-01-13T20:52:35.931709856Z" level=info msg="TearDown network for sandbox \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\" successfully" Jan 13 20:52:35.931808 containerd[1540]: time="2025-01-13T20:52:35.931720470Z" level=info msg="StopPodSandbox for \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\" returns successfully" Jan 13 20:52:35.932857 containerd[1540]: time="2025-01-13T20:52:35.932710844Z" level=info msg="StopPodSandbox for \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\"" Jan 13 20:52:35.932857 containerd[1540]: time="2025-01-13T20:52:35.932778962Z" level=info msg="TearDown network for sandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\" successfully" Jan 13 20:52:35.932857 containerd[1540]: time="2025-01-13T20:52:35.932789733Z" level=info msg="StopPodSandbox for \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\" returns successfully" Jan 13 20:52:35.933955 containerd[1540]: time="2025-01-13T20:52:35.933141710Z" level=info msg="StopPodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\"" Jan 13 20:52:35.933955 containerd[1540]: time="2025-01-13T20:52:35.933202885Z" level=info msg="TearDown network for sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" successfully" Jan 13 20:52:35.933955 containerd[1540]: time="2025-01-13T20:52:35.933212225Z" level=info msg="StopPodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" returns successfully" Jan 13 20:52:35.934079 kubelet[2861]: I0113 20:52:35.933416 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a" Jan 13 20:52:35.934473 containerd[1540]: time="2025-01-13T20:52:35.934378973Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\"" Jan 13 20:52:35.937376 containerd[1540]: time="2025-01-13T20:52:35.934552606Z" level=info msg="TearDown network for sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" successfully" Jan 13 20:52:35.937376 containerd[1540]: time="2025-01-13T20:52:35.934562270Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" returns successfully" Jan 13 20:52:35.937376 containerd[1540]: time="2025-01-13T20:52:35.934712478Z" level=info msg="StopPodSandbox for \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\"" Jan 13 20:52:35.937376 containerd[1540]: time="2025-01-13T20:52:35.934838787Z" level=info msg="Ensure that sandbox 8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a in task-service has been cleanup successfully" Jan 13 20:52:35.936734 systemd[1]: run-netns-cni\x2d9562efa6\x2d252b\x2d8441\x2d3625\x2dffa26e63c71b.mount: Deactivated successfully. Jan 13 20:52:35.937960 containerd[1540]: time="2025-01-13T20:52:35.937943834Z" level=info msg="TearDown network for sandbox \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\" successfully" Jan 13 20:52:35.938072 containerd[1540]: time="2025-01-13T20:52:35.938000619Z" level=info msg="StopPodSandbox for \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\" returns successfully" Jan 13 20:52:35.946285 containerd[1540]: time="2025-01-13T20:52:35.945906188Z" level=info msg="StopPodSandbox for \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\"" Jan 13 20:52:35.946285 containerd[1540]: time="2025-01-13T20:52:35.945969949Z" level=info msg="TearDown network for sandbox \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\" successfully" Jan 13 20:52:35.946285 containerd[1540]: time="2025-01-13T20:52:35.945977241Z" level=info msg="StopPodSandbox for \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\" returns successfully" Jan 13 20:52:35.946285 containerd[1540]: time="2025-01-13T20:52:35.946025800Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\"" Jan 13 20:52:35.946285 containerd[1540]: time="2025-01-13T20:52:35.946061051Z" level=info msg="TearDown network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" successfully" Jan 13 20:52:35.946285 containerd[1540]: time="2025-01-13T20:52:35.946066637Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" returns successfully" Jan 13 20:52:35.947258 containerd[1540]: time="2025-01-13T20:52:35.946995962Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:8,}" Jan 13 20:52:35.947598 containerd[1540]: time="2025-01-13T20:52:35.947581515Z" level=info msg="StopPodSandbox for \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\"" Jan 13 20:52:35.947815 containerd[1540]: time="2025-01-13T20:52:35.947709218Z" level=info msg="TearDown network for sandbox \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\" successfully" Jan 13 20:52:35.947815 containerd[1540]: time="2025-01-13T20:52:35.947717902Z" level=info msg="StopPodSandbox for \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\" returns successfully" Jan 13 20:52:35.951167 containerd[1540]: time="2025-01-13T20:52:35.951053454Z" level=info msg="StopPodSandbox for \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\"" Jan 13 20:52:35.951430 containerd[1540]: time="2025-01-13T20:52:35.951420715Z" level=info msg="TearDown network for sandbox \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\" successfully" Jan 13 20:52:35.951520 containerd[1540]: time="2025-01-13T20:52:35.951487830Z" level=info msg="StopPodSandbox for \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\" returns successfully" Jan 13 20:52:35.952122 containerd[1540]: time="2025-01-13T20:52:35.951996078Z" level=info msg="StopPodSandbox for \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\"" Jan 13 20:52:35.952122 containerd[1540]: time="2025-01-13T20:52:35.952040101Z" level=info msg="TearDown network for sandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\" successfully" Jan 13 20:52:35.952122 containerd[1540]: time="2025-01-13T20:52:35.952046458Z" level=info msg="StopPodSandbox for \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\" returns successfully" Jan 13 20:52:35.953267 containerd[1540]: time="2025-01-13T20:52:35.953157620Z" level=info msg="StopPodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\"" Jan 13 20:52:35.953267 containerd[1540]: time="2025-01-13T20:52:35.953202790Z" level=info msg="TearDown network for sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" successfully" Jan 13 20:52:35.953267 containerd[1540]: time="2025-01-13T20:52:35.953209153Z" level=info msg="StopPodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" returns successfully" Jan 13 20:52:35.954070 containerd[1540]: time="2025-01-13T20:52:35.954059789Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\"" Jan 13 20:52:35.954230 containerd[1540]: time="2025-01-13T20:52:35.954139428Z" level=info msg="TearDown network for sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" successfully" Jan 13 20:52:35.954230 containerd[1540]: time="2025-01-13T20:52:35.954146952Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" returns successfully" Jan 13 20:52:35.954594 containerd[1540]: time="2025-01-13T20:52:35.954412556Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\"" Jan 13 20:52:35.954594 containerd[1540]: time="2025-01-13T20:52:35.954452650Z" level=info msg="TearDown network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" successfully" Jan 13 20:52:35.954594 containerd[1540]: time="2025-01-13T20:52:35.954458534Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" returns successfully" Jan 13 20:52:35.955238 kubelet[2861]: I0113 20:52:35.954966 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63" Jan 13 20:52:35.955287 containerd[1540]: time="2025-01-13T20:52:35.955042481Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:8,}" Jan 13 20:52:35.956498 containerd[1540]: time="2025-01-13T20:52:35.956480371Z" level=info msg="StopPodSandbox for \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\"" Jan 13 20:52:35.956749 containerd[1540]: time="2025-01-13T20:52:35.956723545Z" level=info msg="Ensure that sandbox 8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63 in task-service has been cleanup successfully" Jan 13 20:52:35.957155 containerd[1540]: time="2025-01-13T20:52:35.957144737Z" level=info msg="TearDown network for sandbox \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\" successfully" Jan 13 20:52:35.957250 containerd[1540]: time="2025-01-13T20:52:35.957240538Z" level=info msg="StopPodSandbox for \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\" returns successfully" Jan 13 20:52:35.962205 systemd[1]: run-netns-cni\x2dec40db05\x2d9a1a\x2dbe8e\x2d805a\x2de2b0a037e2d8.mount: Deactivated successfully. Jan 13 20:52:35.964133 containerd[1540]: time="2025-01-13T20:52:35.963777948Z" level=info msg="StopPodSandbox for \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\"" Jan 13 20:52:35.965875 containerd[1540]: time="2025-01-13T20:52:35.965255914Z" level=info msg="TearDown network for sandbox \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\" successfully" Jan 13 20:52:35.965875 containerd[1540]: time="2025-01-13T20:52:35.965267002Z" level=info msg="StopPodSandbox for \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\" returns successfully" Jan 13 20:52:35.969298 containerd[1540]: time="2025-01-13T20:52:35.968823985Z" level=info msg="StopPodSandbox for \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\"" Jan 13 20:52:35.969298 containerd[1540]: time="2025-01-13T20:52:35.968920804Z" level=info msg="TearDown network for sandbox \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\" successfully" Jan 13 20:52:35.969298 containerd[1540]: time="2025-01-13T20:52:35.968928795Z" level=info msg="StopPodSandbox for \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\" returns successfully" Jan 13 20:52:35.973182 containerd[1540]: time="2025-01-13T20:52:35.970011103Z" level=info msg="StopPodSandbox for \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\"" Jan 13 20:52:35.973182 containerd[1540]: time="2025-01-13T20:52:35.970793686Z" level=info msg="TearDown network for sandbox \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\" successfully" Jan 13 20:52:35.973182 containerd[1540]: time="2025-01-13T20:52:35.970865236Z" level=info msg="StopPodSandbox for \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\" returns successfully" Jan 13 20:52:35.973337 containerd[1540]: time="2025-01-13T20:52:35.973283038Z" level=info msg="StopPodSandbox for \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\"" Jan 13 20:52:35.973462 containerd[1540]: time="2025-01-13T20:52:35.973434687Z" level=info msg="TearDown network for sandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\" successfully" Jan 13 20:52:35.973462 containerd[1540]: time="2025-01-13T20:52:35.973448235Z" level=info msg="StopPodSandbox for \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\" returns successfully" Jan 13 20:52:35.977053 containerd[1540]: time="2025-01-13T20:52:35.976944066Z" level=info msg="StopPodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\"" Jan 13 20:52:35.977053 containerd[1540]: time="2025-01-13T20:52:35.976996540Z" level=info msg="TearDown network for sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" successfully" Jan 13 20:52:35.977053 containerd[1540]: time="2025-01-13T20:52:35.977004191Z" level=info msg="StopPodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" returns successfully" Jan 13 20:52:35.977982 containerd[1540]: time="2025-01-13T20:52:35.977224051Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\"" Jan 13 20:52:35.977982 containerd[1540]: time="2025-01-13T20:52:35.977273035Z" level=info msg="TearDown network for sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" successfully" Jan 13 20:52:35.977982 containerd[1540]: time="2025-01-13T20:52:35.977279406Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" returns successfully" Jan 13 20:52:35.977982 containerd[1540]: time="2025-01-13T20:52:35.977638623Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\"" Jan 13 20:52:35.977982 containerd[1540]: time="2025-01-13T20:52:35.977677347Z" level=info msg="TearDown network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" successfully" Jan 13 20:52:35.977982 containerd[1540]: time="2025-01-13T20:52:35.977683194Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" returns successfully" Jan 13 20:52:35.978106 kubelet[2861]: I0113 20:52:35.977882 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0" Jan 13 20:52:35.978459 containerd[1540]: time="2025-01-13T20:52:35.978442420Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:8,}" Jan 13 20:52:35.979009 containerd[1540]: time="2025-01-13T20:52:35.978676838Z" level=info msg="StopPodSandbox for \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\"" Jan 13 20:52:35.979176 containerd[1540]: time="2025-01-13T20:52:35.979164884Z" level=info msg="Ensure that sandbox 5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0 in task-service has been cleanup successfully" Jan 13 20:52:35.979412 containerd[1540]: time="2025-01-13T20:52:35.979401537Z" level=info msg="TearDown network for sandbox \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\" successfully" Jan 13 20:52:35.979748 containerd[1540]: time="2025-01-13T20:52:35.979736854Z" level=info msg="StopPodSandbox for \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\" returns successfully" Jan 13 20:52:35.980193 containerd[1540]: time="2025-01-13T20:52:35.980183246Z" level=info msg="StopPodSandbox for \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\"" Jan 13 20:52:35.980457 containerd[1540]: time="2025-01-13T20:52:35.980360556Z" level=info msg="TearDown network for sandbox \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\" successfully" Jan 13 20:52:35.980532 containerd[1540]: time="2025-01-13T20:52:35.980520936Z" level=info msg="StopPodSandbox for \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\" returns successfully" Jan 13 20:52:35.981656 containerd[1540]: time="2025-01-13T20:52:35.981630909Z" level=info msg="StopPodSandbox for \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\"" Jan 13 20:52:35.981838 containerd[1540]: time="2025-01-13T20:52:35.981827342Z" level=info msg="TearDown network for sandbox \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\" successfully" Jan 13 20:52:35.986261 containerd[1540]: time="2025-01-13T20:52:35.986222201Z" level=info msg="StopPodSandbox for \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\" returns successfully" Jan 13 20:52:35.987195 kubelet[2861]: I0113 20:52:35.987179 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533" Jan 13 20:52:35.987885 containerd[1540]: time="2025-01-13T20:52:35.987852116Z" level=info msg="StopPodSandbox for \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\"" Jan 13 20:52:35.988945 containerd[1540]: time="2025-01-13T20:52:35.988923128Z" level=info msg="Ensure that sandbox bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533 in task-service has been cleanup successfully" Jan 13 20:52:35.989759 containerd[1540]: time="2025-01-13T20:52:35.989306767Z" level=info msg="TearDown network for sandbox \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\" successfully" Jan 13 20:52:35.989759 containerd[1540]: time="2025-01-13T20:52:35.989755225Z" level=info msg="StopPodSandbox for \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\" returns successfully" Jan 13 20:52:35.991044 containerd[1540]: time="2025-01-13T20:52:35.990903706Z" level=info msg="StopPodSandbox for \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\"" Jan 13 20:52:35.991744 containerd[1540]: time="2025-01-13T20:52:35.991370181Z" level=info msg="TearDown network for sandbox \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\" successfully" Jan 13 20:52:35.991744 containerd[1540]: time="2025-01-13T20:52:35.991382786Z" level=info msg="StopPodSandbox for \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\" returns successfully" Jan 13 20:52:35.993358 containerd[1540]: time="2025-01-13T20:52:35.993329179Z" level=info msg="StopPodSandbox for \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\"" Jan 13 20:52:35.994382 containerd[1540]: time="2025-01-13T20:52:35.993521214Z" level=info msg="TearDown network for sandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\" successfully" Jan 13 20:52:35.994382 containerd[1540]: time="2025-01-13T20:52:35.993536010Z" level=info msg="StopPodSandbox for \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\" returns successfully" Jan 13 20:52:35.994382 containerd[1540]: time="2025-01-13T20:52:35.993607828Z" level=info msg="StopPodSandbox for \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\"" Jan 13 20:52:35.994382 containerd[1540]: time="2025-01-13T20:52:35.993687239Z" level=info msg="TearDown network for sandbox \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\" successfully" Jan 13 20:52:35.994382 containerd[1540]: time="2025-01-13T20:52:35.993697690Z" level=info msg="StopPodSandbox for \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\" returns successfully" Jan 13 20:52:35.997014 containerd[1540]: time="2025-01-13T20:52:35.996994273Z" level=info msg="StopPodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\"" Jan 13 20:52:35.997093 containerd[1540]: time="2025-01-13T20:52:35.997060259Z" level=info msg="TearDown network for sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" successfully" Jan 13 20:52:35.997093 containerd[1540]: time="2025-01-13T20:52:35.997067414Z" level=info msg="StopPodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" returns successfully" Jan 13 20:52:35.997138 containerd[1540]: time="2025-01-13T20:52:35.997097807Z" level=info msg="StopPodSandbox for \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\"" Jan 13 20:52:35.997206 containerd[1540]: time="2025-01-13T20:52:35.997138803Z" level=info msg="TearDown network for sandbox \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\" successfully" Jan 13 20:52:35.997206 containerd[1540]: time="2025-01-13T20:52:35.997145054Z" level=info msg="StopPodSandbox for \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\" returns successfully" Jan 13 20:52:35.997540 containerd[1540]: time="2025-01-13T20:52:35.997525634Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\"" Jan 13 20:52:35.998869 containerd[1540]: time="2025-01-13T20:52:35.998839802Z" level=info msg="TearDown network for sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" successfully" Jan 13 20:52:35.998869 containerd[1540]: time="2025-01-13T20:52:35.998862650Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" returns successfully" Jan 13 20:52:35.998988 containerd[1540]: time="2025-01-13T20:52:35.997705259Z" level=info msg="StopPodSandbox for \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\"" Jan 13 20:52:35.999049 containerd[1540]: time="2025-01-13T20:52:35.999032218Z" level=info msg="TearDown network for sandbox \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\" successfully" Jan 13 20:52:35.999155 containerd[1540]: time="2025-01-13T20:52:35.999143905Z" level=info msg="StopPodSandbox for \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\" returns successfully" Jan 13 20:52:36.005494 containerd[1540]: time="2025-01-13T20:52:36.005467041Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\"" Jan 13 20:52:36.005588 containerd[1540]: time="2025-01-13T20:52:36.005554159Z" level=info msg="TearDown network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" successfully" Jan 13 20:52:36.005611 containerd[1540]: time="2025-01-13T20:52:36.005581397Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" returns successfully" Jan 13 20:52:36.006861 containerd[1540]: time="2025-01-13T20:52:36.006836686Z" level=info msg="StopPodSandbox for \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\"" Jan 13 20:52:36.007021 containerd[1540]: time="2025-01-13T20:52:36.006961791Z" level=info msg="TearDown network for sandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\" successfully" Jan 13 20:52:36.007021 containerd[1540]: time="2025-01-13T20:52:36.006971280Z" level=info msg="StopPodSandbox for \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\" returns successfully" Jan 13 20:52:36.008647 containerd[1540]: time="2025-01-13T20:52:36.008630152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:8,}" Jan 13 20:52:36.009133 containerd[1540]: time="2025-01-13T20:52:36.008837207Z" level=info msg="StopPodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\"" Jan 13 20:52:36.009133 containerd[1540]: time="2025-01-13T20:52:36.008920189Z" level=info msg="TearDown network for sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" successfully" Jan 13 20:52:36.009133 containerd[1540]: time="2025-01-13T20:52:36.008928233Z" level=info msg="StopPodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" returns successfully" Jan 13 20:52:36.009419 containerd[1540]: time="2025-01-13T20:52:36.009398385Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\"" Jan 13 20:52:36.011555 kubelet[2861]: I0113 20:52:36.010566 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d" Jan 13 20:52:36.011690 containerd[1540]: time="2025-01-13T20:52:36.011135169Z" level=info msg="TearDown network for sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" successfully" Jan 13 20:52:36.011690 containerd[1540]: time="2025-01-13T20:52:36.011147599Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" returns successfully" Jan 13 20:52:36.011742 containerd[1540]: time="2025-01-13T20:52:36.011273171Z" level=info msg="StopPodSandbox for \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\"" Jan 13 20:52:36.011882 containerd[1540]: time="2025-01-13T20:52:36.011863473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:7,}" Jan 13 20:52:36.011979 containerd[1540]: time="2025-01-13T20:52:36.011966246Z" level=info msg="Ensure that sandbox 2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d in task-service has been cleanup successfully" Jan 13 20:52:36.013182 containerd[1540]: time="2025-01-13T20:52:36.013161605Z" level=info msg="TearDown network for sandbox \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\" successfully" Jan 13 20:52:36.013182 containerd[1540]: time="2025-01-13T20:52:36.013177221Z" level=info msg="StopPodSandbox for \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\" returns successfully" Jan 13 20:52:36.013993 containerd[1540]: time="2025-01-13T20:52:36.013967443Z" level=info msg="StopPodSandbox for \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\"" Jan 13 20:52:36.014744 containerd[1540]: time="2025-01-13T20:52:36.014726383Z" level=info msg="TearDown network for sandbox \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\" successfully" Jan 13 20:52:36.014744 containerd[1540]: time="2025-01-13T20:52:36.014741154Z" level=info msg="StopPodSandbox for \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\" returns successfully" Jan 13 20:52:36.016081 containerd[1540]: time="2025-01-13T20:52:36.016056519Z" level=info msg="StopPodSandbox for \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\"" Jan 13 20:52:36.017467 containerd[1540]: time="2025-01-13T20:52:36.017430914Z" level=info msg="TearDown network for sandbox \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\" successfully" Jan 13 20:52:36.017558 containerd[1540]: time="2025-01-13T20:52:36.017458313Z" level=info msg="StopPodSandbox for \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\" returns successfully" Jan 13 20:52:36.019166 containerd[1540]: time="2025-01-13T20:52:36.019144609Z" level=info msg="StopPodSandbox for \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\"" Jan 13 20:52:36.019230 containerd[1540]: time="2025-01-13T20:52:36.019205278Z" level=info msg="TearDown network for sandbox \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\" successfully" Jan 13 20:52:36.019230 containerd[1540]: time="2025-01-13T20:52:36.019219260Z" level=info msg="StopPodSandbox for \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\" returns successfully" Jan 13 20:52:36.019686 containerd[1540]: time="2025-01-13T20:52:36.019643348Z" level=info msg="StopPodSandbox for \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\"" Jan 13 20:52:36.019728 containerd[1540]: time="2025-01-13T20:52:36.019699658Z" level=info msg="TearDown network for sandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\" successfully" Jan 13 20:52:36.019728 containerd[1540]: time="2025-01-13T20:52:36.019706081Z" level=info msg="StopPodSandbox for \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\" returns successfully" Jan 13 20:52:36.029647 containerd[1540]: time="2025-01-13T20:52:36.029610781Z" level=info msg="StopPodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\"" Jan 13 20:52:36.030346 containerd[1540]: time="2025-01-13T20:52:36.029822938Z" level=info msg="TearDown network for sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" successfully" Jan 13 20:52:36.030346 containerd[1540]: time="2025-01-13T20:52:36.029971175Z" level=info msg="StopPodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" returns successfully" Jan 13 20:52:36.034950 containerd[1540]: time="2025-01-13T20:52:36.031436707Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\"" Jan 13 20:52:36.034950 containerd[1540]: time="2025-01-13T20:52:36.031484767Z" level=info msg="TearDown network for sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" successfully" Jan 13 20:52:36.034950 containerd[1540]: time="2025-01-13T20:52:36.031707671Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" returns successfully" Jan 13 20:52:36.034950 containerd[1540]: time="2025-01-13T20:52:36.031972393Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\"" Jan 13 20:52:36.034950 containerd[1540]: time="2025-01-13T20:52:36.032161340Z" level=info msg="TearDown network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" successfully" Jan 13 20:52:36.034950 containerd[1540]: time="2025-01-13T20:52:36.032461158Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" returns successfully" Jan 13 20:52:36.034950 containerd[1540]: time="2025-01-13T20:52:36.033101589Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:8,}" Jan 13 20:52:36.593039 systemd[1]: run-netns-cni\x2d693678e0\x2db1e8\x2dcf35\x2d503c\x2dcc10815dc3bd.mount: Deactivated successfully. Jan 13 20:52:36.593466 systemd[1]: run-netns-cni\x2d02b27d17\x2d5df0\x2d451e\x2db7c2\x2d64743900bbd1.mount: Deactivated successfully. Jan 13 20:52:36.594186 systemd[1]: run-netns-cni\x2d31dbcbc5\x2d2d35\x2d71cd\x2dc98a\x2dd2fb2a047ae8.mount: Deactivated successfully. Jan 13 20:52:36.596228 systemd-networkd[1457]: cali64a14b3f383: Link UP Jan 13 20:52:36.596912 systemd-networkd[1457]: cali64a14b3f383: Gained carrier Jan 13 20:52:36.614912 systemd-networkd[1457]: cali40bd859ab58: Link UP Jan 13 20:52:36.615550 systemd-networkd[1457]: cali40bd859ab58: Gained carrier Jan 13 20:52:36.621120 kubelet[2861]: I0113 20:52:36.620656 2861 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-node-kwhkn" podStartSLOduration=3.169263435 podStartE2EDuration="28.613276448s" podCreationTimestamp="2025-01-13 20:52:08 +0000 UTC" firstStartedPulling="2025-01-13 20:52:09.149984417 +0000 UTC m=+21.256269146" lastFinishedPulling="2025-01-13 20:52:34.593997429 +0000 UTC m=+46.700282159" observedRunningTime="2025-01-13 20:52:35.925779001 +0000 UTC m=+48.032063734" watchObservedRunningTime="2025-01-13 20:52:36.613276448 +0000 UTC m=+48.719561181" Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.116 [INFO][5210] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.130 [INFO][5210] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--7qmqc-eth0 csi-node-driver- calico-system 6b0be92d-fb03-4015-90fc-415d37c2d78b 597 0 2025-01-13 20:52:08 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:55b695c467 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-7qmqc eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali64a14b3f383 [] []}} ContainerID="9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" Namespace="calico-system" Pod="csi-node-driver-7qmqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--7qmqc-" Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.130 [INFO][5210] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" Namespace="calico-system" Pod="csi-node-driver-7qmqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--7qmqc-eth0" Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.523 [INFO][5240] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" HandleID="k8s-pod-network.9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" Workload="localhost-k8s-csi--node--driver--7qmqc-eth0" Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.548 [INFO][5240] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" HandleID="k8s-pod-network.9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" Workload="localhost-k8s-csi--node--driver--7qmqc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003e5320), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-7qmqc", "timestamp":"2025-01-13 20:52:36.523426429 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.548 [INFO][5240] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.550 [INFO][5240] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.550 [INFO][5240] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.551 [INFO][5240] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" host="localhost" Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.560 [INFO][5240] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.563 [INFO][5240] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.564 [INFO][5240] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.565 [INFO][5240] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.565 [INFO][5240] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" host="localhost" Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.565 [INFO][5240] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976 Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.569 [INFO][5240] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" host="localhost" Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.571 [INFO][5240] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" host="localhost" Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.571 [INFO][5240] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" host="localhost" Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.571 [INFO][5240] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:52:36.621438 containerd[1540]: 2025-01-13 20:52:36.571 [INFO][5240] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" HandleID="k8s-pod-network.9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" Workload="localhost-k8s-csi--node--driver--7qmqc-eth0" Jan 13 20:52:36.623156 containerd[1540]: 2025-01-13 20:52:36.573 [INFO][5210] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" Namespace="calico-system" Pod="csi-node-driver-7qmqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--7qmqc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7qmqc-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6b0be92d-fb03-4015-90fc-415d37c2d78b", ResourceVersion:"597", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 52, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b695c467", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-7qmqc", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali64a14b3f383", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:52:36.623156 containerd[1540]: 2025-01-13 20:52:36.573 [INFO][5210] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" Namespace="calico-system" Pod="csi-node-driver-7qmqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--7qmqc-eth0" Jan 13 20:52:36.623156 containerd[1540]: 2025-01-13 20:52:36.574 [INFO][5210] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali64a14b3f383 ContainerID="9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" Namespace="calico-system" Pod="csi-node-driver-7qmqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--7qmqc-eth0" Jan 13 20:52:36.623156 containerd[1540]: 2025-01-13 20:52:36.598 [INFO][5210] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" Namespace="calico-system" Pod="csi-node-driver-7qmqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--7qmqc-eth0" Jan 13 20:52:36.623156 containerd[1540]: 2025-01-13 20:52:36.598 [INFO][5210] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" Namespace="calico-system" Pod="csi-node-driver-7qmqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--7qmqc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7qmqc-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6b0be92d-fb03-4015-90fc-415d37c2d78b", ResourceVersion:"597", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 52, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b695c467", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976", Pod:"csi-node-driver-7qmqc", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali64a14b3f383", MAC:"fa:65:a7:a6:2f:ba", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:52:36.623156 containerd[1540]: 2025-01-13 20:52:36.614 [INFO][5210] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976" Namespace="calico-system" Pod="csi-node-driver-7qmqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--7qmqc-eth0" Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.091 [INFO][5195] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.105 [INFO][5195] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--76f75df574--2wvd8-eth0 coredns-76f75df574- kube-system 42cec424-d73c-431a-b548-ae49975c9420 739 0 2025-01-13 20:52:03 +0000 UTC map[k8s-app:kube-dns pod-template-hash:76f75df574 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-76f75df574-2wvd8 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali40bd859ab58 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" Namespace="kube-system" Pod="coredns-76f75df574-2wvd8" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--2wvd8-" Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.105 [INFO][5195] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" Namespace="kube-system" Pod="coredns-76f75df574-2wvd8" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--2wvd8-eth0" Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.523 [INFO][5237] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" HandleID="k8s-pod-network.1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" Workload="localhost-k8s-coredns--76f75df574--2wvd8-eth0" Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.549 [INFO][5237] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" HandleID="k8s-pod-network.1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" Workload="localhost-k8s-coredns--76f75df574--2wvd8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003993a0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-76f75df574-2wvd8", "timestamp":"2025-01-13 20:52:36.523568857 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.549 [INFO][5237] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.571 [INFO][5237] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.571 [INFO][5237] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.573 [INFO][5237] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" host="localhost" Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.577 [INFO][5237] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.582 [INFO][5237] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.589 [INFO][5237] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.590 [INFO][5237] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.590 [INFO][5237] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" host="localhost" Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.591 [INFO][5237] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159 Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.597 [INFO][5237] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" host="localhost" Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.604 [INFO][5237] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" host="localhost" Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.604 [INFO][5237] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" host="localhost" Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.604 [INFO][5237] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:52:36.628650 containerd[1540]: 2025-01-13 20:52:36.604 [INFO][5237] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" HandleID="k8s-pod-network.1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" Workload="localhost-k8s-coredns--76f75df574--2wvd8-eth0" Jan 13 20:52:36.629177 containerd[1540]: 2025-01-13 20:52:36.610 [INFO][5195] cni-plugin/k8s.go 386: Populated endpoint ContainerID="1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" Namespace="kube-system" Pod="coredns-76f75df574-2wvd8" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--2wvd8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--76f75df574--2wvd8-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"42cec424-d73c-431a-b548-ae49975c9420", ResourceVersion:"739", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 52, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-76f75df574-2wvd8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali40bd859ab58", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:52:36.629177 containerd[1540]: 2025-01-13 20:52:36.610 [INFO][5195] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" Namespace="kube-system" Pod="coredns-76f75df574-2wvd8" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--2wvd8-eth0" Jan 13 20:52:36.629177 containerd[1540]: 2025-01-13 20:52:36.610 [INFO][5195] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali40bd859ab58 ContainerID="1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" Namespace="kube-system" Pod="coredns-76f75df574-2wvd8" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--2wvd8-eth0" Jan 13 20:52:36.629177 containerd[1540]: 2025-01-13 20:52:36.616 [INFO][5195] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" Namespace="kube-system" Pod="coredns-76f75df574-2wvd8" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--2wvd8-eth0" Jan 13 20:52:36.629177 containerd[1540]: 2025-01-13 20:52:36.617 [INFO][5195] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" Namespace="kube-system" Pod="coredns-76f75df574-2wvd8" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--2wvd8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--76f75df574--2wvd8-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"42cec424-d73c-431a-b548-ae49975c9420", ResourceVersion:"739", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 52, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159", Pod:"coredns-76f75df574-2wvd8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali40bd859ab58", MAC:"d2:d6:60:1e:fb:06", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:52:36.629177 containerd[1540]: 2025-01-13 20:52:36.625 [INFO][5195] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159" Namespace="kube-system" Pod="coredns-76f75df574-2wvd8" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--2wvd8-eth0" Jan 13 20:52:36.660806 containerd[1540]: time="2025-01-13T20:52:36.660505587Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:52:36.660806 containerd[1540]: time="2025-01-13T20:52:36.660565236Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:52:36.660806 containerd[1540]: time="2025-01-13T20:52:36.660576058Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:36.660806 containerd[1540]: time="2025-01-13T20:52:36.660625455Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:36.661789 containerd[1540]: time="2025-01-13T20:52:36.661611319Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:52:36.661789 containerd[1540]: time="2025-01-13T20:52:36.661655718Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:52:36.661789 containerd[1540]: time="2025-01-13T20:52:36.661666397Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:36.676132 containerd[1540]: time="2025-01-13T20:52:36.664884233Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:36.693876 systemd[1]: run-containerd-runc-k8s.io-1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159-runc.cmCdiv.mount: Deactivated successfully. Jan 13 20:52:36.697820 systemd-networkd[1457]: cali3bdc2b905e9: Link UP Jan 13 20:52:36.699291 systemd-networkd[1457]: cali3bdc2b905e9: Gained carrier Jan 13 20:52:36.710249 systemd[1]: Started cri-containerd-1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159.scope - libcontainer container 1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159. Jan 13 20:52:36.727201 systemd[1]: Started cri-containerd-9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976.scope - libcontainer container 9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976. Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.041 [INFO][5158] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.095 [INFO][5158] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0 calico-kube-controllers-86544f5f57- calico-system b067aee8-97d0-47ca-9359-80c070636930 737 0 2025-01-13 20:52:08 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:86544f5f57 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-86544f5f57-nbx6b eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali3bdc2b905e9 [] []}} ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Namespace="calico-system" Pod="calico-kube-controllers-86544f5f57-nbx6b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-" Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.095 [INFO][5158] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Namespace="calico-system" Pod="calico-kube-controllers-86544f5f57-nbx6b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.523 [INFO][5234] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" HandleID="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Workload="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.549 [INFO][5234] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" HandleID="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Workload="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00038ff00), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-86544f5f57-nbx6b", "timestamp":"2025-01-13 20:52:36.523504345 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.549 [INFO][5234] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.604 [INFO][5234] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.604 [INFO][5234] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.607 [INFO][5234] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" host="localhost" Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.614 [INFO][5234] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.622 [INFO][5234] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.625 [INFO][5234] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.633 [INFO][5234] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.633 [INFO][5234] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" host="localhost" Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.636 [INFO][5234] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.645 [INFO][5234] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" host="localhost" Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.676 [INFO][5234] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" host="localhost" Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.676 [INFO][5234] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" host="localhost" Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.676 [INFO][5234] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:52:36.730628 containerd[1540]: 2025-01-13 20:52:36.676 [INFO][5234] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" HandleID="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Workload="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:52:36.731826 containerd[1540]: 2025-01-13 20:52:36.695 [INFO][5158] cni-plugin/k8s.go 386: Populated endpoint ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Namespace="calico-system" Pod="calico-kube-controllers-86544f5f57-nbx6b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0", GenerateName:"calico-kube-controllers-86544f5f57-", Namespace:"calico-system", SelfLink:"", UID:"b067aee8-97d0-47ca-9359-80c070636930", ResourceVersion:"737", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 52, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"86544f5f57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-86544f5f57-nbx6b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3bdc2b905e9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:52:36.731826 containerd[1540]: 2025-01-13 20:52:36.695 [INFO][5158] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Namespace="calico-system" Pod="calico-kube-controllers-86544f5f57-nbx6b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:52:36.731826 containerd[1540]: 2025-01-13 20:52:36.695 [INFO][5158] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3bdc2b905e9 ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Namespace="calico-system" Pod="calico-kube-controllers-86544f5f57-nbx6b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:52:36.731826 containerd[1540]: 2025-01-13 20:52:36.700 [INFO][5158] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Namespace="calico-system" Pod="calico-kube-controllers-86544f5f57-nbx6b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:52:36.731826 containerd[1540]: 2025-01-13 20:52:36.703 [INFO][5158] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Namespace="calico-system" Pod="calico-kube-controllers-86544f5f57-nbx6b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0", GenerateName:"calico-kube-controllers-86544f5f57-", Namespace:"calico-system", SelfLink:"", UID:"b067aee8-97d0-47ca-9359-80c070636930", ResourceVersion:"737", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 52, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"86544f5f57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d", Pod:"calico-kube-controllers-86544f5f57-nbx6b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3bdc2b905e9", MAC:"2a:e9:8f:9c:fd:68", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:52:36.731826 containerd[1540]: 2025-01-13 20:52:36.725 [INFO][5158] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Namespace="calico-system" Pod="calico-kube-controllers-86544f5f57-nbx6b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:52:36.752217 systemd-resolved[1409]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:52:36.754420 systemd-networkd[1457]: cali1c00b518512: Link UP Jan 13 20:52:36.756418 systemd-networkd[1457]: cali1c00b518512: Gained carrier Jan 13 20:52:36.772931 containerd[1540]: time="2025-01-13T20:52:36.772473247Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:52:36.773169 containerd[1540]: time="2025-01-13T20:52:36.773116482Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:52:36.773524 containerd[1540]: time="2025-01-13T20:52:36.773173803Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:36.775531 containerd[1540]: time="2025-01-13T20:52:36.775481249Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.049 [INFO][5182] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.094 [INFO][5182] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--565485b44--5l4bw-eth0 calico-apiserver-565485b44- calico-apiserver 26da34a0-8538-4fcf-9a78-f93cb2d6a0ef 738 0 2025-01-13 20:52:08 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:565485b44 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-565485b44-5l4bw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1c00b518512 [] []}} ContainerID="964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-5l4bw" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--5l4bw-" Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.094 [INFO][5182] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-5l4bw" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--5l4bw-eth0" Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.523 [INFO][5238] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" HandleID="k8s-pod-network.964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" Workload="localhost-k8s-calico--apiserver--565485b44--5l4bw-eth0" Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.549 [INFO][5238] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" HandleID="k8s-pod-network.964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" Workload="localhost-k8s-calico--apiserver--565485b44--5l4bw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000515f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-565485b44-5l4bw", "timestamp":"2025-01-13 20:52:36.523531308 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.550 [INFO][5238] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.680 [INFO][5238] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.680 [INFO][5238] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.692 [INFO][5238] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" host="localhost" Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.705 [INFO][5238] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.713 [INFO][5238] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.714 [INFO][5238] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.716 [INFO][5238] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.716 [INFO][5238] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" host="localhost" Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.726 [INFO][5238] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177 Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.731 [INFO][5238] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" host="localhost" Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.740 [INFO][5238] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" host="localhost" Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.741 [INFO][5238] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" host="localhost" Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.741 [INFO][5238] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:52:36.778934 containerd[1540]: 2025-01-13 20:52:36.741 [INFO][5238] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" HandleID="k8s-pod-network.964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" Workload="localhost-k8s-calico--apiserver--565485b44--5l4bw-eth0" Jan 13 20:52:36.779380 containerd[1540]: 2025-01-13 20:52:36.748 [INFO][5182] cni-plugin/k8s.go 386: Populated endpoint ContainerID="964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-5l4bw" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--5l4bw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--565485b44--5l4bw-eth0", GenerateName:"calico-apiserver-565485b44-", Namespace:"calico-apiserver", SelfLink:"", UID:"26da34a0-8538-4fcf-9a78-f93cb2d6a0ef", ResourceVersion:"738", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 52, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"565485b44", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-565485b44-5l4bw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1c00b518512", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:52:36.779380 containerd[1540]: 2025-01-13 20:52:36.748 [INFO][5182] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-5l4bw" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--5l4bw-eth0" Jan 13 20:52:36.779380 containerd[1540]: 2025-01-13 20:52:36.748 [INFO][5182] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1c00b518512 ContainerID="964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-5l4bw" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--5l4bw-eth0" Jan 13 20:52:36.779380 containerd[1540]: 2025-01-13 20:52:36.757 [INFO][5182] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-5l4bw" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--5l4bw-eth0" Jan 13 20:52:36.779380 containerd[1540]: 2025-01-13 20:52:36.760 [INFO][5182] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-5l4bw" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--5l4bw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--565485b44--5l4bw-eth0", GenerateName:"calico-apiserver-565485b44-", Namespace:"calico-apiserver", SelfLink:"", UID:"26da34a0-8538-4fcf-9a78-f93cb2d6a0ef", ResourceVersion:"738", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 52, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"565485b44", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177", Pod:"calico-apiserver-565485b44-5l4bw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1c00b518512", MAC:"d2:4d:03:68:a6:62", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:52:36.779380 containerd[1540]: 2025-01-13 20:52:36.771 [INFO][5182] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-5l4bw" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--5l4bw-eth0" Jan 13 20:52:36.794645 systemd[1]: Started cri-containerd-0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d.scope - libcontainer container 0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d. Jan 13 20:52:36.816876 containerd[1540]: time="2025-01-13T20:52:36.816811501Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:52:36.817365 containerd[1540]: time="2025-01-13T20:52:36.816866954Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:52:36.817365 containerd[1540]: time="2025-01-13T20:52:36.816878756Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:36.817365 containerd[1540]: time="2025-01-13T20:52:36.816939045Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:36.819987 systemd-resolved[1409]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:52:36.839864 systemd[1]: Started cri-containerd-964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177.scope - libcontainer container 964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177. Jan 13 20:52:36.846205 systemd-networkd[1457]: cali089d8ad2779: Link UP Jan 13 20:52:36.847414 systemd-networkd[1457]: cali089d8ad2779: Gained carrier Jan 13 20:52:36.869032 containerd[1540]: time="2025-01-13T20:52:36.868999834Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2wvd8,Uid:42cec424-d73c-431a-b548-ae49975c9420,Namespace:kube-system,Attempt:7,} returns sandbox id \"1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159\"" Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.008 [INFO][5139] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.093 [INFO][5139] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--565485b44--cbpp6-eth0 calico-apiserver-565485b44- calico-apiserver c9c96bfa-10f6-4dae-9986-cb25e73d9966 735 0 2025-01-13 20:52:08 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:565485b44 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-565485b44-cbpp6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali089d8ad2779 [] []}} ContainerID="2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-cbpp6" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--cbpp6-" Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.093 [INFO][5139] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-cbpp6" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--cbpp6-eth0" Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.523 [INFO][5236] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" HandleID="k8s-pod-network.2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" Workload="localhost-k8s-calico--apiserver--565485b44--cbpp6-eth0" Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.550 [INFO][5236] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" HandleID="k8s-pod-network.2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" Workload="localhost-k8s-calico--apiserver--565485b44--cbpp6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00027e640), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-565485b44-cbpp6", "timestamp":"2025-01-13 20:52:36.523470509 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.550 [INFO][5236] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.741 [INFO][5236] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.741 [INFO][5236] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.747 [INFO][5236] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" host="localhost" Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.768 [INFO][5236] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.780 [INFO][5236] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.785 [INFO][5236] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.796 [INFO][5236] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.796 [INFO][5236] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" host="localhost" Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.803 [INFO][5236] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.816 [INFO][5236] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" host="localhost" Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.826 [INFO][5236] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" host="localhost" Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.827 [INFO][5236] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" host="localhost" Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.827 [INFO][5236] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:52:36.872670 containerd[1540]: 2025-01-13 20:52:36.827 [INFO][5236] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" HandleID="k8s-pod-network.2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" Workload="localhost-k8s-calico--apiserver--565485b44--cbpp6-eth0" Jan 13 20:52:36.873093 containerd[1540]: 2025-01-13 20:52:36.835 [INFO][5139] cni-plugin/k8s.go 386: Populated endpoint ContainerID="2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-cbpp6" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--cbpp6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--565485b44--cbpp6-eth0", GenerateName:"calico-apiserver-565485b44-", Namespace:"calico-apiserver", SelfLink:"", UID:"c9c96bfa-10f6-4dae-9986-cb25e73d9966", ResourceVersion:"735", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 52, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"565485b44", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-565485b44-cbpp6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali089d8ad2779", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:52:36.873093 containerd[1540]: 2025-01-13 20:52:36.835 [INFO][5139] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-cbpp6" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--cbpp6-eth0" Jan 13 20:52:36.873093 containerd[1540]: 2025-01-13 20:52:36.835 [INFO][5139] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali089d8ad2779 ContainerID="2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-cbpp6" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--cbpp6-eth0" Jan 13 20:52:36.873093 containerd[1540]: 2025-01-13 20:52:36.847 [INFO][5139] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-cbpp6" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--cbpp6-eth0" Jan 13 20:52:36.873093 containerd[1540]: 2025-01-13 20:52:36.848 [INFO][5139] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-cbpp6" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--cbpp6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--565485b44--cbpp6-eth0", GenerateName:"calico-apiserver-565485b44-", Namespace:"calico-apiserver", SelfLink:"", UID:"c9c96bfa-10f6-4dae-9986-cb25e73d9966", ResourceVersion:"735", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 52, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"565485b44", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b", Pod:"calico-apiserver-565485b44-cbpp6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali089d8ad2779", MAC:"52:31:91:51:29:0f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:52:36.873093 containerd[1540]: 2025-01-13 20:52:36.868 [INFO][5139] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b" Namespace="calico-apiserver" Pod="calico-apiserver-565485b44-cbpp6" WorkloadEndpoint="localhost-k8s-calico--apiserver--565485b44--cbpp6-eth0" Jan 13 20:52:36.875835 containerd[1540]: time="2025-01-13T20:52:36.875200236Z" level=info msg="CreateContainer within sandbox \"1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 13 20:52:36.901802 systemd-networkd[1457]: cali6db7a756bc3: Link UP Jan 13 20:52:36.902529 systemd-networkd[1457]: cali6db7a756bc3: Gained carrier Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.114 [INFO][5208] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.127 [INFO][5208] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--76f75df574--f9v7b-eth0 coredns-76f75df574- kube-system dd0ed670-6fae-4b3f-8750-1689ff0c62c3 736 0 2025-01-13 20:52:03 +0000 UTC map[k8s-app:kube-dns pod-template-hash:76f75df574 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-76f75df574-f9v7b eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6db7a756bc3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" Namespace="kube-system" Pod="coredns-76f75df574-f9v7b" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--f9v7b-" Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.127 [INFO][5208] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" Namespace="kube-system" Pod="coredns-76f75df574-f9v7b" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--f9v7b-eth0" Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.523 [INFO][5239] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" HandleID="k8s-pod-network.5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" Workload="localhost-k8s-coredns--76f75df574--f9v7b-eth0" Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.551 [INFO][5239] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" HandleID="k8s-pod-network.5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" Workload="localhost-k8s-coredns--76f75df574--f9v7b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003839f0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-76f75df574-f9v7b", "timestamp":"2025-01-13 20:52:36.523587335 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.551 [INFO][5239] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.827 [INFO][5239] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.827 [INFO][5239] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.830 [INFO][5239] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" host="localhost" Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.853 [INFO][5239] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.868 [INFO][5239] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.875 [INFO][5239] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.879 [INFO][5239] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.879 [INFO][5239] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" host="localhost" Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.882 [INFO][5239] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.885 [INFO][5239] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" host="localhost" Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.892 [INFO][5239] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" host="localhost" Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.892 [INFO][5239] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" host="localhost" Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.892 [INFO][5239] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:52:36.916118 containerd[1540]: 2025-01-13 20:52:36.892 [INFO][5239] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" HandleID="k8s-pod-network.5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" Workload="localhost-k8s-coredns--76f75df574--f9v7b-eth0" Jan 13 20:52:36.919853 containerd[1540]: 2025-01-13 20:52:36.898 [INFO][5208] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" Namespace="kube-system" Pod="coredns-76f75df574-f9v7b" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--f9v7b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--76f75df574--f9v7b-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"dd0ed670-6fae-4b3f-8750-1689ff0c62c3", ResourceVersion:"736", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 52, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-76f75df574-f9v7b", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6db7a756bc3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:52:36.919853 containerd[1540]: 2025-01-13 20:52:36.899 [INFO][5208] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" Namespace="kube-system" Pod="coredns-76f75df574-f9v7b" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--f9v7b-eth0" Jan 13 20:52:36.919853 containerd[1540]: 2025-01-13 20:52:36.899 [INFO][5208] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6db7a756bc3 ContainerID="5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" Namespace="kube-system" Pod="coredns-76f75df574-f9v7b" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--f9v7b-eth0" Jan 13 20:52:36.919853 containerd[1540]: 2025-01-13 20:52:36.903 [INFO][5208] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" Namespace="kube-system" Pod="coredns-76f75df574-f9v7b" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--f9v7b-eth0" Jan 13 20:52:36.919853 containerd[1540]: 2025-01-13 20:52:36.903 [INFO][5208] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" Namespace="kube-system" Pod="coredns-76f75df574-f9v7b" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--f9v7b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--76f75df574--f9v7b-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"dd0ed670-6fae-4b3f-8750-1689ff0c62c3", ResourceVersion:"736", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 52, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad", Pod:"coredns-76f75df574-f9v7b", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6db7a756bc3", MAC:"0e:6e:07:23:4b:fc", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:52:36.919853 containerd[1540]: 2025-01-13 20:52:36.909 [INFO][5208] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad" Namespace="kube-system" Pod="coredns-76f75df574-f9v7b" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--f9v7b-eth0" Jan 13 20:52:36.926564 containerd[1540]: time="2025-01-13T20:52:36.926289281Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:52:36.926564 containerd[1540]: time="2025-01-13T20:52:36.926338293Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:52:36.927588 containerd[1540]: time="2025-01-13T20:52:36.927182589Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:36.927983 containerd[1540]: time="2025-01-13T20:52:36.927861331Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:36.935237 systemd-resolved[1409]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:52:36.939044 containerd[1540]: time="2025-01-13T20:52:36.939019536Z" level=info msg="CreateContainer within sandbox \"1d533bf50c6598d0e05112c0e070cedf064341f0adcf33edf6d1aa44e7e47159\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"589e09149415cb35edc38344e1d68f04a3e9118a93e988ae89c0f67c92ecc9a3\"" Jan 13 20:52:36.941476 containerd[1540]: time="2025-01-13T20:52:36.941302497Z" level=info msg="StartContainer for \"589e09149415cb35edc38344e1d68f04a3e9118a93e988ae89c0f67c92ecc9a3\"" Jan 13 20:52:36.943343 systemd-resolved[1409]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:52:36.970144 containerd[1540]: time="2025-01-13T20:52:36.967879976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86544f5f57-nbx6b,Uid:b067aee8-97d0-47ca-9359-80c070636930,Namespace:calico-system,Attempt:8,} returns sandbox id \"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d\"" Jan 13 20:52:36.971291 containerd[1540]: time="2025-01-13T20:52:36.971271780Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Jan 13 20:52:36.994458 systemd[1]: Started cri-containerd-2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b.scope - libcontainer container 2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b. Jan 13 20:52:36.997004 containerd[1540]: time="2025-01-13T20:52:36.996582216Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7qmqc,Uid:6b0be92d-fb03-4015-90fc-415d37c2d78b,Namespace:calico-system,Attempt:8,} returns sandbox id \"9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976\"" Jan 13 20:52:37.001016 containerd[1540]: time="2025-01-13T20:52:37.000973112Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:52:37.001380 containerd[1540]: time="2025-01-13T20:52:37.001009224Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:52:37.001380 containerd[1540]: time="2025-01-13T20:52:37.001311952Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:37.001446 containerd[1540]: time="2025-01-13T20:52:37.001429842Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:52:37.024653 systemd-resolved[1409]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:52:37.035573 systemd[1]: Started cri-containerd-589e09149415cb35edc38344e1d68f04a3e9118a93e988ae89c0f67c92ecc9a3.scope - libcontainer container 589e09149415cb35edc38344e1d68f04a3e9118a93e988ae89c0f67c92ecc9a3. Jan 13 20:52:37.049528 systemd[1]: Started cri-containerd-5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad.scope - libcontainer container 5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad. Jan 13 20:52:37.066271 containerd[1540]: time="2025-01-13T20:52:37.066118216Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-5l4bw,Uid:26da34a0-8538-4fcf-9a78-f93cb2d6a0ef,Namespace:calico-apiserver,Attempt:8,} returns sandbox id \"964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177\"" Jan 13 20:52:37.085251 systemd-resolved[1409]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:52:37.097525 containerd[1540]: time="2025-01-13T20:52:37.097468727Z" level=info msg="StartContainer for \"589e09149415cb35edc38344e1d68f04a3e9118a93e988ae89c0f67c92ecc9a3\" returns successfully" Jan 13 20:52:37.116342 containerd[1540]: time="2025-01-13T20:52:37.116276537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-f9v7b,Uid:dd0ed670-6fae-4b3f-8750-1689ff0c62c3,Namespace:kube-system,Attempt:8,} returns sandbox id \"5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad\"" Jan 13 20:52:37.125243 containerd[1540]: time="2025-01-13T20:52:37.125084642Z" level=info msg="CreateContainer within sandbox \"5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 13 20:52:37.146869 containerd[1540]: time="2025-01-13T20:52:37.146839986Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-565485b44-cbpp6,Uid:c9c96bfa-10f6-4dae-9986-cb25e73d9966,Namespace:calico-apiserver,Attempt:8,} returns sandbox id \"2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b\"" Jan 13 20:52:37.153713 containerd[1540]: time="2025-01-13T20:52:37.153622799Z" level=info msg="CreateContainer within sandbox \"5df27e6517aef30b1a8bdac40d67b9c0fe3e99beec8892cf2b3aa923dae77aad\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"2e48638bf7266d26aa82987cec4ef9a984cbc769eb8a3bc8907a441ac3112f74\"" Jan 13 20:52:37.154364 containerd[1540]: time="2025-01-13T20:52:37.154244701Z" level=info msg="StartContainer for \"2e48638bf7266d26aa82987cec4ef9a984cbc769eb8a3bc8907a441ac3112f74\"" Jan 13 20:52:37.186495 systemd[1]: Started cri-containerd-2e48638bf7266d26aa82987cec4ef9a984cbc769eb8a3bc8907a441ac3112f74.scope - libcontainer container 2e48638bf7266d26aa82987cec4ef9a984cbc769eb8a3bc8907a441ac3112f74. Jan 13 20:52:37.216184 containerd[1540]: time="2025-01-13T20:52:37.216157875Z" level=info msg="StartContainer for \"2e48638bf7266d26aa82987cec4ef9a984cbc769eb8a3bc8907a441ac3112f74\" returns successfully" Jan 13 20:52:37.321639 kernel: bpftool[5787]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Jan 13 20:52:37.501406 systemd-networkd[1457]: vxlan.calico: Link UP Jan 13 20:52:37.501411 systemd-networkd[1457]: vxlan.calico: Gained carrier Jan 13 20:52:38.005666 systemd-networkd[1457]: cali40bd859ab58: Gained IPv6LL Jan 13 20:52:38.066278 kubelet[2861]: I0113 20:52:38.066253 2861 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-76f75df574-2wvd8" podStartSLOduration=35.06622737 podStartE2EDuration="35.06622737s" podCreationTimestamp="2025-01-13 20:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:52:38.065536012 +0000 UTC m=+50.171820751" watchObservedRunningTime="2025-01-13 20:52:38.06622737 +0000 UTC m=+50.172512098" Jan 13 20:52:38.069438 systemd-networkd[1457]: cali64a14b3f383: Gained IPv6LL Jan 13 20:52:38.132037 kubelet[2861]: I0113 20:52:38.131773 2861 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-76f75df574-f9v7b" podStartSLOduration=35.131747695 podStartE2EDuration="35.131747695s" podCreationTimestamp="2025-01-13 20:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:52:38.131410142 +0000 UTC m=+50.237694882" watchObservedRunningTime="2025-01-13 20:52:38.131747695 +0000 UTC m=+50.238032435" Jan 13 20:52:38.261500 systemd-networkd[1457]: cali3bdc2b905e9: Gained IPv6LL Jan 13 20:52:38.325538 systemd-networkd[1457]: cali6db7a756bc3: Gained IPv6LL Jan 13 20:52:38.325727 systemd-networkd[1457]: cali1c00b518512: Gained IPv6LL Jan 13 20:52:38.453498 systemd-networkd[1457]: cali089d8ad2779: Gained IPv6LL Jan 13 20:52:38.581453 systemd-networkd[1457]: vxlan.calico: Gained IPv6LL Jan 13 20:52:40.572984 containerd[1540]: time="2025-01-13T20:52:40.572948477Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:40.573458 containerd[1540]: time="2025-01-13T20:52:40.573399445Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Jan 13 20:52:40.573864 containerd[1540]: time="2025-01-13T20:52:40.573790408Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:40.574831 containerd[1540]: time="2025-01-13T20:52:40.574817470Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:40.575504 containerd[1540]: time="2025-01-13T20:52:40.575207654Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 3.603914885s" Jan 13 20:52:40.575504 containerd[1540]: time="2025-01-13T20:52:40.575224846Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Jan 13 20:52:40.575806 containerd[1540]: time="2025-01-13T20:52:40.575791407Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Jan 13 20:52:40.667090 containerd[1540]: time="2025-01-13T20:52:40.666840547Z" level=info msg="CreateContainer within sandbox \"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 13 20:52:40.673397 containerd[1540]: time="2025-01-13T20:52:40.673327777Z" level=info msg="CreateContainer within sandbox \"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1\"" Jan 13 20:52:40.674205 containerd[1540]: time="2025-01-13T20:52:40.674191558Z" level=info msg="StartContainer for \"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1\"" Jan 13 20:52:40.698566 systemd[1]: Started cri-containerd-a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1.scope - libcontainer container a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1. Jan 13 20:52:40.736690 containerd[1540]: time="2025-01-13T20:52:40.736609698Z" level=info msg="StartContainer for \"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1\" returns successfully" Jan 13 20:52:41.169064 kubelet[2861]: I0113 20:52:41.169038 2861 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-86544f5f57-nbx6b" podStartSLOduration=29.564478857 podStartE2EDuration="33.16901217s" podCreationTimestamp="2025-01-13 20:52:08 +0000 UTC" firstStartedPulling="2025-01-13 20:52:36.970940922 +0000 UTC m=+49.077225652" lastFinishedPulling="2025-01-13 20:52:40.575474235 +0000 UTC m=+52.681758965" observedRunningTime="2025-01-13 20:52:41.140085494 +0000 UTC m=+53.246370233" watchObservedRunningTime="2025-01-13 20:52:41.16901217 +0000 UTC m=+53.275296904" Jan 13 20:52:43.374561 containerd[1540]: time="2025-01-13T20:52:43.374516272Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:43.375274 containerd[1540]: time="2025-01-13T20:52:43.374992604Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Jan 13 20:52:43.375274 containerd[1540]: time="2025-01-13T20:52:43.375236552Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:43.376455 containerd[1540]: time="2025-01-13T20:52:43.376431697Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:43.377099 containerd[1540]: time="2025-01-13T20:52:43.376829977Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 2.801020213s" Jan 13 20:52:43.377099 containerd[1540]: time="2025-01-13T20:52:43.376845659Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Jan 13 20:52:43.377245 containerd[1540]: time="2025-01-13T20:52:43.377230703Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 13 20:52:43.381935 containerd[1540]: time="2025-01-13T20:52:43.381911900Z" level=info msg="CreateContainer within sandbox \"9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jan 13 20:52:43.395778 containerd[1540]: time="2025-01-13T20:52:43.395708558Z" level=info msg="CreateContainer within sandbox \"9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"d58362f3987038b9136a9b64ff6b0feb18907eb072ecedbefbd62575ffb3f017\"" Jan 13 20:52:43.396259 containerd[1540]: time="2025-01-13T20:52:43.396186816Z" level=info msg="StartContainer for \"d58362f3987038b9136a9b64ff6b0feb18907eb072ecedbefbd62575ffb3f017\"" Jan 13 20:52:43.416359 systemd[1]: run-containerd-runc-k8s.io-d58362f3987038b9136a9b64ff6b0feb18907eb072ecedbefbd62575ffb3f017-runc.deC8Ic.mount: Deactivated successfully. Jan 13 20:52:43.422591 systemd[1]: Started cri-containerd-d58362f3987038b9136a9b64ff6b0feb18907eb072ecedbefbd62575ffb3f017.scope - libcontainer container d58362f3987038b9136a9b64ff6b0feb18907eb072ecedbefbd62575ffb3f017. Jan 13 20:52:43.451991 containerd[1540]: time="2025-01-13T20:52:43.451960542Z" level=info msg="StartContainer for \"d58362f3987038b9136a9b64ff6b0feb18907eb072ecedbefbd62575ffb3f017\" returns successfully" Jan 13 20:52:46.759743 containerd[1540]: time="2025-01-13T20:52:46.759113209Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:46.759743 containerd[1540]: time="2025-01-13T20:52:46.759507370Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Jan 13 20:52:46.759743 containerd[1540]: time="2025-01-13T20:52:46.759691270Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:46.761277 containerd[1540]: time="2025-01-13T20:52:46.761067736Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:46.761557 containerd[1540]: time="2025-01-13T20:52:46.761541655Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 3.384294491s" Jan 13 20:52:46.762900 containerd[1540]: time="2025-01-13T20:52:46.761557271Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 13 20:52:46.762900 containerd[1540]: time="2025-01-13T20:52:46.762025805Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 13 20:52:46.764170 containerd[1540]: time="2025-01-13T20:52:46.764145125Z" level=info msg="CreateContainer within sandbox \"964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 13 20:52:46.771219 containerd[1540]: time="2025-01-13T20:52:46.771152593Z" level=info msg="CreateContainer within sandbox \"964ead5208fb11dc8a2c6a7c066ab670ab8f9b5cffb965bb359338cba938f177\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"26769152e6d58c6f5ed0a3d9c8fa4146e9464d0b6ff5a8d54588f1272cd88ef5\"" Jan 13 20:52:46.772152 containerd[1540]: time="2025-01-13T20:52:46.771521636Z" level=info msg="StartContainer for \"26769152e6d58c6f5ed0a3d9c8fa4146e9464d0b6ff5a8d54588f1272cd88ef5\"" Jan 13 20:52:46.817454 systemd[1]: Started cri-containerd-26769152e6d58c6f5ed0a3d9c8fa4146e9464d0b6ff5a8d54588f1272cd88ef5.scope - libcontainer container 26769152e6d58c6f5ed0a3d9c8fa4146e9464d0b6ff5a8d54588f1272cd88ef5. Jan 13 20:52:46.846356 containerd[1540]: time="2025-01-13T20:52:46.846324744Z" level=info msg="StartContainer for \"26769152e6d58c6f5ed0a3d9c8fa4146e9464d0b6ff5a8d54588f1272cd88ef5\" returns successfully" Jan 13 20:52:47.107279 containerd[1540]: time="2025-01-13T20:52:47.107128145Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:47.107537 containerd[1540]: time="2025-01-13T20:52:47.107498246Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Jan 13 20:52:47.110411 containerd[1540]: time="2025-01-13T20:52:47.110357811Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 348.310993ms" Jan 13 20:52:47.110411 containerd[1540]: time="2025-01-13T20:52:47.110378994Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 13 20:52:47.111935 containerd[1540]: time="2025-01-13T20:52:47.111733613Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Jan 13 20:52:47.113257 containerd[1540]: time="2025-01-13T20:52:47.113143159Z" level=info msg="CreateContainer within sandbox \"2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 13 20:52:47.136249 containerd[1540]: time="2025-01-13T20:52:47.136222213Z" level=info msg="CreateContainer within sandbox \"2f20279b444cfe8afc5eaaa5aa8fc2448f39c79aa148537446aea909bc7ed93b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e90b92b9eed24804d20d1857f571778d81ed598b9428af44a321e100508dc5b5\"" Jan 13 20:52:47.137577 containerd[1540]: time="2025-01-13T20:52:47.137376342Z" level=info msg="StartContainer for \"e90b92b9eed24804d20d1857f571778d81ed598b9428af44a321e100508dc5b5\"" Jan 13 20:52:47.172666 systemd[1]: Started cri-containerd-e90b92b9eed24804d20d1857f571778d81ed598b9428af44a321e100508dc5b5.scope - libcontainer container e90b92b9eed24804d20d1857f571778d81ed598b9428af44a321e100508dc5b5. Jan 13 20:52:47.222341 containerd[1540]: time="2025-01-13T20:52:47.222262592Z" level=info msg="StartContainer for \"e90b92b9eed24804d20d1857f571778d81ed598b9428af44a321e100508dc5b5\" returns successfully" Jan 13 20:52:47.390651 kubelet[2861]: I0113 20:52:47.390126 2861 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-565485b44-5l4bw" podStartSLOduration=29.695782097 podStartE2EDuration="39.390082894s" podCreationTimestamp="2025-01-13 20:52:08 +0000 UTC" firstStartedPulling="2025-01-13 20:52:37.067587355 +0000 UTC m=+49.173872085" lastFinishedPulling="2025-01-13 20:52:46.76188815 +0000 UTC m=+58.868172882" observedRunningTime="2025-01-13 20:52:47.16919656 +0000 UTC m=+59.275481294" watchObservedRunningTime="2025-01-13 20:52:47.390082894 +0000 UTC m=+59.496367635" Jan 13 20:52:48.383896 containerd[1540]: time="2025-01-13T20:52:48.383698614Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\"" Jan 13 20:52:48.383896 containerd[1540]: time="2025-01-13T20:52:48.383820185Z" level=info msg="TearDown network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" successfully" Jan 13 20:52:48.383896 containerd[1540]: time="2025-01-13T20:52:48.383831602Z" level=info msg="StopPodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" returns successfully" Jan 13 20:52:48.976858 containerd[1540]: time="2025-01-13T20:52:48.976703590Z" level=info msg="RemovePodSandbox for \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\"" Jan 13 20:52:48.990655 containerd[1540]: time="2025-01-13T20:52:48.990624478Z" level=info msg="Forcibly stopping sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\"" Jan 13 20:52:49.009798 containerd[1540]: time="2025-01-13T20:52:48.990701351Z" level=info msg="TearDown network for sandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" successfully" Jan 13 20:52:49.140195 containerd[1540]: time="2025-01-13T20:52:49.140050524Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:49.176996 containerd[1540]: time="2025-01-13T20:52:49.176955108Z" level=info msg="RemovePodSandbox \"4b591838a9b5d343646d7cc9424b9d115dd2ce5c6606794e6db8b36766fe2cdd\" returns successfully" Jan 13 20:52:49.177369 containerd[1540]: time="2025-01-13T20:52:49.177343652Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\"" Jan 13 20:52:49.177441 containerd[1540]: time="2025-01-13T20:52:49.177426792Z" level=info msg="TearDown network for sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" successfully" Jan 13 20:52:49.177470 containerd[1540]: time="2025-01-13T20:52:49.177439903Z" level=info msg="StopPodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" returns successfully" Jan 13 20:52:49.177592 containerd[1540]: time="2025-01-13T20:52:49.177577646Z" level=info msg="RemovePodSandbox for \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\"" Jan 13 20:52:49.177648 containerd[1540]: time="2025-01-13T20:52:49.177634407Z" level=info msg="Forcibly stopping sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\"" Jan 13 20:52:49.177717 containerd[1540]: time="2025-01-13T20:52:49.177689013Z" level=info msg="TearDown network for sandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" successfully" Jan 13 20:52:49.188748 kubelet[2861]: I0113 20:52:49.188366 2861 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-565485b44-cbpp6" podStartSLOduration=31.226645032 podStartE2EDuration="41.188326294s" podCreationTimestamp="2025-01-13 20:52:08 +0000 UTC" firstStartedPulling="2025-01-13 20:52:37.1488725 +0000 UTC m=+49.255157231" lastFinishedPulling="2025-01-13 20:52:47.110553761 +0000 UTC m=+59.216838493" observedRunningTime="2025-01-13 20:52:49.152857125 +0000 UTC m=+61.259141861" watchObservedRunningTime="2025-01-13 20:52:49.188326294 +0000 UTC m=+61.294611036" Jan 13 20:52:49.262718 containerd[1540]: time="2025-01-13T20:52:49.262609021Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:49.263066 containerd[1540]: time="2025-01-13T20:52:49.262835917Z" level=info msg="RemovePodSandbox \"6b8a48a6dba708a595d27e5701cdc0bde8e817ac0a4e188e9783a135ad6d110a\" returns successfully" Jan 13 20:52:49.263345 containerd[1540]: time="2025-01-13T20:52:49.263333385Z" level=info msg="StopPodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\"" Jan 13 20:52:49.263589 containerd[1540]: time="2025-01-13T20:52:49.263515777Z" level=info msg="TearDown network for sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" successfully" Jan 13 20:52:49.263589 containerd[1540]: time="2025-01-13T20:52:49.263526485Z" level=info msg="StopPodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" returns successfully" Jan 13 20:52:49.263783 containerd[1540]: time="2025-01-13T20:52:49.263734323Z" level=info msg="RemovePodSandbox for \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\"" Jan 13 20:52:49.263783 containerd[1540]: time="2025-01-13T20:52:49.263780878Z" level=info msg="Forcibly stopping sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\"" Jan 13 20:52:49.263878 containerd[1540]: time="2025-01-13T20:52:49.263820600Z" level=info msg="TearDown network for sandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" successfully" Jan 13 20:52:49.379740 containerd[1540]: time="2025-01-13T20:52:49.379697299Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:49.379930 containerd[1540]: time="2025-01-13T20:52:49.379757641Z" level=info msg="RemovePodSandbox \"3814c9dc0a04e5d9d61f713d3bbd35638c612e7409c29674be8a99cd739ea134\" returns successfully" Jan 13 20:52:49.380204 containerd[1540]: time="2025-01-13T20:52:49.380156288Z" level=info msg="StopPodSandbox for \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\"" Jan 13 20:52:49.380267 containerd[1540]: time="2025-01-13T20:52:49.380250611Z" level=info msg="TearDown network for sandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\" successfully" Jan 13 20:52:49.380292 containerd[1540]: time="2025-01-13T20:52:49.380278169Z" level=info msg="StopPodSandbox for \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\" returns successfully" Jan 13 20:52:49.380524 containerd[1540]: time="2025-01-13T20:52:49.380513826Z" level=info msg="RemovePodSandbox for \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\"" Jan 13 20:52:49.381370 containerd[1540]: time="2025-01-13T20:52:49.380574195Z" level=info msg="Forcibly stopping sandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\"" Jan 13 20:52:49.381370 containerd[1540]: time="2025-01-13T20:52:49.380613215Z" level=info msg="TearDown network for sandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\" successfully" Jan 13 20:52:49.471440 containerd[1540]: time="2025-01-13T20:52:49.471403963Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:49.471839 containerd[1540]: time="2025-01-13T20:52:49.471465592Z" level=info msg="RemovePodSandbox \"3a070447de7a7a80d81f0f179150c2011fd8d7dc7117c4a6d6f4f58b864fd7f5\" returns successfully" Jan 13 20:52:49.472079 containerd[1540]: time="2025-01-13T20:52:49.471933822Z" level=info msg="StopPodSandbox for \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\"" Jan 13 20:52:49.472079 containerd[1540]: time="2025-01-13T20:52:49.471993461Z" level=info msg="TearDown network for sandbox \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\" successfully" Jan 13 20:52:49.472079 containerd[1540]: time="2025-01-13T20:52:49.472000299Z" level=info msg="StopPodSandbox for \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\" returns successfully" Jan 13 20:52:49.472471 containerd[1540]: time="2025-01-13T20:52:49.472223796Z" level=info msg="RemovePodSandbox for \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\"" Jan 13 20:52:49.472471 containerd[1540]: time="2025-01-13T20:52:49.472236174Z" level=info msg="Forcibly stopping sandbox \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\"" Jan 13 20:52:49.472471 containerd[1540]: time="2025-01-13T20:52:49.472269609Z" level=info msg="TearDown network for sandbox \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\" successfully" Jan 13 20:52:49.531573 containerd[1540]: time="2025-01-13T20:52:49.531390170Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:49.531573 containerd[1540]: time="2025-01-13T20:52:49.531441518Z" level=info msg="RemovePodSandbox \"ca2c42b31f3ee496c18f341a78ff809a7c77eea507efc3a927de425f0e9f2212\" returns successfully" Jan 13 20:52:49.532441 containerd[1540]: time="2025-01-13T20:52:49.532192715Z" level=info msg="StopPodSandbox for \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\"" Jan 13 20:52:49.532441 containerd[1540]: time="2025-01-13T20:52:49.532344408Z" level=info msg="TearDown network for sandbox \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\" successfully" Jan 13 20:52:49.532441 containerd[1540]: time="2025-01-13T20:52:49.532367319Z" level=info msg="StopPodSandbox for \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\" returns successfully" Jan 13 20:52:49.539754 containerd[1540]: time="2025-01-13T20:52:49.532584825Z" level=info msg="RemovePodSandbox for \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\"" Jan 13 20:52:49.539754 containerd[1540]: time="2025-01-13T20:52:49.532600348Z" level=info msg="Forcibly stopping sandbox \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\"" Jan 13 20:52:49.539754 containerd[1540]: time="2025-01-13T20:52:49.532664386Z" level=info msg="TearDown network for sandbox \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\" successfully" Jan 13 20:52:49.550411 containerd[1540]: time="2025-01-13T20:52:49.550363288Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:49.550654 containerd[1540]: time="2025-01-13T20:52:49.550417089Z" level=info msg="RemovePodSandbox \"6f3020f4385844efb3fd0bb99c3635de7c807afa6a4ff39ce891fa4b447d4b22\" returns successfully" Jan 13 20:52:49.550713 containerd[1540]: time="2025-01-13T20:52:49.550696750Z" level=info msg="StopPodSandbox for \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\"" Jan 13 20:52:49.550785 containerd[1540]: time="2025-01-13T20:52:49.550767881Z" level=info msg="TearDown network for sandbox \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\" successfully" Jan 13 20:52:49.550827 containerd[1540]: time="2025-01-13T20:52:49.550780301Z" level=info msg="StopPodSandbox for \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\" returns successfully" Jan 13 20:52:49.551378 containerd[1540]: time="2025-01-13T20:52:49.551100715Z" level=info msg="RemovePodSandbox for \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\"" Jan 13 20:52:49.551378 containerd[1540]: time="2025-01-13T20:52:49.551118611Z" level=info msg="Forcibly stopping sandbox \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\"" Jan 13 20:52:49.551378 containerd[1540]: time="2025-01-13T20:52:49.551156793Z" level=info msg="TearDown network for sandbox \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\" successfully" Jan 13 20:52:49.574457 containerd[1540]: time="2025-01-13T20:52:49.574424978Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:49.581397 containerd[1540]: time="2025-01-13T20:52:49.574487567Z" level=info msg="RemovePodSandbox \"a1e588deff13a8d7797663a20802217f651d1d13a5a40c73d24f3b794eec4d85\" returns successfully" Jan 13 20:52:49.581397 containerd[1540]: time="2025-01-13T20:52:49.574846755Z" level=info msg="StopPodSandbox for \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\"" Jan 13 20:52:49.581397 containerd[1540]: time="2025-01-13T20:52:49.574915018Z" level=info msg="TearDown network for sandbox \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\" successfully" Jan 13 20:52:49.581397 containerd[1540]: time="2025-01-13T20:52:49.574922986Z" level=info msg="StopPodSandbox for \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\" returns successfully" Jan 13 20:52:49.581397 containerd[1540]: time="2025-01-13T20:52:49.575131595Z" level=info msg="RemovePodSandbox for \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\"" Jan 13 20:52:49.581397 containerd[1540]: time="2025-01-13T20:52:49.575145074Z" level=info msg="Forcibly stopping sandbox \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\"" Jan 13 20:52:49.581397 containerd[1540]: time="2025-01-13T20:52:49.575180146Z" level=info msg="TearDown network for sandbox \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\" successfully" Jan 13 20:52:49.651512 containerd[1540]: time="2025-01-13T20:52:49.651470118Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:49.651512 containerd[1540]: time="2025-01-13T20:52:49.651516184Z" level=info msg="RemovePodSandbox \"2296cea8b1af86a36527cf16cb11dbce2cff6a9d011b40f177a7300f3605a20d\" returns successfully" Jan 13 20:52:49.651835 containerd[1540]: time="2025-01-13T20:52:49.651802751Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\"" Jan 13 20:52:49.651913 containerd[1540]: time="2025-01-13T20:52:49.651895892Z" level=info msg="TearDown network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" successfully" Jan 13 20:52:49.651913 containerd[1540]: time="2025-01-13T20:52:49.651908312Z" level=info msg="StopPodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" returns successfully" Jan 13 20:52:49.652927 containerd[1540]: time="2025-01-13T20:52:49.652130840Z" level=info msg="RemovePodSandbox for \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\"" Jan 13 20:52:49.652927 containerd[1540]: time="2025-01-13T20:52:49.652148980Z" level=info msg="Forcibly stopping sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\"" Jan 13 20:52:49.652927 containerd[1540]: time="2025-01-13T20:52:49.652191167Z" level=info msg="TearDown network for sandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" successfully" Jan 13 20:52:50.367549 containerd[1540]: time="2025-01-13T20:52:50.367507723Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.367650 containerd[1540]: time="2025-01-13T20:52:50.367567988Z" level=info msg="RemovePodSandbox \"261686cec8936ad30ceb941eb4c2face975e273cf08abb138e02eeaac6536658\" returns successfully" Jan 13 20:52:50.369970 containerd[1540]: time="2025-01-13T20:52:50.367886650Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\"" Jan 13 20:52:50.369970 containerd[1540]: time="2025-01-13T20:52:50.367942618Z" level=info msg="TearDown network for sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" successfully" Jan 13 20:52:50.369970 containerd[1540]: time="2025-01-13T20:52:50.367949014Z" level=info msg="StopPodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" returns successfully" Jan 13 20:52:50.369970 containerd[1540]: time="2025-01-13T20:52:50.368083618Z" level=info msg="RemovePodSandbox for \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\"" Jan 13 20:52:50.369970 containerd[1540]: time="2025-01-13T20:52:50.368094103Z" level=info msg="Forcibly stopping sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\"" Jan 13 20:52:50.369970 containerd[1540]: time="2025-01-13T20:52:50.368122212Z" level=info msg="TearDown network for sandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" successfully" Jan 13 20:52:50.422473 containerd[1540]: time="2025-01-13T20:52:50.422444959Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.422581 containerd[1540]: time="2025-01-13T20:52:50.422496529Z" level=info msg="RemovePodSandbox \"96b5d6c9e3e6def666a5ab0b78846c0f5f77d7852e9edf9f8c6d08c6716df64b\" returns successfully" Jan 13 20:52:50.424081 containerd[1540]: time="2025-01-13T20:52:50.424066163Z" level=info msg="StopPodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\"" Jan 13 20:52:50.424259 containerd[1540]: time="2025-01-13T20:52:50.424201453Z" level=info msg="TearDown network for sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" successfully" Jan 13 20:52:50.424259 containerd[1540]: time="2025-01-13T20:52:50.424211211Z" level=info msg="StopPodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" returns successfully" Jan 13 20:52:50.424630 containerd[1540]: time="2025-01-13T20:52:50.424610809Z" level=info msg="RemovePodSandbox for \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\"" Jan 13 20:52:50.424630 containerd[1540]: time="2025-01-13T20:52:50.424627387Z" level=info msg="Forcibly stopping sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\"" Jan 13 20:52:50.424693 containerd[1540]: time="2025-01-13T20:52:50.424660598Z" level=info msg="TearDown network for sandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" successfully" Jan 13 20:52:50.429392 containerd[1540]: time="2025-01-13T20:52:50.429285618Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.429392 containerd[1540]: time="2025-01-13T20:52:50.429317181Z" level=info msg="RemovePodSandbox \"c7f36469df78818733b4f115bdc8d63729a05bb7bf96b1ca527e94b38ddc6ee4\" returns successfully" Jan 13 20:52:50.429551 containerd[1540]: time="2025-01-13T20:52:50.429539768Z" level=info msg="StopPodSandbox for \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\"" Jan 13 20:52:50.429616 containerd[1540]: time="2025-01-13T20:52:50.429586718Z" level=info msg="TearDown network for sandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\" successfully" Jan 13 20:52:50.429616 containerd[1540]: time="2025-01-13T20:52:50.429594284Z" level=info msg="StopPodSandbox for \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\" returns successfully" Jan 13 20:52:50.429889 containerd[1540]: time="2025-01-13T20:52:50.429874130Z" level=info msg="RemovePodSandbox for \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\"" Jan 13 20:52:50.429915 containerd[1540]: time="2025-01-13T20:52:50.429888257Z" level=info msg="Forcibly stopping sandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\"" Jan 13 20:52:50.429943 containerd[1540]: time="2025-01-13T20:52:50.429937945Z" level=info msg="TearDown network for sandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\" successfully" Jan 13 20:52:50.431246 containerd[1540]: time="2025-01-13T20:52:50.431212083Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.431246 containerd[1540]: time="2025-01-13T20:52:50.431234356Z" level=info msg="RemovePodSandbox \"218fb0838d4ca7280cdc4e59678c5526c7c18e15daded804f65ab07a423645f3\" returns successfully" Jan 13 20:52:50.431896 containerd[1540]: time="2025-01-13T20:52:50.431478772Z" level=info msg="StopPodSandbox for \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\"" Jan 13 20:52:50.431896 containerd[1540]: time="2025-01-13T20:52:50.431523355Z" level=info msg="TearDown network for sandbox \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\" successfully" Jan 13 20:52:50.431896 containerd[1540]: time="2025-01-13T20:52:50.431530309Z" level=info msg="StopPodSandbox for \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\" returns successfully" Jan 13 20:52:50.432034 containerd[1540]: time="2025-01-13T20:52:50.432024025Z" level=info msg="RemovePodSandbox for \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\"" Jan 13 20:52:50.432248 containerd[1540]: time="2025-01-13T20:52:50.432173866Z" level=info msg="Forcibly stopping sandbox \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\"" Jan 13 20:52:50.432248 containerd[1540]: time="2025-01-13T20:52:50.432211422Z" level=info msg="TearDown network for sandbox \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\" successfully" Jan 13 20:52:50.434306 containerd[1540]: time="2025-01-13T20:52:50.434294312Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.434399 containerd[1540]: time="2025-01-13T20:52:50.434358699Z" level=info msg="RemovePodSandbox \"2b3f037dc9d71de4bef8826b0a39956db0e461df59f591cc0b8c3c3852bb217a\" returns successfully" Jan 13 20:52:50.434594 containerd[1540]: time="2025-01-13T20:52:50.434582639Z" level=info msg="StopPodSandbox for \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\"" Jan 13 20:52:50.434687 containerd[1540]: time="2025-01-13T20:52:50.434623277Z" level=info msg="TearDown network for sandbox \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\" successfully" Jan 13 20:52:50.434687 containerd[1540]: time="2025-01-13T20:52:50.434629414Z" level=info msg="StopPodSandbox for \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\" returns successfully" Jan 13 20:52:50.435985 containerd[1540]: time="2025-01-13T20:52:50.435972705Z" level=info msg="RemovePodSandbox for \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\"" Jan 13 20:52:50.436013 containerd[1540]: time="2025-01-13T20:52:50.435985605Z" level=info msg="Forcibly stopping sandbox \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\"" Jan 13 20:52:50.436123 containerd[1540]: time="2025-01-13T20:52:50.436018803Z" level=info msg="TearDown network for sandbox \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\" successfully" Jan 13 20:52:50.438502 containerd[1540]: time="2025-01-13T20:52:50.438486042Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.438536 containerd[1540]: time="2025-01-13T20:52:50.438510580Z" level=info msg="RemovePodSandbox \"90f7b1d10fe2b2e2a32d1d82db26cbf6b60fd87f5a26687ba6e608c87d13261c\" returns successfully" Jan 13 20:52:50.438651 containerd[1540]: time="2025-01-13T20:52:50.438637697Z" level=info msg="StopPodSandbox for \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\"" Jan 13 20:52:50.438692 containerd[1540]: time="2025-01-13T20:52:50.438680169Z" level=info msg="TearDown network for sandbox \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\" successfully" Jan 13 20:52:50.438692 containerd[1540]: time="2025-01-13T20:52:50.438689433Z" level=info msg="StopPodSandbox for \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\" returns successfully" Jan 13 20:52:50.438909 containerd[1540]: time="2025-01-13T20:52:50.438880683Z" level=info msg="RemovePodSandbox for \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\"" Jan 13 20:52:50.438909 containerd[1540]: time="2025-01-13T20:52:50.438893680Z" level=info msg="Forcibly stopping sandbox \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\"" Jan 13 20:52:50.438991 containerd[1540]: time="2025-01-13T20:52:50.438925901Z" level=info msg="TearDown network for sandbox \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\" successfully" Jan 13 20:52:50.442406 containerd[1540]: time="2025-01-13T20:52:50.442388349Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.442447 containerd[1540]: time="2025-01-13T20:52:50.442411890Z" level=info msg="RemovePodSandbox \"48efda64d96ae1ed33dee39368d055432abe0644484161d82236908018cefeac\" returns successfully" Jan 13 20:52:50.442843 containerd[1540]: time="2025-01-13T20:52:50.442793149Z" level=info msg="StopPodSandbox for \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\"" Jan 13 20:52:50.442843 containerd[1540]: time="2025-01-13T20:52:50.442834478Z" level=info msg="TearDown network for sandbox \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\" successfully" Jan 13 20:52:50.442843 containerd[1540]: time="2025-01-13T20:52:50.442840335Z" level=info msg="StopPodSandbox for \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\" returns successfully" Jan 13 20:52:50.443683 containerd[1540]: time="2025-01-13T20:52:50.443646401Z" level=info msg="RemovePodSandbox for \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\"" Jan 13 20:52:50.443683 containerd[1540]: time="2025-01-13T20:52:50.443661489Z" level=info msg="Forcibly stopping sandbox \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\"" Jan 13 20:52:50.443756 containerd[1540]: time="2025-01-13T20:52:50.443732386Z" level=info msg="TearDown network for sandbox \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\" successfully" Jan 13 20:52:50.445237 containerd[1540]: time="2025-01-13T20:52:50.445080905Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.445237 containerd[1540]: time="2025-01-13T20:52:50.445101441Z" level=info msg="RemovePodSandbox \"5dfbbd22b564b5079ca3a6f28c1e066c61fa65ec826eed7243a24b0df4ea40d0\" returns successfully" Jan 13 20:52:50.445295 containerd[1540]: time="2025-01-13T20:52:50.445262406Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\"" Jan 13 20:52:50.445316 containerd[1540]: time="2025-01-13T20:52:50.445310791Z" level=info msg="TearDown network for sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" successfully" Jan 13 20:52:50.445335 containerd[1540]: time="2025-01-13T20:52:50.445316994Z" level=info msg="StopPodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" returns successfully" Jan 13 20:52:50.445675 containerd[1540]: time="2025-01-13T20:52:50.445489569Z" level=info msg="RemovePodSandbox for \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\"" Jan 13 20:52:50.445822 containerd[1540]: time="2025-01-13T20:52:50.445737036Z" level=info msg="Forcibly stopping sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\"" Jan 13 20:52:50.456519 containerd[1540]: time="2025-01-13T20:52:50.456492804Z" level=info msg="TearDown network for sandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" successfully" Jan 13 20:52:50.458720 containerd[1540]: time="2025-01-13T20:52:50.458664634Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.458720 containerd[1540]: time="2025-01-13T20:52:50.458691383Z" level=info msg="RemovePodSandbox \"ae8c33fe0f9d5b60aedfb9baece18d4f3afdde39f1a1dc010171ed3095f347ec\" returns successfully" Jan 13 20:52:50.465522 containerd[1540]: time="2025-01-13T20:52:50.465403440Z" level=info msg="StopPodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\"" Jan 13 20:52:50.465522 containerd[1540]: time="2025-01-13T20:52:50.465475840Z" level=info msg="TearDown network for sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" successfully" Jan 13 20:52:50.465522 containerd[1540]: time="2025-01-13T20:52:50.465482912Z" level=info msg="StopPodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" returns successfully" Jan 13 20:52:50.466492 containerd[1540]: time="2025-01-13T20:52:50.465824826Z" level=info msg="RemovePodSandbox for \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\"" Jan 13 20:52:50.466492 containerd[1540]: time="2025-01-13T20:52:50.465840453Z" level=info msg="Forcibly stopping sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\"" Jan 13 20:52:50.466492 containerd[1540]: time="2025-01-13T20:52:50.465926598Z" level=info msg="TearDown network for sandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" successfully" Jan 13 20:52:50.468289 containerd[1540]: time="2025-01-13T20:52:50.468271743Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.468324 containerd[1540]: time="2025-01-13T20:52:50.468301652Z" level=info msg="RemovePodSandbox \"ab35a20d98198ba8d5480b68766865b2bb6004f3c6e39a9fe60450f05283f26d\" returns successfully" Jan 13 20:52:50.468509 containerd[1540]: time="2025-01-13T20:52:50.468495566Z" level=info msg="StopPodSandbox for \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\"" Jan 13 20:52:50.468556 containerd[1540]: time="2025-01-13T20:52:50.468539070Z" level=info msg="TearDown network for sandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\" successfully" Jan 13 20:52:50.468556 containerd[1540]: time="2025-01-13T20:52:50.468549364Z" level=info msg="StopPodSandbox for \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\" returns successfully" Jan 13 20:52:50.468795 containerd[1540]: time="2025-01-13T20:52:50.468752523Z" level=info msg="RemovePodSandbox for \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\"" Jan 13 20:52:50.468940 containerd[1540]: time="2025-01-13T20:52:50.468765622Z" level=info msg="Forcibly stopping sandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\"" Jan 13 20:52:50.469088 containerd[1540]: time="2025-01-13T20:52:50.469068115Z" level=info msg="TearDown network for sandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\" successfully" Jan 13 20:52:50.471447 containerd[1540]: time="2025-01-13T20:52:50.471392768Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.471447 containerd[1540]: time="2025-01-13T20:52:50.471416967Z" level=info msg="RemovePodSandbox \"206427eb57db11c202aebbcf86f9f04ed4546c071ba2c5aec21d27deb5d4ba0b\" returns successfully" Jan 13 20:52:50.471640 containerd[1540]: time="2025-01-13T20:52:50.471576940Z" level=info msg="StopPodSandbox for \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\"" Jan 13 20:52:50.471640 containerd[1540]: time="2025-01-13T20:52:50.471617563Z" level=info msg="TearDown network for sandbox \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\" successfully" Jan 13 20:52:50.471640 containerd[1540]: time="2025-01-13T20:52:50.471623559Z" level=info msg="StopPodSandbox for \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\" returns successfully" Jan 13 20:52:50.471997 containerd[1540]: time="2025-01-13T20:52:50.471782588Z" level=info msg="RemovePodSandbox for \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\"" Jan 13 20:52:50.471997 containerd[1540]: time="2025-01-13T20:52:50.471827602Z" level=info msg="Forcibly stopping sandbox \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\"" Jan 13 20:52:50.471997 containerd[1540]: time="2025-01-13T20:52:50.471860323Z" level=info msg="TearDown network for sandbox \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\" successfully" Jan 13 20:52:50.474148 containerd[1540]: time="2025-01-13T20:52:50.474132257Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.474228 containerd[1540]: time="2025-01-13T20:52:50.474154591Z" level=info msg="RemovePodSandbox \"ad0afec2e3d8f7ba98adec9109a66d8f9c13799de64d9c160bb9c93f55703f5c\" returns successfully" Jan 13 20:52:50.474839 containerd[1540]: time="2025-01-13T20:52:50.474825823Z" level=info msg="StopPodSandbox for \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\"" Jan 13 20:52:50.474883 containerd[1540]: time="2025-01-13T20:52:50.474871195Z" level=info msg="TearDown network for sandbox \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\" successfully" Jan 13 20:52:50.474883 containerd[1540]: time="2025-01-13T20:52:50.474881655Z" level=info msg="StopPodSandbox for \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\" returns successfully" Jan 13 20:52:50.475291 containerd[1540]: time="2025-01-13T20:52:50.475270407Z" level=info msg="RemovePodSandbox for \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\"" Jan 13 20:52:50.475322 containerd[1540]: time="2025-01-13T20:52:50.475291335Z" level=info msg="Forcibly stopping sandbox \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\"" Jan 13 20:52:50.475450 containerd[1540]: time="2025-01-13T20:52:50.475342021Z" level=info msg="TearDown network for sandbox \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\" successfully" Jan 13 20:52:50.476911 containerd[1540]: time="2025-01-13T20:52:50.476891281Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.476947 containerd[1540]: time="2025-01-13T20:52:50.476916065Z" level=info msg="RemovePodSandbox \"86644f9a7bd02175e23f2068623b0345c24fddc1dac128620c7132e966deb3ff\" returns successfully" Jan 13 20:52:50.477117 containerd[1540]: time="2025-01-13T20:52:50.477049074Z" level=info msg="StopPodSandbox for \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\"" Jan 13 20:52:50.477163 containerd[1540]: time="2025-01-13T20:52:50.477125680Z" level=info msg="TearDown network for sandbox \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\" successfully" Jan 13 20:52:50.477163 containerd[1540]: time="2025-01-13T20:52:50.477153524Z" level=info msg="StopPodSandbox for \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\" returns successfully" Jan 13 20:52:50.477376 containerd[1540]: time="2025-01-13T20:52:50.477363117Z" level=info msg="RemovePodSandbox for \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\"" Jan 13 20:52:50.477407 containerd[1540]: time="2025-01-13T20:52:50.477376310Z" level=info msg="Forcibly stopping sandbox \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\"" Jan 13 20:52:50.477425 containerd[1540]: time="2025-01-13T20:52:50.477410374Z" level=info msg="TearDown network for sandbox \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\" successfully" Jan 13 20:52:50.479400 containerd[1540]: time="2025-01-13T20:52:50.479338694Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.479400 containerd[1540]: time="2025-01-13T20:52:50.479393121Z" level=info msg="RemovePodSandbox \"1fea7c65b115684ec12818f427835cef6c531fcc4877064f751cf2623ba62b55\" returns successfully" Jan 13 20:52:50.479721 containerd[1540]: time="2025-01-13T20:52:50.479618833Z" level=info msg="StopPodSandbox for \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\"" Jan 13 20:52:50.479721 containerd[1540]: time="2025-01-13T20:52:50.479664739Z" level=info msg="TearDown network for sandbox \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\" successfully" Jan 13 20:52:50.479721 containerd[1540]: time="2025-01-13T20:52:50.479689322Z" level=info msg="StopPodSandbox for \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\" returns successfully" Jan 13 20:52:50.480090 containerd[1540]: time="2025-01-13T20:52:50.480076965Z" level=info msg="RemovePodSandbox for \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\"" Jan 13 20:52:50.480121 containerd[1540]: time="2025-01-13T20:52:50.480092919Z" level=info msg="Forcibly stopping sandbox \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\"" Jan 13 20:52:50.480217 containerd[1540]: time="2025-01-13T20:52:50.480194342Z" level=info msg="TearDown network for sandbox \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\" successfully" Jan 13 20:52:50.483594 containerd[1540]: time="2025-01-13T20:52:50.483566739Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.483669 containerd[1540]: time="2025-01-13T20:52:50.483609115Z" level=info msg="RemovePodSandbox \"bdd5677d05af533f2a4e80d27d9e444ca15e5650c7e5ac9b03882adc912ef533\" returns successfully" Jan 13 20:52:50.488684 containerd[1540]: time="2025-01-13T20:52:50.488619392Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\"" Jan 13 20:52:50.493734 containerd[1540]: time="2025-01-13T20:52:50.488698399Z" level=info msg="TearDown network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" successfully" Jan 13 20:52:50.493734 containerd[1540]: time="2025-01-13T20:52:50.488705793Z" level=info msg="StopPodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" returns successfully" Jan 13 20:52:50.493734 containerd[1540]: time="2025-01-13T20:52:50.488955187Z" level=info msg="RemovePodSandbox for \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\"" Jan 13 20:52:50.493734 containerd[1540]: time="2025-01-13T20:52:50.488968138Z" level=info msg="Forcibly stopping sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\"" Jan 13 20:52:50.493734 containerd[1540]: time="2025-01-13T20:52:50.489006166Z" level=info msg="TearDown network for sandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" successfully" Jan 13 20:52:50.562898 containerd[1540]: time="2025-01-13T20:52:50.562851211Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.563022 containerd[1540]: time="2025-01-13T20:52:50.562904074Z" level=info msg="RemovePodSandbox \"bf6db56addc2fc75f73c88bb56a13472b2fd461522c0bca49cbb6272f98c2dab\" returns successfully" Jan 13 20:52:50.563193 containerd[1540]: time="2025-01-13T20:52:50.563180745Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\"" Jan 13 20:52:50.563344 containerd[1540]: time="2025-01-13T20:52:50.563286931Z" level=info msg="TearDown network for sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" successfully" Jan 13 20:52:50.563344 containerd[1540]: time="2025-01-13T20:52:50.563295643Z" level=info msg="StopPodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" returns successfully" Jan 13 20:52:50.563499 containerd[1540]: time="2025-01-13T20:52:50.563489938Z" level=info msg="RemovePodSandbox for \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\"" Jan 13 20:52:50.563669 containerd[1540]: time="2025-01-13T20:52:50.563531007Z" level=info msg="Forcibly stopping sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\"" Jan 13 20:52:50.563669 containerd[1540]: time="2025-01-13T20:52:50.563563169Z" level=info msg="TearDown network for sandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" successfully" Jan 13 20:52:50.571694 containerd[1540]: time="2025-01-13T20:52:50.571670003Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Jan 13 20:52:50.572229 containerd[1540]: time="2025-01-13T20:52:50.572211466Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:50.572458 containerd[1540]: time="2025-01-13T20:52:50.572372212Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.572458 containerd[1540]: time="2025-01-13T20:52:50.572397759Z" level=info msg="RemovePodSandbox \"1b0c187a50127f049fe74e405bebdb71f072906dd63be7d4adaaf430f70bb509\" returns successfully" Jan 13 20:52:50.573064 containerd[1540]: time="2025-01-13T20:52:50.572925037Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:50.573064 containerd[1540]: time="2025-01-13T20:52:50.572984739Z" level=info msg="StopPodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\"" Jan 13 20:52:50.573064 containerd[1540]: time="2025-01-13T20:52:50.573032878Z" level=info msg="TearDown network for sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" successfully" Jan 13 20:52:50.573064 containerd[1540]: time="2025-01-13T20:52:50.573038974Z" level=info msg="StopPodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" returns successfully" Jan 13 20:52:50.573462 containerd[1540]: time="2025-01-13T20:52:50.573450624Z" level=info msg="RemovePodSandbox for \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\"" Jan 13 20:52:50.573517 containerd[1540]: time="2025-01-13T20:52:50.573509180Z" level=info msg="Forcibly stopping sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\"" Jan 13 20:52:50.573646 containerd[1540]: time="2025-01-13T20:52:50.573613915Z" level=info msg="TearDown network for sandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" successfully" Jan 13 20:52:50.577359 containerd[1540]: time="2025-01-13T20:52:50.576541501Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:52:50.577359 containerd[1540]: time="2025-01-13T20:52:50.576568744Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 3.464818753s" Jan 13 20:52:50.577359 containerd[1540]: time="2025-01-13T20:52:50.576583583Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Jan 13 20:52:50.577359 containerd[1540]: time="2025-01-13T20:52:50.576553793Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.577359 containerd[1540]: time="2025-01-13T20:52:50.576734186Z" level=info msg="RemovePodSandbox \"7792371d797d3a70d5e5492cb6bbfa5918429351b7d813d5b9b1f58a5feea7e5\" returns successfully" Jan 13 20:52:50.579039 containerd[1540]: time="2025-01-13T20:52:50.579021837Z" level=info msg="StopPodSandbox for \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\"" Jan 13 20:52:50.579096 containerd[1540]: time="2025-01-13T20:52:50.579069837Z" level=info msg="TearDown network for sandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\" successfully" Jan 13 20:52:50.579096 containerd[1540]: time="2025-01-13T20:52:50.579094797Z" level=info msg="StopPodSandbox for \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\" returns successfully" Jan 13 20:52:50.581313 containerd[1540]: time="2025-01-13T20:52:50.581298751Z" level=info msg="CreateContainer within sandbox \"9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jan 13 20:52:50.581421 containerd[1540]: time="2025-01-13T20:52:50.581411106Z" level=info msg="RemovePodSandbox for \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\"" Jan 13 20:52:50.581481 containerd[1540]: time="2025-01-13T20:52:50.581473643Z" level=info msg="Forcibly stopping sandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\"" Jan 13 20:52:50.581636 containerd[1540]: time="2025-01-13T20:52:50.581615231Z" level=info msg="TearDown network for sandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\" successfully" Jan 13 20:52:50.601179 containerd[1540]: time="2025-01-13T20:52:50.601159454Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.601380 containerd[1540]: time="2025-01-13T20:52:50.601254320Z" level=info msg="RemovePodSandbox \"de2202e1e197f9b3f937490ff5d3dda42834839af332365c6aa2808955879a1e\" returns successfully" Jan 13 20:52:50.601610 containerd[1540]: time="2025-01-13T20:52:50.601551100Z" level=info msg="CreateContainer within sandbox \"9110807847167594f6d8091d6e23c8ada2834a6197069d60b298c64a87ce0976\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"40402b56a01cf8198479c2418dd7b1649757e440299a70a56fd5e9ce580e44a8\"" Jan 13 20:52:50.601906 containerd[1540]: time="2025-01-13T20:52:50.601896868Z" level=info msg="StartContainer for \"40402b56a01cf8198479c2418dd7b1649757e440299a70a56fd5e9ce580e44a8\"" Jan 13 20:52:50.602040 containerd[1540]: time="2025-01-13T20:52:50.602031741Z" level=info msg="StopPodSandbox for \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\"" Jan 13 20:52:50.602194 containerd[1540]: time="2025-01-13T20:52:50.602125907Z" level=info msg="TearDown network for sandbox \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\" successfully" Jan 13 20:52:50.602194 containerd[1540]: time="2025-01-13T20:52:50.602136159Z" level=info msg="StopPodSandbox for \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\" returns successfully" Jan 13 20:52:50.602496 containerd[1540]: time="2025-01-13T20:52:50.602399992Z" level=info msg="RemovePodSandbox for \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\"" Jan 13 20:52:50.602496 containerd[1540]: time="2025-01-13T20:52:50.602415431Z" level=info msg="Forcibly stopping sandbox \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\"" Jan 13 20:52:50.602496 containerd[1540]: time="2025-01-13T20:52:50.602454900Z" level=info msg="TearDown network for sandbox \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\" successfully" Jan 13 20:52:50.604019 containerd[1540]: time="2025-01-13T20:52:50.604001851Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.604051 containerd[1540]: time="2025-01-13T20:52:50.604029023Z" level=info msg="RemovePodSandbox \"56c6c8077c464c9902f87517146bdff1c5589d0fd653a25daa44f4e1e70f383e\" returns successfully" Jan 13 20:52:50.604301 containerd[1540]: time="2025-01-13T20:52:50.604182177Z" level=info msg="StopPodSandbox for \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\"" Jan 13 20:52:50.604301 containerd[1540]: time="2025-01-13T20:52:50.604223658Z" level=info msg="TearDown network for sandbox \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\" successfully" Jan 13 20:52:50.604301 containerd[1540]: time="2025-01-13T20:52:50.604249005Z" level=info msg="StopPodSandbox for \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\" returns successfully" Jan 13 20:52:50.604594 containerd[1540]: time="2025-01-13T20:52:50.604557765Z" level=info msg="RemovePodSandbox for \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\"" Jan 13 20:52:50.604946 containerd[1540]: time="2025-01-13T20:52:50.604933933Z" level=info msg="Forcibly stopping sandbox \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\"" Jan 13 20:52:50.605854 containerd[1540]: time="2025-01-13T20:52:50.604978268Z" level=info msg="TearDown network for sandbox \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\" successfully" Jan 13 20:52:50.606291 containerd[1540]: time="2025-01-13T20:52:50.606275505Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.606419 containerd[1540]: time="2025-01-13T20:52:50.606301817Z" level=info msg="RemovePodSandbox \"b5be33027b7c98dfadd4eb863abb660114da6f2ee378da03b9c0590830b85865\" returns successfully" Jan 13 20:52:50.606615 containerd[1540]: time="2025-01-13T20:52:50.606488218Z" level=info msg="StopPodSandbox for \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\"" Jan 13 20:52:50.606615 containerd[1540]: time="2025-01-13T20:52:50.606530737Z" level=info msg="TearDown network for sandbox \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\" successfully" Jan 13 20:52:50.606615 containerd[1540]: time="2025-01-13T20:52:50.606537025Z" level=info msg="StopPodSandbox for \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\" returns successfully" Jan 13 20:52:50.607394 containerd[1540]: time="2025-01-13T20:52:50.606740355Z" level=info msg="RemovePodSandbox for \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\"" Jan 13 20:52:50.607394 containerd[1540]: time="2025-01-13T20:52:50.606752834Z" level=info msg="Forcibly stopping sandbox \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\"" Jan 13 20:52:50.607394 containerd[1540]: time="2025-01-13T20:52:50.606784230Z" level=info msg="TearDown network for sandbox \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\" successfully" Jan 13 20:52:50.610654 containerd[1540]: time="2025-01-13T20:52:50.610585432Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.610654 containerd[1540]: time="2025-01-13T20:52:50.610607905Z" level=info msg="RemovePodSandbox \"cedcc34386a6610063c402386f4527bf3c31ce6d90ff9c3a4c70196b6c08fcc7\" returns successfully" Jan 13 20:52:50.610925 containerd[1540]: time="2025-01-13T20:52:50.610909864Z" level=info msg="StopPodSandbox for \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\"" Jan 13 20:52:50.611045 containerd[1540]: time="2025-01-13T20:52:50.611001476Z" level=info msg="TearDown network for sandbox \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\" successfully" Jan 13 20:52:50.611045 containerd[1540]: time="2025-01-13T20:52:50.611009804Z" level=info msg="StopPodSandbox for \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\" returns successfully" Jan 13 20:52:50.611739 containerd[1540]: time="2025-01-13T20:52:50.611138148Z" level=info msg="RemovePodSandbox for \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\"" Jan 13 20:52:50.611739 containerd[1540]: time="2025-01-13T20:52:50.611150503Z" level=info msg="Forcibly stopping sandbox \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\"" Jan 13 20:52:50.611739 containerd[1540]: time="2025-01-13T20:52:50.611180219Z" level=info msg="TearDown network for sandbox \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\" successfully" Jan 13 20:52:50.612826 containerd[1540]: time="2025-01-13T20:52:50.612814212Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.612915 containerd[1540]: time="2025-01-13T20:52:50.612904848Z" level=info msg="RemovePodSandbox \"8e5a3a31f2a2e55080d764d020c63306093e1a54ed4014d4ac715ede1f234213\" returns successfully" Jan 13 20:52:50.613118 containerd[1540]: time="2025-01-13T20:52:50.613107591Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\"" Jan 13 20:52:50.614146 containerd[1540]: time="2025-01-13T20:52:50.613254140Z" level=info msg="TearDown network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" successfully" Jan 13 20:52:50.614146 containerd[1540]: time="2025-01-13T20:52:50.613263226Z" level=info msg="StopPodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" returns successfully" Jan 13 20:52:50.614146 containerd[1540]: time="2025-01-13T20:52:50.613406086Z" level=info msg="RemovePodSandbox for \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\"" Jan 13 20:52:50.614146 containerd[1540]: time="2025-01-13T20:52:50.613415682Z" level=info msg="Forcibly stopping sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\"" Jan 13 20:52:50.614146 containerd[1540]: time="2025-01-13T20:52:50.613486087Z" level=info msg="TearDown network for sandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" successfully" Jan 13 20:52:50.615146 containerd[1540]: time="2025-01-13T20:52:50.615134307Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.615222 containerd[1540]: time="2025-01-13T20:52:50.615212866Z" level=info msg="RemovePodSandbox \"4d7294f9319dfcd09453ef49a4a0565dac9e883a4e1f537683c66538fc7dfede\" returns successfully" Jan 13 20:52:50.615483 containerd[1540]: time="2025-01-13T20:52:50.615475156Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\"" Jan 13 20:52:50.615648 containerd[1540]: time="2025-01-13T20:52:50.615639126Z" level=info msg="TearDown network for sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" successfully" Jan 13 20:52:50.615700 containerd[1540]: time="2025-01-13T20:52:50.615691397Z" level=info msg="StopPodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" returns successfully" Jan 13 20:52:50.615955 containerd[1540]: time="2025-01-13T20:52:50.615915873Z" level=info msg="RemovePodSandbox for \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\"" Jan 13 20:52:50.616010 containerd[1540]: time="2025-01-13T20:52:50.616001727Z" level=info msg="Forcibly stopping sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\"" Jan 13 20:52:50.616091 containerd[1540]: time="2025-01-13T20:52:50.616072643Z" level=info msg="TearDown network for sandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" successfully" Jan 13 20:52:50.617726 containerd[1540]: time="2025-01-13T20:52:50.617689876Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.617793 containerd[1540]: time="2025-01-13T20:52:50.617784080Z" level=info msg="RemovePodSandbox \"3577310d235d88031897fb229bebeda0ba881d76252ab2b6d835b74fc7285588\" returns successfully" Jan 13 20:52:50.618652 containerd[1540]: time="2025-01-13T20:52:50.618641296Z" level=info msg="StopPodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\"" Jan 13 20:52:50.618770 containerd[1540]: time="2025-01-13T20:52:50.618761439Z" level=info msg="TearDown network for sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" successfully" Jan 13 20:52:50.618824 containerd[1540]: time="2025-01-13T20:52:50.618816210Z" level=info msg="StopPodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" returns successfully" Jan 13 20:52:50.619046 containerd[1540]: time="2025-01-13T20:52:50.619026642Z" level=info msg="RemovePodSandbox for \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\"" Jan 13 20:52:50.619046 containerd[1540]: time="2025-01-13T20:52:50.619041346Z" level=info msg="Forcibly stopping sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\"" Jan 13 20:52:50.619098 containerd[1540]: time="2025-01-13T20:52:50.619076615Z" level=info msg="TearDown network for sandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" successfully" Jan 13 20:52:50.623699 containerd[1540]: time="2025-01-13T20:52:50.623445120Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.623699 containerd[1540]: time="2025-01-13T20:52:50.623581287Z" level=info msg="RemovePodSandbox \"f3a01d925fcd38f782d3343876bc8ff717e3fcade57b07f5e7b7a394f1e5a1a3\" returns successfully" Jan 13 20:52:50.624280 containerd[1540]: time="2025-01-13T20:52:50.624232697Z" level=info msg="StopPodSandbox for \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\"" Jan 13 20:52:50.624644 containerd[1540]: time="2025-01-13T20:52:50.624546607Z" level=info msg="TearDown network for sandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\" successfully" Jan 13 20:52:50.625463 containerd[1540]: time="2025-01-13T20:52:50.624791696Z" level=info msg="StopPodSandbox for \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\" returns successfully" Jan 13 20:52:50.625463 containerd[1540]: time="2025-01-13T20:52:50.624970340Z" level=info msg="RemovePodSandbox for \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\"" Jan 13 20:52:50.625463 containerd[1540]: time="2025-01-13T20:52:50.624981972Z" level=info msg="Forcibly stopping sandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\"" Jan 13 20:52:50.625463 containerd[1540]: time="2025-01-13T20:52:50.625023359Z" level=info msg="TearDown network for sandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\" successfully" Jan 13 20:52:50.627374 containerd[1540]: time="2025-01-13T20:52:50.627232866Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.627374 containerd[1540]: time="2025-01-13T20:52:50.627259250Z" level=info msg="RemovePodSandbox \"9e1f9461eca2e476a795277f8547d57edfe210b3dc588fff15cfb314e6d86a87\" returns successfully" Jan 13 20:52:50.627684 containerd[1540]: time="2025-01-13T20:52:50.627665315Z" level=info msg="StopPodSandbox for \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\"" Jan 13 20:52:50.627718 containerd[1540]: time="2025-01-13T20:52:50.627713120Z" level=info msg="TearDown network for sandbox \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\" successfully" Jan 13 20:52:50.627973 containerd[1540]: time="2025-01-13T20:52:50.627720007Z" level=info msg="StopPodSandbox for \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\" returns successfully" Jan 13 20:52:50.631088 containerd[1540]: time="2025-01-13T20:52:50.630589071Z" level=info msg="RemovePodSandbox for \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\"" Jan 13 20:52:50.631088 containerd[1540]: time="2025-01-13T20:52:50.630604273Z" level=info msg="Forcibly stopping sandbox \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\"" Jan 13 20:52:50.631088 containerd[1540]: time="2025-01-13T20:52:50.630646293Z" level=info msg="TearDown network for sandbox \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\" successfully" Jan 13 20:52:50.633408 containerd[1540]: time="2025-01-13T20:52:50.633387546Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.634400 containerd[1540]: time="2025-01-13T20:52:50.633414247Z" level=info msg="RemovePodSandbox \"adc40a8bd66231356d6987364fdb3e6172f808cccf9bc8385a877b5b05028797\" returns successfully" Jan 13 20:52:50.635173 containerd[1540]: time="2025-01-13T20:52:50.635099062Z" level=info msg="StopPodSandbox for \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\"" Jan 13 20:52:50.635173 containerd[1540]: time="2025-01-13T20:52:50.635142971Z" level=info msg="TearDown network for sandbox \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\" successfully" Jan 13 20:52:50.635173 containerd[1540]: time="2025-01-13T20:52:50.635150095Z" level=info msg="StopPodSandbox for \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\" returns successfully" Jan 13 20:52:50.635972 containerd[1540]: time="2025-01-13T20:52:50.635358225Z" level=info msg="RemovePodSandbox for \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\"" Jan 13 20:52:50.635972 containerd[1540]: time="2025-01-13T20:52:50.635370722Z" level=info msg="Forcibly stopping sandbox \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\"" Jan 13 20:52:50.635972 containerd[1540]: time="2025-01-13T20:52:50.635399822Z" level=info msg="TearDown network for sandbox \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\" successfully" Jan 13 20:52:50.635894 systemd[1]: run-containerd-runc-k8s.io-40402b56a01cf8198479c2418dd7b1649757e440299a70a56fd5e9ce580e44a8-runc.JWzCew.mount: Deactivated successfully. Jan 13 20:52:50.637122 containerd[1540]: time="2025-01-13T20:52:50.636497749Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.637122 containerd[1540]: time="2025-01-13T20:52:50.636517817Z" level=info msg="RemovePodSandbox \"c173447e47facf970b7db0e7ef3152088739e0096e82d67551f2fa75582ba02e\" returns successfully" Jan 13 20:52:50.637260 containerd[1540]: time="2025-01-13T20:52:50.637249288Z" level=info msg="StopPodSandbox for \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\"" Jan 13 20:52:50.637384 containerd[1540]: time="2025-01-13T20:52:50.637373612Z" level=info msg="TearDown network for sandbox \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\" successfully" Jan 13 20:52:50.637425 containerd[1540]: time="2025-01-13T20:52:50.637418024Z" level=info msg="StopPodSandbox for \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\" returns successfully" Jan 13 20:52:50.637602 containerd[1540]: time="2025-01-13T20:52:50.637593391Z" level=info msg="RemovePodSandbox for \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\"" Jan 13 20:52:50.637652 containerd[1540]: time="2025-01-13T20:52:50.637645292Z" level=info msg="Forcibly stopping sandbox \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\"" Jan 13 20:52:50.637736 containerd[1540]: time="2025-01-13T20:52:50.637717051Z" level=info msg="TearDown network for sandbox \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\" successfully" Jan 13 20:52:50.639054 containerd[1540]: time="2025-01-13T20:52:50.639036089Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.639132 containerd[1540]: time="2025-01-13T20:52:50.639122378Z" level=info msg="RemovePodSandbox \"21703071012f2cdb800463eed5375f53e724a0d004a4f184c3d7797906b568fd\" returns successfully" Jan 13 20:52:50.639696 containerd[1540]: time="2025-01-13T20:52:50.639686158Z" level=info msg="StopPodSandbox for \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\"" Jan 13 20:52:50.639780 containerd[1540]: time="2025-01-13T20:52:50.639772308Z" level=info msg="TearDown network for sandbox \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\" successfully" Jan 13 20:52:50.639899 containerd[1540]: time="2025-01-13T20:52:50.639891163Z" level=info msg="StopPodSandbox for \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\" returns successfully" Jan 13 20:52:50.640455 containerd[1540]: time="2025-01-13T20:52:50.640445293Z" level=info msg="RemovePodSandbox for \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\"" Jan 13 20:52:50.640516 containerd[1540]: time="2025-01-13T20:52:50.640507799Z" level=info msg="Forcibly stopping sandbox \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\"" Jan 13 20:52:50.640640 containerd[1540]: time="2025-01-13T20:52:50.640621166Z" level=info msg="TearDown network for sandbox \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\" successfully" Jan 13 20:52:50.642789 containerd[1540]: time="2025-01-13T20:52:50.642730451Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.642789 containerd[1540]: time="2025-01-13T20:52:50.642751745Z" level=info msg="RemovePodSandbox \"8873923c90cafa25c404b19b1bef43fd05eca977d5b64afd71d15cea3105ea63\" returns successfully" Jan 13 20:52:50.643078 containerd[1540]: time="2025-01-13T20:52:50.642886248Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\"" Jan 13 20:52:50.643078 containerd[1540]: time="2025-01-13T20:52:50.642988251Z" level=info msg="TearDown network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" successfully" Jan 13 20:52:50.643078 containerd[1540]: time="2025-01-13T20:52:50.642996552Z" level=info msg="StopPodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" returns successfully" Jan 13 20:52:50.643650 containerd[1540]: time="2025-01-13T20:52:50.643122525Z" level=info msg="RemovePodSandbox for \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\"" Jan 13 20:52:50.643650 containerd[1540]: time="2025-01-13T20:52:50.643132618Z" level=info msg="Forcibly stopping sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\"" Jan 13 20:52:50.643650 containerd[1540]: time="2025-01-13T20:52:50.643173651Z" level=info msg="TearDown network for sandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" successfully" Jan 13 20:52:50.644438 systemd[1]: Started cri-containerd-40402b56a01cf8198479c2418dd7b1649757e440299a70a56fd5e9ce580e44a8.scope - libcontainer container 40402b56a01cf8198479c2418dd7b1649757e440299a70a56fd5e9ce580e44a8. Jan 13 20:52:50.645778 containerd[1540]: time="2025-01-13T20:52:50.645765624Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.645850 containerd[1540]: time="2025-01-13T20:52:50.645840825Z" level=info msg="RemovePodSandbox \"aa12ac963a82302e6e468d6b9fb10fcb63b9d65c16a66339b71bf3297d0a8ab0\" returns successfully" Jan 13 20:52:50.646116 containerd[1540]: time="2025-01-13T20:52:50.646106748Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\"" Jan 13 20:52:50.646337 containerd[1540]: time="2025-01-13T20:52:50.646327514Z" level=info msg="TearDown network for sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" successfully" Jan 13 20:52:50.646410 containerd[1540]: time="2025-01-13T20:52:50.646400673Z" level=info msg="StopPodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" returns successfully" Jan 13 20:52:50.647073 containerd[1540]: time="2025-01-13T20:52:50.647061995Z" level=info msg="RemovePodSandbox for \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\"" Jan 13 20:52:50.647522 containerd[1540]: time="2025-01-13T20:52:50.647131583Z" level=info msg="Forcibly stopping sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\"" Jan 13 20:52:50.647522 containerd[1540]: time="2025-01-13T20:52:50.647179283Z" level=info msg="TearDown network for sandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" successfully" Jan 13 20:52:50.649172 containerd[1540]: time="2025-01-13T20:52:50.649161010Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.650181 containerd[1540]: time="2025-01-13T20:52:50.649244375Z" level=info msg="RemovePodSandbox \"2ce2d5e1512c98ed065236d4e6a3b79dc2940efc3fc1808f009440e46aa4c8b6\" returns successfully" Jan 13 20:52:50.650181 containerd[1540]: time="2025-01-13T20:52:50.649415655Z" level=info msg="StopPodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\"" Jan 13 20:52:50.650181 containerd[1540]: time="2025-01-13T20:52:50.649468254Z" level=info msg="TearDown network for sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" successfully" Jan 13 20:52:50.650181 containerd[1540]: time="2025-01-13T20:52:50.649475529Z" level=info msg="StopPodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" returns successfully" Jan 13 20:52:50.650181 containerd[1540]: time="2025-01-13T20:52:50.649623240Z" level=info msg="RemovePodSandbox for \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\"" Jan 13 20:52:50.650181 containerd[1540]: time="2025-01-13T20:52:50.649632550Z" level=info msg="Forcibly stopping sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\"" Jan 13 20:52:50.650181 containerd[1540]: time="2025-01-13T20:52:50.649666801Z" level=info msg="TearDown network for sandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" successfully" Jan 13 20:52:50.651700 containerd[1540]: time="2025-01-13T20:52:50.651667898Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.651758 containerd[1540]: time="2025-01-13T20:52:50.651749408Z" level=info msg="RemovePodSandbox \"ab9e798312e90064639aa53a6e1a1f8ef9ee493d98e755ccf68d79f7746752c7\" returns successfully" Jan 13 20:52:50.652007 containerd[1540]: time="2025-01-13T20:52:50.651997225Z" level=info msg="StopPodSandbox for \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\"" Jan 13 20:52:50.652167 containerd[1540]: time="2025-01-13T20:52:50.652158525Z" level=info msg="TearDown network for sandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\" successfully" Jan 13 20:52:50.652229 containerd[1540]: time="2025-01-13T20:52:50.652221928Z" level=info msg="StopPodSandbox for \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\" returns successfully" Jan 13 20:52:50.652446 containerd[1540]: time="2025-01-13T20:52:50.652437579Z" level=info msg="RemovePodSandbox for \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\"" Jan 13 20:52:50.652501 containerd[1540]: time="2025-01-13T20:52:50.652484750Z" level=info msg="Forcibly stopping sandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\"" Jan 13 20:52:50.652625 containerd[1540]: time="2025-01-13T20:52:50.652605897Z" level=info msg="TearDown network for sandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\" successfully" Jan 13 20:52:50.654272 containerd[1540]: time="2025-01-13T20:52:50.654197170Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.654272 containerd[1540]: time="2025-01-13T20:52:50.654221877Z" level=info msg="RemovePodSandbox \"63857cabd6ab0568d861dfbd551e4f31eeef23c8fa63f13cc0c496de8c78f70f\" returns successfully" Jan 13 20:52:50.654741 containerd[1540]: time="2025-01-13T20:52:50.654547579Z" level=info msg="StopPodSandbox for \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\"" Jan 13 20:52:50.654741 containerd[1540]: time="2025-01-13T20:52:50.654589806Z" level=info msg="TearDown network for sandbox \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\" successfully" Jan 13 20:52:50.654741 containerd[1540]: time="2025-01-13T20:52:50.654595814Z" level=info msg="StopPodSandbox for \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\" returns successfully" Jan 13 20:52:50.655015 containerd[1540]: time="2025-01-13T20:52:50.654921097Z" level=info msg="RemovePodSandbox for \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\"" Jan 13 20:52:50.655015 containerd[1540]: time="2025-01-13T20:52:50.654984594Z" level=info msg="Forcibly stopping sandbox \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\"" Jan 13 20:52:50.655165 containerd[1540]: time="2025-01-13T20:52:50.655088073Z" level=info msg="TearDown network for sandbox \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\" successfully" Jan 13 20:52:50.656674 containerd[1540]: time="2025-01-13T20:52:50.656574350Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.656733 containerd[1540]: time="2025-01-13T20:52:50.656723051Z" level=info msg="RemovePodSandbox \"0c737e8255e2e465dfd0851e09d90fbf4a3919d52175cb620b28b07e0035a461\" returns successfully" Jan 13 20:52:50.657020 containerd[1540]: time="2025-01-13T20:52:50.656937921Z" level=info msg="StopPodSandbox for \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\"" Jan 13 20:52:50.657020 containerd[1540]: time="2025-01-13T20:52:50.656986136Z" level=info msg="TearDown network for sandbox \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\" successfully" Jan 13 20:52:50.657020 containerd[1540]: time="2025-01-13T20:52:50.656992399Z" level=info msg="StopPodSandbox for \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\" returns successfully" Jan 13 20:52:50.657527 containerd[1540]: time="2025-01-13T20:52:50.657397126Z" level=info msg="RemovePodSandbox for \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\"" Jan 13 20:52:50.657596 containerd[1540]: time="2025-01-13T20:52:50.657571495Z" level=info msg="Forcibly stopping sandbox \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\"" Jan 13 20:52:50.657718 containerd[1540]: time="2025-01-13T20:52:50.657653773Z" level=info msg="TearDown network for sandbox \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\" successfully" Jan 13 20:52:50.659365 containerd[1540]: time="2025-01-13T20:52:50.659325011Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.659365 containerd[1540]: time="2025-01-13T20:52:50.659360455Z" level=info msg="RemovePodSandbox \"59a3501bf8aa112e386c4236fc36f54347d2b24bb85a0feaeb26260a6cf71383\" returns successfully" Jan 13 20:52:50.659647 containerd[1540]: time="2025-01-13T20:52:50.659570713Z" level=info msg="StopPodSandbox for \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\"" Jan 13 20:52:50.659647 containerd[1540]: time="2025-01-13T20:52:50.659612516Z" level=info msg="TearDown network for sandbox \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\" successfully" Jan 13 20:52:50.659647 containerd[1540]: time="2025-01-13T20:52:50.659618591Z" level=info msg="StopPodSandbox for \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\" returns successfully" Jan 13 20:52:50.660476 containerd[1540]: time="2025-01-13T20:52:50.659904394Z" level=info msg="RemovePodSandbox for \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\"" Jan 13 20:52:50.660476 containerd[1540]: time="2025-01-13T20:52:50.659917416Z" level=info msg="Forcibly stopping sandbox \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\"" Jan 13 20:52:50.660476 containerd[1540]: time="2025-01-13T20:52:50.659949389Z" level=info msg="TearDown network for sandbox \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\" successfully" Jan 13 20:52:50.662122 containerd[1540]: time="2025-01-13T20:52:50.662105955Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.662170 containerd[1540]: time="2025-01-13T20:52:50.662130440Z" level=info msg="RemovePodSandbox \"8658d26e9e1d981fafe863f23b85e053a34a65a99ad98bec016e9e23b2b41f82\" returns successfully" Jan 13 20:52:50.662289 containerd[1540]: time="2025-01-13T20:52:50.662276073Z" level=info msg="StopPodSandbox for \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\"" Jan 13 20:52:50.662323 containerd[1540]: time="2025-01-13T20:52:50.662316792Z" level=info msg="TearDown network for sandbox \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\" successfully" Jan 13 20:52:50.662439 containerd[1540]: time="2025-01-13T20:52:50.662323217Z" level=info msg="StopPodSandbox for \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\" returns successfully" Jan 13 20:52:50.662526 containerd[1540]: time="2025-01-13T20:52:50.662479899Z" level=info msg="RemovePodSandbox for \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\"" Jan 13 20:52:50.662526 containerd[1540]: time="2025-01-13T20:52:50.662492370Z" level=info msg="Forcibly stopping sandbox \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\"" Jan 13 20:52:50.662580 containerd[1540]: time="2025-01-13T20:52:50.662522388Z" level=info msg="TearDown network for sandbox \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\" successfully" Jan 13 20:52:50.664135 containerd[1540]: time="2025-01-13T20:52:50.664119064Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:52:50.664171 containerd[1540]: time="2025-01-13T20:52:50.664141644Z" level=info msg="RemovePodSandbox \"8c6e8e3f4f10ed0d557fcf5ccd4c236ddec7e9ea96002f42bd1cfdc423e1f41a\" returns successfully" Jan 13 20:52:50.669411 containerd[1540]: time="2025-01-13T20:52:50.669395554Z" level=info msg="StartContainer for \"40402b56a01cf8198479c2418dd7b1649757e440299a70a56fd5e9ce580e44a8\" returns successfully" Jan 13 20:52:51.148318 kubelet[2861]: I0113 20:52:51.148067 2861 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/csi-node-driver-7qmqc" podStartSLOduration=29.569530852 podStartE2EDuration="43.148036548s" podCreationTimestamp="2025-01-13 20:52:08 +0000 UTC" firstStartedPulling="2025-01-13 20:52:36.999026092 +0000 UTC m=+49.105310822" lastFinishedPulling="2025-01-13 20:52:50.577531788 +0000 UTC m=+62.683816518" observedRunningTime="2025-01-13 20:52:51.139727171 +0000 UTC m=+63.246011910" watchObservedRunningTime="2025-01-13 20:52:51.148036548 +0000 UTC m=+63.254321283" Jan 13 20:52:51.593207 kubelet[2861]: I0113 20:52:51.593102 2861 csi_plugin.go:99] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jan 13 20:52:51.600495 kubelet[2861]: I0113 20:52:51.600428 2861 csi_plugin.go:112] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jan 13 20:53:04.384600 systemd[1]: Started sshd@7-139.178.70.104:22-147.75.109.163:43330.service - OpenSSH per-connection server daemon (147.75.109.163:43330). Jan 13 20:53:04.571743 sshd[6240]: Accepted publickey for core from 147.75.109.163 port 43330 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:04.573436 sshd-session[6240]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:04.578514 systemd-logind[1528]: New session 10 of user core. Jan 13 20:53:04.582450 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 13 20:53:05.291290 sshd[6242]: Connection closed by 147.75.109.163 port 43330 Jan 13 20:53:05.292014 sshd-session[6240]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:05.294071 systemd-logind[1528]: Session 10 logged out. Waiting for processes to exit. Jan 13 20:53:05.295288 systemd[1]: sshd@7-139.178.70.104:22-147.75.109.163:43330.service: Deactivated successfully. Jan 13 20:53:05.296639 systemd[1]: session-10.scope: Deactivated successfully. Jan 13 20:53:05.297865 systemd-logind[1528]: Removed session 10. Jan 13 20:53:09.205676 containerd[1540]: time="2025-01-13T20:53:09.205635385Z" level=info msg="StopContainer for \"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6\" with timeout 300 (s)" Jan 13 20:53:09.231581 containerd[1540]: time="2025-01-13T20:53:09.231535551Z" level=info msg="Stop container \"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6\" with signal terminated" Jan 13 20:53:09.434019 containerd[1540]: time="2025-01-13T20:53:09.433911425Z" level=info msg="StopContainer for \"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1\" with timeout 30 (s)" Jan 13 20:53:09.443146 containerd[1540]: time="2025-01-13T20:53:09.443089013Z" level=info msg="Stop container \"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1\" with signal terminated" Jan 13 20:53:09.449617 systemd[1]: cri-containerd-a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1.scope: Deactivated successfully. Jan 13 20:53:09.467865 containerd[1540]: time="2025-01-13T20:53:09.467463102Z" level=info msg="shim disconnected" id=a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1 namespace=k8s.io Jan 13 20:53:09.467779 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1-rootfs.mount: Deactivated successfully. Jan 13 20:53:09.479158 containerd[1540]: time="2025-01-13T20:53:09.479127207Z" level=warning msg="cleaning up after shim disconnected" id=a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1 namespace=k8s.io Jan 13 20:53:09.479158 containerd[1540]: time="2025-01-13T20:53:09.479153530Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:53:09.717557 containerd[1540]: time="2025-01-13T20:53:09.717512742Z" level=info msg="StopContainer for \"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1\" returns successfully" Jan 13 20:53:09.717959 containerd[1540]: time="2025-01-13T20:53:09.717902019Z" level=info msg="StopPodSandbox for \"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d\"" Jan 13 20:53:09.718024 containerd[1540]: time="2025-01-13T20:53:09.717932772Z" level=info msg="Container to stop \"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 13 20:53:09.723271 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d-shm.mount: Deactivated successfully. Jan 13 20:53:09.725612 systemd[1]: cri-containerd-0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d.scope: Deactivated successfully. Jan 13 20:53:09.740789 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d-rootfs.mount: Deactivated successfully. Jan 13 20:53:09.741193 containerd[1540]: time="2025-01-13T20:53:09.741054490Z" level=info msg="shim disconnected" id=0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d namespace=k8s.io Jan 13 20:53:09.741193 containerd[1540]: time="2025-01-13T20:53:09.741086286Z" level=warning msg="cleaning up after shim disconnected" id=0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d namespace=k8s.io Jan 13 20:53:09.741193 containerd[1540]: time="2025-01-13T20:53:09.741091491Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:53:09.749815 containerd[1540]: time="2025-01-13T20:53:09.749780864Z" level=warning msg="cleanup warnings time=\"2025-01-13T20:53:09Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Jan 13 20:53:10.011284 systemd-networkd[1457]: cali3bdc2b905e9: Link DOWN Jan 13 20:53:10.011608 systemd-networkd[1457]: cali3bdc2b905e9: Lost carrier Jan 13 20:53:10.077211 containerd[1540]: 2025-01-13 20:53:10.004 [INFO][6344] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Jan 13 20:53:10.077211 containerd[1540]: 2025-01-13 20:53:10.008 [INFO][6344] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" iface="eth0" netns="/var/run/netns/cni-a9b14467-3453-0b23-fd8c-9d491a63546d" Jan 13 20:53:10.077211 containerd[1540]: 2025-01-13 20:53:10.009 [INFO][6344] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" iface="eth0" netns="/var/run/netns/cni-a9b14467-3453-0b23-fd8c-9d491a63546d" Jan 13 20:53:10.077211 containerd[1540]: 2025-01-13 20:53:10.021 [INFO][6344] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" after=12.885935ms iface="eth0" netns="/var/run/netns/cni-a9b14467-3453-0b23-fd8c-9d491a63546d" Jan 13 20:53:10.077211 containerd[1540]: 2025-01-13 20:53:10.021 [INFO][6344] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Jan 13 20:53:10.077211 containerd[1540]: 2025-01-13 20:53:10.021 [INFO][6344] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Jan 13 20:53:10.077211 containerd[1540]: 2025-01-13 20:53:10.045 [INFO][6355] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" HandleID="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Workload="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:53:10.077211 containerd[1540]: 2025-01-13 20:53:10.046 [INFO][6355] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:53:10.077211 containerd[1540]: 2025-01-13 20:53:10.046 [INFO][6355] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:53:10.077211 containerd[1540]: 2025-01-13 20:53:10.072 [INFO][6355] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" HandleID="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Workload="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:53:10.077211 containerd[1540]: 2025-01-13 20:53:10.072 [INFO][6355] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" HandleID="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Workload="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:53:10.077211 containerd[1540]: 2025-01-13 20:53:10.073 [INFO][6355] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:53:10.077211 containerd[1540]: 2025-01-13 20:53:10.075 [INFO][6344] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Jan 13 20:53:10.078918 systemd[1]: run-netns-cni\x2da9b14467\x2d3453\x2d0b23\x2dfd8c\x2d9d491a63546d.mount: Deactivated successfully. Jan 13 20:53:10.079553 containerd[1540]: time="2025-01-13T20:53:10.079101783Z" level=info msg="TearDown network for sandbox \"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d\" successfully" Jan 13 20:53:10.079553 containerd[1540]: time="2025-01-13T20:53:10.079120024Z" level=info msg="StopPodSandbox for \"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d\" returns successfully" Jan 13 20:53:10.129454 kubelet[2861]: I0113 20:53:10.129430 2861 scope.go:117] "RemoveContainer" containerID="a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1" Jan 13 20:53:10.130089 kubelet[2861]: I0113 20:53:10.129643 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b067aee8-97d0-47ca-9359-80c070636930-tigera-ca-bundle\") pod \"b067aee8-97d0-47ca-9359-80c070636930\" (UID: \"b067aee8-97d0-47ca-9359-80c070636930\") " Jan 13 20:53:10.130089 kubelet[2861]: I0113 20:53:10.129828 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mvln\" (UniqueName: \"kubernetes.io/projected/b067aee8-97d0-47ca-9359-80c070636930-kube-api-access-8mvln\") pod \"b067aee8-97d0-47ca-9359-80c070636930\" (UID: \"b067aee8-97d0-47ca-9359-80c070636930\") " Jan 13 20:53:10.147907 containerd[1540]: time="2025-01-13T20:53:10.147826801Z" level=info msg="RemoveContainer for \"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1\"" Jan 13 20:53:10.149707 containerd[1540]: time="2025-01-13T20:53:10.149466992Z" level=info msg="RemoveContainer for \"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1\" returns successfully" Jan 13 20:53:10.153890 kubelet[2861]: I0113 20:53:10.153877 2861 scope.go:117] "RemoveContainer" containerID="a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1" Jan 13 20:53:10.154082 containerd[1540]: time="2025-01-13T20:53:10.154064985Z" level=error msg="ContainerStatus for \"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1\": not found" Jan 13 20:53:10.157555 systemd[1]: var-lib-kubelet-pods-b067aee8\x2d97d0\x2d47ca\x2d9359\x2d80c070636930-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dkube\x2dcontrollers-1.mount: Deactivated successfully. Jan 13 20:53:10.157718 systemd[1]: var-lib-kubelet-pods-b067aee8\x2d97d0\x2d47ca\x2d9359\x2d80c070636930-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d8mvln.mount: Deactivated successfully. Jan 13 20:53:10.160782 kubelet[2861]: I0113 20:53:10.160131 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b067aee8-97d0-47ca-9359-80c070636930-kube-api-access-8mvln" (OuterVolumeSpecName: "kube-api-access-8mvln") pod "b067aee8-97d0-47ca-9359-80c070636930" (UID: "b067aee8-97d0-47ca-9359-80c070636930"). InnerVolumeSpecName "kube-api-access-8mvln". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 13 20:53:10.161883 kubelet[2861]: E0113 20:53:10.161405 2861 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1\": not found" containerID="a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1" Jan 13 20:53:10.161943 kubelet[2861]: I0113 20:53:10.159688 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b067aee8-97d0-47ca-9359-80c070636930-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "b067aee8-97d0-47ca-9359-80c070636930" (UID: "b067aee8-97d0-47ca-9359-80c070636930"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 13 20:53:10.168086 kubelet[2861]: I0113 20:53:10.168070 2861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1"} err="failed to get container status \"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1\": rpc error: code = NotFound desc = an error occurred when try to find container \"a500a6851f2786e0d8ed07fcce838f493b01204cc80a7eb96fa13fbbbde771a1\": not found" Jan 13 20:53:10.230420 kubelet[2861]: I0113 20:53:10.230391 2861 reconciler_common.go:300] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b067aee8-97d0-47ca-9359-80c070636930-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Jan 13 20:53:10.230420 kubelet[2861]: I0113 20:53:10.230417 2861 reconciler_common.go:300] "Volume detached for volume \"kube-api-access-8mvln\" (UniqueName: \"kubernetes.io/projected/b067aee8-97d0-47ca-9359-80c070636930-kube-api-access-8mvln\") on node \"localhost\" DevicePath \"\"" Jan 13 20:53:10.305183 systemd[1]: Started sshd@8-139.178.70.104:22-147.75.109.163:51016.service - OpenSSH per-connection server daemon (147.75.109.163:51016). Jan 13 20:53:10.396393 sshd[6366]: Accepted publickey for core from 147.75.109.163 port 51016 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:10.396945 sshd-session[6366]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:10.401502 systemd-logind[1528]: New session 11 of user core. Jan 13 20:53:10.407536 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 13 20:53:10.421135 systemd[1]: Removed slice kubepods-besteffort-podb067aee8_97d0_47ca_9359_80c070636930.slice - libcontainer container kubepods-besteffort-podb067aee8_97d0_47ca_9359_80c070636930.slice. Jan 13 20:53:10.454796 kubelet[2861]: I0113 20:53:10.454769 2861 topology_manager.go:215] "Topology Admit Handler" podUID="d3d250cf-2323-4ef1-839b-3a21607a3ccc" podNamespace="calico-system" podName="calico-kube-controllers-6d7b96d599-hp6fx" Jan 13 20:53:10.465701 kubelet[2861]: E0113 20:53:10.465613 2861 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="b067aee8-97d0-47ca-9359-80c070636930" containerName="calico-kube-controllers" Jan 13 20:53:10.468270 kubelet[2861]: I0113 20:53:10.468191 2861 memory_manager.go:354] "RemoveStaleState removing state" podUID="b067aee8-97d0-47ca-9359-80c070636930" containerName="calico-kube-controllers" Jan 13 20:53:10.486308 systemd[1]: Created slice kubepods-besteffort-podd3d250cf_2323_4ef1_839b_3a21607a3ccc.slice - libcontainer container kubepods-besteffort-podd3d250cf_2323_4ef1_839b_3a21607a3ccc.slice. Jan 13 20:53:10.534232 kubelet[2861]: I0113 20:53:10.534083 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3d250cf-2323-4ef1-839b-3a21607a3ccc-tigera-ca-bundle\") pod \"calico-kube-controllers-6d7b96d599-hp6fx\" (UID: \"d3d250cf-2323-4ef1-839b-3a21607a3ccc\") " pod="calico-system/calico-kube-controllers-6d7b96d599-hp6fx" Jan 13 20:53:10.534232 kubelet[2861]: I0113 20:53:10.534117 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7rbc\" (UniqueName: \"kubernetes.io/projected/d3d250cf-2323-4ef1-839b-3a21607a3ccc-kube-api-access-z7rbc\") pod \"calico-kube-controllers-6d7b96d599-hp6fx\" (UID: \"d3d250cf-2323-4ef1-839b-3a21607a3ccc\") " pod="calico-system/calico-kube-controllers-6d7b96d599-hp6fx" Jan 13 20:53:10.597221 sshd[6375]: Connection closed by 147.75.109.163 port 51016 Jan 13 20:53:10.597619 sshd-session[6366]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:10.599623 systemd[1]: sshd@8-139.178.70.104:22-147.75.109.163:51016.service: Deactivated successfully. Jan 13 20:53:10.600807 systemd[1]: session-11.scope: Deactivated successfully. Jan 13 20:53:10.601674 systemd-logind[1528]: Session 11 logged out. Waiting for processes to exit. Jan 13 20:53:10.602246 systemd-logind[1528]: Removed session 11. Jan 13 20:53:10.790929 containerd[1540]: time="2025-01-13T20:53:10.790725256Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d7b96d599-hp6fx,Uid:d3d250cf-2323-4ef1-839b-3a21607a3ccc,Namespace:calico-system,Attempt:0,}" Jan 13 20:53:10.912038 systemd-networkd[1457]: cali9620b1737bd: Link UP Jan 13 20:53:10.912830 systemd-networkd[1457]: cali9620b1737bd: Gained carrier Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.844 [INFO][6391] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--6d7b96d599--hp6fx-eth0 calico-kube-controllers-6d7b96d599- calico-system d3d250cf-2323-4ef1-839b-3a21607a3ccc 1152 0 2025-01-13 20:53:10 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6d7b96d599 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-6d7b96d599-hp6fx eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali9620b1737bd [] []}} ContainerID="bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" Namespace="calico-system" Pod="calico-kube-controllers-6d7b96d599-hp6fx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6d7b96d599--hp6fx-" Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.844 [INFO][6391] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" Namespace="calico-system" Pod="calico-kube-controllers-6d7b96d599-hp6fx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6d7b96d599--hp6fx-eth0" Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.870 [INFO][6399] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" HandleID="k8s-pod-network.bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" Workload="localhost-k8s-calico--kube--controllers--6d7b96d599--hp6fx-eth0" Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.875 [INFO][6399] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" HandleID="k8s-pod-network.bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" Workload="localhost-k8s-calico--kube--controllers--6d7b96d599--hp6fx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000334c80), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-6d7b96d599-hp6fx", "timestamp":"2025-01-13 20:53:10.870036057 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.875 [INFO][6399] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.875 [INFO][6399] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.875 [INFO][6399] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.876 [INFO][6399] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" host="localhost" Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.881 [INFO][6399] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.885 [INFO][6399] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.887 [INFO][6399] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.890 [INFO][6399] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.890 [INFO][6399] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" host="localhost" Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.894 [INFO][6399] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03 Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.900 [INFO][6399] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" host="localhost" Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.905 [INFO][6399] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" host="localhost" Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.905 [INFO][6399] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" host="localhost" Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.905 [INFO][6399] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:53:10.930135 containerd[1540]: 2025-01-13 20:53:10.906 [INFO][6399] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" HandleID="k8s-pod-network.bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" Workload="localhost-k8s-calico--kube--controllers--6d7b96d599--hp6fx-eth0" Jan 13 20:53:10.933888 containerd[1540]: 2025-01-13 20:53:10.908 [INFO][6391] cni-plugin/k8s.go 386: Populated endpoint ContainerID="bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" Namespace="calico-system" Pod="calico-kube-controllers-6d7b96d599-hp6fx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6d7b96d599--hp6fx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6d7b96d599--hp6fx-eth0", GenerateName:"calico-kube-controllers-6d7b96d599-", Namespace:"calico-system", SelfLink:"", UID:"d3d250cf-2323-4ef1-839b-3a21607a3ccc", ResourceVersion:"1152", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 53, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6d7b96d599", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-6d7b96d599-hp6fx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9620b1737bd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:53:10.933888 containerd[1540]: 2025-01-13 20:53:10.908 [INFO][6391] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.135/32] ContainerID="bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" Namespace="calico-system" Pod="calico-kube-controllers-6d7b96d599-hp6fx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6d7b96d599--hp6fx-eth0" Jan 13 20:53:10.933888 containerd[1540]: 2025-01-13 20:53:10.908 [INFO][6391] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9620b1737bd ContainerID="bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" Namespace="calico-system" Pod="calico-kube-controllers-6d7b96d599-hp6fx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6d7b96d599--hp6fx-eth0" Jan 13 20:53:10.933888 containerd[1540]: 2025-01-13 20:53:10.912 [INFO][6391] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" Namespace="calico-system" Pod="calico-kube-controllers-6d7b96d599-hp6fx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6d7b96d599--hp6fx-eth0" Jan 13 20:53:10.933888 containerd[1540]: 2025-01-13 20:53:10.913 [INFO][6391] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" Namespace="calico-system" Pod="calico-kube-controllers-6d7b96d599-hp6fx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6d7b96d599--hp6fx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6d7b96d599--hp6fx-eth0", GenerateName:"calico-kube-controllers-6d7b96d599-", Namespace:"calico-system", SelfLink:"", UID:"d3d250cf-2323-4ef1-839b-3a21607a3ccc", ResourceVersion:"1152", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 53, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6d7b96d599", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03", Pod:"calico-kube-controllers-6d7b96d599-hp6fx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9620b1737bd", MAC:"26:28:07:fd:c6:88", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:53:10.933888 containerd[1540]: 2025-01-13 20:53:10.925 [INFO][6391] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03" Namespace="calico-system" Pod="calico-kube-controllers-6d7b96d599-hp6fx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6d7b96d599--hp6fx-eth0" Jan 13 20:53:10.959895 containerd[1540]: time="2025-01-13T20:53:10.959721868Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:53:10.959895 containerd[1540]: time="2025-01-13T20:53:10.959776636Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:53:10.959895 containerd[1540]: time="2025-01-13T20:53:10.959787081Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:53:10.959895 containerd[1540]: time="2025-01-13T20:53:10.959839556Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:53:10.978483 systemd[1]: Started cri-containerd-bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03.scope - libcontainer container bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03. Jan 13 20:53:10.987039 systemd-resolved[1409]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 20:53:11.007869 containerd[1540]: time="2025-01-13T20:53:11.007840397Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d7b96d599-hp6fx,Uid:d3d250cf-2323-4ef1-839b-3a21607a3ccc,Namespace:calico-system,Attempt:0,} returns sandbox id \"bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03\"" Jan 13 20:53:11.021366 containerd[1540]: time="2025-01-13T20:53:11.021326783Z" level=info msg="CreateContainer within sandbox \"bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 13 20:53:11.033079 containerd[1540]: time="2025-01-13T20:53:11.033045228Z" level=info msg="CreateContainer within sandbox \"bc21dc4ac9ca651e621ad947358f9d54285698aa9fcfa8629f3fba8073a01c03\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"43ee1a04e13a2aa19c75ae7d04318936e1ab29f26c795cb951fdda269b294b86\"" Jan 13 20:53:11.033449 containerd[1540]: time="2025-01-13T20:53:11.033363513Z" level=info msg="StartContainer for \"43ee1a04e13a2aa19c75ae7d04318936e1ab29f26c795cb951fdda269b294b86\"" Jan 13 20:53:11.054445 systemd[1]: Started cri-containerd-43ee1a04e13a2aa19c75ae7d04318936e1ab29f26c795cb951fdda269b294b86.scope - libcontainer container 43ee1a04e13a2aa19c75ae7d04318936e1ab29f26c795cb951fdda269b294b86. Jan 13 20:53:11.110709 containerd[1540]: time="2025-01-13T20:53:11.110680875Z" level=info msg="StartContainer for \"43ee1a04e13a2aa19c75ae7d04318936e1ab29f26c795cb951fdda269b294b86\" returns successfully" Jan 13 20:53:12.056142 kubelet[2861]: I0113 20:53:12.056108 2861 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="b067aee8-97d0-47ca-9359-80c070636930" path="/var/lib/kubelet/pods/b067aee8-97d0-47ca-9359-80c070636930/volumes" Jan 13 20:53:12.133789 systemd[1]: run-containerd-runc-k8s.io-43ee1a04e13a2aa19c75ae7d04318936e1ab29f26c795cb951fdda269b294b86-runc.kym1Fi.mount: Deactivated successfully. Jan 13 20:53:12.170483 kubelet[2861]: I0113 20:53:12.170317 2861 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6d7b96d599-hp6fx" podStartSLOduration=2.170287923 podStartE2EDuration="2.170287923s" podCreationTimestamp="2025-01-13 20:53:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:53:11.137069514 +0000 UTC m=+83.243354254" watchObservedRunningTime="2025-01-13 20:53:12.170287923 +0000 UTC m=+84.276572662" Jan 13 20:53:12.501489 systemd-networkd[1457]: cali9620b1737bd: Gained IPv6LL Jan 13 20:53:13.543831 systemd[1]: cri-containerd-0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6.scope: Deactivated successfully. Jan 13 20:53:13.559163 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6-rootfs.mount: Deactivated successfully. Jan 13 20:53:13.561891 containerd[1540]: time="2025-01-13T20:53:13.559720480Z" level=info msg="shim disconnected" id=0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6 namespace=k8s.io Jan 13 20:53:13.561891 containerd[1540]: time="2025-01-13T20:53:13.559799622Z" level=warning msg="cleaning up after shim disconnected" id=0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6 namespace=k8s.io Jan 13 20:53:13.561891 containerd[1540]: time="2025-01-13T20:53:13.559820969Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:53:13.585577 containerd[1540]: time="2025-01-13T20:53:13.585541365Z" level=info msg="StopContainer for \"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6\" returns successfully" Jan 13 20:53:13.588190 containerd[1540]: time="2025-01-13T20:53:13.585870209Z" level=info msg="StopPodSandbox for \"520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196\"" Jan 13 20:53:13.588190 containerd[1540]: time="2025-01-13T20:53:13.585897938Z" level=info msg="Container to stop \"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 13 20:53:13.587684 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196-shm.mount: Deactivated successfully. Jan 13 20:53:13.595590 systemd[1]: cri-containerd-520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196.scope: Deactivated successfully. Jan 13 20:53:13.611530 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196-rootfs.mount: Deactivated successfully. Jan 13 20:53:13.660474 containerd[1540]: time="2025-01-13T20:53:13.612846232Z" level=info msg="shim disconnected" id=520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196 namespace=k8s.io Jan 13 20:53:13.660474 containerd[1540]: time="2025-01-13T20:53:13.612889897Z" level=warning msg="cleaning up after shim disconnected" id=520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196 namespace=k8s.io Jan 13 20:53:13.660474 containerd[1540]: time="2025-01-13T20:53:13.612895640Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:53:13.678069 containerd[1540]: time="2025-01-13T20:53:13.678004558Z" level=info msg="TearDown network for sandbox \"520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196\" successfully" Jan 13 20:53:13.678069 containerd[1540]: time="2025-01-13T20:53:13.678026282Z" level=info msg="StopPodSandbox for \"520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196\" returns successfully" Jan 13 20:53:13.750183 kubelet[2861]: I0113 20:53:13.750009 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeba9dfe-23e2-4693-9190-c72e73d772ae-tigera-ca-bundle\") pod \"aeba9dfe-23e2-4693-9190-c72e73d772ae\" (UID: \"aeba9dfe-23e2-4693-9190-c72e73d772ae\") " Jan 13 20:53:13.750183 kubelet[2861]: I0113 20:53:13.750054 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/aeba9dfe-23e2-4693-9190-c72e73d772ae-typha-certs\") pod \"aeba9dfe-23e2-4693-9190-c72e73d772ae\" (UID: \"aeba9dfe-23e2-4693-9190-c72e73d772ae\") " Jan 13 20:53:13.750183 kubelet[2861]: I0113 20:53:13.750072 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzp46\" (UniqueName: \"kubernetes.io/projected/aeba9dfe-23e2-4693-9190-c72e73d772ae-kube-api-access-wzp46\") pod \"aeba9dfe-23e2-4693-9190-c72e73d772ae\" (UID: \"aeba9dfe-23e2-4693-9190-c72e73d772ae\") " Jan 13 20:53:13.766715 systemd[1]: var-lib-kubelet-pods-aeba9dfe\x2d23e2\x2d4693\x2d9190\x2dc72e73d772ae-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Jan 13 20:53:13.776760 systemd[1]: var-lib-kubelet-pods-aeba9dfe\x2d23e2\x2d4693\x2d9190\x2dc72e73d772ae-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dwzp46.mount: Deactivated successfully. Jan 13 20:53:13.786645 kubelet[2861]: I0113 20:53:13.786606 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeba9dfe-23e2-4693-9190-c72e73d772ae-kube-api-access-wzp46" (OuterVolumeSpecName: "kube-api-access-wzp46") pod "aeba9dfe-23e2-4693-9190-c72e73d772ae" (UID: "aeba9dfe-23e2-4693-9190-c72e73d772ae"). InnerVolumeSpecName "kube-api-access-wzp46". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 13 20:53:13.786727 kubelet[2861]: I0113 20:53:13.786675 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba9dfe-23e2-4693-9190-c72e73d772ae-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "aeba9dfe-23e2-4693-9190-c72e73d772ae" (UID: "aeba9dfe-23e2-4693-9190-c72e73d772ae"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 13 20:53:13.792911 kubelet[2861]: I0113 20:53:13.792868 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeba9dfe-23e2-4693-9190-c72e73d772ae-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "aeba9dfe-23e2-4693-9190-c72e73d772ae" (UID: "aeba9dfe-23e2-4693-9190-c72e73d772ae"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 13 20:53:13.850477 kubelet[2861]: I0113 20:53:13.850374 2861 reconciler_common.go:300] "Volume detached for volume \"kube-api-access-wzp46\" (UniqueName: \"kubernetes.io/projected/aeba9dfe-23e2-4693-9190-c72e73d772ae-kube-api-access-wzp46\") on node \"localhost\" DevicePath \"\"" Jan 13 20:53:13.850477 kubelet[2861]: I0113 20:53:13.850402 2861 reconciler_common.go:300] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeba9dfe-23e2-4693-9190-c72e73d772ae-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Jan 13 20:53:13.850477 kubelet[2861]: I0113 20:53:13.850410 2861 reconciler_common.go:300] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/aeba9dfe-23e2-4693-9190-c72e73d772ae-typha-certs\") on node \"localhost\" DevicePath \"\"" Jan 13 20:53:14.036286 systemd[1]: Removed slice kubepods-besteffort-podaeba9dfe_23e2_4693_9190_c72e73d772ae.slice - libcontainer container kubepods-besteffort-podaeba9dfe_23e2_4693_9190_c72e73d772ae.slice. Jan 13 20:53:14.122680 kubelet[2861]: I0113 20:53:14.122568 2861 scope.go:117] "RemoveContainer" containerID="0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6" Jan 13 20:53:14.160822 containerd[1540]: time="2025-01-13T20:53:14.160530672Z" level=info msg="RemoveContainer for \"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6\"" Jan 13 20:53:14.173288 containerd[1540]: time="2025-01-13T20:53:14.173230363Z" level=info msg="RemoveContainer for \"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6\" returns successfully" Jan 13 20:53:14.173547 kubelet[2861]: I0113 20:53:14.173384 2861 scope.go:117] "RemoveContainer" containerID="0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6" Jan 13 20:53:14.173697 containerd[1540]: time="2025-01-13T20:53:14.173673025Z" level=error msg="ContainerStatus for \"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6\": not found" Jan 13 20:53:14.173848 kubelet[2861]: E0113 20:53:14.173832 2861 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6\": not found" containerID="0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6" Jan 13 20:53:14.173898 kubelet[2861]: I0113 20:53:14.173858 2861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6"} err="failed to get container status \"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6\": rpc error: code = NotFound desc = an error occurred when try to find container \"0b36068e7fc9471084fe45db7d747fa5a39047d936936adce952f54e6ee30ce6\": not found" Jan 13 20:53:14.559790 systemd[1]: var-lib-kubelet-pods-aeba9dfe\x2d23e2\x2d4693\x2d9190\x2dc72e73d772ae-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Jan 13 20:53:15.606141 systemd[1]: Started sshd@9-139.178.70.104:22-147.75.109.163:51028.service - OpenSSH per-connection server daemon (147.75.109.163:51028). Jan 13 20:53:16.008952 kubelet[2861]: I0113 20:53:16.008930 2861 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="aeba9dfe-23e2-4693-9190-c72e73d772ae" path="/var/lib/kubelet/pods/aeba9dfe-23e2-4693-9190-c72e73d772ae/volumes" Jan 13 20:53:16.012012 sshd[6699]: Accepted publickey for core from 147.75.109.163 port 51028 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:16.013202 sshd-session[6699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:16.016316 systemd-logind[1528]: New session 12 of user core. Jan 13 20:53:16.021481 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 13 20:53:16.571134 sshd[6720]: Connection closed by 147.75.109.163 port 51028 Jan 13 20:53:16.571648 sshd-session[6699]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:16.573629 systemd[1]: sshd@9-139.178.70.104:22-147.75.109.163:51028.service: Deactivated successfully. Jan 13 20:53:16.574731 systemd[1]: session-12.scope: Deactivated successfully. Jan 13 20:53:16.575215 systemd-logind[1528]: Session 12 logged out. Waiting for processes to exit. Jan 13 20:53:16.576105 systemd-logind[1528]: Removed session 12. Jan 13 20:53:21.583218 systemd[1]: Started sshd@10-139.178.70.104:22-147.75.109.163:48090.service - OpenSSH per-connection server daemon (147.75.109.163:48090). Jan 13 20:53:21.621464 sshd[6824]: Accepted publickey for core from 147.75.109.163 port 48090 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:21.622235 sshd-session[6824]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:21.625029 systemd-logind[1528]: New session 13 of user core. Jan 13 20:53:21.630438 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 13 20:53:21.725475 sshd[6826]: Connection closed by 147.75.109.163 port 48090 Jan 13 20:53:21.726782 sshd-session[6824]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:21.733903 systemd[1]: sshd@10-139.178.70.104:22-147.75.109.163:48090.service: Deactivated successfully. Jan 13 20:53:21.734879 systemd[1]: session-13.scope: Deactivated successfully. Jan 13 20:53:21.735737 systemd-logind[1528]: Session 13 logged out. Waiting for processes to exit. Jan 13 20:53:21.739529 systemd[1]: Started sshd@11-139.178.70.104:22-147.75.109.163:48102.service - OpenSSH per-connection server daemon (147.75.109.163:48102). Jan 13 20:53:21.740552 systemd-logind[1528]: Removed session 13. Jan 13 20:53:21.775931 sshd[6851]: Accepted publickey for core from 147.75.109.163 port 48102 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:21.776810 sshd-session[6851]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:21.781479 systemd-logind[1528]: New session 14 of user core. Jan 13 20:53:21.782545 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 13 20:53:21.956379 sshd[6853]: Connection closed by 147.75.109.163 port 48102 Jan 13 20:53:21.957086 sshd-session[6851]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:21.962413 systemd[1]: sshd@11-139.178.70.104:22-147.75.109.163:48102.service: Deactivated successfully. Jan 13 20:53:21.965334 systemd[1]: session-14.scope: Deactivated successfully. Jan 13 20:53:21.967880 systemd-logind[1528]: Session 14 logged out. Waiting for processes to exit. Jan 13 20:53:21.974735 systemd[1]: Started sshd@12-139.178.70.104:22-147.75.109.163:48112.service - OpenSSH per-connection server daemon (147.75.109.163:48112). Jan 13 20:53:21.976786 systemd-logind[1528]: Removed session 14. Jan 13 20:53:22.016552 sshd[6862]: Accepted publickey for core from 147.75.109.163 port 48112 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:22.017414 sshd-session[6862]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:22.019954 systemd-logind[1528]: New session 15 of user core. Jan 13 20:53:22.025456 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 13 20:53:22.146608 sshd[6864]: Connection closed by 147.75.109.163 port 48112 Jan 13 20:53:22.146823 sshd-session[6862]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:22.149023 systemd[1]: sshd@12-139.178.70.104:22-147.75.109.163:48112.service: Deactivated successfully. Jan 13 20:53:22.149390 systemd-logind[1528]: Session 15 logged out. Waiting for processes to exit. Jan 13 20:53:22.150393 systemd[1]: session-15.scope: Deactivated successfully. Jan 13 20:53:22.151649 systemd-logind[1528]: Removed session 15. Jan 13 20:53:27.155918 systemd[1]: Started sshd@13-139.178.70.104:22-147.75.109.163:48124.service - OpenSSH per-connection server daemon (147.75.109.163:48124). Jan 13 20:53:27.477769 sshd[6962]: Accepted publickey for core from 147.75.109.163 port 48124 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:27.478699 sshd-session[6962]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:27.485260 systemd-logind[1528]: New session 16 of user core. Jan 13 20:53:27.489460 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 13 20:53:28.624703 sshd[6978]: Connection closed by 147.75.109.163 port 48124 Jan 13 20:53:28.625063 sshd-session[6962]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:28.626799 systemd[1]: sshd@13-139.178.70.104:22-147.75.109.163:48124.service: Deactivated successfully. Jan 13 20:53:28.627887 systemd[1]: session-16.scope: Deactivated successfully. Jan 13 20:53:28.628729 systemd-logind[1528]: Session 16 logged out. Waiting for processes to exit. Jan 13 20:53:28.629549 systemd-logind[1528]: Removed session 16. Jan 13 20:53:33.634910 systemd[1]: Started sshd@14-139.178.70.104:22-147.75.109.163:53070.service - OpenSSH per-connection server daemon (147.75.109.163:53070). Jan 13 20:53:33.739719 sshd[7112]: Accepted publickey for core from 147.75.109.163 port 53070 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:33.740623 sshd-session[7112]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:33.743338 systemd-logind[1528]: New session 17 of user core. Jan 13 20:53:33.751455 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 13 20:53:33.867635 sshd[7114]: Connection closed by 147.75.109.163 port 53070 Jan 13 20:53:33.868135 sshd-session[7112]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:33.869963 systemd[1]: sshd@14-139.178.70.104:22-147.75.109.163:53070.service: Deactivated successfully. Jan 13 20:53:33.871116 systemd[1]: session-17.scope: Deactivated successfully. Jan 13 20:53:33.871978 systemd-logind[1528]: Session 17 logged out. Waiting for processes to exit. Jan 13 20:53:33.872748 systemd-logind[1528]: Removed session 17. Jan 13 20:53:38.876660 systemd[1]: Started sshd@15-139.178.70.104:22-147.75.109.163:53350.service - OpenSSH per-connection server daemon (147.75.109.163:53350). Jan 13 20:53:38.912788 sshd[7233]: Accepted publickey for core from 147.75.109.163 port 53350 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:38.913554 sshd-session[7233]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:38.917538 systemd-logind[1528]: New session 18 of user core. Jan 13 20:53:38.925492 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 13 20:53:39.036399 sshd[7235]: Connection closed by 147.75.109.163 port 53350 Jan 13 20:53:39.036926 sshd-session[7233]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:39.046321 systemd[1]: sshd@15-139.178.70.104:22-147.75.109.163:53350.service: Deactivated successfully. Jan 13 20:53:39.047924 systemd[1]: session-18.scope: Deactivated successfully. Jan 13 20:53:39.049029 systemd-logind[1528]: Session 18 logged out. Waiting for processes to exit. Jan 13 20:53:39.056665 systemd[1]: Started sshd@16-139.178.70.104:22-147.75.109.163:53362.service - OpenSSH per-connection server daemon (147.75.109.163:53362). Jan 13 20:53:39.057867 systemd-logind[1528]: Removed session 18. Jan 13 20:53:39.090811 sshd[7246]: Accepted publickey for core from 147.75.109.163 port 53362 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:39.091681 sshd-session[7246]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:39.094213 systemd-logind[1528]: New session 19 of user core. Jan 13 20:53:39.098542 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 13 20:53:39.567642 sshd[7248]: Connection closed by 147.75.109.163 port 53362 Jan 13 20:53:39.568893 sshd-session[7246]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:39.576798 systemd[1]: sshd@16-139.178.70.104:22-147.75.109.163:53362.service: Deactivated successfully. Jan 13 20:53:39.579892 systemd[1]: session-19.scope: Deactivated successfully. Jan 13 20:53:39.581819 systemd-logind[1528]: Session 19 logged out. Waiting for processes to exit. Jan 13 20:53:39.587781 systemd[1]: Started sshd@17-139.178.70.104:22-147.75.109.163:53368.service - OpenSSH per-connection server daemon (147.75.109.163:53368). Jan 13 20:53:39.589224 systemd-logind[1528]: Removed session 19. Jan 13 20:53:39.637430 sshd[7262]: Accepted publickey for core from 147.75.109.163 port 53368 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:39.638291 sshd-session[7262]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:39.641012 systemd-logind[1528]: New session 20 of user core. Jan 13 20:53:39.646434 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 13 20:53:41.496005 sshd[7267]: Connection closed by 147.75.109.163 port 53368 Jan 13 20:53:41.501972 sshd-session[7262]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:41.511601 systemd[1]: Started sshd@18-139.178.70.104:22-147.75.109.163:53372.service - OpenSSH per-connection server daemon (147.75.109.163:53372). Jan 13 20:53:41.511917 systemd[1]: sshd@17-139.178.70.104:22-147.75.109.163:53368.service: Deactivated successfully. Jan 13 20:53:41.513021 systemd[1]: session-20.scope: Deactivated successfully. Jan 13 20:53:41.517211 systemd-logind[1528]: Session 20 logged out. Waiting for processes to exit. Jan 13 20:53:41.518810 systemd-logind[1528]: Removed session 20. Jan 13 20:53:41.664216 sshd[7336]: Accepted publickey for core from 147.75.109.163 port 53372 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:41.665161 sshd-session[7336]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:41.672139 systemd-logind[1528]: New session 21 of user core. Jan 13 20:53:41.682489 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 13 20:53:42.435106 sshd[7340]: Connection closed by 147.75.109.163 port 53372 Jan 13 20:53:42.443941 systemd[1]: sshd@18-139.178.70.104:22-147.75.109.163:53372.service: Deactivated successfully. Jan 13 20:53:42.436208 sshd-session[7336]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:42.445076 systemd[1]: session-21.scope: Deactivated successfully. Jan 13 20:53:42.445505 systemd-logind[1528]: Session 21 logged out. Waiting for processes to exit. Jan 13 20:53:42.452591 systemd[1]: Started sshd@19-139.178.70.104:22-147.75.109.163:53388.service - OpenSSH per-connection server daemon (147.75.109.163:53388). Jan 13 20:53:42.453165 systemd-logind[1528]: Removed session 21. Jan 13 20:53:42.543164 sshd[7373]: Accepted publickey for core from 147.75.109.163 port 53388 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:42.544457 sshd-session[7373]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:42.547620 systemd-logind[1528]: New session 22 of user core. Jan 13 20:53:42.557815 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 13 20:53:42.670981 sshd[7375]: Connection closed by 147.75.109.163 port 53388 Jan 13 20:53:42.671483 sshd-session[7373]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:42.673233 systemd[1]: sshd@19-139.178.70.104:22-147.75.109.163:53388.service: Deactivated successfully. Jan 13 20:53:42.674463 systemd[1]: session-22.scope: Deactivated successfully. Jan 13 20:53:42.675301 systemd-logind[1528]: Session 22 logged out. Waiting for processes to exit. Jan 13 20:53:42.676007 systemd-logind[1528]: Removed session 22. Jan 13 20:53:47.680101 systemd[1]: Started sshd@20-139.178.70.104:22-147.75.109.163:57662.service - OpenSSH per-connection server daemon (147.75.109.163:57662). Jan 13 20:53:47.737707 sshd[7507]: Accepted publickey for core from 147.75.109.163 port 57662 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:47.738631 sshd-session[7507]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:47.740974 systemd-logind[1528]: New session 23 of user core. Jan 13 20:53:47.745444 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 13 20:53:47.890759 sshd[7509]: Connection closed by 147.75.109.163 port 57662 Jan 13 20:53:47.891278 sshd-session[7507]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:47.892956 systemd[1]: sshd@20-139.178.70.104:22-147.75.109.163:57662.service: Deactivated successfully. Jan 13 20:53:47.894671 systemd[1]: session-23.scope: Deactivated successfully. Jan 13 20:53:47.895849 systemd-logind[1528]: Session 23 logged out. Waiting for processes to exit. Jan 13 20:53:47.896492 systemd-logind[1528]: Removed session 23. Jan 13 20:53:50.762659 containerd[1540]: time="2025-01-13T20:53:50.762619235Z" level=info msg="StopPodSandbox for \"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d\"" Jan 13 20:53:50.959608 containerd[1540]: 2025-01-13 20:53:50.884 [WARNING][7590] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:53:50.959608 containerd[1540]: 2025-01-13 20:53:50.891 [INFO][7590] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Jan 13 20:53:50.959608 containerd[1540]: 2025-01-13 20:53:50.891 [INFO][7590] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" iface="eth0" netns="" Jan 13 20:53:50.959608 containerd[1540]: 2025-01-13 20:53:50.894 [INFO][7590] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Jan 13 20:53:50.959608 containerd[1540]: 2025-01-13 20:53:50.894 [INFO][7590] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Jan 13 20:53:50.959608 containerd[1540]: 2025-01-13 20:53:50.952 [INFO][7596] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" HandleID="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Workload="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:53:50.959608 containerd[1540]: 2025-01-13 20:53:50.952 [INFO][7596] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:53:50.959608 containerd[1540]: 2025-01-13 20:53:50.952 [INFO][7596] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:53:50.959608 containerd[1540]: 2025-01-13 20:53:50.956 [WARNING][7596] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" HandleID="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Workload="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:53:50.959608 containerd[1540]: 2025-01-13 20:53:50.956 [INFO][7596] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" HandleID="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Workload="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:53:50.959608 containerd[1540]: 2025-01-13 20:53:50.956 [INFO][7596] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:53:50.959608 containerd[1540]: 2025-01-13 20:53:50.958 [INFO][7590] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Jan 13 20:53:50.959608 containerd[1540]: time="2025-01-13T20:53:50.959522478Z" level=info msg="TearDown network for sandbox \"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d\" successfully" Jan 13 20:53:50.959608 containerd[1540]: time="2025-01-13T20:53:50.959538271Z" level=info msg="StopPodSandbox for \"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d\" returns successfully" Jan 13 20:53:50.973060 containerd[1540]: time="2025-01-13T20:53:50.960122985Z" level=info msg="RemovePodSandbox for \"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d\"" Jan 13 20:53:50.993928 containerd[1540]: time="2025-01-13T20:53:50.993855349Z" level=info msg="Forcibly stopping sandbox \"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d\"" Jan 13 20:53:51.062461 containerd[1540]: 2025-01-13 20:53:51.039 [WARNING][7622] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:53:51.062461 containerd[1540]: 2025-01-13 20:53:51.040 [INFO][7622] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Jan 13 20:53:51.062461 containerd[1540]: 2025-01-13 20:53:51.040 [INFO][7622] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" iface="eth0" netns="" Jan 13 20:53:51.062461 containerd[1540]: 2025-01-13 20:53:51.040 [INFO][7622] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Jan 13 20:53:51.062461 containerd[1540]: 2025-01-13 20:53:51.040 [INFO][7622] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Jan 13 20:53:51.062461 containerd[1540]: 2025-01-13 20:53:51.056 [INFO][7628] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" HandleID="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Workload="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:53:51.062461 containerd[1540]: 2025-01-13 20:53:51.056 [INFO][7628] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:53:51.062461 containerd[1540]: 2025-01-13 20:53:51.056 [INFO][7628] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:53:51.062461 containerd[1540]: 2025-01-13 20:53:51.059 [WARNING][7628] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" HandleID="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Workload="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:53:51.062461 containerd[1540]: 2025-01-13 20:53:51.059 [INFO][7628] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" HandleID="k8s-pod-network.0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Workload="localhost-k8s-calico--kube--controllers--86544f5f57--nbx6b-eth0" Jan 13 20:53:51.062461 containerd[1540]: 2025-01-13 20:53:51.060 [INFO][7628] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:53:51.062461 containerd[1540]: 2025-01-13 20:53:51.060 [INFO][7622] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d" Jan 13 20:53:51.062803 containerd[1540]: time="2025-01-13T20:53:51.062464587Z" level=info msg="TearDown network for sandbox \"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d\" successfully" Jan 13 20:53:51.070281 containerd[1540]: time="2025-01-13T20:53:51.070257740Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:53:51.070335 containerd[1540]: time="2025-01-13T20:53:51.070299653Z" level=info msg="RemovePodSandbox \"0167136dcd0e6f7d451070018cf5f5faa810dbf43484d7f7316363a805fd476d\" returns successfully" Jan 13 20:53:51.070666 containerd[1540]: time="2025-01-13T20:53:51.070648758Z" level=info msg="StopPodSandbox for \"520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196\"" Jan 13 20:53:51.070738 containerd[1540]: time="2025-01-13T20:53:51.070721592Z" level=info msg="TearDown network for sandbox \"520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196\" successfully" Jan 13 20:53:51.070766 containerd[1540]: time="2025-01-13T20:53:51.070735099Z" level=info msg="StopPodSandbox for \"520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196\" returns successfully" Jan 13 20:53:51.070919 containerd[1540]: time="2025-01-13T20:53:51.070906050Z" level=info msg="RemovePodSandbox for \"520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196\"" Jan 13 20:53:51.070949 containerd[1540]: time="2025-01-13T20:53:51.070920387Z" level=info msg="Forcibly stopping sandbox \"520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196\"" Jan 13 20:53:51.075425 containerd[1540]: time="2025-01-13T20:53:51.070954632Z" level=info msg="TearDown network for sandbox \"520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196\" successfully" Jan 13 20:53:51.077179 containerd[1540]: time="2025-01-13T20:53:51.077163384Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:53:51.077231 containerd[1540]: time="2025-01-13T20:53:51.077191439Z" level=info msg="RemovePodSandbox \"520cc07f2e3f5dc2de87731425d82b1d558e4a10afcdb6c7bcd3eefcbf1ad196\" returns successfully" Jan 13 20:53:52.901401 systemd[1]: Started sshd@21-139.178.70.104:22-147.75.109.163:57666.service - OpenSSH per-connection server daemon (147.75.109.163:57666). Jan 13 20:53:52.976591 sshd[7668]: Accepted publickey for core from 147.75.109.163 port 57666 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:52.977668 sshd-session[7668]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:52.980277 systemd-logind[1528]: New session 24 of user core. Jan 13 20:53:52.991457 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 13 20:53:53.149939 sshd[7670]: Connection closed by 147.75.109.163 port 57666 Jan 13 20:53:53.149852 sshd-session[7668]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:53.152259 systemd[1]: sshd@21-139.178.70.104:22-147.75.109.163:57666.service: Deactivated successfully. Jan 13 20:53:53.153391 systemd[1]: session-24.scope: Deactivated successfully. Jan 13 20:53:53.153817 systemd-logind[1528]: Session 24 logged out. Waiting for processes to exit. Jan 13 20:53:53.154421 systemd-logind[1528]: Removed session 24. Jan 13 20:53:58.158851 systemd[1]: Started sshd@22-139.178.70.104:22-147.75.109.163:48990.service - OpenSSH per-connection server daemon (147.75.109.163:48990). Jan 13 20:53:58.213798 sshd[7768]: Accepted publickey for core from 147.75.109.163 port 48990 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:53:58.215137 sshd-session[7768]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:53:58.217998 systemd-logind[1528]: New session 25 of user core. Jan 13 20:53:58.223455 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 13 20:53:58.481819 sshd[7771]: Connection closed by 147.75.109.163 port 48990 Jan 13 20:53:58.482182 sshd-session[7768]: pam_unix(sshd:session): session closed for user core Jan 13 20:53:58.484542 systemd[1]: sshd@22-139.178.70.104:22-147.75.109.163:48990.service: Deactivated successfully. Jan 13 20:53:58.485905 systemd[1]: session-25.scope: Deactivated successfully. Jan 13 20:53:58.486506 systemd-logind[1528]: Session 25 logged out. Waiting for processes to exit. Jan 13 20:53:58.487252 systemd-logind[1528]: Removed session 25. Jan 13 20:54:03.490730 systemd[1]: Started sshd@23-139.178.70.104:22-147.75.109.163:48994.service - OpenSSH per-connection server daemon (147.75.109.163:48994). Jan 13 20:54:03.527344 sshd[7903]: Accepted publickey for core from 147.75.109.163 port 48994 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:54:03.528174 sshd-session[7903]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:54:03.530857 systemd-logind[1528]: New session 26 of user core. Jan 13 20:54:03.533433 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 13 20:54:03.635041 sshd[7905]: Connection closed by 147.75.109.163 port 48994 Jan 13 20:54:03.635443 sshd-session[7903]: pam_unix(sshd:session): session closed for user core Jan 13 20:54:03.637069 systemd-logind[1528]: Session 26 logged out. Waiting for processes to exit. Jan 13 20:54:03.637230 systemd[1]: sshd@23-139.178.70.104:22-147.75.109.163:48994.service: Deactivated successfully. Jan 13 20:54:03.638426 systemd[1]: session-26.scope: Deactivated successfully. Jan 13 20:54:03.639462 systemd-logind[1528]: Removed session 26. Jan 13 20:54:08.648032 systemd[1]: Started sshd@24-139.178.70.104:22-147.75.109.163:47672.service - OpenSSH per-connection server daemon (147.75.109.163:47672). Jan 13 20:54:08.741256 sshd[8042]: Accepted publickey for core from 147.75.109.163 port 47672 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:54:08.742318 sshd-session[8042]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:54:08.745200 systemd-logind[1528]: New session 27 of user core. Jan 13 20:54:08.749465 systemd[1]: Started session-27.scope - Session 27 of User core. Jan 13 20:54:09.171827 sshd[8045]: Connection closed by 147.75.109.163 port 47672 Jan 13 20:54:09.172499 sshd-session[8042]: pam_unix(sshd:session): session closed for user core Jan 13 20:54:09.174160 systemd-logind[1528]: Session 27 logged out. Waiting for processes to exit. Jan 13 20:54:09.175153 systemd[1]: sshd@24-139.178.70.104:22-147.75.109.163:47672.service: Deactivated successfully. Jan 13 20:54:09.176327 systemd[1]: session-27.scope: Deactivated successfully. Jan 13 20:54:09.176907 systemd-logind[1528]: Removed session 27. Jan 13 20:54:13.664323 systemd[1]: run-containerd-runc-k8s.io-6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc-runc.8BG4HS.mount: Deactivated successfully. Jan 13 20:54:13.725693 containerd[1540]: time="2025-01-13T20:54:13.725567413Z" level=info msg="StopContainer for \"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc\" with timeout 5 (s)" Jan 13 20:54:13.737904 containerd[1540]: time="2025-01-13T20:54:13.737875429Z" level=info msg="Stop container \"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc\" with signal terminated" Jan 13 20:54:13.758049 systemd[1]: cri-containerd-6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc.scope: Deactivated successfully. Jan 13 20:54:13.758431 systemd[1]: cri-containerd-6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc.scope: Consumed 7.029s CPU time. Jan 13 20:54:13.807860 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc-rootfs.mount: Deactivated successfully. Jan 13 20:54:13.813994 containerd[1540]: time="2025-01-13T20:54:13.805679692Z" level=info msg="shim disconnected" id=6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc namespace=k8s.io Jan 13 20:54:13.814063 containerd[1540]: time="2025-01-13T20:54:13.814002530Z" level=warning msg="cleaning up after shim disconnected" id=6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc namespace=k8s.io Jan 13 20:54:13.814063 containerd[1540]: time="2025-01-13T20:54:13.814015111Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:54:14.182863 systemd[1]: Started sshd@25-139.178.70.104:22-147.75.109.163:47688.service - OpenSSH per-connection server daemon (147.75.109.163:47688). Jan 13 20:54:14.224520 containerd[1540]: time="2025-01-13T20:54:14.224395409Z" level=info msg="StopContainer for \"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc\" returns successfully" Jan 13 20:54:14.273206 containerd[1540]: time="2025-01-13T20:54:14.273188659Z" level=info msg="StopPodSandbox for \"3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78\"" Jan 13 20:54:14.273344 containerd[1540]: time="2025-01-13T20:54:14.273305013Z" level=info msg="Container to stop \"165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 13 20:54:14.273430 containerd[1540]: time="2025-01-13T20:54:14.273420655Z" level=info msg="Container to stop \"5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 13 20:54:14.273470 containerd[1540]: time="2025-01-13T20:54:14.273462966Z" level=info msg="Container to stop \"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 13 20:54:14.276268 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78-shm.mount: Deactivated successfully. Jan 13 20:54:14.279975 systemd[1]: cri-containerd-3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78.scope: Deactivated successfully. Jan 13 20:54:14.294774 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78-rootfs.mount: Deactivated successfully. Jan 13 20:54:14.297377 containerd[1540]: time="2025-01-13T20:54:14.295238217Z" level=info msg="shim disconnected" id=3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78 namespace=k8s.io Jan 13 20:54:14.297377 containerd[1540]: time="2025-01-13T20:54:14.295281484Z" level=warning msg="cleaning up after shim disconnected" id=3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78 namespace=k8s.io Jan 13 20:54:14.297377 containerd[1540]: time="2025-01-13T20:54:14.295286915Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:54:14.456800 sshd[8159]: Accepted publickey for core from 147.75.109.163 port 47688 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 20:54:14.457312 sshd-session[8159]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:54:14.464613 systemd-logind[1528]: New session 28 of user core. Jan 13 20:54:14.474464 systemd[1]: Started session-28.scope - Session 28 of User core. Jan 13 20:54:14.560000 containerd[1540]: time="2025-01-13T20:54:14.559754521Z" level=info msg="TearDown network for sandbox \"3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78\" successfully" Jan 13 20:54:14.560000 containerd[1540]: time="2025-01-13T20:54:14.559778024Z" level=info msg="StopPodSandbox for \"3ebeadbdbd0e904705c401a58ab964f7920392cef9f184a98c1b01f1a4dd4d78\" returns successfully" Jan 13 20:54:14.622347 kubelet[2861]: I0113 20:54:14.615286 2861 topology_manager.go:215] "Topology Admit Handler" podUID="a5753e94-8e81-4780-bf82-50fb2d9264c3" podNamespace="calico-system" podName="calico-node-49hb7" Jan 13 20:54:14.636993 kubelet[2861]: E0113 20:54:14.636959 2861 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="5ae12788-a88d-4b93-b16c-86265eaf0a93" containerName="install-cni" Jan 13 20:54:14.639403 kubelet[2861]: E0113 20:54:14.639381 2861 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="aeba9dfe-23e2-4693-9190-c72e73d772ae" containerName="calico-typha" Jan 13 20:54:14.639403 kubelet[2861]: E0113 20:54:14.639423 2861 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="5ae12788-a88d-4b93-b16c-86265eaf0a93" containerName="calico-node" Jan 13 20:54:14.639403 kubelet[2861]: E0113 20:54:14.639430 2861 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="5ae12788-a88d-4b93-b16c-86265eaf0a93" containerName="flexvol-driver" Jan 13 20:54:14.651988 kubelet[2861]: I0113 20:54:14.651857 2861 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeba9dfe-23e2-4693-9190-c72e73d772ae" containerName="calico-typha" Jan 13 20:54:14.651988 kubelet[2861]: I0113 20:54:14.651901 2861 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ae12788-a88d-4b93-b16c-86265eaf0a93" containerName="calico-node" Jan 13 20:54:14.680839 kubelet[2861]: I0113 20:54:14.680457 2861 scope.go:117] "RemoveContainer" containerID="6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc" Jan 13 20:54:14.685920 systemd[1]: Created slice kubepods-besteffort-poda5753e94_8e81_4780_bf82_50fb2d9264c3.slice - libcontainer container kubepods-besteffort-poda5753e94_8e81_4780_bf82_50fb2d9264c3.slice. Jan 13 20:54:14.714608 kubelet[2861]: I0113 20:54:14.714011 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-xtables-lock\") pod \"5ae12788-a88d-4b93-b16c-86265eaf0a93\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " Jan 13 20:54:14.714608 kubelet[2861]: I0113 20:54:14.714050 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-cni-log-dir\") pod \"5ae12788-a88d-4b93-b16c-86265eaf0a93\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " Jan 13 20:54:14.714608 kubelet[2861]: I0113 20:54:14.714063 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-policysync\") pod \"5ae12788-a88d-4b93-b16c-86265eaf0a93\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " Jan 13 20:54:14.714608 kubelet[2861]: I0113 20:54:14.714075 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-lib-modules\") pod \"5ae12788-a88d-4b93-b16c-86265eaf0a93\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " Jan 13 20:54:14.714608 kubelet[2861]: I0113 20:54:14.714087 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-var-run-calico\") pod \"5ae12788-a88d-4b93-b16c-86265eaf0a93\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " Jan 13 20:54:14.714608 kubelet[2861]: I0113 20:54:14.714099 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-cni-bin-dir\") pod \"5ae12788-a88d-4b93-b16c-86265eaf0a93\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " Jan 13 20:54:14.714807 kubelet[2861]: I0113 20:54:14.714109 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-var-lib-calico\") pod \"5ae12788-a88d-4b93-b16c-86265eaf0a93\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " Jan 13 20:54:14.714807 kubelet[2861]: I0113 20:54:14.714121 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-flexvol-driver-host\") pod \"5ae12788-a88d-4b93-b16c-86265eaf0a93\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " Jan 13 20:54:14.714807 kubelet[2861]: I0113 20:54:14.714142 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/5ae12788-a88d-4b93-b16c-86265eaf0a93-node-certs\") pod \"5ae12788-a88d-4b93-b16c-86265eaf0a93\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " Jan 13 20:54:14.714807 kubelet[2861]: I0113 20:54:14.714153 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mrd4\" (UniqueName: \"kubernetes.io/projected/5ae12788-a88d-4b93-b16c-86265eaf0a93-kube-api-access-2mrd4\") pod \"5ae12788-a88d-4b93-b16c-86265eaf0a93\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " Jan 13 20:54:14.714807 kubelet[2861]: I0113 20:54:14.714164 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-cni-net-dir\") pod \"5ae12788-a88d-4b93-b16c-86265eaf0a93\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " Jan 13 20:54:14.714807 kubelet[2861]: I0113 20:54:14.714176 2861 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ae12788-a88d-4b93-b16c-86265eaf0a93-tigera-ca-bundle\") pod \"5ae12788-a88d-4b93-b16c-86265eaf0a93\" (UID: \"5ae12788-a88d-4b93-b16c-86265eaf0a93\") " Jan 13 20:54:14.716917 containerd[1540]: time="2025-01-13T20:54:14.716711415Z" level=info msg="RemoveContainer for \"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc\"" Jan 13 20:54:14.720260 containerd[1540]: time="2025-01-13T20:54:14.720229984Z" level=info msg="RemoveContainer for \"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc\" returns successfully" Jan 13 20:54:14.722017 kubelet[2861]: I0113 20:54:14.718821 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "5ae12788-a88d-4b93-b16c-86265eaf0a93" (UID: "5ae12788-a88d-4b93-b16c-86265eaf0a93"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:54:14.722017 kubelet[2861]: I0113 20:54:14.721733 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "5ae12788-a88d-4b93-b16c-86265eaf0a93" (UID: "5ae12788-a88d-4b93-b16c-86265eaf0a93"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:54:14.722017 kubelet[2861]: I0113 20:54:14.721755 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "5ae12788-a88d-4b93-b16c-86265eaf0a93" (UID: "5ae12788-a88d-4b93-b16c-86265eaf0a93"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:54:14.722017 kubelet[2861]: I0113 20:54:14.721768 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-policysync" (OuterVolumeSpecName: "policysync") pod "5ae12788-a88d-4b93-b16c-86265eaf0a93" (UID: "5ae12788-a88d-4b93-b16c-86265eaf0a93"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:54:14.722017 kubelet[2861]: I0113 20:54:14.721778 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "5ae12788-a88d-4b93-b16c-86265eaf0a93" (UID: "5ae12788-a88d-4b93-b16c-86265eaf0a93"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:54:14.722551 kubelet[2861]: I0113 20:54:14.721790 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "5ae12788-a88d-4b93-b16c-86265eaf0a93" (UID: "5ae12788-a88d-4b93-b16c-86265eaf0a93"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:54:14.722551 kubelet[2861]: I0113 20:54:14.721862 2861 scope.go:117] "RemoveContainer" containerID="165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b" Jan 13 20:54:14.737757 kubelet[2861]: I0113 20:54:14.736831 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "5ae12788-a88d-4b93-b16c-86265eaf0a93" (UID: "5ae12788-a88d-4b93-b16c-86265eaf0a93"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:54:14.737757 kubelet[2861]: I0113 20:54:14.736861 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "5ae12788-a88d-4b93-b16c-86265eaf0a93" (UID: "5ae12788-a88d-4b93-b16c-86265eaf0a93"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:54:14.746999 systemd[1]: var-lib-kubelet-pods-5ae12788\x2da88d\x2d4b93\x2db16c\x2d86265eaf0a93-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Jan 13 20:54:14.749730 kubelet[2861]: I0113 20:54:14.749595 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "5ae12788-a88d-4b93-b16c-86265eaf0a93" (UID: "5ae12788-a88d-4b93-b16c-86265eaf0a93"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 13 20:54:14.751427 kubelet[2861]: I0113 20:54:14.751412 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ae12788-a88d-4b93-b16c-86265eaf0a93-node-certs" (OuterVolumeSpecName: "node-certs") pod "5ae12788-a88d-4b93-b16c-86265eaf0a93" (UID: "5ae12788-a88d-4b93-b16c-86265eaf0a93"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 13 20:54:14.752654 systemd[1]: var-lib-kubelet-pods-5ae12788\x2da88d\x2d4b93\x2db16c\x2d86265eaf0a93-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. Jan 13 20:54:14.753767 kubelet[2861]: I0113 20:54:14.752709 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae12788-a88d-4b93-b16c-86265eaf0a93-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "5ae12788-a88d-4b93-b16c-86265eaf0a93" (UID: "5ae12788-a88d-4b93-b16c-86265eaf0a93"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 13 20:54:14.754856 containerd[1540]: time="2025-01-13T20:54:14.754838586Z" level=info msg="RemoveContainer for \"165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b\"" Jan 13 20:54:14.755295 systemd[1]: var-lib-kubelet-pods-5ae12788\x2da88d\x2d4b93\x2db16c\x2d86265eaf0a93-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d2mrd4.mount: Deactivated successfully. Jan 13 20:54:14.758811 kubelet[2861]: I0113 20:54:14.758781 2861 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae12788-a88d-4b93-b16c-86265eaf0a93-kube-api-access-2mrd4" (OuterVolumeSpecName: "kube-api-access-2mrd4") pod "5ae12788-a88d-4b93-b16c-86265eaf0a93" (UID: "5ae12788-a88d-4b93-b16c-86265eaf0a93"). InnerVolumeSpecName "kube-api-access-2mrd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 13 20:54:14.759155 containerd[1540]: time="2025-01-13T20:54:14.759142632Z" level=info msg="RemoveContainer for \"165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b\" returns successfully" Jan 13 20:54:14.759294 kubelet[2861]: I0113 20:54:14.759281 2861 scope.go:117] "RemoveContainer" containerID="5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9" Jan 13 20:54:14.760064 containerd[1540]: time="2025-01-13T20:54:14.760053235Z" level=info msg="RemoveContainer for \"5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9\"" Jan 13 20:54:14.761644 containerd[1540]: time="2025-01-13T20:54:14.761632485Z" level=info msg="RemoveContainer for \"5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9\" returns successfully" Jan 13 20:54:14.761770 kubelet[2861]: I0113 20:54:14.761756 2861 scope.go:117] "RemoveContainer" containerID="6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc" Jan 13 20:54:14.763518 containerd[1540]: time="2025-01-13T20:54:14.763449956Z" level=error msg="ContainerStatus for \"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc\": not found" Jan 13 20:54:14.776661 kubelet[2861]: E0113 20:54:14.776548 2861 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc\": not found" containerID="6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc" Jan 13 20:54:14.778878 kubelet[2861]: I0113 20:54:14.778786 2861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc"} err="failed to get container status \"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc\": rpc error: code = NotFound desc = an error occurred when try to find container \"6f099d6b0e9df0871510864521ff0cfd4897e60f562e0efa94792a4523110ddc\": not found" Jan 13 20:54:14.778878 kubelet[2861]: I0113 20:54:14.778806 2861 scope.go:117] "RemoveContainer" containerID="165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b" Jan 13 20:54:14.779080 containerd[1540]: time="2025-01-13T20:54:14.779046791Z" level=error msg="ContainerStatus for \"165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b\": not found" Jan 13 20:54:14.779256 kubelet[2861]: E0113 20:54:14.779244 2861 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b\": not found" containerID="165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b" Jan 13 20:54:14.779285 kubelet[2861]: I0113 20:54:14.779270 2861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b"} err="failed to get container status \"165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b\": rpc error: code = NotFound desc = an error occurred when try to find container \"165b72813e6ebb489b410a410222a01da1aa8e64c5d3efae073b54d731521a4b\": not found" Jan 13 20:54:14.779285 kubelet[2861]: I0113 20:54:14.779278 2861 scope.go:117] "RemoveContainer" containerID="5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9" Jan 13 20:54:14.779641 containerd[1540]: time="2025-01-13T20:54:14.779623951Z" level=error msg="ContainerStatus for \"5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9\": not found" Jan 13 20:54:14.779804 kubelet[2861]: E0113 20:54:14.779766 2861 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9\": not found" containerID="5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9" Jan 13 20:54:14.779804 kubelet[2861]: I0113 20:54:14.779788 2861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9"} err="failed to get container status \"5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9\": rpc error: code = NotFound desc = an error occurred when try to find container \"5fe6481600e2225cdeec5a08b12e5ee1fb86f19b65ac7836820fabceb510eac9\": not found" Jan 13 20:54:14.783464 sshd[8194]: Connection closed by 147.75.109.163 port 47688 Jan 13 20:54:14.784257 sshd-session[8159]: pam_unix(sshd:session): session closed for user core Jan 13 20:54:14.786174 systemd[1]: sshd@25-139.178.70.104:22-147.75.109.163:47688.service: Deactivated successfully. Jan 13 20:54:14.787576 systemd[1]: session-28.scope: Deactivated successfully. Jan 13 20:54:14.788131 systemd-logind[1528]: Session 28 logged out. Waiting for processes to exit. Jan 13 20:54:14.788811 systemd-logind[1528]: Removed session 28. Jan 13 20:54:14.817300 kubelet[2861]: I0113 20:54:14.817254 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a5753e94-8e81-4780-bf82-50fb2d9264c3-xtables-lock\") pod \"calico-node-49hb7\" (UID: \"a5753e94-8e81-4780-bf82-50fb2d9264c3\") " pod="calico-system/calico-node-49hb7" Jan 13 20:54:14.817300 kubelet[2861]: I0113 20:54:14.817283 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a5753e94-8e81-4780-bf82-50fb2d9264c3-cni-net-dir\") pod \"calico-node-49hb7\" (UID: \"a5753e94-8e81-4780-bf82-50fb2d9264c3\") " pod="calico-system/calico-node-49hb7" Jan 13 20:54:14.818960 kubelet[2861]: I0113 20:54:14.818945 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a5753e94-8e81-4780-bf82-50fb2d9264c3-node-certs\") pod \"calico-node-49hb7\" (UID: \"a5753e94-8e81-4780-bf82-50fb2d9264c3\") " pod="calico-system/calico-node-49hb7" Jan 13 20:54:14.819009 kubelet[2861]: I0113 20:54:14.818990 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a5753e94-8e81-4780-bf82-50fb2d9264c3-var-run-calico\") pod \"calico-node-49hb7\" (UID: \"a5753e94-8e81-4780-bf82-50fb2d9264c3\") " pod="calico-system/calico-node-49hb7" Jan 13 20:54:14.819009 kubelet[2861]: I0113 20:54:14.819005 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgczs\" (UniqueName: \"kubernetes.io/projected/a5753e94-8e81-4780-bf82-50fb2d9264c3-kube-api-access-rgczs\") pod \"calico-node-49hb7\" (UID: \"a5753e94-8e81-4780-bf82-50fb2d9264c3\") " pod="calico-system/calico-node-49hb7" Jan 13 20:54:14.819360 kubelet[2861]: I0113 20:54:14.819020 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a5753e94-8e81-4780-bf82-50fb2d9264c3-flexvol-driver-host\") pod \"calico-node-49hb7\" (UID: \"a5753e94-8e81-4780-bf82-50fb2d9264c3\") " pod="calico-system/calico-node-49hb7" Jan 13 20:54:14.819360 kubelet[2861]: I0113 20:54:14.819033 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a5753e94-8e81-4780-bf82-50fb2d9264c3-var-lib-calico\") pod \"calico-node-49hb7\" (UID: \"a5753e94-8e81-4780-bf82-50fb2d9264c3\") " pod="calico-system/calico-node-49hb7" Jan 13 20:54:14.819360 kubelet[2861]: I0113 20:54:14.819045 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a5753e94-8e81-4780-bf82-50fb2d9264c3-lib-modules\") pod \"calico-node-49hb7\" (UID: \"a5753e94-8e81-4780-bf82-50fb2d9264c3\") " pod="calico-system/calico-node-49hb7" Jan 13 20:54:14.819360 kubelet[2861]: I0113 20:54:14.819055 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a5753e94-8e81-4780-bf82-50fb2d9264c3-cni-bin-dir\") pod \"calico-node-49hb7\" (UID: \"a5753e94-8e81-4780-bf82-50fb2d9264c3\") " pod="calico-system/calico-node-49hb7" Jan 13 20:54:14.819360 kubelet[2861]: I0113 20:54:14.819075 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a5753e94-8e81-4780-bf82-50fb2d9264c3-policysync\") pod \"calico-node-49hb7\" (UID: \"a5753e94-8e81-4780-bf82-50fb2d9264c3\") " pod="calico-system/calico-node-49hb7" Jan 13 20:54:14.819470 kubelet[2861]: I0113 20:54:14.819086 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5753e94-8e81-4780-bf82-50fb2d9264c3-tigera-ca-bundle\") pod \"calico-node-49hb7\" (UID: \"a5753e94-8e81-4780-bf82-50fb2d9264c3\") " pod="calico-system/calico-node-49hb7" Jan 13 20:54:14.819470 kubelet[2861]: I0113 20:54:14.819098 2861 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a5753e94-8e81-4780-bf82-50fb2d9264c3-cni-log-dir\") pod \"calico-node-49hb7\" (UID: \"a5753e94-8e81-4780-bf82-50fb2d9264c3\") " pod="calico-system/calico-node-49hb7" Jan 13 20:54:14.819470 kubelet[2861]: I0113 20:54:14.819117 2861 reconciler_common.go:300] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/5ae12788-a88d-4b93-b16c-86265eaf0a93-node-certs\") on node \"localhost\" DevicePath \"\"" Jan 13 20:54:14.819470 kubelet[2861]: I0113 20:54:14.819126 2861 reconciler_common.go:300] "Volume detached for volume \"kube-api-access-2mrd4\" (UniqueName: \"kubernetes.io/projected/5ae12788-a88d-4b93-b16c-86265eaf0a93-kube-api-access-2mrd4\") on node \"localhost\" DevicePath \"\"" Jan 13 20:54:14.819470 kubelet[2861]: I0113 20:54:14.819132 2861 reconciler_common.go:300] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-cni-net-dir\") on node \"localhost\" DevicePath \"\"" Jan 13 20:54:14.819470 kubelet[2861]: I0113 20:54:14.819138 2861 reconciler_common.go:300] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ae12788-a88d-4b93-b16c-86265eaf0a93-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Jan 13 20:54:14.819470 kubelet[2861]: I0113 20:54:14.819172 2861 reconciler_common.go:300] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-xtables-lock\") on node \"localhost\" DevicePath \"\"" Jan 13 20:54:14.819587 kubelet[2861]: I0113 20:54:14.819184 2861 reconciler_common.go:300] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-cni-log-dir\") on node \"localhost\" DevicePath \"\"" Jan 13 20:54:14.819587 kubelet[2861]: I0113 20:54:14.819190 2861 reconciler_common.go:300] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-policysync\") on node \"localhost\" DevicePath \"\"" Jan 13 20:54:14.819587 kubelet[2861]: I0113 20:54:14.819195 2861 reconciler_common.go:300] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-lib-modules\") on node \"localhost\" DevicePath \"\"" Jan 13 20:54:14.819587 kubelet[2861]: I0113 20:54:14.819212 2861 reconciler_common.go:300] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-var-lib-calico\") on node \"localhost\" DevicePath \"\"" Jan 13 20:54:14.819587 kubelet[2861]: I0113 20:54:14.819219 2861 reconciler_common.go:300] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-var-run-calico\") on node \"localhost\" DevicePath \"\"" Jan 13 20:54:14.819587 kubelet[2861]: I0113 20:54:14.819224 2861 reconciler_common.go:300] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-cni-bin-dir\") on node \"localhost\" DevicePath \"\"" Jan 13 20:54:14.819587 kubelet[2861]: I0113 20:54:14.819231 2861 reconciler_common.go:300] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/5ae12788-a88d-4b93-b16c-86265eaf0a93-flexvol-driver-host\") on node \"localhost\" DevicePath \"\"" Jan 13 20:54:14.917091 systemd[1]: Removed slice kubepods-besteffort-pod5ae12788_a88d_4b93_b16c_86265eaf0a93.slice - libcontainer container kubepods-besteffort-pod5ae12788_a88d_4b93_b16c_86265eaf0a93.slice. Jan 13 20:54:14.917232 systemd[1]: kubepods-besteffort-pod5ae12788_a88d_4b93_b16c_86265eaf0a93.slice: Consumed 7.349s CPU time. Jan 13 20:54:15.026038 containerd[1540]: time="2025-01-13T20:54:15.025943442Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-49hb7,Uid:a5753e94-8e81-4780-bf82-50fb2d9264c3,Namespace:calico-system,Attempt:0,}" Jan 13 20:54:15.064060 containerd[1540]: time="2025-01-13T20:54:15.063675805Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:54:15.064285 containerd[1540]: time="2025-01-13T20:54:15.064181363Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:54:15.064285 containerd[1540]: time="2025-01-13T20:54:15.064200802Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:54:15.064476 containerd[1540]: time="2025-01-13T20:54:15.064449153Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:54:15.079703 systemd[1]: Started cri-containerd-f4395ed6c35922cb63738d98dbd62af306a81c7b503ab7b6b6d472ec469cbcd7.scope - libcontainer container f4395ed6c35922cb63738d98dbd62af306a81c7b503ab7b6b6d472ec469cbcd7. Jan 13 20:54:15.105435 containerd[1540]: time="2025-01-13T20:54:15.105407234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-49hb7,Uid:a5753e94-8e81-4780-bf82-50fb2d9264c3,Namespace:calico-system,Attempt:0,} returns sandbox id \"f4395ed6c35922cb63738d98dbd62af306a81c7b503ab7b6b6d472ec469cbcd7\"" Jan 13 20:54:15.121733 containerd[1540]: time="2025-01-13T20:54:15.121709836Z" level=info msg="CreateContainer within sandbox \"f4395ed6c35922cb63738d98dbd62af306a81c7b503ab7b6b6d472ec469cbcd7\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 13 20:54:15.155620 containerd[1540]: time="2025-01-13T20:54:15.155585332Z" level=info msg="CreateContainer within sandbox \"f4395ed6c35922cb63738d98dbd62af306a81c7b503ab7b6b6d472ec469cbcd7\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"c8d465bcd41d172293345c02729fee8a9b435fcfe4878112c544f3c16653bc3c\"" Jan 13 20:54:15.157090 containerd[1540]: time="2025-01-13T20:54:15.156046172Z" level=info msg="StartContainer for \"c8d465bcd41d172293345c02729fee8a9b435fcfe4878112c544f3c16653bc3c\"" Jan 13 20:54:15.174457 systemd[1]: Started cri-containerd-c8d465bcd41d172293345c02729fee8a9b435fcfe4878112c544f3c16653bc3c.scope - libcontainer container c8d465bcd41d172293345c02729fee8a9b435fcfe4878112c544f3c16653bc3c. Jan 13 20:54:15.192939 containerd[1540]: time="2025-01-13T20:54:15.192912706Z" level=info msg="StartContainer for \"c8d465bcd41d172293345c02729fee8a9b435fcfe4878112c544f3c16653bc3c\" returns successfully" Jan 13 20:54:15.318566 systemd[1]: cri-containerd-c8d465bcd41d172293345c02729fee8a9b435fcfe4878112c544f3c16653bc3c.scope: Deactivated successfully. Jan 13 20:54:15.337770 containerd[1540]: time="2025-01-13T20:54:15.337660448Z" level=info msg="shim disconnected" id=c8d465bcd41d172293345c02729fee8a9b435fcfe4878112c544f3c16653bc3c namespace=k8s.io Jan 13 20:54:15.337770 containerd[1540]: time="2025-01-13T20:54:15.337701170Z" level=warning msg="cleaning up after shim disconnected" id=c8d465bcd41d172293345c02729fee8a9b435fcfe4878112c544f3c16653bc3c namespace=k8s.io Jan 13 20:54:15.337770 containerd[1540]: time="2025-01-13T20:54:15.337707255Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:54:15.623647 containerd[1540]: time="2025-01-13T20:54:15.623583670Z" level=info msg="CreateContainer within sandbox \"f4395ed6c35922cb63738d98dbd62af306a81c7b503ab7b6b6d472ec469cbcd7\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 13 20:54:15.639964 containerd[1540]: time="2025-01-13T20:54:15.639902448Z" level=info msg="CreateContainer within sandbox \"f4395ed6c35922cb63738d98dbd62af306a81c7b503ab7b6b6d472ec469cbcd7\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"57bd7f241815a009651f742f2d4e7fe69cb2ded48a0541414e00092943ab071d\"" Jan 13 20:54:15.640768 containerd[1540]: time="2025-01-13T20:54:15.640750245Z" level=info msg="StartContainer for \"57bd7f241815a009651f742f2d4e7fe69cb2ded48a0541414e00092943ab071d\"" Jan 13 20:54:15.660439 systemd[1]: Started cri-containerd-57bd7f241815a009651f742f2d4e7fe69cb2ded48a0541414e00092943ab071d.scope - libcontainer container 57bd7f241815a009651f742f2d4e7fe69cb2ded48a0541414e00092943ab071d. Jan 13 20:54:15.678951 containerd[1540]: time="2025-01-13T20:54:15.677950661Z" level=info msg="StartContainer for \"57bd7f241815a009651f742f2d4e7fe69cb2ded48a0541414e00092943ab071d\" returns successfully" Jan 13 20:54:16.008784 kubelet[2861]: I0113 20:54:16.008714 2861 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="5ae12788-a88d-4b93-b16c-86265eaf0a93" path="/var/lib/kubelet/pods/5ae12788-a88d-4b93-b16c-86265eaf0a93/volumes"