Jan 30 13:05:45.734271 kernel: Linux version 6.6.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241116 p3) 14.2.1 20241116, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Wed Jan 29 09:29:54 -00 2025 Jan 30 13:05:45.734289 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=fe60919b0c6f6abb7495678f87f7024e97a038fc343fa31a123a43ef5f489466 Jan 30 13:05:45.734295 kernel: Disabled fast string operations Jan 30 13:05:45.734299 kernel: BIOS-provided physical RAM map: Jan 30 13:05:45.734303 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Jan 30 13:05:45.734307 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Jan 30 13:05:45.734313 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Jan 30 13:05:45.734318 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Jan 30 13:05:45.734322 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Jan 30 13:05:45.734326 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Jan 30 13:05:45.734330 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Jan 30 13:05:45.734334 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Jan 30 13:05:45.734339 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Jan 30 13:05:45.734343 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Jan 30 13:05:45.734349 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Jan 30 13:05:45.734354 kernel: NX (Execute Disable) protection: active Jan 30 13:05:45.734359 kernel: APIC: Static calls initialized Jan 30 13:05:45.734364 kernel: SMBIOS 2.7 present. Jan 30 13:05:45.734369 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Jan 30 13:05:45.734374 kernel: vmware: hypercall mode: 0x00 Jan 30 13:05:45.734378 kernel: Hypervisor detected: VMware Jan 30 13:05:45.734383 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Jan 30 13:05:45.734389 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Jan 30 13:05:45.734394 kernel: vmware: using clock offset of 2579831776 ns Jan 30 13:05:45.734399 kernel: tsc: Detected 3408.000 MHz processor Jan 30 13:05:45.734404 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 30 13:05:45.734410 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 30 13:05:45.734415 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Jan 30 13:05:45.734420 kernel: total RAM covered: 3072M Jan 30 13:05:45.734425 kernel: Found optimal setting for mtrr clean up Jan 30 13:05:45.734430 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Jan 30 13:05:45.734435 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Jan 30 13:05:45.734441 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 30 13:05:45.734446 kernel: Using GB pages for direct mapping Jan 30 13:05:45.734451 kernel: ACPI: Early table checksum verification disabled Jan 30 13:05:45.734456 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Jan 30 13:05:45.734461 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Jan 30 13:05:45.734466 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Jan 30 13:05:45.734471 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Jan 30 13:05:45.734476 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 30 13:05:45.734484 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 30 13:05:45.734489 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Jan 30 13:05:45.734494 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Jan 30 13:05:45.734500 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Jan 30 13:05:45.734505 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Jan 30 13:05:45.734510 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Jan 30 13:05:45.734517 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Jan 30 13:05:45.734522 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Jan 30 13:05:45.734527 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Jan 30 13:05:45.734532 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 30 13:05:45.734537 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 30 13:05:45.734542 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Jan 30 13:05:45.734548 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Jan 30 13:05:45.734553 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Jan 30 13:05:45.734558 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Jan 30 13:05:45.734564 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Jan 30 13:05:45.734569 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Jan 30 13:05:45.734574 kernel: system APIC only can use physical flat Jan 30 13:05:45.734579 kernel: APIC: Switched APIC routing to: physical flat Jan 30 13:05:45.734584 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 30 13:05:45.734589 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 30 13:05:45.734595 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 30 13:05:45.734600 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 30 13:05:45.734605 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 30 13:05:45.734610 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 30 13:05:45.734623 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 30 13:05:45.734628 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 30 13:05:45.734633 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Jan 30 13:05:45.734638 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Jan 30 13:05:45.734643 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Jan 30 13:05:45.734648 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Jan 30 13:05:45.734653 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Jan 30 13:05:45.734658 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Jan 30 13:05:45.734663 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Jan 30 13:05:45.734668 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Jan 30 13:05:45.734674 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Jan 30 13:05:45.734680 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Jan 30 13:05:45.734684 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Jan 30 13:05:45.734689 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Jan 30 13:05:45.734694 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Jan 30 13:05:45.734699 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Jan 30 13:05:45.734704 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Jan 30 13:05:45.734709 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Jan 30 13:05:45.734715 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Jan 30 13:05:45.734719 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Jan 30 13:05:45.734726 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Jan 30 13:05:45.734731 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Jan 30 13:05:45.734736 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Jan 30 13:05:45.734741 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Jan 30 13:05:45.734746 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Jan 30 13:05:45.734750 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Jan 30 13:05:45.734756 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Jan 30 13:05:45.734761 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Jan 30 13:05:45.734766 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Jan 30 13:05:45.734771 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Jan 30 13:05:45.734776 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Jan 30 13:05:45.734783 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Jan 30 13:05:45.734787 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Jan 30 13:05:45.734793 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Jan 30 13:05:45.734798 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Jan 30 13:05:45.734803 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Jan 30 13:05:45.734808 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Jan 30 13:05:45.734813 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Jan 30 13:05:45.734818 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Jan 30 13:05:45.734823 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Jan 30 13:05:45.734828 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Jan 30 13:05:45.734834 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Jan 30 13:05:45.734839 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Jan 30 13:05:45.734844 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Jan 30 13:05:45.734849 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Jan 30 13:05:45.734854 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Jan 30 13:05:45.734859 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Jan 30 13:05:45.734864 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Jan 30 13:05:45.734869 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Jan 30 13:05:45.734873 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Jan 30 13:05:45.734879 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Jan 30 13:05:45.734885 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Jan 30 13:05:45.734890 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Jan 30 13:05:45.734899 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Jan 30 13:05:45.734905 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Jan 30 13:05:45.734910 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Jan 30 13:05:45.734916 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Jan 30 13:05:45.734921 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Jan 30 13:05:45.734927 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Jan 30 13:05:45.734932 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Jan 30 13:05:45.734938 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Jan 30 13:05:45.734944 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Jan 30 13:05:45.734949 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Jan 30 13:05:45.734954 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Jan 30 13:05:45.734960 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Jan 30 13:05:45.734965 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Jan 30 13:05:45.734970 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Jan 30 13:05:45.734976 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Jan 30 13:05:45.734981 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Jan 30 13:05:45.734986 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Jan 30 13:05:45.734993 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Jan 30 13:05:45.734998 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Jan 30 13:05:45.735004 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Jan 30 13:05:45.735009 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Jan 30 13:05:45.735014 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Jan 30 13:05:45.735020 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Jan 30 13:05:45.735025 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Jan 30 13:05:45.735030 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Jan 30 13:05:45.735036 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Jan 30 13:05:45.735041 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Jan 30 13:05:45.735048 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Jan 30 13:05:45.735053 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Jan 30 13:05:45.735059 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Jan 30 13:05:45.735064 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Jan 30 13:05:45.735069 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Jan 30 13:05:45.735074 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Jan 30 13:05:45.735080 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Jan 30 13:05:45.735085 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Jan 30 13:05:45.735090 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Jan 30 13:05:45.735096 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Jan 30 13:05:45.735101 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Jan 30 13:05:45.735107 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Jan 30 13:05:45.735113 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Jan 30 13:05:45.735118 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Jan 30 13:05:45.735123 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Jan 30 13:05:45.735129 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Jan 30 13:05:45.735134 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Jan 30 13:05:45.735139 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Jan 30 13:05:45.735145 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Jan 30 13:05:45.735150 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Jan 30 13:05:45.735155 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Jan 30 13:05:45.735162 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Jan 30 13:05:45.735167 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Jan 30 13:05:45.735173 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Jan 30 13:05:45.735178 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Jan 30 13:05:45.735183 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Jan 30 13:05:45.735189 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Jan 30 13:05:45.735194 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Jan 30 13:05:45.735199 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Jan 30 13:05:45.735204 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Jan 30 13:05:45.735210 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Jan 30 13:05:45.735216 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Jan 30 13:05:45.735221 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Jan 30 13:05:45.735227 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Jan 30 13:05:45.735233 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Jan 30 13:05:45.735238 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Jan 30 13:05:45.735243 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Jan 30 13:05:45.735248 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Jan 30 13:05:45.735254 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Jan 30 13:05:45.735259 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Jan 30 13:05:45.735265 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Jan 30 13:05:45.735272 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Jan 30 13:05:45.735277 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 30 13:05:45.735283 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 30 13:05:45.735288 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Jan 30 13:05:45.735294 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Jan 30 13:05:45.735299 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Jan 30 13:05:45.735305 kernel: Zone ranges: Jan 30 13:05:45.735311 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 30 13:05:45.735316 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Jan 30 13:05:45.735321 kernel: Normal empty Jan 30 13:05:45.735328 kernel: Movable zone start for each node Jan 30 13:05:45.735333 kernel: Early memory node ranges Jan 30 13:05:45.735339 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Jan 30 13:05:45.735344 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Jan 30 13:05:45.735350 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Jan 30 13:05:45.735355 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Jan 30 13:05:45.735360 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 30 13:05:45.735366 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Jan 30 13:05:45.735372 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Jan 30 13:05:45.735378 kernel: ACPI: PM-Timer IO Port: 0x1008 Jan 30 13:05:45.735384 kernel: system APIC only can use physical flat Jan 30 13:05:45.735389 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Jan 30 13:05:45.735395 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Jan 30 13:05:45.735400 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Jan 30 13:05:45.735406 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Jan 30 13:05:45.735411 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Jan 30 13:05:45.735416 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Jan 30 13:05:45.735422 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Jan 30 13:05:45.735427 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Jan 30 13:05:45.735434 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Jan 30 13:05:45.735439 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Jan 30 13:05:45.735445 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Jan 30 13:05:45.735450 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Jan 30 13:05:45.735456 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Jan 30 13:05:45.735461 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Jan 30 13:05:45.735466 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Jan 30 13:05:45.735472 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Jan 30 13:05:45.735477 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Jan 30 13:05:45.735483 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Jan 30 13:05:45.735490 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Jan 30 13:05:45.735495 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Jan 30 13:05:45.735501 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Jan 30 13:05:45.735506 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Jan 30 13:05:45.735511 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Jan 30 13:05:45.735517 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Jan 30 13:05:45.735522 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Jan 30 13:05:45.735527 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Jan 30 13:05:45.735533 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Jan 30 13:05:45.735540 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Jan 30 13:05:45.735545 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Jan 30 13:05:45.735550 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Jan 30 13:05:45.735556 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Jan 30 13:05:45.735561 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Jan 30 13:05:45.735566 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Jan 30 13:05:45.735572 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Jan 30 13:05:45.735577 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Jan 30 13:05:45.735582 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Jan 30 13:05:45.735588 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Jan 30 13:05:45.735595 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Jan 30 13:05:45.735600 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Jan 30 13:05:45.735606 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Jan 30 13:05:45.735611 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Jan 30 13:05:45.735622 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Jan 30 13:05:45.735628 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Jan 30 13:05:45.735633 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Jan 30 13:05:45.735639 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Jan 30 13:05:45.735644 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Jan 30 13:05:45.735649 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Jan 30 13:05:45.735656 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Jan 30 13:05:45.735662 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Jan 30 13:05:45.735667 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Jan 30 13:05:45.735673 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Jan 30 13:05:45.735678 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Jan 30 13:05:45.735684 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Jan 30 13:05:45.735690 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Jan 30 13:05:45.735695 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Jan 30 13:05:45.735701 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Jan 30 13:05:45.735706 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Jan 30 13:05:45.735713 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Jan 30 13:05:45.735718 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Jan 30 13:05:45.735723 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Jan 30 13:05:45.735729 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Jan 30 13:05:45.735734 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Jan 30 13:05:45.735739 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Jan 30 13:05:45.735745 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Jan 30 13:05:45.735750 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Jan 30 13:05:45.735755 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Jan 30 13:05:45.735761 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Jan 30 13:05:45.735767 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Jan 30 13:05:45.735773 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Jan 30 13:05:45.735778 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Jan 30 13:05:45.735783 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Jan 30 13:05:45.735789 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Jan 30 13:05:45.735794 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Jan 30 13:05:45.735800 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Jan 30 13:05:45.735805 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Jan 30 13:05:45.735810 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Jan 30 13:05:45.735817 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Jan 30 13:05:45.735823 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Jan 30 13:05:45.735828 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Jan 30 13:05:45.735833 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Jan 30 13:05:45.735839 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Jan 30 13:05:45.735844 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Jan 30 13:05:45.735850 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Jan 30 13:05:45.735855 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Jan 30 13:05:45.735860 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Jan 30 13:05:45.735866 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Jan 30 13:05:45.735872 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Jan 30 13:05:45.735878 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Jan 30 13:05:45.735883 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Jan 30 13:05:45.735888 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Jan 30 13:05:45.735894 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Jan 30 13:05:45.735899 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Jan 30 13:05:45.735905 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Jan 30 13:05:45.735910 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Jan 30 13:05:45.735915 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Jan 30 13:05:45.735921 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Jan 30 13:05:45.735927 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Jan 30 13:05:45.735933 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Jan 30 13:05:45.735938 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Jan 30 13:05:45.735943 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Jan 30 13:05:45.735949 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Jan 30 13:05:45.735954 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Jan 30 13:05:45.735959 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Jan 30 13:05:45.735965 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Jan 30 13:05:45.735970 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Jan 30 13:05:45.735976 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Jan 30 13:05:45.735982 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Jan 30 13:05:45.735988 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Jan 30 13:05:45.735993 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Jan 30 13:05:45.735999 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Jan 30 13:05:45.736004 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Jan 30 13:05:45.736009 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Jan 30 13:05:45.736015 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Jan 30 13:05:45.736020 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Jan 30 13:05:45.736025 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Jan 30 13:05:45.736032 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Jan 30 13:05:45.736038 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Jan 30 13:05:45.736043 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Jan 30 13:05:45.736048 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Jan 30 13:05:45.736054 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Jan 30 13:05:45.736059 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Jan 30 13:05:45.736064 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Jan 30 13:05:45.736070 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Jan 30 13:05:45.736075 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Jan 30 13:05:45.736081 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Jan 30 13:05:45.736087 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Jan 30 13:05:45.736093 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Jan 30 13:05:45.736098 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Jan 30 13:05:45.736104 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Jan 30 13:05:45.736109 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Jan 30 13:05:45.736115 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 30 13:05:45.736120 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Jan 30 13:05:45.736126 kernel: TSC deadline timer available Jan 30 13:05:45.736132 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Jan 30 13:05:45.736137 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Jan 30 13:05:45.736144 kernel: Booting paravirtualized kernel on VMware hypervisor Jan 30 13:05:45.736155 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 30 13:05:45.736161 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Jan 30 13:05:45.736167 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 30 13:05:45.736172 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 30 13:05:45.736178 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Jan 30 13:05:45.736183 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Jan 30 13:05:45.736189 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Jan 30 13:05:45.736196 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Jan 30 13:05:45.736201 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Jan 30 13:05:45.736214 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Jan 30 13:05:45.736222 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Jan 30 13:05:45.736228 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Jan 30 13:05:45.736233 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Jan 30 13:05:45.736239 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Jan 30 13:05:45.736245 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Jan 30 13:05:45.736250 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Jan 30 13:05:45.736257 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Jan 30 13:05:45.736263 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Jan 30 13:05:45.736269 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Jan 30 13:05:45.736275 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Jan 30 13:05:45.736281 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=fe60919b0c6f6abb7495678f87f7024e97a038fc343fa31a123a43ef5f489466 Jan 30 13:05:45.736288 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 30 13:05:45.736293 kernel: random: crng init done Jan 30 13:05:45.736299 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Jan 30 13:05:45.736306 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Jan 30 13:05:45.736312 kernel: printk: log_buf_len min size: 262144 bytes Jan 30 13:05:45.736318 kernel: printk: log_buf_len: 1048576 bytes Jan 30 13:05:45.736324 kernel: printk: early log buf free: 239648(91%) Jan 30 13:05:45.736330 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 30 13:05:45.736336 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 30 13:05:45.736341 kernel: Fallback order for Node 0: 0 Jan 30 13:05:45.736347 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Jan 30 13:05:45.736354 kernel: Policy zone: DMA32 Jan 30 13:05:45.736361 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 30 13:05:45.736367 kernel: Memory: 1934304K/2096628K available (14336K kernel code, 2301K rwdata, 22800K rodata, 43320K init, 1752K bss, 162064K reserved, 0K cma-reserved) Jan 30 13:05:45.736375 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Jan 30 13:05:45.736381 kernel: ftrace: allocating 37893 entries in 149 pages Jan 30 13:05:45.736387 kernel: ftrace: allocated 149 pages with 4 groups Jan 30 13:05:45.736394 kernel: Dynamic Preempt: voluntary Jan 30 13:05:45.736400 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 30 13:05:45.736406 kernel: rcu: RCU event tracing is enabled. Jan 30 13:05:45.736412 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Jan 30 13:05:45.736418 kernel: Trampoline variant of Tasks RCU enabled. Jan 30 13:05:45.736424 kernel: Rude variant of Tasks RCU enabled. Jan 30 13:05:45.736429 kernel: Tracing variant of Tasks RCU enabled. Jan 30 13:05:45.736436 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 30 13:05:45.736441 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Jan 30 13:05:45.736447 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Jan 30 13:05:45.736454 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Jan 30 13:05:45.736460 kernel: Console: colour VGA+ 80x25 Jan 30 13:05:45.736466 kernel: printk: console [tty0] enabled Jan 30 13:05:45.736472 kernel: printk: console [ttyS0] enabled Jan 30 13:05:45.736478 kernel: ACPI: Core revision 20230628 Jan 30 13:05:45.736485 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Jan 30 13:05:45.736491 kernel: APIC: Switch to symmetric I/O mode setup Jan 30 13:05:45.736497 kernel: x2apic enabled Jan 30 13:05:45.736503 kernel: APIC: Switched APIC routing to: physical x2apic Jan 30 13:05:45.736510 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 30 13:05:45.736516 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 30 13:05:45.736521 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Jan 30 13:05:45.736527 kernel: Disabled fast string operations Jan 30 13:05:45.736533 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Jan 30 13:05:45.736539 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Jan 30 13:05:45.736545 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 30 13:05:45.736552 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Jan 30 13:05:45.736558 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Jan 30 13:05:45.736565 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 30 13:05:45.736571 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 30 13:05:45.736577 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 30 13:05:45.736583 kernel: RETBleed: Mitigation: Enhanced IBRS Jan 30 13:05:45.736589 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 30 13:05:45.736596 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 30 13:05:45.736602 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jan 30 13:05:45.736608 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 30 13:05:45.736613 kernel: GDS: Unknown: Dependent on hypervisor status Jan 30 13:05:45.736633 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 30 13:05:45.736640 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 30 13:05:45.736646 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 30 13:05:45.736652 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 30 13:05:45.736658 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jan 30 13:05:45.736663 kernel: Freeing SMP alternatives memory: 32K Jan 30 13:05:45.736669 kernel: pid_max: default: 131072 minimum: 1024 Jan 30 13:05:45.736675 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 30 13:05:45.736681 kernel: landlock: Up and running. Jan 30 13:05:45.736688 kernel: SELinux: Initializing. Jan 30 13:05:45.736694 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 30 13:05:45.736700 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 30 13:05:45.736706 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Jan 30 13:05:45.736712 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 30 13:05:45.736718 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 30 13:05:45.736724 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 30 13:05:45.736730 kernel: Performance Events: Skylake events, core PMU driver. Jan 30 13:05:45.736736 kernel: core: CPUID marked event: 'cpu cycles' unavailable Jan 30 13:05:45.736743 kernel: core: CPUID marked event: 'instructions' unavailable Jan 30 13:05:45.736749 kernel: core: CPUID marked event: 'bus cycles' unavailable Jan 30 13:05:45.736755 kernel: core: CPUID marked event: 'cache references' unavailable Jan 30 13:05:45.736760 kernel: core: CPUID marked event: 'cache misses' unavailable Jan 30 13:05:45.736766 kernel: core: CPUID marked event: 'branch instructions' unavailable Jan 30 13:05:45.736772 kernel: core: CPUID marked event: 'branch misses' unavailable Jan 30 13:05:45.736777 kernel: ... version: 1 Jan 30 13:05:45.736783 kernel: ... bit width: 48 Jan 30 13:05:45.736790 kernel: ... generic registers: 4 Jan 30 13:05:45.736797 kernel: ... value mask: 0000ffffffffffff Jan 30 13:05:45.736802 kernel: ... max period: 000000007fffffff Jan 30 13:05:45.736808 kernel: ... fixed-purpose events: 0 Jan 30 13:05:45.736814 kernel: ... event mask: 000000000000000f Jan 30 13:05:45.736820 kernel: signal: max sigframe size: 1776 Jan 30 13:05:45.736826 kernel: rcu: Hierarchical SRCU implementation. Jan 30 13:05:45.736832 kernel: rcu: Max phase no-delay instances is 400. Jan 30 13:05:45.736838 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 30 13:05:45.736846 kernel: smp: Bringing up secondary CPUs ... Jan 30 13:05:45.736852 kernel: smpboot: x86: Booting SMP configuration: Jan 30 13:05:45.736858 kernel: .... node #0, CPUs: #1 Jan 30 13:05:45.736864 kernel: Disabled fast string operations Jan 30 13:05:45.736870 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Jan 30 13:05:45.736876 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 30 13:05:45.736882 kernel: smp: Brought up 1 node, 2 CPUs Jan 30 13:05:45.736888 kernel: smpboot: Max logical packages: 128 Jan 30 13:05:45.736894 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Jan 30 13:05:45.736900 kernel: devtmpfs: initialized Jan 30 13:05:45.736907 kernel: x86/mm: Memory block size: 128MB Jan 30 13:05:45.736913 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Jan 30 13:05:45.736919 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 30 13:05:45.736925 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Jan 30 13:05:45.736931 kernel: pinctrl core: initialized pinctrl subsystem Jan 30 13:05:45.736937 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 30 13:05:45.736943 kernel: audit: initializing netlink subsys (disabled) Jan 30 13:05:45.736949 kernel: audit: type=2000 audit(1738242344.068:1): state=initialized audit_enabled=0 res=1 Jan 30 13:05:45.736954 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 30 13:05:45.736962 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 30 13:05:45.736968 kernel: cpuidle: using governor menu Jan 30 13:05:45.736975 kernel: Simple Boot Flag at 0x36 set to 0x80 Jan 30 13:05:45.736981 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 30 13:05:45.736987 kernel: dca service started, version 1.12.1 Jan 30 13:05:45.736993 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Jan 30 13:05:45.736999 kernel: PCI: Using configuration type 1 for base access Jan 30 13:05:45.737005 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 30 13:05:45.737011 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 30 13:05:45.737018 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 30 13:05:45.737024 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 30 13:05:45.737029 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 30 13:05:45.737035 kernel: ACPI: Added _OSI(Module Device) Jan 30 13:05:45.737041 kernel: ACPI: Added _OSI(Processor Device) Jan 30 13:05:45.737047 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 30 13:05:45.737053 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 30 13:05:45.737059 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 30 13:05:45.737065 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Jan 30 13:05:45.737072 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 30 13:05:45.737078 kernel: ACPI: Interpreter enabled Jan 30 13:05:45.737084 kernel: ACPI: PM: (supports S0 S1 S5) Jan 30 13:05:45.737090 kernel: ACPI: Using IOAPIC for interrupt routing Jan 30 13:05:45.737096 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 30 13:05:45.737102 kernel: PCI: Using E820 reservations for host bridge windows Jan 30 13:05:45.737108 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Jan 30 13:05:45.737115 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Jan 30 13:05:45.737205 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 30 13:05:45.737267 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Jan 30 13:05:45.737317 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Jan 30 13:05:45.737326 kernel: PCI host bridge to bus 0000:00 Jan 30 13:05:45.737376 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 30 13:05:45.737422 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Jan 30 13:05:45.737466 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 30 13:05:45.737512 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 30 13:05:45.737556 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Jan 30 13:05:45.737600 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Jan 30 13:05:45.737711 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Jan 30 13:05:45.737768 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Jan 30 13:05:45.737823 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Jan 30 13:05:45.737881 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Jan 30 13:05:45.737931 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Jan 30 13:05:45.737981 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Jan 30 13:05:45.738030 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Jan 30 13:05:45.738079 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Jan 30 13:05:45.738128 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Jan 30 13:05:45.738182 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Jan 30 13:05:45.738234 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Jan 30 13:05:45.738283 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Jan 30 13:05:45.738338 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Jan 30 13:05:45.738388 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Jan 30 13:05:45.738438 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Jan 30 13:05:45.738491 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Jan 30 13:05:45.738543 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Jan 30 13:05:45.738592 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Jan 30 13:05:45.738675 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Jan 30 13:05:45.738726 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Jan 30 13:05:45.738774 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 30 13:05:45.738829 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Jan 30 13:05:45.738886 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.738939 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.738996 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739048 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739102 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739183 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739243 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739297 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739351 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739402 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739456 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739510 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739565 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739684 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739743 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739793 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739846 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739898 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739954 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740008 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740061 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740110 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740162 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740212 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740264 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740316 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740369 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740418 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740470 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740520 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740573 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740635 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740690 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740739 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740793 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740843 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740899 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740952 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.741005 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.741055 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.741109 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.741162 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.741217 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.741268 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.741324 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.741374 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.741501 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.741554 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.741607 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.742064 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.742125 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.742177 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.742236 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.742288 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.742342 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.742393 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.742448 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.742500 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.742554 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.742604 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.742673 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.742725 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.744686 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.744749 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.744805 kernel: pci_bus 0000:01: extended config space not accessible Jan 30 13:05:45.744859 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 30 13:05:45.744912 kernel: pci_bus 0000:02: extended config space not accessible Jan 30 13:05:45.744921 kernel: acpiphp: Slot [32] registered Jan 30 13:05:45.744927 kernel: acpiphp: Slot [33] registered Jan 30 13:05:45.744936 kernel: acpiphp: Slot [34] registered Jan 30 13:05:45.744942 kernel: acpiphp: Slot [35] registered Jan 30 13:05:45.744947 kernel: acpiphp: Slot [36] registered Jan 30 13:05:45.744953 kernel: acpiphp: Slot [37] registered Jan 30 13:05:45.744959 kernel: acpiphp: Slot [38] registered Jan 30 13:05:45.744965 kernel: acpiphp: Slot [39] registered Jan 30 13:05:45.744971 kernel: acpiphp: Slot [40] registered Jan 30 13:05:45.744977 kernel: acpiphp: Slot [41] registered Jan 30 13:05:45.744982 kernel: acpiphp: Slot [42] registered Jan 30 13:05:45.744988 kernel: acpiphp: Slot [43] registered Jan 30 13:05:45.744996 kernel: acpiphp: Slot [44] registered Jan 30 13:05:45.745001 kernel: acpiphp: Slot [45] registered Jan 30 13:05:45.745007 kernel: acpiphp: Slot [46] registered Jan 30 13:05:45.745013 kernel: acpiphp: Slot [47] registered Jan 30 13:05:45.745018 kernel: acpiphp: Slot [48] registered Jan 30 13:05:45.745024 kernel: acpiphp: Slot [49] registered Jan 30 13:05:45.745030 kernel: acpiphp: Slot [50] registered Jan 30 13:05:45.745036 kernel: acpiphp: Slot [51] registered Jan 30 13:05:45.745042 kernel: acpiphp: Slot [52] registered Jan 30 13:05:45.745049 kernel: acpiphp: Slot [53] registered Jan 30 13:05:45.745055 kernel: acpiphp: Slot [54] registered Jan 30 13:05:45.745061 kernel: acpiphp: Slot [55] registered Jan 30 13:05:45.745066 kernel: acpiphp: Slot [56] registered Jan 30 13:05:45.745072 kernel: acpiphp: Slot [57] registered Jan 30 13:05:45.745078 kernel: acpiphp: Slot [58] registered Jan 30 13:05:45.745084 kernel: acpiphp: Slot [59] registered Jan 30 13:05:45.745090 kernel: acpiphp: Slot [60] registered Jan 30 13:05:45.745096 kernel: acpiphp: Slot [61] registered Jan 30 13:05:45.745101 kernel: acpiphp: Slot [62] registered Jan 30 13:05:45.745108 kernel: acpiphp: Slot [63] registered Jan 30 13:05:45.745159 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jan 30 13:05:45.745210 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 30 13:05:45.745260 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 30 13:05:45.745309 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 30 13:05:45.745359 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Jan 30 13:05:45.745409 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Jan 30 13:05:45.745461 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Jan 30 13:05:45.745511 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Jan 30 13:05:45.745561 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Jan 30 13:05:45.745641 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Jan 30 13:05:45.745695 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Jan 30 13:05:45.745746 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Jan 30 13:05:45.745796 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 30 13:05:45.745847 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.745900 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 30 13:05:45.745953 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 30 13:05:45.746004 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 30 13:05:45.746054 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 30 13:05:45.746105 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 30 13:05:45.746156 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 30 13:05:45.746205 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 30 13:05:45.746258 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 30 13:05:45.746311 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 30 13:05:45.746360 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 30 13:05:45.746410 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 30 13:05:45.746460 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 30 13:05:45.746512 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 30 13:05:45.746563 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 30 13:05:45.746612 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 30 13:05:45.746982 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 30 13:05:45.747034 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 30 13:05:45.747084 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 30 13:05:45.747137 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 30 13:05:45.747194 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 30 13:05:45.747244 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 30 13:05:45.747295 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 30 13:05:45.747345 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 30 13:05:45.747394 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 30 13:05:45.747446 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 30 13:05:45.747494 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 30 13:05:45.747583 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 30 13:05:45.748708 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Jan 30 13:05:45.748769 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Jan 30 13:05:45.748824 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Jan 30 13:05:45.748875 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Jan 30 13:05:45.748926 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Jan 30 13:05:45.748977 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 30 13:05:45.749028 kernel: pci 0000:0b:00.0: supports D1 D2 Jan 30 13:05:45.749085 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Jan 30 13:05:45.749136 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 30 13:05:45.749216 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 30 13:05:45.749268 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 30 13:05:45.749319 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 30 13:05:45.749370 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 30 13:05:45.749422 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 30 13:05:45.749471 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 30 13:05:45.749525 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 30 13:05:45.749579 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 30 13:05:45.749653 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 30 13:05:45.749752 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 30 13:05:45.753753 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 30 13:05:45.753822 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 30 13:05:45.753874 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 30 13:05:45.753926 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 30 13:05:45.753983 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 30 13:05:45.754033 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 30 13:05:45.754082 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 30 13:05:45.754134 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 30 13:05:45.754184 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 30 13:05:45.754234 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 30 13:05:45.754285 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 30 13:05:45.754334 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 30 13:05:45.754386 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 30 13:05:45.754438 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 30 13:05:45.754487 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 30 13:05:45.754536 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 30 13:05:45.754589 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 30 13:05:45.754655 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 30 13:05:45.754706 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 30 13:05:45.754755 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 30 13:05:45.754812 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 30 13:05:45.754861 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 30 13:05:45.754909 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 30 13:05:45.754960 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 30 13:05:45.755012 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 30 13:05:45.755062 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 30 13:05:45.755112 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 30 13:05:45.755169 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 30 13:05:45.755222 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 30 13:05:45.755272 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 30 13:05:45.755321 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 30 13:05:45.755373 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 30 13:05:45.755423 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 30 13:05:45.755472 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 30 13:05:45.755524 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 30 13:05:45.755577 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 30 13:05:45.758491 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 30 13:05:45.758559 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 30 13:05:45.758611 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 30 13:05:45.758673 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 30 13:05:45.758726 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 30 13:05:45.758776 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 30 13:05:45.758827 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 30 13:05:45.758883 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 30 13:05:45.758933 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 30 13:05:45.758983 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 30 13:05:45.759033 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 30 13:05:45.759085 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 30 13:05:45.759135 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 30 13:05:45.759197 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 30 13:05:45.759247 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 30 13:05:45.759302 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 30 13:05:45.759353 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 30 13:05:45.759403 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 30 13:05:45.759455 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 30 13:05:45.759505 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 30 13:05:45.759555 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 30 13:05:45.759606 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 30 13:05:45.759663 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 30 13:05:45.759716 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 30 13:05:45.759768 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 30 13:05:45.759819 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 30 13:05:45.759868 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 30 13:05:45.759920 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 30 13:05:45.759969 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 30 13:05:45.760019 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 30 13:05:45.760071 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 30 13:05:45.760125 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 30 13:05:45.760175 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 30 13:05:45.760183 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Jan 30 13:05:45.760190 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Jan 30 13:05:45.760196 kernel: ACPI: PCI: Interrupt link LNKB disabled Jan 30 13:05:45.760202 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 30 13:05:45.760208 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Jan 30 13:05:45.760214 kernel: iommu: Default domain type: Translated Jan 30 13:05:45.760222 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 30 13:05:45.760228 kernel: PCI: Using ACPI for IRQ routing Jan 30 13:05:45.760234 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 30 13:05:45.760240 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Jan 30 13:05:45.760246 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Jan 30 13:05:45.760295 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Jan 30 13:05:45.760345 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Jan 30 13:05:45.760394 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 30 13:05:45.760403 kernel: vgaarb: loaded Jan 30 13:05:45.760411 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Jan 30 13:05:45.760417 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Jan 30 13:05:45.760422 kernel: clocksource: Switched to clocksource tsc-early Jan 30 13:05:45.760429 kernel: VFS: Disk quotas dquot_6.6.0 Jan 30 13:05:45.760435 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 30 13:05:45.760440 kernel: pnp: PnP ACPI init Jan 30 13:05:45.760494 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Jan 30 13:05:45.760542 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Jan 30 13:05:45.760590 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Jan 30 13:05:45.760948 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Jan 30 13:05:45.761002 kernel: pnp 00:06: [dma 2] Jan 30 13:05:45.761053 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Jan 30 13:05:45.761100 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Jan 30 13:05:45.761146 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Jan 30 13:05:45.761155 kernel: pnp: PnP ACPI: found 8 devices Jan 30 13:05:45.761166 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 30 13:05:45.761172 kernel: NET: Registered PF_INET protocol family Jan 30 13:05:45.761178 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 30 13:05:45.761184 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 30 13:05:45.761190 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 30 13:05:45.761196 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 30 13:05:45.761202 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 30 13:05:45.761208 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 30 13:05:45.761214 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 30 13:05:45.761221 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 30 13:05:45.761227 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 30 13:05:45.761233 kernel: NET: Registered PF_XDP protocol family Jan 30 13:05:45.761287 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Jan 30 13:05:45.761340 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 30 13:05:45.761394 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 30 13:05:45.761447 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 30 13:05:45.761502 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 30 13:05:45.761554 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 30 13:05:45.761606 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 30 13:05:45.763673 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 30 13:05:45.763740 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 30 13:05:45.763802 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 30 13:05:45.763860 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 30 13:05:45.763913 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 30 13:05:45.763966 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 30 13:05:45.764019 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 30 13:05:45.764070 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 30 13:05:45.764124 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 30 13:05:45.764177 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 30 13:05:45.764228 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 30 13:05:45.764332 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 30 13:05:45.764386 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 30 13:05:45.764437 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 30 13:05:45.764491 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 30 13:05:45.764543 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Jan 30 13:05:45.764595 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Jan 30 13:05:45.764716 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Jan 30 13:05:45.764769 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.764819 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.764871 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.764924 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.764976 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765026 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765077 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765128 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765185 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765236 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765285 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765339 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765390 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765440 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765491 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765541 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765592 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765650 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765701 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765753 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765804 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765854 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765905 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765954 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.766006 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.766056 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.766106 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.766159 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.766209 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.766260 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.766311 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.766361 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.766412 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.766462 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.766512 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.766562 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.767654 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.767724 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.767780 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.767831 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.767883 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.767934 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.767984 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768033 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768087 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768135 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768190 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768239 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768289 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768338 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768388 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768436 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768487 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768539 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768589 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768645 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768696 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768746 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768795 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768846 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768896 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768946 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.769012 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.769067 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.769118 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.769174 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.769224 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.769272 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.769323 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.769372 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.769423 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.769472 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.769526 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.769575 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.771140 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.771218 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.771281 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.771335 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.771388 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.771439 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.771491 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.771540 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.771596 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.771670 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.771724 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 30 13:05:45.771776 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Jan 30 13:05:45.771828 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 30 13:05:45.771878 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 30 13:05:45.771928 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 30 13:05:45.771983 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Jan 30 13:05:45.772039 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 30 13:05:45.772091 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 30 13:05:45.772142 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 30 13:05:45.772192 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Jan 30 13:05:45.772244 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 30 13:05:45.772295 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 30 13:05:45.772346 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 30 13:05:45.772396 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 30 13:05:45.772450 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 30 13:05:45.772503 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 30 13:05:45.772553 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 30 13:05:45.772603 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 30 13:05:45.772726 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 30 13:05:45.772778 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 30 13:05:45.772828 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 30 13:05:45.772878 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 30 13:05:45.772928 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 30 13:05:45.772978 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 30 13:05:45.773032 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 30 13:05:45.773082 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 30 13:05:45.773132 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 30 13:05:45.773182 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 30 13:05:45.773232 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 30 13:05:45.773285 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 30 13:05:45.773338 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 30 13:05:45.773388 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 30 13:05:45.773438 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 30 13:05:45.773493 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Jan 30 13:05:45.773544 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 30 13:05:45.773595 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 30 13:05:45.775185 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 30 13:05:45.775244 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Jan 30 13:05:45.775299 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 30 13:05:45.775352 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 30 13:05:45.775408 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 30 13:05:45.775459 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 30 13:05:45.775512 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 30 13:05:45.775563 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 30 13:05:45.775630 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 30 13:05:45.775689 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 30 13:05:45.775743 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 30 13:05:45.775793 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 30 13:05:45.775843 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 30 13:05:45.775899 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 30 13:05:45.775950 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 30 13:05:45.776000 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 30 13:05:45.776052 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 30 13:05:45.776102 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 30 13:05:45.776153 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 30 13:05:45.776205 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 30 13:05:45.776255 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 30 13:05:45.776305 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 30 13:05:45.776357 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 30 13:05:45.776410 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 30 13:05:45.776461 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 30 13:05:45.776512 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 30 13:05:45.776563 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 30 13:05:45.776612 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 30 13:05:45.777678 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 30 13:05:45.777733 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 30 13:05:45.777784 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 30 13:05:45.777847 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 30 13:05:45.777902 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 30 13:05:45.777955 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 30 13:05:45.778005 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 30 13:05:45.778056 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 30 13:05:45.778105 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 30 13:05:45.778157 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 30 13:05:45.778207 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 30 13:05:45.778257 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 30 13:05:45.778310 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 30 13:05:45.778364 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 30 13:05:45.778414 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 30 13:05:45.778465 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 30 13:05:45.778516 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 30 13:05:45.778566 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 30 13:05:45.778628 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 30 13:05:45.778683 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 30 13:05:45.778734 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 30 13:05:45.778787 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 30 13:05:45.778838 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 30 13:05:45.778892 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 30 13:05:45.778944 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 30 13:05:45.778994 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 30 13:05:45.779044 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 30 13:05:45.779109 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 30 13:05:45.779180 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 30 13:05:45.779232 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 30 13:05:45.779283 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 30 13:05:45.779333 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 30 13:05:45.779389 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 30 13:05:45.779440 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 30 13:05:45.779491 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 30 13:05:45.779544 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 30 13:05:45.779595 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 30 13:05:45.779662 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 30 13:05:45.779717 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 30 13:05:45.779767 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 30 13:05:45.779818 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 30 13:05:45.779870 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 30 13:05:45.779925 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 30 13:05:45.779976 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 30 13:05:45.780029 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 30 13:05:45.780080 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 30 13:05:45.780130 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 30 13:05:45.780182 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 30 13:05:45.780232 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 30 13:05:45.780283 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 30 13:05:45.780334 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Jan 30 13:05:45.780383 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 30 13:05:45.780428 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 30 13:05:45.780472 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Jan 30 13:05:45.780517 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Jan 30 13:05:45.780567 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Jan 30 13:05:45.780657 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Jan 30 13:05:45.780711 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 30 13:05:45.780757 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Jan 30 13:05:45.780805 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 30 13:05:45.780851 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 30 13:05:45.780896 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Jan 30 13:05:45.780941 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Jan 30 13:05:45.780993 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Jan 30 13:05:45.781039 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Jan 30 13:05:45.781085 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Jan 30 13:05:45.781138 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Jan 30 13:05:45.781471 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Jan 30 13:05:45.781525 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Jan 30 13:05:45.781577 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Jan 30 13:05:45.782770 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Jan 30 13:05:45.782824 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Jan 30 13:05:45.782877 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Jan 30 13:05:45.782929 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Jan 30 13:05:45.782980 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Jan 30 13:05:45.783026 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 30 13:05:45.783076 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Jan 30 13:05:45.783122 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Jan 30 13:05:45.783172 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Jan 30 13:05:45.783220 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Jan 30 13:05:45.783273 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Jan 30 13:05:45.783329 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Jan 30 13:05:45.783381 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Jan 30 13:05:45.783427 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Jan 30 13:05:45.783475 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Jan 30 13:05:45.783527 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Jan 30 13:05:45.783573 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Jan 30 13:05:45.783628 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Jan 30 13:05:45.784057 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Jan 30 13:05:45.784108 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Jan 30 13:05:45.784160 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Jan 30 13:05:45.784217 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Jan 30 13:05:45.784266 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 30 13:05:45.784316 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Jan 30 13:05:45.784365 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 30 13:05:45.784415 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Jan 30 13:05:45.784462 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Jan 30 13:05:45.784513 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Jan 30 13:05:45.784563 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Jan 30 13:05:45.784886 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Jan 30 13:05:45.784951 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 30 13:05:45.785008 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Jan 30 13:05:45.785057 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Jan 30 13:05:45.785104 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 30 13:05:45.785173 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Jan 30 13:05:45.785452 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Jan 30 13:05:45.785506 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Jan 30 13:05:45.785559 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Jan 30 13:05:45.785608 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Jan 30 13:05:45.785678 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Jan 30 13:05:45.785744 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Jan 30 13:05:45.786030 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 30 13:05:45.786090 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Jan 30 13:05:45.786138 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 30 13:05:45.786191 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Jan 30 13:05:45.786239 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Jan 30 13:05:45.786291 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Jan 30 13:05:45.786341 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Jan 30 13:05:45.786392 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Jan 30 13:05:45.786439 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 30 13:05:45.786494 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 30 13:05:45.786541 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Jan 30 13:05:45.786591 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Jan 30 13:05:45.786690 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Jan 30 13:05:45.786738 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Jan 30 13:05:45.786784 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Jan 30 13:05:45.786834 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Jan 30 13:05:45.786883 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Jan 30 13:05:45.786934 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Jan 30 13:05:45.786984 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 30 13:05:45.787038 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Jan 30 13:05:45.787085 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Jan 30 13:05:45.787361 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Jan 30 13:05:45.787413 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Jan 30 13:05:45.787466 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Jan 30 13:05:45.787516 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Jan 30 13:05:45.787569 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Jan 30 13:05:45.787632 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 30 13:05:45.787697 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jan 30 13:05:45.787708 kernel: PCI: CLS 32 bytes, default 64 Jan 30 13:05:45.787715 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 30 13:05:45.787722 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 30 13:05:45.787731 kernel: clocksource: Switched to clocksource tsc Jan 30 13:05:45.787737 kernel: Initialise system trusted keyrings Jan 30 13:05:45.787743 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 30 13:05:45.787749 kernel: Key type asymmetric registered Jan 30 13:05:45.787755 kernel: Asymmetric key parser 'x509' registered Jan 30 13:05:45.787761 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 30 13:05:45.787768 kernel: io scheduler mq-deadline registered Jan 30 13:05:45.787774 kernel: io scheduler kyber registered Jan 30 13:05:45.787781 kernel: io scheduler bfq registered Jan 30 13:05:45.787838 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Jan 30 13:05:45.787896 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.787950 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Jan 30 13:05:45.788002 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788057 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Jan 30 13:05:45.788114 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788170 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Jan 30 13:05:45.788222 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788277 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Jan 30 13:05:45.788329 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788381 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Jan 30 13:05:45.788434 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788488 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Jan 30 13:05:45.788542 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788595 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Jan 30 13:05:45.788712 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788767 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Jan 30 13:05:45.788820 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788874 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Jan 30 13:05:45.788930 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788985 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Jan 30 13:05:45.789039 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.789092 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Jan 30 13:05:45.789145 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.789210 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Jan 30 13:05:45.789266 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.789320 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Jan 30 13:05:45.789373 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.789427 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Jan 30 13:05:45.789479 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.789536 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Jan 30 13:05:45.789588 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790145 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Jan 30 13:05:45.790207 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790262 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Jan 30 13:05:45.790315 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790368 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Jan 30 13:05:45.790424 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790477 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Jan 30 13:05:45.790529 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790582 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Jan 30 13:05:45.790656 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790713 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Jan 30 13:05:45.790769 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790823 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Jan 30 13:05:45.790876 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790929 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Jan 30 13:05:45.790981 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.791037 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Jan 30 13:05:45.791089 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.791143 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Jan 30 13:05:45.791196 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.791249 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Jan 30 13:05:45.791302 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.791358 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Jan 30 13:05:45.791410 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.791464 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Jan 30 13:05:45.791517 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.791570 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Jan 30 13:05:45.791665 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.791722 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Jan 30 13:05:45.793657 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.793725 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Jan 30 13:05:45.793782 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.793792 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 30 13:05:45.793801 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 30 13:05:45.793808 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 30 13:05:45.793814 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Jan 30 13:05:45.793820 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 30 13:05:45.793827 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 30 13:05:45.793880 kernel: rtc_cmos 00:01: registered as rtc0 Jan 30 13:05:45.793929 kernel: rtc_cmos 00:01: setting system clock to 2025-01-30T13:05:45 UTC (1738242345) Jan 30 13:05:45.793975 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Jan 30 13:05:45.793987 kernel: intel_pstate: CPU model not supported Jan 30 13:05:45.793993 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 30 13:05:45.794000 kernel: NET: Registered PF_INET6 protocol family Jan 30 13:05:45.794006 kernel: Segment Routing with IPv6 Jan 30 13:05:45.794013 kernel: In-situ OAM (IOAM) with IPv6 Jan 30 13:05:45.794019 kernel: NET: Registered PF_PACKET protocol family Jan 30 13:05:45.794025 kernel: Key type dns_resolver registered Jan 30 13:05:45.794031 kernel: IPI shorthand broadcast: enabled Jan 30 13:05:45.794038 kernel: sched_clock: Marking stable (908067637, 228752517)->(1196418366, -59598212) Jan 30 13:05:45.794045 kernel: registered taskstats version 1 Jan 30 13:05:45.794051 kernel: Loading compiled-in X.509 certificates Jan 30 13:05:45.794058 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.74-flatcar: 7f0738935740330d55027faa5877e7155d5f24f4' Jan 30 13:05:45.794064 kernel: Key type .fscrypt registered Jan 30 13:05:45.794070 kernel: Key type fscrypt-provisioning registered Jan 30 13:05:45.794076 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 30 13:05:45.794082 kernel: ima: Allocated hash algorithm: sha1 Jan 30 13:05:45.794089 kernel: ima: No architecture policies found Jan 30 13:05:45.794095 kernel: clk: Disabling unused clocks Jan 30 13:05:45.794102 kernel: Freeing unused kernel image (initmem) memory: 43320K Jan 30 13:05:45.794109 kernel: Write protecting the kernel read-only data: 38912k Jan 30 13:05:45.794117 kernel: Freeing unused kernel image (rodata/data gap) memory: 1776K Jan 30 13:05:45.794123 kernel: Run /init as init process Jan 30 13:05:45.794130 kernel: with arguments: Jan 30 13:05:45.794136 kernel: /init Jan 30 13:05:45.794142 kernel: with environment: Jan 30 13:05:45.794148 kernel: HOME=/ Jan 30 13:05:45.794154 kernel: TERM=linux Jan 30 13:05:45.794162 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 30 13:05:45.794170 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 30 13:05:45.794177 systemd[1]: Detected virtualization vmware. Jan 30 13:05:45.794184 systemd[1]: Detected architecture x86-64. Jan 30 13:05:45.794190 systemd[1]: Running in initrd. Jan 30 13:05:45.794196 systemd[1]: No hostname configured, using default hostname. Jan 30 13:05:45.794203 systemd[1]: Hostname set to . Jan 30 13:05:45.794210 systemd[1]: Initializing machine ID from random generator. Jan 30 13:05:45.794217 systemd[1]: Queued start job for default target initrd.target. Jan 30 13:05:45.794223 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 30 13:05:45.794230 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 30 13:05:45.794237 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 30 13:05:45.794244 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 30 13:05:45.794251 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 30 13:05:45.794257 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 30 13:05:45.794266 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 30 13:05:45.794273 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 30 13:05:45.794279 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 30 13:05:45.794286 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 30 13:05:45.794292 systemd[1]: Reached target paths.target - Path Units. Jan 30 13:05:45.794299 systemd[1]: Reached target slices.target - Slice Units. Jan 30 13:05:45.794305 systemd[1]: Reached target swap.target - Swaps. Jan 30 13:05:45.794313 systemd[1]: Reached target timers.target - Timer Units. Jan 30 13:05:45.794320 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 30 13:05:45.794326 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 30 13:05:45.794333 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 30 13:05:45.794339 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 30 13:05:45.794345 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 30 13:05:45.794352 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 30 13:05:45.794359 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 30 13:05:45.794365 systemd[1]: Reached target sockets.target - Socket Units. Jan 30 13:05:45.794373 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 30 13:05:45.794380 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 30 13:05:45.794386 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 30 13:05:45.794392 systemd[1]: Starting systemd-fsck-usr.service... Jan 30 13:05:45.794399 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 30 13:05:45.794405 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 30 13:05:45.794412 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 30 13:05:45.794418 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 30 13:05:45.794437 systemd-journald[217]: Collecting audit messages is disabled. Jan 30 13:05:45.794456 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 30 13:05:45.794463 systemd[1]: Finished systemd-fsck-usr.service. Jan 30 13:05:45.794471 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 30 13:05:45.794478 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 30 13:05:45.794485 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 30 13:05:45.794491 kernel: Bridge firewalling registered Jan 30 13:05:45.794498 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 30 13:05:45.794504 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 30 13:05:45.794513 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 30 13:05:45.794519 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 30 13:05:45.794526 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 30 13:05:45.794532 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 30 13:05:45.794540 systemd-journald[217]: Journal started Jan 30 13:05:45.794554 systemd-journald[217]: Runtime Journal (/run/log/journal/50d88166ddb3455f8851ee758043dcca) is 4.8M, max 38.6M, 33.8M free. Jan 30 13:05:45.751245 systemd-modules-load[218]: Inserted module 'overlay' Jan 30 13:05:45.797147 systemd[1]: Started systemd-journald.service - Journal Service. Jan 30 13:05:45.770817 systemd-modules-load[218]: Inserted module 'br_netfilter' Jan 30 13:05:45.799889 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 30 13:05:45.800103 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 30 13:05:45.802046 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 30 13:05:45.802791 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 30 13:05:45.809534 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 30 13:05:45.811190 dracut-cmdline[246]: dracut-dracut-053 Jan 30 13:05:45.814955 dracut-cmdline[246]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=fe60919b0c6f6abb7495678f87f7024e97a038fc343fa31a123a43ef5f489466 Jan 30 13:05:45.814101 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 30 13:05:45.832625 systemd-resolved[258]: Positive Trust Anchors: Jan 30 13:05:45.832841 systemd-resolved[258]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 30 13:05:45.832865 systemd-resolved[258]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 30 13:05:45.835849 systemd-resolved[258]: Defaulting to hostname 'linux'. Jan 30 13:05:45.836436 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 30 13:05:45.836588 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 30 13:05:45.866644 kernel: SCSI subsystem initialized Jan 30 13:05:45.873632 kernel: Loading iSCSI transport class v2.0-870. Jan 30 13:05:45.879628 kernel: iscsi: registered transport (tcp) Jan 30 13:05:45.894999 kernel: iscsi: registered transport (qla4xxx) Jan 30 13:05:45.895048 kernel: QLogic iSCSI HBA Driver Jan 30 13:05:45.915319 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 30 13:05:45.919715 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 30 13:05:45.934880 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 30 13:05:45.934948 kernel: device-mapper: uevent: version 1.0.3 Jan 30 13:05:45.935996 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 30 13:05:45.970641 kernel: raid6: avx2x4 gen() 46804 MB/s Jan 30 13:05:45.984636 kernel: raid6: avx2x2 gen() 53018 MB/s Jan 30 13:05:46.001856 kernel: raid6: avx2x1 gen() 44538 MB/s Jan 30 13:05:46.001894 kernel: raid6: using algorithm avx2x2 gen() 53018 MB/s Jan 30 13:05:46.020890 kernel: raid6: .... xor() 32167 MB/s, rmw enabled Jan 30 13:05:46.020938 kernel: raid6: using avx2x2 recovery algorithm Jan 30 13:05:46.034635 kernel: xor: automatically using best checksumming function avx Jan 30 13:05:46.125640 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 30 13:05:46.130599 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 30 13:05:46.135730 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 30 13:05:46.143436 systemd-udevd[435]: Using default interface naming scheme 'v255'. Jan 30 13:05:46.145948 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 30 13:05:46.151770 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 30 13:05:46.158791 dracut-pre-trigger[440]: rd.md=0: removing MD RAID activation Jan 30 13:05:46.174472 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 30 13:05:46.178782 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 30 13:05:46.248938 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 30 13:05:46.251736 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 30 13:05:46.264262 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 30 13:05:46.265049 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 30 13:05:46.265166 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 30 13:05:46.265284 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 30 13:05:46.269743 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 30 13:05:46.280285 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 30 13:05:46.318977 kernel: VMware PVSCSI driver - version 1.0.7.0-k Jan 30 13:05:46.319013 kernel: vmw_pvscsi: using 64bit dma Jan 30 13:05:46.323631 kernel: vmw_pvscsi: max_id: 16 Jan 30 13:05:46.323661 kernel: vmw_pvscsi: setting ring_pages to 8 Jan 30 13:05:46.325943 kernel: vmw_pvscsi: enabling reqCallThreshold Jan 30 13:05:46.325969 kernel: vmw_pvscsi: driver-based request coalescing enabled Jan 30 13:05:46.325982 kernel: vmw_pvscsi: using MSI-X Jan 30 13:05:46.329634 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Jan 30 13:05:46.334676 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Jan 30 13:05:46.342629 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Jan 30 13:05:46.345607 kernel: cryptd: max_cpu_qlen set to 1000 Jan 30 13:05:46.345626 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Jan 30 13:05:46.345649 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Jan 30 13:05:46.351922 kernel: libata version 3.00 loaded. Jan 30 13:05:46.351933 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Jan 30 13:05:46.354679 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Jan 30 13:05:46.361625 kernel: ata_piix 0000:00:07.1: version 2.13 Jan 30 13:05:46.367612 kernel: scsi host1: ata_piix Jan 30 13:05:46.367722 kernel: scsi host2: ata_piix Jan 30 13:05:46.367791 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Jan 30 13:05:46.367801 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Jan 30 13:05:46.367812 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Jan 30 13:05:46.383510 kernel: sd 0:0:0:0: [sda] Write Protect is off Jan 30 13:05:46.383592 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Jan 30 13:05:46.383674 kernel: sd 0:0:0:0: [sda] Cache data unavailable Jan 30 13:05:46.383737 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Jan 30 13:05:46.383800 kernel: AVX2 version of gcm_enc/dec engaged. Jan 30 13:05:46.383809 kernel: AES CTR mode by8 optimization enabled Jan 30 13:05:46.383817 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 30 13:05:46.383828 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jan 30 13:05:46.360804 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 30 13:05:46.360874 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 30 13:05:46.361053 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 30 13:05:46.361145 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 30 13:05:46.361211 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 30 13:05:46.361314 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 30 13:05:46.367797 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 30 13:05:46.387828 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 30 13:05:46.396700 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 30 13:05:46.406849 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 30 13:05:46.536690 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Jan 30 13:05:46.539659 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Jan 30 13:05:46.561631 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Jan 30 13:05:46.575387 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 30 13:05:46.575398 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jan 30 13:05:46.605351 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Jan 30 13:05:46.606173 kernel: BTRFS: device fsid f8084233-4a6f-4e67-af0b-519e43b19e58 devid 1 transid 41 /dev/sda3 scanned by (udev-worker) (489) Jan 30 13:05:46.609637 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (490) Jan 30 13:05:46.609513 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Jan 30 13:05:46.613049 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Jan 30 13:05:46.613327 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Jan 30 13:05:46.616142 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 30 13:05:46.622699 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 30 13:05:46.645633 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 30 13:05:46.649634 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 30 13:05:47.651643 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 30 13:05:47.652602 disk-uuid[595]: The operation has completed successfully. Jan 30 13:05:47.737806 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 30 13:05:47.737877 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 30 13:05:47.742728 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 30 13:05:47.747167 sh[613]: Success Jan 30 13:05:47.757693 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Jan 30 13:05:47.804102 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 30 13:05:47.809604 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 30 13:05:47.810057 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 30 13:05:47.827079 kernel: BTRFS info (device dm-0): first mount of filesystem f8084233-4a6f-4e67-af0b-519e43b19e58 Jan 30 13:05:47.827124 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 30 13:05:47.827133 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 30 13:05:47.829018 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 30 13:05:47.829041 kernel: BTRFS info (device dm-0): using free space tree Jan 30 13:05:47.837632 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 30 13:05:47.839029 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 30 13:05:47.852782 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Jan 30 13:05:47.854263 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 30 13:05:47.873340 kernel: BTRFS info (device sda6): first mount of filesystem 8f723f8b-dc93-4eaf-8b2c-0038aa5af52c Jan 30 13:05:47.873376 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 30 13:05:47.873385 kernel: BTRFS info (device sda6): using free space tree Jan 30 13:05:47.878631 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 30 13:05:47.889111 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 30 13:05:47.889636 kernel: BTRFS info (device sda6): last unmount of filesystem 8f723f8b-dc93-4eaf-8b2c-0038aa5af52c Jan 30 13:05:47.895365 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 30 13:05:47.901699 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 30 13:05:47.924756 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 30 13:05:47.929774 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 30 13:05:47.976780 ignition[672]: Ignition 2.20.0 Jan 30 13:05:47.976803 ignition[672]: Stage: fetch-offline Jan 30 13:05:47.976823 ignition[672]: no configs at "/usr/lib/ignition/base.d" Jan 30 13:05:47.976828 ignition[672]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 30 13:05:47.976883 ignition[672]: parsed url from cmdline: "" Jan 30 13:05:47.976884 ignition[672]: no config URL provided Jan 30 13:05:47.976887 ignition[672]: reading system config file "/usr/lib/ignition/user.ign" Jan 30 13:05:47.976891 ignition[672]: no config at "/usr/lib/ignition/user.ign" Jan 30 13:05:47.977253 ignition[672]: config successfully fetched Jan 30 13:05:47.977270 ignition[672]: parsing config with SHA512: 97edc499e6ba7c4b478a3e418684fc44dcbf50f8eabd58698f6592e97a2a21e21bbb280b9fbd36866ff7a96673efd2d0b7370b0d2d67ceb5638e6d261b72e8b7 Jan 30 13:05:47.982082 unknown[672]: fetched base config from "system" Jan 30 13:05:47.982089 unknown[672]: fetched user config from "vmware" Jan 30 13:05:47.982586 ignition[672]: fetch-offline: fetch-offline passed Jan 30 13:05:47.982670 ignition[672]: Ignition finished successfully Jan 30 13:05:47.983511 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 30 13:05:47.994302 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 30 13:05:47.998730 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 30 13:05:48.010451 systemd-networkd[806]: lo: Link UP Jan 30 13:05:48.010742 systemd-networkd[806]: lo: Gained carrier Jan 30 13:05:48.011461 systemd-networkd[806]: Enumeration completed Jan 30 13:05:48.011669 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 30 13:05:48.011897 systemd[1]: Reached target network.target - Network. Jan 30 13:05:48.012095 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jan 30 13:05:48.015273 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 30 13:05:48.015394 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 30 13:05:48.012355 systemd-networkd[806]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Jan 30 13:05:48.015884 systemd-networkd[806]: ens192: Link UP Jan 30 13:05:48.015887 systemd-networkd[806]: ens192: Gained carrier Jan 30 13:05:48.022777 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 30 13:05:48.030375 ignition[808]: Ignition 2.20.0 Jan 30 13:05:48.030383 ignition[808]: Stage: kargs Jan 30 13:05:48.030485 ignition[808]: no configs at "/usr/lib/ignition/base.d" Jan 30 13:05:48.030491 ignition[808]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 30 13:05:48.031009 ignition[808]: kargs: kargs passed Jan 30 13:05:48.031034 ignition[808]: Ignition finished successfully Jan 30 13:05:48.032116 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 30 13:05:48.035733 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 30 13:05:48.042728 ignition[816]: Ignition 2.20.0 Jan 30 13:05:48.042739 ignition[816]: Stage: disks Jan 30 13:05:48.042875 ignition[816]: no configs at "/usr/lib/ignition/base.d" Jan 30 13:05:48.042881 ignition[816]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 30 13:05:48.043946 ignition[816]: disks: disks passed Jan 30 13:05:48.043989 ignition[816]: Ignition finished successfully Jan 30 13:05:48.044755 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 30 13:05:48.045106 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 30 13:05:48.045351 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 30 13:05:48.045602 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 30 13:05:48.045836 systemd[1]: Reached target sysinit.target - System Initialization. Jan 30 13:05:48.046060 systemd[1]: Reached target basic.target - Basic System. Jan 30 13:05:48.049695 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 30 13:05:48.337836 systemd-fsck[824]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 30 13:05:48.341108 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 30 13:05:48.351715 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 30 13:05:48.497673 kernel: EXT4-fs (sda9): mounted filesystem cdc615db-d057-439f-af25-aa57b1c399e2 r/w with ordered data mode. Quota mode: none. Jan 30 13:05:48.498488 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 30 13:05:48.499056 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 30 13:05:48.515708 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 30 13:05:48.524195 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 30 13:05:48.524643 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 30 13:05:48.524828 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 30 13:05:48.524845 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 30 13:05:48.528386 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 30 13:05:48.531728 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 30 13:05:48.600637 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (832) Jan 30 13:05:48.619701 kernel: BTRFS info (device sda6): first mount of filesystem 8f723f8b-dc93-4eaf-8b2c-0038aa5af52c Jan 30 13:05:48.619745 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 30 13:05:48.619754 kernel: BTRFS info (device sda6): using free space tree Jan 30 13:05:48.688635 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 30 13:05:48.693709 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 30 13:05:48.717196 initrd-setup-root[856]: cut: /sysroot/etc/passwd: No such file or directory Jan 30 13:05:48.722413 initrd-setup-root[863]: cut: /sysroot/etc/group: No such file or directory Jan 30 13:05:48.726391 initrd-setup-root[870]: cut: /sysroot/etc/shadow: No such file or directory Jan 30 13:05:48.728678 initrd-setup-root[877]: cut: /sysroot/etc/gshadow: No such file or directory Jan 30 13:05:48.831304 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 30 13:05:48.835709 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 30 13:05:48.838139 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 30 13:05:48.841183 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 30 13:05:48.841625 kernel: BTRFS info (device sda6): last unmount of filesystem 8f723f8b-dc93-4eaf-8b2c-0038aa5af52c Jan 30 13:05:48.857870 ignition[945]: INFO : Ignition 2.20.0 Jan 30 13:05:48.857870 ignition[945]: INFO : Stage: mount Jan 30 13:05:48.858453 ignition[945]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 30 13:05:48.858453 ignition[945]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 30 13:05:48.859198 ignition[945]: INFO : mount: mount passed Jan 30 13:05:48.859348 ignition[945]: INFO : Ignition finished successfully Jan 30 13:05:48.859321 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 30 13:05:48.860183 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 30 13:05:48.863812 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 30 13:05:48.868275 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 30 13:05:48.975640 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (957) Jan 30 13:05:48.986812 kernel: BTRFS info (device sda6): first mount of filesystem 8f723f8b-dc93-4eaf-8b2c-0038aa5af52c Jan 30 13:05:48.986845 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 30 13:05:48.986854 kernel: BTRFS info (device sda6): using free space tree Jan 30 13:05:49.042719 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 30 13:05:49.047414 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 30 13:05:49.065527 ignition[974]: INFO : Ignition 2.20.0 Jan 30 13:05:49.065527 ignition[974]: INFO : Stage: files Jan 30 13:05:49.066069 ignition[974]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 30 13:05:49.066069 ignition[974]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 30 13:05:49.066283 ignition[974]: DEBUG : files: compiled without relabeling support, skipping Jan 30 13:05:49.075878 ignition[974]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 30 13:05:49.075878 ignition[974]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 30 13:05:49.114157 ignition[974]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 30 13:05:49.114514 ignition[974]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 30 13:05:49.114911 unknown[974]: wrote ssh authorized keys file for user: core Jan 30 13:05:49.115260 ignition[974]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 30 13:05:49.140637 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 30 13:05:49.141206 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 30 13:05:49.175740 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 30 13:05:49.209976 systemd-networkd[806]: ens192: Gained IPv6LL Jan 30 13:05:49.262744 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 30 13:05:49.262744 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 30 13:05:49.263273 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 30 13:05:49.263273 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 30 13:05:49.263273 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 30 13:05:49.263273 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 30 13:05:49.263273 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 30 13:05:49.263273 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 30 13:05:49.263273 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 30 13:05:49.269670 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 30 13:05:49.269886 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 30 13:05:49.269886 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 30 13:05:49.269886 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 30 13:05:49.269886 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 30 13:05:49.270813 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 Jan 30 13:05:49.793237 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 30 13:05:50.143493 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 30 13:05:50.143927 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 30 13:05:50.143927 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 30 13:05:50.143927 ignition[974]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Jan 30 13:05:50.213030 ignition[974]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Jan 30 13:05:50.215525 ignition[974]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jan 30 13:05:50.215525 ignition[974]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Jan 30 13:05:50.215525 ignition[974]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Jan 30 13:05:50.215525 ignition[974]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Jan 30 13:05:50.216863 ignition[974]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 30 13:05:50.216863 ignition[974]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 30 13:05:50.216863 ignition[974]: INFO : files: files passed Jan 30 13:05:50.216863 ignition[974]: INFO : Ignition finished successfully Jan 30 13:05:50.217516 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 30 13:05:50.220750 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 30 13:05:50.222318 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 30 13:05:50.224497 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 30 13:05:50.224708 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 30 13:05:50.229074 initrd-setup-root-after-ignition[1004]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 30 13:05:50.229074 initrd-setup-root-after-ignition[1004]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 30 13:05:50.230043 initrd-setup-root-after-ignition[1008]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 30 13:05:50.231038 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 30 13:05:50.231425 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 30 13:05:50.234743 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 30 13:05:50.248884 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 30 13:05:50.248953 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 30 13:05:50.249416 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 30 13:05:50.249550 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 30 13:05:50.249763 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 30 13:05:50.250265 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 30 13:05:50.260134 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 30 13:05:50.263728 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 30 13:05:50.269802 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 30 13:05:50.270171 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 30 13:05:50.270335 systemd[1]: Stopped target timers.target - Timer Units. Jan 30 13:05:50.270474 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 30 13:05:50.270555 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 30 13:05:50.270961 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 30 13:05:50.271184 systemd[1]: Stopped target basic.target - Basic System. Jan 30 13:05:50.271377 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 30 13:05:50.271568 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 30 13:05:50.271772 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 30 13:05:50.271976 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 30 13:05:50.272323 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 30 13:05:50.272543 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 30 13:05:50.272775 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 30 13:05:50.272973 systemd[1]: Stopped target swap.target - Swaps. Jan 30 13:05:50.273137 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 30 13:05:50.273207 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 30 13:05:50.273464 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 30 13:05:50.273702 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 30 13:05:50.273878 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 30 13:05:50.273927 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 30 13:05:50.274088 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 30 13:05:50.274147 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 30 13:05:50.274389 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 30 13:05:50.274451 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 30 13:05:50.274719 systemd[1]: Stopped target paths.target - Path Units. Jan 30 13:05:50.274863 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 30 13:05:50.276646 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 30 13:05:50.276803 systemd[1]: Stopped target slices.target - Slice Units. Jan 30 13:05:50.277000 systemd[1]: Stopped target sockets.target - Socket Units. Jan 30 13:05:50.277184 systemd[1]: iscsid.socket: Deactivated successfully. Jan 30 13:05:50.277260 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 30 13:05:50.277469 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 30 13:05:50.277515 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 30 13:05:50.277762 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 30 13:05:50.277826 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 30 13:05:50.278063 systemd[1]: ignition-files.service: Deactivated successfully. Jan 30 13:05:50.278121 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 30 13:05:50.285800 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 30 13:05:50.285921 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 30 13:05:50.286018 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 30 13:05:50.288494 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 30 13:05:50.288611 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 30 13:05:50.288718 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 30 13:05:50.289016 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 30 13:05:50.289104 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 30 13:05:50.291928 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 30 13:05:50.292006 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 30 13:05:50.295632 ignition[1028]: INFO : Ignition 2.20.0 Jan 30 13:05:50.295632 ignition[1028]: INFO : Stage: umount Jan 30 13:05:50.295632 ignition[1028]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 30 13:05:50.295632 ignition[1028]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 30 13:05:50.297326 ignition[1028]: INFO : umount: umount passed Jan 30 13:05:50.297326 ignition[1028]: INFO : Ignition finished successfully Jan 30 13:05:50.297329 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 30 13:05:50.298285 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 30 13:05:50.298515 systemd[1]: Stopped target network.target - Network. Jan 30 13:05:50.298600 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 30 13:05:50.298671 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 30 13:05:50.298774 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 30 13:05:50.298797 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 30 13:05:50.298897 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 30 13:05:50.298918 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 30 13:05:50.299016 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 30 13:05:50.299038 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 30 13:05:50.299221 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 30 13:05:50.299397 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 30 13:05:50.303964 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 30 13:05:50.304072 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 30 13:05:50.306334 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 30 13:05:50.306735 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 30 13:05:50.306802 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 30 13:05:50.307253 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 30 13:05:50.307282 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 30 13:05:50.311671 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 30 13:05:50.311770 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 30 13:05:50.311800 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 30 13:05:50.311936 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Jan 30 13:05:50.311960 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 30 13:05:50.312077 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 30 13:05:50.312098 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 30 13:05:50.312203 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 30 13:05:50.312223 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 30 13:05:50.312329 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 30 13:05:50.312349 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 30 13:05:50.312505 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 30 13:05:50.320196 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 30 13:05:50.320267 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 30 13:05:50.325027 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 30 13:05:50.325106 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 30 13:05:50.325410 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 30 13:05:50.325437 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 30 13:05:50.325693 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 30 13:05:50.325717 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 30 13:05:50.325835 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 30 13:05:50.325859 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 30 13:05:50.326131 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 30 13:05:50.326153 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 30 13:05:50.326437 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 30 13:05:50.326459 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 30 13:05:50.332849 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 30 13:05:50.333170 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 30 13:05:50.333209 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 30 13:05:50.333336 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 30 13:05:50.333359 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 30 13:05:50.336834 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 30 13:05:50.336900 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 30 13:05:50.426150 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 30 13:05:50.426235 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 30 13:05:50.426612 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 30 13:05:50.426771 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 30 13:05:50.426806 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 30 13:05:50.430721 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 30 13:05:50.435944 systemd[1]: Switching root. Jan 30 13:05:50.481960 systemd-journald[217]: Journal stopped Jan 30 13:05:45.734271 kernel: Linux version 6.6.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241116 p3) 14.2.1 20241116, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Wed Jan 29 09:29:54 -00 2025 Jan 30 13:05:45.734289 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=fe60919b0c6f6abb7495678f87f7024e97a038fc343fa31a123a43ef5f489466 Jan 30 13:05:45.734295 kernel: Disabled fast string operations Jan 30 13:05:45.734299 kernel: BIOS-provided physical RAM map: Jan 30 13:05:45.734303 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Jan 30 13:05:45.734307 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Jan 30 13:05:45.734313 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Jan 30 13:05:45.734318 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Jan 30 13:05:45.734322 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Jan 30 13:05:45.734326 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Jan 30 13:05:45.734330 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Jan 30 13:05:45.734334 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Jan 30 13:05:45.734339 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Jan 30 13:05:45.734343 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Jan 30 13:05:45.734349 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Jan 30 13:05:45.734354 kernel: NX (Execute Disable) protection: active Jan 30 13:05:45.734359 kernel: APIC: Static calls initialized Jan 30 13:05:45.734364 kernel: SMBIOS 2.7 present. Jan 30 13:05:45.734369 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Jan 30 13:05:45.734374 kernel: vmware: hypercall mode: 0x00 Jan 30 13:05:45.734378 kernel: Hypervisor detected: VMware Jan 30 13:05:45.734383 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Jan 30 13:05:45.734389 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Jan 30 13:05:45.734394 kernel: vmware: using clock offset of 2579831776 ns Jan 30 13:05:45.734399 kernel: tsc: Detected 3408.000 MHz processor Jan 30 13:05:45.734404 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 30 13:05:45.734410 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 30 13:05:45.734415 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Jan 30 13:05:45.734420 kernel: total RAM covered: 3072M Jan 30 13:05:45.734425 kernel: Found optimal setting for mtrr clean up Jan 30 13:05:45.734430 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Jan 30 13:05:45.734435 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Jan 30 13:05:45.734441 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 30 13:05:45.734446 kernel: Using GB pages for direct mapping Jan 30 13:05:45.734451 kernel: ACPI: Early table checksum verification disabled Jan 30 13:05:45.734456 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Jan 30 13:05:45.734461 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Jan 30 13:05:45.734466 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Jan 30 13:05:45.734471 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Jan 30 13:05:45.734476 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 30 13:05:45.734484 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 30 13:05:45.734489 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Jan 30 13:05:45.734494 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Jan 30 13:05:45.734500 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Jan 30 13:05:45.734505 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Jan 30 13:05:45.734510 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Jan 30 13:05:45.734517 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Jan 30 13:05:45.734522 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Jan 30 13:05:45.734527 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Jan 30 13:05:45.734532 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 30 13:05:45.734537 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 30 13:05:45.734542 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Jan 30 13:05:45.734548 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Jan 30 13:05:45.734553 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Jan 30 13:05:45.734558 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Jan 30 13:05:45.734564 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Jan 30 13:05:45.734569 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Jan 30 13:05:45.734574 kernel: system APIC only can use physical flat Jan 30 13:05:45.734579 kernel: APIC: Switched APIC routing to: physical flat Jan 30 13:05:45.734584 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 30 13:05:45.734589 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 30 13:05:45.734595 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 30 13:05:45.734600 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 30 13:05:45.734605 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 30 13:05:45.734610 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 30 13:05:45.734623 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 30 13:05:45.734628 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 30 13:05:45.734633 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Jan 30 13:05:45.734638 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Jan 30 13:05:45.734643 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Jan 30 13:05:45.734648 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Jan 30 13:05:45.734653 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Jan 30 13:05:45.734658 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Jan 30 13:05:45.734663 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Jan 30 13:05:45.734668 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Jan 30 13:05:45.734674 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Jan 30 13:05:45.734680 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Jan 30 13:05:45.734684 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Jan 30 13:05:45.734689 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Jan 30 13:05:45.734694 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Jan 30 13:05:45.734699 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Jan 30 13:05:45.734704 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Jan 30 13:05:45.734709 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Jan 30 13:05:45.734715 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Jan 30 13:05:45.734719 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Jan 30 13:05:45.734726 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Jan 30 13:05:45.734731 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Jan 30 13:05:45.734736 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Jan 30 13:05:45.734741 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Jan 30 13:05:45.734746 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Jan 30 13:05:45.734750 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Jan 30 13:05:45.734756 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Jan 30 13:05:45.734761 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Jan 30 13:05:45.734766 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Jan 30 13:05:45.734771 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Jan 30 13:05:45.734776 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Jan 30 13:05:45.734783 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Jan 30 13:05:45.734787 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Jan 30 13:05:45.734793 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Jan 30 13:05:45.734798 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Jan 30 13:05:45.734803 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Jan 30 13:05:45.734808 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Jan 30 13:05:45.734813 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Jan 30 13:05:45.734818 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Jan 30 13:05:45.734823 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Jan 30 13:05:45.734828 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Jan 30 13:05:45.734834 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Jan 30 13:05:45.734839 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Jan 30 13:05:45.734844 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Jan 30 13:05:45.734849 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Jan 30 13:05:45.734854 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Jan 30 13:05:45.734859 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Jan 30 13:05:45.734864 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Jan 30 13:05:45.734869 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Jan 30 13:05:45.734873 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Jan 30 13:05:45.734879 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Jan 30 13:05:45.734885 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Jan 30 13:05:45.734890 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Jan 30 13:05:45.734899 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Jan 30 13:05:45.734905 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Jan 30 13:05:45.734910 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Jan 30 13:05:45.734916 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Jan 30 13:05:45.734921 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Jan 30 13:05:45.734927 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Jan 30 13:05:45.734932 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Jan 30 13:05:45.734938 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Jan 30 13:05:45.734944 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Jan 30 13:05:45.734949 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Jan 30 13:05:45.734954 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Jan 30 13:05:45.734960 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Jan 30 13:05:45.734965 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Jan 30 13:05:45.734970 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Jan 30 13:05:45.734976 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Jan 30 13:05:45.734981 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Jan 30 13:05:45.734986 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Jan 30 13:05:45.734993 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Jan 30 13:05:45.734998 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Jan 30 13:05:45.735004 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Jan 30 13:05:45.735009 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Jan 30 13:05:45.735014 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Jan 30 13:05:45.735020 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Jan 30 13:05:45.735025 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Jan 30 13:05:45.735030 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Jan 30 13:05:45.735036 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Jan 30 13:05:45.735041 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Jan 30 13:05:45.735048 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Jan 30 13:05:45.735053 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Jan 30 13:05:45.735059 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Jan 30 13:05:45.735064 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Jan 30 13:05:45.735069 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Jan 30 13:05:45.735074 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Jan 30 13:05:45.735080 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Jan 30 13:05:45.735085 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Jan 30 13:05:45.735090 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Jan 30 13:05:45.735096 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Jan 30 13:05:45.735101 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Jan 30 13:05:45.735107 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Jan 30 13:05:45.735113 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Jan 30 13:05:45.735118 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Jan 30 13:05:45.735123 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Jan 30 13:05:45.735129 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Jan 30 13:05:45.735134 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Jan 30 13:05:45.735139 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Jan 30 13:05:45.735145 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Jan 30 13:05:45.735150 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Jan 30 13:05:45.735155 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Jan 30 13:05:45.735162 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Jan 30 13:05:45.735167 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Jan 30 13:05:45.735173 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Jan 30 13:05:45.735178 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Jan 30 13:05:45.735183 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Jan 30 13:05:45.735189 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Jan 30 13:05:45.735194 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Jan 30 13:05:45.735199 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Jan 30 13:05:45.735204 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Jan 30 13:05:45.735210 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Jan 30 13:05:45.735216 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Jan 30 13:05:45.735221 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Jan 30 13:05:45.735227 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Jan 30 13:05:45.735233 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Jan 30 13:05:45.735238 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Jan 30 13:05:45.735243 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Jan 30 13:05:45.735248 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Jan 30 13:05:45.735254 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Jan 30 13:05:45.735259 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Jan 30 13:05:45.735265 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Jan 30 13:05:45.735272 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Jan 30 13:05:45.735277 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 30 13:05:45.735283 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 30 13:05:45.735288 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Jan 30 13:05:45.735294 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Jan 30 13:05:45.735299 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Jan 30 13:05:45.735305 kernel: Zone ranges: Jan 30 13:05:45.735311 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 30 13:05:45.735316 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Jan 30 13:05:45.735321 kernel: Normal empty Jan 30 13:05:45.735328 kernel: Movable zone start for each node Jan 30 13:05:45.735333 kernel: Early memory node ranges Jan 30 13:05:45.735339 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Jan 30 13:05:45.735344 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Jan 30 13:05:45.735350 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Jan 30 13:05:45.735355 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Jan 30 13:05:45.735360 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 30 13:05:45.735366 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Jan 30 13:05:45.735372 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Jan 30 13:05:45.735378 kernel: ACPI: PM-Timer IO Port: 0x1008 Jan 30 13:05:45.735384 kernel: system APIC only can use physical flat Jan 30 13:05:45.735389 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Jan 30 13:05:45.735395 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Jan 30 13:05:45.735400 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Jan 30 13:05:45.735406 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Jan 30 13:05:45.735411 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Jan 30 13:05:45.735416 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Jan 30 13:05:45.735422 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Jan 30 13:05:45.735427 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Jan 30 13:05:45.735434 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Jan 30 13:05:45.735439 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Jan 30 13:05:45.735445 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Jan 30 13:05:45.735450 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Jan 30 13:05:45.735456 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Jan 30 13:05:45.735461 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Jan 30 13:05:45.735466 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Jan 30 13:05:45.735472 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Jan 30 13:05:45.735477 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Jan 30 13:05:45.735483 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Jan 30 13:05:45.735490 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Jan 30 13:05:45.735495 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Jan 30 13:05:45.735501 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Jan 30 13:05:45.735506 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Jan 30 13:05:45.735511 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Jan 30 13:05:45.735517 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Jan 30 13:05:45.735522 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Jan 30 13:05:45.735527 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Jan 30 13:05:45.735533 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Jan 30 13:05:45.735540 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Jan 30 13:05:45.735545 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Jan 30 13:05:45.735550 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Jan 30 13:05:45.735556 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Jan 30 13:05:45.735561 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Jan 30 13:05:45.735566 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Jan 30 13:05:45.735572 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Jan 30 13:05:45.735577 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Jan 30 13:05:45.735582 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Jan 30 13:05:45.735588 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Jan 30 13:05:45.735595 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Jan 30 13:05:45.735600 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Jan 30 13:05:45.735606 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Jan 30 13:05:45.735611 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Jan 30 13:05:45.735622 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Jan 30 13:05:45.735628 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Jan 30 13:05:45.735633 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Jan 30 13:05:45.735639 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Jan 30 13:05:45.735644 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Jan 30 13:05:45.735649 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Jan 30 13:05:45.735656 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Jan 30 13:05:45.735662 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Jan 30 13:05:45.735667 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Jan 30 13:05:45.735673 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Jan 30 13:05:45.735678 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Jan 30 13:05:45.735684 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Jan 30 13:05:45.735690 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Jan 30 13:05:45.735695 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Jan 30 13:05:45.735701 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Jan 30 13:05:45.735706 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Jan 30 13:05:45.735713 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Jan 30 13:05:45.735718 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Jan 30 13:05:45.735723 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Jan 30 13:05:45.735729 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Jan 30 13:05:45.735734 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Jan 30 13:05:45.735739 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Jan 30 13:05:45.735745 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Jan 30 13:05:45.735750 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Jan 30 13:05:45.735755 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Jan 30 13:05:45.735761 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Jan 30 13:05:45.735767 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Jan 30 13:05:45.735773 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Jan 30 13:05:45.735778 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Jan 30 13:05:45.735783 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Jan 30 13:05:45.735789 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Jan 30 13:05:45.735794 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Jan 30 13:05:45.735800 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Jan 30 13:05:45.735805 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Jan 30 13:05:45.735810 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Jan 30 13:05:45.735817 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Jan 30 13:05:45.735823 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Jan 30 13:05:45.735828 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Jan 30 13:05:45.735833 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Jan 30 13:05:45.735839 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Jan 30 13:05:45.735844 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Jan 30 13:05:45.735850 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Jan 30 13:05:45.735855 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Jan 30 13:05:45.735860 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Jan 30 13:05:45.735866 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Jan 30 13:05:45.735872 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Jan 30 13:05:45.735878 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Jan 30 13:05:45.735883 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Jan 30 13:05:45.735888 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Jan 30 13:05:45.735894 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Jan 30 13:05:45.735899 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Jan 30 13:05:45.735905 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Jan 30 13:05:45.735910 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Jan 30 13:05:45.735915 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Jan 30 13:05:45.735921 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Jan 30 13:05:45.735927 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Jan 30 13:05:45.735933 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Jan 30 13:05:45.735938 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Jan 30 13:05:45.735943 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Jan 30 13:05:45.735949 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Jan 30 13:05:45.735954 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Jan 30 13:05:45.735959 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Jan 30 13:05:45.735965 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Jan 30 13:05:45.735970 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Jan 30 13:05:45.735976 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Jan 30 13:05:45.735982 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Jan 30 13:05:45.735988 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Jan 30 13:05:45.735993 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Jan 30 13:05:45.735999 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Jan 30 13:05:45.736004 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Jan 30 13:05:45.736009 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Jan 30 13:05:45.736015 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Jan 30 13:05:45.736020 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Jan 30 13:05:45.736025 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Jan 30 13:05:45.736032 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Jan 30 13:05:45.736038 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Jan 30 13:05:45.736043 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Jan 30 13:05:45.736048 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Jan 30 13:05:45.736054 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Jan 30 13:05:45.736059 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Jan 30 13:05:45.736064 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Jan 30 13:05:45.736070 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Jan 30 13:05:45.736075 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Jan 30 13:05:45.736081 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Jan 30 13:05:45.736087 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Jan 30 13:05:45.736093 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Jan 30 13:05:45.736098 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Jan 30 13:05:45.736104 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Jan 30 13:05:45.736109 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Jan 30 13:05:45.736115 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 30 13:05:45.736120 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Jan 30 13:05:45.736126 kernel: TSC deadline timer available Jan 30 13:05:45.736132 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Jan 30 13:05:45.736137 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Jan 30 13:05:45.736144 kernel: Booting paravirtualized kernel on VMware hypervisor Jan 30 13:05:45.736155 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 30 13:05:45.736161 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Jan 30 13:05:45.736167 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 30 13:05:45.736172 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 30 13:05:45.736178 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Jan 30 13:05:45.736183 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Jan 30 13:05:45.736189 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Jan 30 13:05:45.736196 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Jan 30 13:05:45.736201 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Jan 30 13:05:45.736214 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Jan 30 13:05:45.736222 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Jan 30 13:05:45.736228 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Jan 30 13:05:45.736233 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Jan 30 13:05:45.736239 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Jan 30 13:05:45.736245 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Jan 30 13:05:45.736250 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Jan 30 13:05:45.736257 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Jan 30 13:05:45.736263 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Jan 30 13:05:45.736269 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Jan 30 13:05:45.736275 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Jan 30 13:05:45.736281 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=fe60919b0c6f6abb7495678f87f7024e97a038fc343fa31a123a43ef5f489466 Jan 30 13:05:45.736288 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 30 13:05:45.736293 kernel: random: crng init done Jan 30 13:05:45.736299 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Jan 30 13:05:45.736306 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Jan 30 13:05:45.736312 kernel: printk: log_buf_len min size: 262144 bytes Jan 30 13:05:45.736318 kernel: printk: log_buf_len: 1048576 bytes Jan 30 13:05:45.736324 kernel: printk: early log buf free: 239648(91%) Jan 30 13:05:45.736330 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 30 13:05:45.736336 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 30 13:05:45.736341 kernel: Fallback order for Node 0: 0 Jan 30 13:05:45.736347 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Jan 30 13:05:45.736354 kernel: Policy zone: DMA32 Jan 30 13:05:45.736361 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 30 13:05:45.736367 kernel: Memory: 1934304K/2096628K available (14336K kernel code, 2301K rwdata, 22800K rodata, 43320K init, 1752K bss, 162064K reserved, 0K cma-reserved) Jan 30 13:05:45.736375 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Jan 30 13:05:45.736381 kernel: ftrace: allocating 37893 entries in 149 pages Jan 30 13:05:45.736387 kernel: ftrace: allocated 149 pages with 4 groups Jan 30 13:05:45.736394 kernel: Dynamic Preempt: voluntary Jan 30 13:05:45.736400 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 30 13:05:45.736406 kernel: rcu: RCU event tracing is enabled. Jan 30 13:05:45.736412 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Jan 30 13:05:45.736418 kernel: Trampoline variant of Tasks RCU enabled. Jan 30 13:05:45.736424 kernel: Rude variant of Tasks RCU enabled. Jan 30 13:05:45.736429 kernel: Tracing variant of Tasks RCU enabled. Jan 30 13:05:45.736436 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 30 13:05:45.736441 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Jan 30 13:05:45.736447 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Jan 30 13:05:45.736454 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Jan 30 13:05:45.736460 kernel: Console: colour VGA+ 80x25 Jan 30 13:05:45.736466 kernel: printk: console [tty0] enabled Jan 30 13:05:45.736472 kernel: printk: console [ttyS0] enabled Jan 30 13:05:45.736478 kernel: ACPI: Core revision 20230628 Jan 30 13:05:45.736485 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Jan 30 13:05:45.736491 kernel: APIC: Switch to symmetric I/O mode setup Jan 30 13:05:45.736497 kernel: x2apic enabled Jan 30 13:05:45.736503 kernel: APIC: Switched APIC routing to: physical x2apic Jan 30 13:05:45.736510 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 30 13:05:45.736516 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 30 13:05:45.736521 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Jan 30 13:05:45.736527 kernel: Disabled fast string operations Jan 30 13:05:45.736533 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Jan 30 13:05:45.736539 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Jan 30 13:05:45.736545 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 30 13:05:45.736552 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Jan 30 13:05:45.736558 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Jan 30 13:05:45.736565 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 30 13:05:45.736571 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 30 13:05:45.736577 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 30 13:05:45.736583 kernel: RETBleed: Mitigation: Enhanced IBRS Jan 30 13:05:45.736589 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 30 13:05:45.736596 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 30 13:05:45.736602 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jan 30 13:05:45.736608 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 30 13:05:45.736613 kernel: GDS: Unknown: Dependent on hypervisor status Jan 30 13:05:45.736633 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 30 13:05:45.736640 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 30 13:05:45.736646 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 30 13:05:45.736652 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 30 13:05:45.736658 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jan 30 13:05:45.736663 kernel: Freeing SMP alternatives memory: 32K Jan 30 13:05:45.736669 kernel: pid_max: default: 131072 minimum: 1024 Jan 30 13:05:45.736675 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 30 13:05:45.736681 kernel: landlock: Up and running. Jan 30 13:05:45.736688 kernel: SELinux: Initializing. Jan 30 13:05:45.736694 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 30 13:05:45.736700 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 30 13:05:45.736706 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Jan 30 13:05:45.736712 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 30 13:05:45.736718 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 30 13:05:45.736724 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 30 13:05:45.736730 kernel: Performance Events: Skylake events, core PMU driver. Jan 30 13:05:45.736736 kernel: core: CPUID marked event: 'cpu cycles' unavailable Jan 30 13:05:45.736743 kernel: core: CPUID marked event: 'instructions' unavailable Jan 30 13:05:45.736749 kernel: core: CPUID marked event: 'bus cycles' unavailable Jan 30 13:05:45.736755 kernel: core: CPUID marked event: 'cache references' unavailable Jan 30 13:05:45.736760 kernel: core: CPUID marked event: 'cache misses' unavailable Jan 30 13:05:45.736766 kernel: core: CPUID marked event: 'branch instructions' unavailable Jan 30 13:05:45.736772 kernel: core: CPUID marked event: 'branch misses' unavailable Jan 30 13:05:45.736777 kernel: ... version: 1 Jan 30 13:05:45.736783 kernel: ... bit width: 48 Jan 30 13:05:45.736790 kernel: ... generic registers: 4 Jan 30 13:05:45.736797 kernel: ... value mask: 0000ffffffffffff Jan 30 13:05:45.736802 kernel: ... max period: 000000007fffffff Jan 30 13:05:45.736808 kernel: ... fixed-purpose events: 0 Jan 30 13:05:45.736814 kernel: ... event mask: 000000000000000f Jan 30 13:05:45.736820 kernel: signal: max sigframe size: 1776 Jan 30 13:05:45.736826 kernel: rcu: Hierarchical SRCU implementation. Jan 30 13:05:45.736832 kernel: rcu: Max phase no-delay instances is 400. Jan 30 13:05:45.736838 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 30 13:05:45.736846 kernel: smp: Bringing up secondary CPUs ... Jan 30 13:05:45.736852 kernel: smpboot: x86: Booting SMP configuration: Jan 30 13:05:45.736858 kernel: .... node #0, CPUs: #1 Jan 30 13:05:45.736864 kernel: Disabled fast string operations Jan 30 13:05:45.736870 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Jan 30 13:05:45.736876 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 30 13:05:45.736882 kernel: smp: Brought up 1 node, 2 CPUs Jan 30 13:05:45.736888 kernel: smpboot: Max logical packages: 128 Jan 30 13:05:45.736894 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Jan 30 13:05:45.736900 kernel: devtmpfs: initialized Jan 30 13:05:45.736907 kernel: x86/mm: Memory block size: 128MB Jan 30 13:05:45.736913 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Jan 30 13:05:45.736919 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 30 13:05:45.736925 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Jan 30 13:05:45.736931 kernel: pinctrl core: initialized pinctrl subsystem Jan 30 13:05:45.736937 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 30 13:05:45.736943 kernel: audit: initializing netlink subsys (disabled) Jan 30 13:05:45.736949 kernel: audit: type=2000 audit(1738242344.068:1): state=initialized audit_enabled=0 res=1 Jan 30 13:05:45.736954 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 30 13:05:45.736962 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 30 13:05:45.736968 kernel: cpuidle: using governor menu Jan 30 13:05:45.736975 kernel: Simple Boot Flag at 0x36 set to 0x80 Jan 30 13:05:45.736981 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 30 13:05:45.736987 kernel: dca service started, version 1.12.1 Jan 30 13:05:45.736993 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Jan 30 13:05:45.736999 kernel: PCI: Using configuration type 1 for base access Jan 30 13:05:45.737005 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 30 13:05:45.737011 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 30 13:05:45.737018 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 30 13:05:45.737024 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 30 13:05:45.737029 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 30 13:05:45.737035 kernel: ACPI: Added _OSI(Module Device) Jan 30 13:05:45.737041 kernel: ACPI: Added _OSI(Processor Device) Jan 30 13:05:45.737047 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 30 13:05:45.737053 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 30 13:05:45.737059 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 30 13:05:45.737065 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Jan 30 13:05:45.737072 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 30 13:05:45.737078 kernel: ACPI: Interpreter enabled Jan 30 13:05:45.737084 kernel: ACPI: PM: (supports S0 S1 S5) Jan 30 13:05:45.737090 kernel: ACPI: Using IOAPIC for interrupt routing Jan 30 13:05:45.737096 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 30 13:05:45.737102 kernel: PCI: Using E820 reservations for host bridge windows Jan 30 13:05:45.737108 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Jan 30 13:05:45.737115 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Jan 30 13:05:45.737205 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 30 13:05:45.737267 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Jan 30 13:05:45.737317 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Jan 30 13:05:45.737326 kernel: PCI host bridge to bus 0000:00 Jan 30 13:05:45.737376 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 30 13:05:45.737422 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Jan 30 13:05:45.737466 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 30 13:05:45.737512 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 30 13:05:45.737556 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Jan 30 13:05:45.737600 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Jan 30 13:05:45.737711 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Jan 30 13:05:45.737768 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Jan 30 13:05:45.737823 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Jan 30 13:05:45.737881 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Jan 30 13:05:45.737931 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Jan 30 13:05:45.737981 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Jan 30 13:05:45.738030 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Jan 30 13:05:45.738079 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Jan 30 13:05:45.738128 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Jan 30 13:05:45.738182 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Jan 30 13:05:45.738234 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Jan 30 13:05:45.738283 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Jan 30 13:05:45.738338 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Jan 30 13:05:45.738388 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Jan 30 13:05:45.738438 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Jan 30 13:05:45.738491 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Jan 30 13:05:45.738543 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Jan 30 13:05:45.738592 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Jan 30 13:05:45.738675 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Jan 30 13:05:45.738726 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Jan 30 13:05:45.738774 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 30 13:05:45.738829 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Jan 30 13:05:45.738886 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.738939 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.738996 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739048 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739102 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739183 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739243 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739297 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739351 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739402 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739456 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739510 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739565 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739684 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739743 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739793 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739846 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.739898 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.739954 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740008 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740061 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740110 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740162 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740212 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740264 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740316 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740369 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740418 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740470 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740520 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740573 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740635 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740690 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740739 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740793 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740843 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.740899 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.740952 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.741005 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.741055 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.741109 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.741162 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.741217 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.741268 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.741324 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.741374 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.741501 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.741554 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.741607 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.742064 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.742125 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.742177 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.742236 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.742288 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.742342 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.742393 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.742448 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.742500 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.742554 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.742604 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.742673 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.742725 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.744686 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Jan 30 13:05:45.744749 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.744805 kernel: pci_bus 0000:01: extended config space not accessible Jan 30 13:05:45.744859 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 30 13:05:45.744912 kernel: pci_bus 0000:02: extended config space not accessible Jan 30 13:05:45.744921 kernel: acpiphp: Slot [32] registered Jan 30 13:05:45.744927 kernel: acpiphp: Slot [33] registered Jan 30 13:05:45.744936 kernel: acpiphp: Slot [34] registered Jan 30 13:05:45.744942 kernel: acpiphp: Slot [35] registered Jan 30 13:05:45.744947 kernel: acpiphp: Slot [36] registered Jan 30 13:05:45.744953 kernel: acpiphp: Slot [37] registered Jan 30 13:05:45.744959 kernel: acpiphp: Slot [38] registered Jan 30 13:05:45.744965 kernel: acpiphp: Slot [39] registered Jan 30 13:05:45.744971 kernel: acpiphp: Slot [40] registered Jan 30 13:05:45.744977 kernel: acpiphp: Slot [41] registered Jan 30 13:05:45.744982 kernel: acpiphp: Slot [42] registered Jan 30 13:05:45.744988 kernel: acpiphp: Slot [43] registered Jan 30 13:05:45.744996 kernel: acpiphp: Slot [44] registered Jan 30 13:05:45.745001 kernel: acpiphp: Slot [45] registered Jan 30 13:05:45.745007 kernel: acpiphp: Slot [46] registered Jan 30 13:05:45.745013 kernel: acpiphp: Slot [47] registered Jan 30 13:05:45.745018 kernel: acpiphp: Slot [48] registered Jan 30 13:05:45.745024 kernel: acpiphp: Slot [49] registered Jan 30 13:05:45.745030 kernel: acpiphp: Slot [50] registered Jan 30 13:05:45.745036 kernel: acpiphp: Slot [51] registered Jan 30 13:05:45.745042 kernel: acpiphp: Slot [52] registered Jan 30 13:05:45.745049 kernel: acpiphp: Slot [53] registered Jan 30 13:05:45.745055 kernel: acpiphp: Slot [54] registered Jan 30 13:05:45.745061 kernel: acpiphp: Slot [55] registered Jan 30 13:05:45.745066 kernel: acpiphp: Slot [56] registered Jan 30 13:05:45.745072 kernel: acpiphp: Slot [57] registered Jan 30 13:05:45.745078 kernel: acpiphp: Slot [58] registered Jan 30 13:05:45.745084 kernel: acpiphp: Slot [59] registered Jan 30 13:05:45.745090 kernel: acpiphp: Slot [60] registered Jan 30 13:05:45.745096 kernel: acpiphp: Slot [61] registered Jan 30 13:05:45.745101 kernel: acpiphp: Slot [62] registered Jan 30 13:05:45.745108 kernel: acpiphp: Slot [63] registered Jan 30 13:05:45.745159 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jan 30 13:05:45.745210 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 30 13:05:45.745260 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 30 13:05:45.745309 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 30 13:05:45.745359 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Jan 30 13:05:45.745409 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Jan 30 13:05:45.745461 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Jan 30 13:05:45.745511 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Jan 30 13:05:45.745561 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Jan 30 13:05:45.745641 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Jan 30 13:05:45.745695 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Jan 30 13:05:45.745746 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Jan 30 13:05:45.745796 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 30 13:05:45.745847 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Jan 30 13:05:45.745900 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 30 13:05:45.745953 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 30 13:05:45.746004 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 30 13:05:45.746054 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 30 13:05:45.746105 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 30 13:05:45.746156 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 30 13:05:45.746205 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 30 13:05:45.746258 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 30 13:05:45.746311 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 30 13:05:45.746360 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 30 13:05:45.746410 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 30 13:05:45.746460 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 30 13:05:45.746512 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 30 13:05:45.746563 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 30 13:05:45.746612 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 30 13:05:45.746982 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 30 13:05:45.747034 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 30 13:05:45.747084 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 30 13:05:45.747137 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 30 13:05:45.747194 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 30 13:05:45.747244 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 30 13:05:45.747295 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 30 13:05:45.747345 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 30 13:05:45.747394 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 30 13:05:45.747446 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 30 13:05:45.747494 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 30 13:05:45.747583 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 30 13:05:45.748708 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Jan 30 13:05:45.748769 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Jan 30 13:05:45.748824 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Jan 30 13:05:45.748875 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Jan 30 13:05:45.748926 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Jan 30 13:05:45.748977 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 30 13:05:45.749028 kernel: pci 0000:0b:00.0: supports D1 D2 Jan 30 13:05:45.749085 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Jan 30 13:05:45.749136 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 30 13:05:45.749216 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 30 13:05:45.749268 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 30 13:05:45.749319 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 30 13:05:45.749370 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 30 13:05:45.749422 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 30 13:05:45.749471 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 30 13:05:45.749525 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 30 13:05:45.749579 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 30 13:05:45.749653 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 30 13:05:45.749752 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 30 13:05:45.753753 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 30 13:05:45.753822 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 30 13:05:45.753874 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 30 13:05:45.753926 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 30 13:05:45.753983 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 30 13:05:45.754033 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 30 13:05:45.754082 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 30 13:05:45.754134 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 30 13:05:45.754184 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 30 13:05:45.754234 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 30 13:05:45.754285 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 30 13:05:45.754334 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 30 13:05:45.754386 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 30 13:05:45.754438 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 30 13:05:45.754487 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 30 13:05:45.754536 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 30 13:05:45.754589 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 30 13:05:45.754655 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 30 13:05:45.754706 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 30 13:05:45.754755 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 30 13:05:45.754812 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 30 13:05:45.754861 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 30 13:05:45.754909 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 30 13:05:45.754960 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 30 13:05:45.755012 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 30 13:05:45.755062 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 30 13:05:45.755112 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 30 13:05:45.755169 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 30 13:05:45.755222 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 30 13:05:45.755272 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 30 13:05:45.755321 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 30 13:05:45.755373 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 30 13:05:45.755423 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 30 13:05:45.755472 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 30 13:05:45.755524 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 30 13:05:45.755577 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 30 13:05:45.758491 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 30 13:05:45.758559 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 30 13:05:45.758611 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 30 13:05:45.758673 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 30 13:05:45.758726 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 30 13:05:45.758776 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 30 13:05:45.758827 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 30 13:05:45.758883 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 30 13:05:45.758933 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 30 13:05:45.758983 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 30 13:05:45.759033 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 30 13:05:45.759085 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 30 13:05:45.759135 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 30 13:05:45.759197 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 30 13:05:45.759247 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 30 13:05:45.759302 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 30 13:05:45.759353 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 30 13:05:45.759403 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 30 13:05:45.759455 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 30 13:05:45.759505 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 30 13:05:45.759555 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 30 13:05:45.759606 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 30 13:05:45.759663 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 30 13:05:45.759716 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 30 13:05:45.759768 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 30 13:05:45.759819 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 30 13:05:45.759868 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 30 13:05:45.759920 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 30 13:05:45.759969 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 30 13:05:45.760019 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 30 13:05:45.760071 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 30 13:05:45.760125 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 30 13:05:45.760175 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 30 13:05:45.760183 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Jan 30 13:05:45.760190 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Jan 30 13:05:45.760196 kernel: ACPI: PCI: Interrupt link LNKB disabled Jan 30 13:05:45.760202 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 30 13:05:45.760208 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Jan 30 13:05:45.760214 kernel: iommu: Default domain type: Translated Jan 30 13:05:45.760222 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 30 13:05:45.760228 kernel: PCI: Using ACPI for IRQ routing Jan 30 13:05:45.760234 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 30 13:05:45.760240 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Jan 30 13:05:45.760246 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Jan 30 13:05:45.760295 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Jan 30 13:05:45.760345 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Jan 30 13:05:45.760394 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 30 13:05:45.760403 kernel: vgaarb: loaded Jan 30 13:05:45.760411 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Jan 30 13:05:45.760417 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Jan 30 13:05:45.760422 kernel: clocksource: Switched to clocksource tsc-early Jan 30 13:05:45.760429 kernel: VFS: Disk quotas dquot_6.6.0 Jan 30 13:05:45.760435 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 30 13:05:45.760440 kernel: pnp: PnP ACPI init Jan 30 13:05:45.760494 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Jan 30 13:05:45.760542 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Jan 30 13:05:45.760590 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Jan 30 13:05:45.760948 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Jan 30 13:05:45.761002 kernel: pnp 00:06: [dma 2] Jan 30 13:05:45.761053 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Jan 30 13:05:45.761100 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Jan 30 13:05:45.761146 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Jan 30 13:05:45.761155 kernel: pnp: PnP ACPI: found 8 devices Jan 30 13:05:45.761166 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 30 13:05:45.761172 kernel: NET: Registered PF_INET protocol family Jan 30 13:05:45.761178 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 30 13:05:45.761184 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 30 13:05:45.761190 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 30 13:05:45.761196 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 30 13:05:45.761202 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 30 13:05:45.761208 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 30 13:05:45.761214 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 30 13:05:45.761221 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 30 13:05:45.761227 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 30 13:05:45.761233 kernel: NET: Registered PF_XDP protocol family Jan 30 13:05:45.761287 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Jan 30 13:05:45.761340 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 30 13:05:45.761394 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 30 13:05:45.761447 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 30 13:05:45.761502 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 30 13:05:45.761554 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 30 13:05:45.761606 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 30 13:05:45.763673 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 30 13:05:45.763740 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 30 13:05:45.763802 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 30 13:05:45.763860 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 30 13:05:45.763913 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 30 13:05:45.763966 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 30 13:05:45.764019 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 30 13:05:45.764070 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 30 13:05:45.764124 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 30 13:05:45.764177 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 30 13:05:45.764228 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 30 13:05:45.764332 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 30 13:05:45.764386 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 30 13:05:45.764437 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 30 13:05:45.764491 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 30 13:05:45.764543 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Jan 30 13:05:45.764595 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Jan 30 13:05:45.764716 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Jan 30 13:05:45.764769 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.764819 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.764871 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.764924 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.764976 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765026 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765077 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765128 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765185 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765236 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765285 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765339 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765390 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765440 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765491 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765541 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765592 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765650 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765701 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765753 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765804 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765854 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.765905 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.765954 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.766006 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.766056 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.766106 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.766159 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.766209 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.766260 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.766311 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.766361 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.766412 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.766462 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.766512 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.766562 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.767654 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.767724 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.767780 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.767831 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.767883 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.767934 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.767984 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768033 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768087 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768135 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768190 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768239 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768289 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768338 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768388 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768436 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768487 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768539 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768589 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768645 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768696 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768746 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768795 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768846 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.768896 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.768946 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.769012 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.769067 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.769118 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.769174 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.769224 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.769272 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.769323 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.769372 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.769423 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.769472 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.769526 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.769575 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.771140 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.771218 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.771281 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.771335 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.771388 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.771439 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.771491 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.771540 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.771596 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 30 13:05:45.771670 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 30 13:05:45.771724 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 30 13:05:45.771776 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Jan 30 13:05:45.771828 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 30 13:05:45.771878 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 30 13:05:45.771928 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 30 13:05:45.771983 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Jan 30 13:05:45.772039 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 30 13:05:45.772091 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 30 13:05:45.772142 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 30 13:05:45.772192 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Jan 30 13:05:45.772244 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 30 13:05:45.772295 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 30 13:05:45.772346 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 30 13:05:45.772396 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 30 13:05:45.772450 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 30 13:05:45.772503 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 30 13:05:45.772553 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 30 13:05:45.772603 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 30 13:05:45.772726 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 30 13:05:45.772778 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 30 13:05:45.772828 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 30 13:05:45.772878 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 30 13:05:45.772928 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 30 13:05:45.772978 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 30 13:05:45.773032 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 30 13:05:45.773082 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 30 13:05:45.773132 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 30 13:05:45.773182 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 30 13:05:45.773232 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 30 13:05:45.773285 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 30 13:05:45.773338 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 30 13:05:45.773388 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 30 13:05:45.773438 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 30 13:05:45.773493 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Jan 30 13:05:45.773544 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 30 13:05:45.773595 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 30 13:05:45.775185 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 30 13:05:45.775244 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Jan 30 13:05:45.775299 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 30 13:05:45.775352 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 30 13:05:45.775408 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 30 13:05:45.775459 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 30 13:05:45.775512 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 30 13:05:45.775563 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 30 13:05:45.775630 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 30 13:05:45.775689 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 30 13:05:45.775743 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 30 13:05:45.775793 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 30 13:05:45.775843 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 30 13:05:45.775899 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 30 13:05:45.775950 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 30 13:05:45.776000 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 30 13:05:45.776052 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 30 13:05:45.776102 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 30 13:05:45.776153 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 30 13:05:45.776205 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 30 13:05:45.776255 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 30 13:05:45.776305 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 30 13:05:45.776357 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 30 13:05:45.776410 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 30 13:05:45.776461 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 30 13:05:45.776512 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 30 13:05:45.776563 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 30 13:05:45.776612 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 30 13:05:45.777678 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 30 13:05:45.777733 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 30 13:05:45.777784 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 30 13:05:45.777847 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 30 13:05:45.777902 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 30 13:05:45.777955 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 30 13:05:45.778005 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 30 13:05:45.778056 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 30 13:05:45.778105 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 30 13:05:45.778157 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 30 13:05:45.778207 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 30 13:05:45.778257 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 30 13:05:45.778310 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 30 13:05:45.778364 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 30 13:05:45.778414 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 30 13:05:45.778465 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 30 13:05:45.778516 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 30 13:05:45.778566 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 30 13:05:45.778628 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 30 13:05:45.778683 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 30 13:05:45.778734 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 30 13:05:45.778787 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 30 13:05:45.778838 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 30 13:05:45.778892 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 30 13:05:45.778944 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 30 13:05:45.778994 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 30 13:05:45.779044 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 30 13:05:45.779109 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 30 13:05:45.779180 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 30 13:05:45.779232 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 30 13:05:45.779283 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 30 13:05:45.779333 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 30 13:05:45.779389 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 30 13:05:45.779440 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 30 13:05:45.779491 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 30 13:05:45.779544 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 30 13:05:45.779595 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 30 13:05:45.779662 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 30 13:05:45.779717 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 30 13:05:45.779767 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 30 13:05:45.779818 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 30 13:05:45.779870 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 30 13:05:45.779925 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 30 13:05:45.779976 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 30 13:05:45.780029 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 30 13:05:45.780080 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 30 13:05:45.780130 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 30 13:05:45.780182 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 30 13:05:45.780232 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 30 13:05:45.780283 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 30 13:05:45.780334 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Jan 30 13:05:45.780383 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 30 13:05:45.780428 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 30 13:05:45.780472 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Jan 30 13:05:45.780517 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Jan 30 13:05:45.780567 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Jan 30 13:05:45.780657 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Jan 30 13:05:45.780711 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 30 13:05:45.780757 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Jan 30 13:05:45.780805 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 30 13:05:45.780851 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 30 13:05:45.780896 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Jan 30 13:05:45.780941 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Jan 30 13:05:45.780993 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Jan 30 13:05:45.781039 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Jan 30 13:05:45.781085 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Jan 30 13:05:45.781138 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Jan 30 13:05:45.781471 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Jan 30 13:05:45.781525 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Jan 30 13:05:45.781577 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Jan 30 13:05:45.782770 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Jan 30 13:05:45.782824 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Jan 30 13:05:45.782877 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Jan 30 13:05:45.782929 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Jan 30 13:05:45.782980 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Jan 30 13:05:45.783026 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 30 13:05:45.783076 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Jan 30 13:05:45.783122 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Jan 30 13:05:45.783172 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Jan 30 13:05:45.783220 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Jan 30 13:05:45.783273 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Jan 30 13:05:45.783329 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Jan 30 13:05:45.783381 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Jan 30 13:05:45.783427 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Jan 30 13:05:45.783475 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Jan 30 13:05:45.783527 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Jan 30 13:05:45.783573 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Jan 30 13:05:45.783628 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Jan 30 13:05:45.784057 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Jan 30 13:05:45.784108 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Jan 30 13:05:45.784160 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Jan 30 13:05:45.784217 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Jan 30 13:05:45.784266 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 30 13:05:45.784316 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Jan 30 13:05:45.784365 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 30 13:05:45.784415 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Jan 30 13:05:45.784462 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Jan 30 13:05:45.784513 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Jan 30 13:05:45.784563 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Jan 30 13:05:45.784886 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Jan 30 13:05:45.784951 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 30 13:05:45.785008 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Jan 30 13:05:45.785057 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Jan 30 13:05:45.785104 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 30 13:05:45.785173 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Jan 30 13:05:45.785452 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Jan 30 13:05:45.785506 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Jan 30 13:05:45.785559 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Jan 30 13:05:45.785608 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Jan 30 13:05:45.785678 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Jan 30 13:05:45.785744 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Jan 30 13:05:45.786030 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 30 13:05:45.786090 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Jan 30 13:05:45.786138 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 30 13:05:45.786191 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Jan 30 13:05:45.786239 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Jan 30 13:05:45.786291 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Jan 30 13:05:45.786341 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Jan 30 13:05:45.786392 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Jan 30 13:05:45.786439 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 30 13:05:45.786494 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 30 13:05:45.786541 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Jan 30 13:05:45.786591 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Jan 30 13:05:45.786690 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Jan 30 13:05:45.786738 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Jan 30 13:05:45.786784 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Jan 30 13:05:45.786834 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Jan 30 13:05:45.786883 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Jan 30 13:05:45.786934 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Jan 30 13:05:45.786984 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 30 13:05:45.787038 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Jan 30 13:05:45.787085 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Jan 30 13:05:45.787361 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Jan 30 13:05:45.787413 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Jan 30 13:05:45.787466 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Jan 30 13:05:45.787516 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Jan 30 13:05:45.787569 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Jan 30 13:05:45.787632 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 30 13:05:45.787697 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jan 30 13:05:45.787708 kernel: PCI: CLS 32 bytes, default 64 Jan 30 13:05:45.787715 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 30 13:05:45.787722 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 30 13:05:45.787731 kernel: clocksource: Switched to clocksource tsc Jan 30 13:05:45.787737 kernel: Initialise system trusted keyrings Jan 30 13:05:45.787743 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 30 13:05:45.787749 kernel: Key type asymmetric registered Jan 30 13:05:45.787755 kernel: Asymmetric key parser 'x509' registered Jan 30 13:05:45.787761 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 30 13:05:45.787768 kernel: io scheduler mq-deadline registered Jan 30 13:05:45.787774 kernel: io scheduler kyber registered Jan 30 13:05:45.787781 kernel: io scheduler bfq registered Jan 30 13:05:45.787838 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Jan 30 13:05:45.787896 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.787950 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Jan 30 13:05:45.788002 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788057 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Jan 30 13:05:45.788114 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788170 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Jan 30 13:05:45.788222 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788277 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Jan 30 13:05:45.788329 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788381 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Jan 30 13:05:45.788434 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788488 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Jan 30 13:05:45.788542 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788595 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Jan 30 13:05:45.788712 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788767 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Jan 30 13:05:45.788820 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788874 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Jan 30 13:05:45.788930 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.788985 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Jan 30 13:05:45.789039 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.789092 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Jan 30 13:05:45.789145 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.789210 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Jan 30 13:05:45.789266 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.789320 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Jan 30 13:05:45.789373 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.789427 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Jan 30 13:05:45.789479 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.789536 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Jan 30 13:05:45.789588 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790145 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Jan 30 13:05:45.790207 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790262 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Jan 30 13:05:45.790315 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790368 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Jan 30 13:05:45.790424 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790477 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Jan 30 13:05:45.790529 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790582 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Jan 30 13:05:45.790656 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790713 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Jan 30 13:05:45.790769 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790823 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Jan 30 13:05:45.790876 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.790929 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Jan 30 13:05:45.790981 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.791037 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Jan 30 13:05:45.791089 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.791143 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Jan 30 13:05:45.791196 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.791249 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Jan 30 13:05:45.791302 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.791358 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Jan 30 13:05:45.791410 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.791464 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Jan 30 13:05:45.791517 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.791570 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Jan 30 13:05:45.791665 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.791722 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Jan 30 13:05:45.793657 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.793725 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Jan 30 13:05:45.793782 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 30 13:05:45.793792 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 30 13:05:45.793801 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 30 13:05:45.793808 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 30 13:05:45.793814 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Jan 30 13:05:45.793820 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 30 13:05:45.793827 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 30 13:05:45.793880 kernel: rtc_cmos 00:01: registered as rtc0 Jan 30 13:05:45.793929 kernel: rtc_cmos 00:01: setting system clock to 2025-01-30T13:05:45 UTC (1738242345) Jan 30 13:05:45.793975 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Jan 30 13:05:45.793987 kernel: intel_pstate: CPU model not supported Jan 30 13:05:45.793993 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 30 13:05:45.794000 kernel: NET: Registered PF_INET6 protocol family Jan 30 13:05:45.794006 kernel: Segment Routing with IPv6 Jan 30 13:05:45.794013 kernel: In-situ OAM (IOAM) with IPv6 Jan 30 13:05:45.794019 kernel: NET: Registered PF_PACKET protocol family Jan 30 13:05:45.794025 kernel: Key type dns_resolver registered Jan 30 13:05:45.794031 kernel: IPI shorthand broadcast: enabled Jan 30 13:05:45.794038 kernel: sched_clock: Marking stable (908067637, 228752517)->(1196418366, -59598212) Jan 30 13:05:45.794045 kernel: registered taskstats version 1 Jan 30 13:05:45.794051 kernel: Loading compiled-in X.509 certificates Jan 30 13:05:45.794058 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.74-flatcar: 7f0738935740330d55027faa5877e7155d5f24f4' Jan 30 13:05:45.794064 kernel: Key type .fscrypt registered Jan 30 13:05:45.794070 kernel: Key type fscrypt-provisioning registered Jan 30 13:05:45.794076 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 30 13:05:45.794082 kernel: ima: Allocated hash algorithm: sha1 Jan 30 13:05:45.794089 kernel: ima: No architecture policies found Jan 30 13:05:45.794095 kernel: clk: Disabling unused clocks Jan 30 13:05:45.794102 kernel: Freeing unused kernel image (initmem) memory: 43320K Jan 30 13:05:45.794109 kernel: Write protecting the kernel read-only data: 38912k Jan 30 13:05:45.794117 kernel: Freeing unused kernel image (rodata/data gap) memory: 1776K Jan 30 13:05:45.794123 kernel: Run /init as init process Jan 30 13:05:45.794130 kernel: with arguments: Jan 30 13:05:45.794136 kernel: /init Jan 30 13:05:45.794142 kernel: with environment: Jan 30 13:05:45.794148 kernel: HOME=/ Jan 30 13:05:45.794154 kernel: TERM=linux Jan 30 13:05:45.794162 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 30 13:05:45.794170 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 30 13:05:45.794177 systemd[1]: Detected virtualization vmware. Jan 30 13:05:45.794184 systemd[1]: Detected architecture x86-64. Jan 30 13:05:45.794190 systemd[1]: Running in initrd. Jan 30 13:05:45.794196 systemd[1]: No hostname configured, using default hostname. Jan 30 13:05:45.794203 systemd[1]: Hostname set to . Jan 30 13:05:45.794210 systemd[1]: Initializing machine ID from random generator. Jan 30 13:05:45.794217 systemd[1]: Queued start job for default target initrd.target. Jan 30 13:05:45.794223 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 30 13:05:45.794230 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 30 13:05:45.794237 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 30 13:05:45.794244 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 30 13:05:45.794251 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 30 13:05:45.794257 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 30 13:05:45.794266 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 30 13:05:45.794273 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 30 13:05:45.794279 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 30 13:05:45.794286 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 30 13:05:45.794292 systemd[1]: Reached target paths.target - Path Units. Jan 30 13:05:45.794299 systemd[1]: Reached target slices.target - Slice Units. Jan 30 13:05:45.794305 systemd[1]: Reached target swap.target - Swaps. Jan 30 13:05:45.794313 systemd[1]: Reached target timers.target - Timer Units. Jan 30 13:05:45.794320 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 30 13:05:45.794326 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 30 13:05:45.794333 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 30 13:05:45.794339 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 30 13:05:45.794345 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 30 13:05:45.794352 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 30 13:05:45.794359 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 30 13:05:45.794365 systemd[1]: Reached target sockets.target - Socket Units. Jan 30 13:05:45.794373 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 30 13:05:45.794380 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 30 13:05:45.794386 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 30 13:05:45.794392 systemd[1]: Starting systemd-fsck-usr.service... Jan 30 13:05:45.794399 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 30 13:05:45.794405 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 30 13:05:45.794412 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 30 13:05:45.794418 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 30 13:05:45.794437 systemd-journald[217]: Collecting audit messages is disabled. Jan 30 13:05:45.794456 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 30 13:05:45.794463 systemd[1]: Finished systemd-fsck-usr.service. Jan 30 13:05:45.794471 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 30 13:05:45.794478 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 30 13:05:45.794485 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 30 13:05:45.794491 kernel: Bridge firewalling registered Jan 30 13:05:45.794498 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 30 13:05:45.794504 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 30 13:05:45.794513 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 30 13:05:45.794519 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 30 13:05:45.794526 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 30 13:05:45.794532 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 30 13:05:45.794540 systemd-journald[217]: Journal started Jan 30 13:05:45.794554 systemd-journald[217]: Runtime Journal (/run/log/journal/50d88166ddb3455f8851ee758043dcca) is 4.8M, max 38.6M, 33.8M free. Jan 30 13:05:45.751245 systemd-modules-load[218]: Inserted module 'overlay' Jan 30 13:05:45.797147 systemd[1]: Started systemd-journald.service - Journal Service. Jan 30 13:05:45.770817 systemd-modules-load[218]: Inserted module 'br_netfilter' Jan 30 13:05:45.799889 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 30 13:05:45.800103 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 30 13:05:45.802046 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 30 13:05:45.802791 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 30 13:05:45.809534 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 30 13:05:45.811190 dracut-cmdline[246]: dracut-dracut-053 Jan 30 13:05:45.814955 dracut-cmdline[246]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=fe60919b0c6f6abb7495678f87f7024e97a038fc343fa31a123a43ef5f489466 Jan 30 13:05:45.814101 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 30 13:05:45.832625 systemd-resolved[258]: Positive Trust Anchors: Jan 30 13:05:45.832841 systemd-resolved[258]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 30 13:05:45.832865 systemd-resolved[258]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 30 13:05:45.835849 systemd-resolved[258]: Defaulting to hostname 'linux'. Jan 30 13:05:45.836436 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 30 13:05:45.836588 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 30 13:05:45.866644 kernel: SCSI subsystem initialized Jan 30 13:05:45.873632 kernel: Loading iSCSI transport class v2.0-870. Jan 30 13:05:45.879628 kernel: iscsi: registered transport (tcp) Jan 30 13:05:45.894999 kernel: iscsi: registered transport (qla4xxx) Jan 30 13:05:45.895048 kernel: QLogic iSCSI HBA Driver Jan 30 13:05:45.915319 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 30 13:05:45.919715 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 30 13:05:45.934880 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 30 13:05:45.934948 kernel: device-mapper: uevent: version 1.0.3 Jan 30 13:05:45.935996 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 30 13:05:45.970641 kernel: raid6: avx2x4 gen() 46804 MB/s Jan 30 13:05:45.984636 kernel: raid6: avx2x2 gen() 53018 MB/s Jan 30 13:05:46.001856 kernel: raid6: avx2x1 gen() 44538 MB/s Jan 30 13:05:46.001894 kernel: raid6: using algorithm avx2x2 gen() 53018 MB/s Jan 30 13:05:46.020890 kernel: raid6: .... xor() 32167 MB/s, rmw enabled Jan 30 13:05:46.020938 kernel: raid6: using avx2x2 recovery algorithm Jan 30 13:05:46.034635 kernel: xor: automatically using best checksumming function avx Jan 30 13:05:46.125640 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 30 13:05:46.130599 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 30 13:05:46.135730 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 30 13:05:46.143436 systemd-udevd[435]: Using default interface naming scheme 'v255'. Jan 30 13:05:46.145948 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 30 13:05:46.151770 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 30 13:05:46.158791 dracut-pre-trigger[440]: rd.md=0: removing MD RAID activation Jan 30 13:05:46.174472 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 30 13:05:46.178782 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 30 13:05:46.248938 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 30 13:05:46.251736 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 30 13:05:46.264262 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 30 13:05:46.265049 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 30 13:05:46.265166 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 30 13:05:46.265284 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 30 13:05:46.269743 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 30 13:05:46.280285 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 30 13:05:46.318977 kernel: VMware PVSCSI driver - version 1.0.7.0-k Jan 30 13:05:46.319013 kernel: vmw_pvscsi: using 64bit dma Jan 30 13:05:46.323631 kernel: vmw_pvscsi: max_id: 16 Jan 30 13:05:46.323661 kernel: vmw_pvscsi: setting ring_pages to 8 Jan 30 13:05:46.325943 kernel: vmw_pvscsi: enabling reqCallThreshold Jan 30 13:05:46.325969 kernel: vmw_pvscsi: driver-based request coalescing enabled Jan 30 13:05:46.325982 kernel: vmw_pvscsi: using MSI-X Jan 30 13:05:46.329634 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Jan 30 13:05:46.334676 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Jan 30 13:05:46.342629 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Jan 30 13:05:46.345607 kernel: cryptd: max_cpu_qlen set to 1000 Jan 30 13:05:46.345626 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Jan 30 13:05:46.345649 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Jan 30 13:05:46.351922 kernel: libata version 3.00 loaded. Jan 30 13:05:46.351933 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Jan 30 13:05:46.354679 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Jan 30 13:05:46.361625 kernel: ata_piix 0000:00:07.1: version 2.13 Jan 30 13:05:46.367612 kernel: scsi host1: ata_piix Jan 30 13:05:46.367722 kernel: scsi host2: ata_piix Jan 30 13:05:46.367791 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Jan 30 13:05:46.367801 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Jan 30 13:05:46.367812 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Jan 30 13:05:46.383510 kernel: sd 0:0:0:0: [sda] Write Protect is off Jan 30 13:05:46.383592 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Jan 30 13:05:46.383674 kernel: sd 0:0:0:0: [sda] Cache data unavailable Jan 30 13:05:46.383737 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Jan 30 13:05:46.383800 kernel: AVX2 version of gcm_enc/dec engaged. Jan 30 13:05:46.383809 kernel: AES CTR mode by8 optimization enabled Jan 30 13:05:46.383817 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 30 13:05:46.383828 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jan 30 13:05:46.360804 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 30 13:05:46.360874 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 30 13:05:46.361053 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 30 13:05:46.361145 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 30 13:05:46.361211 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 30 13:05:46.361314 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 30 13:05:46.367797 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 30 13:05:46.387828 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 30 13:05:46.396700 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 30 13:05:46.406849 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 30 13:05:46.536690 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Jan 30 13:05:46.539659 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Jan 30 13:05:46.561631 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Jan 30 13:05:46.575387 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 30 13:05:46.575398 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jan 30 13:05:46.605351 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Jan 30 13:05:46.606173 kernel: BTRFS: device fsid f8084233-4a6f-4e67-af0b-519e43b19e58 devid 1 transid 41 /dev/sda3 scanned by (udev-worker) (489) Jan 30 13:05:46.609637 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (490) Jan 30 13:05:46.609513 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Jan 30 13:05:46.613049 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Jan 30 13:05:46.613327 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Jan 30 13:05:46.616142 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 30 13:05:46.622699 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 30 13:05:46.645633 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 30 13:05:46.649634 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 30 13:05:47.651643 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 30 13:05:47.652602 disk-uuid[595]: The operation has completed successfully. Jan 30 13:05:47.737806 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 30 13:05:47.737877 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 30 13:05:47.742728 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 30 13:05:47.747167 sh[613]: Success Jan 30 13:05:47.757693 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Jan 30 13:05:47.804102 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 30 13:05:47.809604 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 30 13:05:47.810057 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 30 13:05:47.827079 kernel: BTRFS info (device dm-0): first mount of filesystem f8084233-4a6f-4e67-af0b-519e43b19e58 Jan 30 13:05:47.827124 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 30 13:05:47.827133 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 30 13:05:47.829018 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 30 13:05:47.829041 kernel: BTRFS info (device dm-0): using free space tree Jan 30 13:05:47.837632 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 30 13:05:47.839029 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 30 13:05:47.852782 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Jan 30 13:05:47.854263 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 30 13:05:47.873340 kernel: BTRFS info (device sda6): first mount of filesystem 8f723f8b-dc93-4eaf-8b2c-0038aa5af52c Jan 30 13:05:47.873376 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 30 13:05:47.873385 kernel: BTRFS info (device sda6): using free space tree Jan 30 13:05:47.878631 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 30 13:05:47.889111 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 30 13:05:47.889636 kernel: BTRFS info (device sda6): last unmount of filesystem 8f723f8b-dc93-4eaf-8b2c-0038aa5af52c Jan 30 13:05:47.895365 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 30 13:05:47.901699 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 30 13:05:47.924756 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 30 13:05:47.929774 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 30 13:05:47.976780 ignition[672]: Ignition 2.20.0 Jan 30 13:05:47.976803 ignition[672]: Stage: fetch-offline Jan 30 13:05:47.976823 ignition[672]: no configs at "/usr/lib/ignition/base.d" Jan 30 13:05:47.976828 ignition[672]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 30 13:05:47.976883 ignition[672]: parsed url from cmdline: "" Jan 30 13:05:47.976884 ignition[672]: no config URL provided Jan 30 13:05:47.976887 ignition[672]: reading system config file "/usr/lib/ignition/user.ign" Jan 30 13:05:47.976891 ignition[672]: no config at "/usr/lib/ignition/user.ign" Jan 30 13:05:47.977253 ignition[672]: config successfully fetched Jan 30 13:05:47.977270 ignition[672]: parsing config with SHA512: 97edc499e6ba7c4b478a3e418684fc44dcbf50f8eabd58698f6592e97a2a21e21bbb280b9fbd36866ff7a96673efd2d0b7370b0d2d67ceb5638e6d261b72e8b7 Jan 30 13:05:47.982082 unknown[672]: fetched base config from "system" Jan 30 13:05:47.982089 unknown[672]: fetched user config from "vmware" Jan 30 13:05:47.982586 ignition[672]: fetch-offline: fetch-offline passed Jan 30 13:05:47.982670 ignition[672]: Ignition finished successfully Jan 30 13:05:47.983511 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 30 13:05:47.994302 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 30 13:05:47.998730 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 30 13:05:48.010451 systemd-networkd[806]: lo: Link UP Jan 30 13:05:48.010742 systemd-networkd[806]: lo: Gained carrier Jan 30 13:05:48.011461 systemd-networkd[806]: Enumeration completed Jan 30 13:05:48.011669 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 30 13:05:48.011897 systemd[1]: Reached target network.target - Network. Jan 30 13:05:48.012095 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jan 30 13:05:48.015273 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 30 13:05:48.015394 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 30 13:05:48.012355 systemd-networkd[806]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Jan 30 13:05:48.015884 systemd-networkd[806]: ens192: Link UP Jan 30 13:05:48.015887 systemd-networkd[806]: ens192: Gained carrier Jan 30 13:05:48.022777 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 30 13:05:48.030375 ignition[808]: Ignition 2.20.0 Jan 30 13:05:48.030383 ignition[808]: Stage: kargs Jan 30 13:05:48.030485 ignition[808]: no configs at "/usr/lib/ignition/base.d" Jan 30 13:05:48.030491 ignition[808]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 30 13:05:48.031009 ignition[808]: kargs: kargs passed Jan 30 13:05:48.031034 ignition[808]: Ignition finished successfully Jan 30 13:05:48.032116 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 30 13:05:48.035733 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 30 13:05:48.042728 ignition[816]: Ignition 2.20.0 Jan 30 13:05:48.042739 ignition[816]: Stage: disks Jan 30 13:05:48.042875 ignition[816]: no configs at "/usr/lib/ignition/base.d" Jan 30 13:05:48.042881 ignition[816]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 30 13:05:48.043946 ignition[816]: disks: disks passed Jan 30 13:05:48.043989 ignition[816]: Ignition finished successfully Jan 30 13:05:48.044755 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 30 13:05:48.045106 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 30 13:05:48.045351 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 30 13:05:48.045602 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 30 13:05:48.045836 systemd[1]: Reached target sysinit.target - System Initialization. Jan 30 13:05:48.046060 systemd[1]: Reached target basic.target - Basic System. Jan 30 13:05:48.049695 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 30 13:05:48.337836 systemd-fsck[824]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 30 13:05:48.341108 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 30 13:05:48.351715 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 30 13:05:48.497673 kernel: EXT4-fs (sda9): mounted filesystem cdc615db-d057-439f-af25-aa57b1c399e2 r/w with ordered data mode. Quota mode: none. Jan 30 13:05:48.498488 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 30 13:05:48.499056 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 30 13:05:48.515708 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 30 13:05:48.524195 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 30 13:05:48.524643 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 30 13:05:48.524828 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 30 13:05:48.524845 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 30 13:05:48.528386 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 30 13:05:48.531728 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 30 13:05:48.600637 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (832) Jan 30 13:05:48.619701 kernel: BTRFS info (device sda6): first mount of filesystem 8f723f8b-dc93-4eaf-8b2c-0038aa5af52c Jan 30 13:05:48.619745 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 30 13:05:48.619754 kernel: BTRFS info (device sda6): using free space tree Jan 30 13:05:48.688635 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 30 13:05:48.693709 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 30 13:05:48.717196 initrd-setup-root[856]: cut: /sysroot/etc/passwd: No such file or directory Jan 30 13:05:48.722413 initrd-setup-root[863]: cut: /sysroot/etc/group: No such file or directory Jan 30 13:05:48.726391 initrd-setup-root[870]: cut: /sysroot/etc/shadow: No such file or directory Jan 30 13:05:48.728678 initrd-setup-root[877]: cut: /sysroot/etc/gshadow: No such file or directory Jan 30 13:05:48.831304 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 30 13:05:48.835709 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 30 13:05:48.838139 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 30 13:05:48.841183 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 30 13:05:48.841625 kernel: BTRFS info (device sda6): last unmount of filesystem 8f723f8b-dc93-4eaf-8b2c-0038aa5af52c Jan 30 13:05:48.857870 ignition[945]: INFO : Ignition 2.20.0 Jan 30 13:05:48.857870 ignition[945]: INFO : Stage: mount Jan 30 13:05:48.858453 ignition[945]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 30 13:05:48.858453 ignition[945]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 30 13:05:48.859198 ignition[945]: INFO : mount: mount passed Jan 30 13:05:48.859348 ignition[945]: INFO : Ignition finished successfully Jan 30 13:05:48.859321 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 30 13:05:48.860183 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 30 13:05:48.863812 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 30 13:05:48.868275 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 30 13:05:48.975640 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (957) Jan 30 13:05:48.986812 kernel: BTRFS info (device sda6): first mount of filesystem 8f723f8b-dc93-4eaf-8b2c-0038aa5af52c Jan 30 13:05:48.986845 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 30 13:05:48.986854 kernel: BTRFS info (device sda6): using free space tree Jan 30 13:05:49.042719 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 30 13:05:49.047414 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 30 13:05:49.065527 ignition[974]: INFO : Ignition 2.20.0 Jan 30 13:05:49.065527 ignition[974]: INFO : Stage: files Jan 30 13:05:49.066069 ignition[974]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 30 13:05:49.066069 ignition[974]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 30 13:05:49.066283 ignition[974]: DEBUG : files: compiled without relabeling support, skipping Jan 30 13:05:49.075878 ignition[974]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 30 13:05:49.075878 ignition[974]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 30 13:05:49.114157 ignition[974]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 30 13:05:49.114514 ignition[974]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 30 13:05:49.114911 unknown[974]: wrote ssh authorized keys file for user: core Jan 30 13:05:49.115260 ignition[974]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 30 13:05:49.140637 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 30 13:05:49.141206 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 30 13:05:49.175740 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 30 13:05:49.209976 systemd-networkd[806]: ens192: Gained IPv6LL Jan 30 13:05:49.262744 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 30 13:05:49.262744 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 30 13:05:49.263273 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 30 13:05:49.263273 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 30 13:05:49.263273 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 30 13:05:49.263273 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 30 13:05:49.263273 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 30 13:05:49.263273 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 30 13:05:49.263273 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 30 13:05:49.269670 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 30 13:05:49.269886 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 30 13:05:49.269886 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 30 13:05:49.269886 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 30 13:05:49.269886 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 30 13:05:49.270813 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 Jan 30 13:05:49.793237 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 30 13:05:50.143493 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 30 13:05:50.143927 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 30 13:05:50.143927 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 30 13:05:50.143927 ignition[974]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Jan 30 13:05:50.144412 ignition[974]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Jan 30 13:05:50.213030 ignition[974]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Jan 30 13:05:50.215525 ignition[974]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jan 30 13:05:50.215525 ignition[974]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Jan 30 13:05:50.215525 ignition[974]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Jan 30 13:05:50.215525 ignition[974]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Jan 30 13:05:50.216863 ignition[974]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 30 13:05:50.216863 ignition[974]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 30 13:05:50.216863 ignition[974]: INFO : files: files passed Jan 30 13:05:50.216863 ignition[974]: INFO : Ignition finished successfully Jan 30 13:05:50.217516 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 30 13:05:50.220750 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 30 13:05:50.222318 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 30 13:05:50.224497 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 30 13:05:50.224708 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 30 13:05:50.229074 initrd-setup-root-after-ignition[1004]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 30 13:05:50.229074 initrd-setup-root-after-ignition[1004]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 30 13:05:50.230043 initrd-setup-root-after-ignition[1008]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 30 13:05:50.231038 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 30 13:05:50.231425 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 30 13:05:50.234743 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 30 13:05:50.248884 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 30 13:05:50.248953 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 30 13:05:50.249416 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 30 13:05:50.249550 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 30 13:05:50.249763 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 30 13:05:50.250265 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 30 13:05:50.260134 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 30 13:05:50.263728 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 30 13:05:50.269802 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 30 13:05:50.270171 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 30 13:05:50.270335 systemd[1]: Stopped target timers.target - Timer Units. Jan 30 13:05:50.270474 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 30 13:05:50.270555 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 30 13:05:50.270961 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 30 13:05:50.271184 systemd[1]: Stopped target basic.target - Basic System. Jan 30 13:05:50.271377 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 30 13:05:50.271568 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 30 13:05:50.271772 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 30 13:05:50.271976 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 30 13:05:50.272323 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 30 13:05:50.272543 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 30 13:05:50.272775 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 30 13:05:50.272973 systemd[1]: Stopped target swap.target - Swaps. Jan 30 13:05:50.273137 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 30 13:05:50.273207 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 30 13:05:50.273464 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 30 13:05:50.273702 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 30 13:05:50.273878 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 30 13:05:50.273927 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 30 13:05:50.274088 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 30 13:05:50.274147 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 30 13:05:50.274389 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 30 13:05:50.274451 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 30 13:05:50.274719 systemd[1]: Stopped target paths.target - Path Units. Jan 30 13:05:50.274863 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 30 13:05:50.276646 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 30 13:05:50.276803 systemd[1]: Stopped target slices.target - Slice Units. Jan 30 13:05:50.277000 systemd[1]: Stopped target sockets.target - Socket Units. Jan 30 13:05:50.277184 systemd[1]: iscsid.socket: Deactivated successfully. Jan 30 13:05:50.277260 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 30 13:05:50.277469 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 30 13:05:50.277515 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 30 13:05:50.277762 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 30 13:05:50.277826 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 30 13:05:50.278063 systemd[1]: ignition-files.service: Deactivated successfully. Jan 30 13:05:50.278121 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 30 13:05:50.285800 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 30 13:05:50.285921 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 30 13:05:50.286018 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 30 13:05:50.288494 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 30 13:05:50.288611 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 30 13:05:50.288718 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 30 13:05:50.289016 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 30 13:05:50.289104 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 30 13:05:50.291928 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 30 13:05:50.292006 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 30 13:05:50.295632 ignition[1028]: INFO : Ignition 2.20.0 Jan 30 13:05:50.295632 ignition[1028]: INFO : Stage: umount Jan 30 13:05:50.295632 ignition[1028]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 30 13:05:50.295632 ignition[1028]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 30 13:05:50.297326 ignition[1028]: INFO : umount: umount passed Jan 30 13:05:50.297326 ignition[1028]: INFO : Ignition finished successfully Jan 30 13:05:50.297329 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 30 13:05:50.298285 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 30 13:05:50.298515 systemd[1]: Stopped target network.target - Network. Jan 30 13:05:50.298600 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 30 13:05:50.298671 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 30 13:05:50.298774 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 30 13:05:50.298797 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 30 13:05:50.298897 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 30 13:05:50.298918 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 30 13:05:50.299016 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 30 13:05:50.299038 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 30 13:05:50.299221 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 30 13:05:50.299397 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 30 13:05:50.303964 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 30 13:05:50.304072 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 30 13:05:50.306334 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 30 13:05:50.306735 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 30 13:05:50.306802 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 30 13:05:50.307253 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 30 13:05:50.307282 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 30 13:05:50.311671 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 30 13:05:50.311770 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 30 13:05:50.311800 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 30 13:05:50.311936 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Jan 30 13:05:50.311960 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 30 13:05:50.312077 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 30 13:05:50.312098 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 30 13:05:50.312203 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 30 13:05:50.312223 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 30 13:05:50.312329 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 30 13:05:50.312349 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 30 13:05:50.312505 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 30 13:05:50.320196 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 30 13:05:50.320267 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 30 13:05:50.325027 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 30 13:05:50.325106 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 30 13:05:50.325410 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 30 13:05:50.325437 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 30 13:05:50.325693 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 30 13:05:50.325717 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 30 13:05:50.325835 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 30 13:05:50.325859 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 30 13:05:50.326131 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 30 13:05:50.326153 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 30 13:05:50.326437 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 30 13:05:50.326459 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 30 13:05:50.332849 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 30 13:05:50.333170 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 30 13:05:50.333209 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 30 13:05:50.333336 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 30 13:05:50.333359 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 30 13:05:50.336834 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 30 13:05:50.336900 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 30 13:05:50.426150 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 30 13:05:50.426235 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 30 13:05:50.426612 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 30 13:05:50.426771 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 30 13:05:50.426806 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 30 13:05:50.430721 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 30 13:05:50.435944 systemd[1]: Switching root. Jan 30 13:05:50.481960 systemd-journald[217]: Journal stopped Jan 30 13:05:51.864434 systemd-journald[217]: Received SIGTERM from PID 1 (systemd). Jan 30 13:05:51.864468 kernel: SELinux: policy capability network_peer_controls=1 Jan 30 13:05:51.864478 kernel: SELinux: policy capability open_perms=1 Jan 30 13:05:51.864484 kernel: SELinux: policy capability extended_socket_class=1 Jan 30 13:05:51.864490 kernel: SELinux: policy capability always_check_network=0 Jan 30 13:05:51.864496 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 30 13:05:51.864504 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 30 13:05:51.864510 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 30 13:05:51.864516 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 30 13:05:51.864523 systemd[1]: Successfully loaded SELinux policy in 36.233ms. Jan 30 13:05:51.864530 kernel: audit: type=1403 audit(1738242351.160:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 30 13:05:51.864537 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 6.994ms. Jan 30 13:05:51.864545 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 30 13:05:51.864553 systemd[1]: Detected virtualization vmware. Jan 30 13:05:51.864561 systemd[1]: Detected architecture x86-64. Jan 30 13:05:51.864567 systemd[1]: Detected first boot. Jan 30 13:05:51.864574 systemd[1]: Initializing machine ID from random generator. Jan 30 13:05:51.864583 zram_generator::config[1070]: No configuration found. Jan 30 13:05:51.864591 systemd[1]: Populated /etc with preset unit settings. Jan 30 13:05:51.864599 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 30 13:05:51.864607 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Jan 30 13:05:51.871185 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 30 13:05:51.871214 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 30 13:05:51.871223 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 30 13:05:51.871231 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 30 13:05:51.871243 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 30 13:05:51.871250 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 30 13:05:51.871258 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 30 13:05:51.871265 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 30 13:05:51.871272 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 30 13:05:51.871281 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 30 13:05:51.871288 systemd[1]: Created slice user.slice - User and Session Slice. Jan 30 13:05:51.871297 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 30 13:05:51.871304 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 30 13:05:51.871311 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 30 13:05:51.871318 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 30 13:05:51.871325 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 30 13:05:51.871332 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 30 13:05:51.871340 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 30 13:05:51.871347 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 30 13:05:51.871356 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 30 13:05:51.871365 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 30 13:05:51.871374 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 30 13:05:51.871381 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 30 13:05:51.871389 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 30 13:05:51.871396 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 30 13:05:51.871403 systemd[1]: Reached target slices.target - Slice Units. Jan 30 13:05:51.871412 systemd[1]: Reached target swap.target - Swaps. Jan 30 13:05:51.871419 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 30 13:05:51.871426 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 30 13:05:51.871433 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 30 13:05:51.871441 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 30 13:05:51.871450 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 30 13:05:51.871457 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 30 13:05:51.871465 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 30 13:05:51.871474 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 30 13:05:51.871481 systemd[1]: Mounting media.mount - External Media Directory... Jan 30 13:05:51.871488 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 13:05:51.871496 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 30 13:05:51.871504 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 30 13:05:51.871512 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 30 13:05:51.871520 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 30 13:05:51.871528 systemd[1]: Reached target machines.target - Containers. Jan 30 13:05:51.871535 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 30 13:05:51.871543 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Jan 30 13:05:51.871550 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 30 13:05:51.871557 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 30 13:05:51.871565 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 30 13:05:51.871574 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 30 13:05:51.871583 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 30 13:05:51.871590 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 30 13:05:51.871598 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 30 13:05:51.871605 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 30 13:05:51.871613 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 30 13:05:51.871629 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 30 13:05:51.871637 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 30 13:05:51.871644 systemd[1]: Stopped systemd-fsck-usr.service. Jan 30 13:05:51.871653 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 30 13:05:51.871661 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 30 13:05:51.871668 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 30 13:05:51.871677 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 30 13:05:51.871685 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 30 13:05:51.871693 kernel: fuse: init (API version 7.39) Jan 30 13:05:51.871700 systemd[1]: verity-setup.service: Deactivated successfully. Jan 30 13:05:51.871707 systemd[1]: Stopped verity-setup.service. Jan 30 13:05:51.871715 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 13:05:51.871723 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 30 13:05:51.871732 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 30 13:05:51.871739 systemd[1]: Mounted media.mount - External Media Directory. Jan 30 13:05:51.871747 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 30 13:05:51.871756 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 30 13:05:51.871763 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 30 13:05:51.871770 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 30 13:05:51.871778 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 30 13:05:51.871787 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 30 13:05:51.871795 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 30 13:05:51.871802 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 30 13:05:51.871809 kernel: loop: module loaded Jan 30 13:05:51.871816 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 30 13:05:51.871825 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 30 13:05:51.871833 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 30 13:05:51.871840 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 30 13:05:51.871847 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 30 13:05:51.871856 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 30 13:05:51.871864 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 30 13:05:51.871871 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 30 13:05:51.871878 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 30 13:05:51.871886 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 30 13:05:51.871894 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 30 13:05:51.871920 systemd-journald[1160]: Collecting audit messages is disabled. Jan 30 13:05:51.871940 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 30 13:05:51.871948 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 30 13:05:51.871958 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 30 13:05:51.871969 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Jan 30 13:05:51.871980 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 30 13:05:51.871990 systemd-journald[1160]: Journal started Jan 30 13:05:51.872006 systemd-journald[1160]: Runtime Journal (/run/log/journal/1f830a5a43024c739a91fe8a9245cd26) is 4.8M, max 38.6M, 33.8M free. Jan 30 13:05:51.641057 systemd[1]: Queued start job for default target multi-user.target. Jan 30 13:05:51.659183 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 30 13:05:51.659443 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 30 13:05:51.872555 jq[1137]: true Jan 30 13:05:51.874981 jq[1169]: true Jan 30 13:05:51.881772 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 30 13:05:51.881818 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 30 13:05:51.884638 kernel: ACPI: bus type drm_connector registered Jan 30 13:05:51.900841 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 30 13:05:51.900885 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 30 13:05:51.903628 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 30 13:05:51.907630 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 30 13:05:51.912636 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 30 13:05:51.917594 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 30 13:05:51.917662 systemd[1]: Started systemd-journald.service - Journal Service. Jan 30 13:05:51.919400 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 30 13:05:51.919694 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 30 13:05:51.919956 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 30 13:05:51.922863 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 30 13:05:51.923060 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 30 13:05:51.923313 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 30 13:05:51.939951 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 30 13:05:51.944800 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 30 13:05:51.949947 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 30 13:05:51.955336 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Jan 30 13:05:51.960086 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 30 13:05:51.969210 kernel: loop0: detected capacity change from 0 to 138184 Jan 30 13:05:51.971438 systemd-journald[1160]: Time spent on flushing to /var/log/journal/1f830a5a43024c739a91fe8a9245cd26 is 73.576ms for 1837 entries. Jan 30 13:05:51.971438 systemd-journald[1160]: System Journal (/var/log/journal/1f830a5a43024c739a91fe8a9245cd26) is 8.0M, max 584.8M, 576.8M free. Jan 30 13:05:52.074018 systemd-journald[1160]: Received client request to flush runtime journal. Jan 30 13:05:52.074056 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 30 13:05:52.074069 kernel: loop1: detected capacity change from 0 to 205544 Jan 30 13:05:51.981629 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 30 13:05:51.989191 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 30 13:05:51.993817 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Jan 30 13:05:52.007543 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 30 13:05:52.017542 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Jan 30 13:05:52.062990 udevadm[1220]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Jan 30 13:05:52.080908 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 30 13:05:52.081875 ignition[1188]: Ignition 2.20.0 Jan 30 13:05:52.082105 ignition[1188]: deleting config from guestinfo properties Jan 30 13:05:52.086644 ignition[1188]: Successfully deleted config Jan 30 13:05:52.088133 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Jan 30 13:05:52.148823 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 30 13:05:52.157311 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 30 13:05:52.181635 kernel: loop2: detected capacity change from 0 to 2960 Jan 30 13:05:52.191527 systemd-tmpfiles[1234]: ACLs are not supported, ignoring. Jan 30 13:05:52.191540 systemd-tmpfiles[1234]: ACLs are not supported, ignoring. Jan 30 13:05:52.196703 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 30 13:05:52.230637 kernel: loop3: detected capacity change from 0 to 141000 Jan 30 13:05:52.280631 kernel: loop4: detected capacity change from 0 to 138184 Jan 30 13:05:52.484639 kernel: loop5: detected capacity change from 0 to 205544 Jan 30 13:05:52.508675 kernel: loop6: detected capacity change from 0 to 2960 Jan 30 13:05:52.544641 kernel: loop7: detected capacity change from 0 to 141000 Jan 30 13:05:52.560289 (sd-merge)[1239]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Jan 30 13:05:52.560645 (sd-merge)[1239]: Merged extensions into '/usr'. Jan 30 13:05:52.568669 systemd[1]: Reloading requested from client PID 1187 ('systemd-sysext') (unit systemd-sysext.service)... Jan 30 13:05:52.568681 systemd[1]: Reloading... Jan 30 13:05:52.599556 zram_generator::config[1261]: No configuration found. Jan 30 13:05:52.730762 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 30 13:05:52.748031 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 30 13:05:52.778472 systemd[1]: Reloading finished in 209 ms. Jan 30 13:05:52.812681 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 30 13:05:52.819744 systemd[1]: Starting ensure-sysext.service... Jan 30 13:05:52.820791 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 30 13:05:52.852830 systemd[1]: Reloading requested from client PID 1320 ('systemctl') (unit ensure-sysext.service)... Jan 30 13:05:52.852847 systemd[1]: Reloading... Jan 30 13:05:52.862710 systemd-tmpfiles[1321]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 30 13:05:52.862890 systemd-tmpfiles[1321]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 30 13:05:52.863405 systemd-tmpfiles[1321]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 30 13:05:52.863580 systemd-tmpfiles[1321]: ACLs are not supported, ignoring. Jan 30 13:05:52.863624 systemd-tmpfiles[1321]: ACLs are not supported, ignoring. Jan 30 13:05:52.871537 systemd-tmpfiles[1321]: Detected autofs mount point /boot during canonicalization of boot. Jan 30 13:05:52.871544 systemd-tmpfiles[1321]: Skipping /boot Jan 30 13:05:52.878792 systemd-tmpfiles[1321]: Detected autofs mount point /boot during canonicalization of boot. Jan 30 13:05:52.878800 systemd-tmpfiles[1321]: Skipping /boot Jan 30 13:05:52.909095 zram_generator::config[1345]: No configuration found. Jan 30 13:05:52.982365 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 30 13:05:52.997692 ldconfig[1180]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 30 13:05:52.999195 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 30 13:05:53.029427 systemd[1]: Reloading finished in 176 ms. Jan 30 13:05:53.043115 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 30 13:05:53.043423 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 30 13:05:53.047897 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 30 13:05:53.052738 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 30 13:05:53.070728 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 30 13:05:53.071976 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 30 13:05:53.075713 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 30 13:05:53.078670 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 30 13:05:53.080722 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 30 13:05:53.089922 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 30 13:05:53.091466 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 13:05:53.093017 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 30 13:05:53.097803 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 30 13:05:53.104803 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 30 13:05:53.104985 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 30 13:05:53.105062 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 13:05:53.108587 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 13:05:53.108730 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 30 13:05:53.108824 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 13:05:53.113105 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 30 13:05:53.116287 systemd-udevd[1413]: Using default interface naming scheme 'v255'. Jan 30 13:05:53.118221 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 13:05:53.123835 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 30 13:05:53.124048 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 30 13:05:53.124159 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 13:05:53.124530 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 30 13:05:53.124809 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 30 13:05:53.125162 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 30 13:05:53.125327 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 30 13:05:53.126234 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 30 13:05:53.128504 systemd[1]: Finished ensure-sysext.service. Jan 30 13:05:53.136795 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 30 13:05:53.137068 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 30 13:05:53.137162 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 30 13:05:53.137446 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 30 13:05:53.140454 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 30 13:05:53.140545 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 30 13:05:53.142822 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 30 13:05:53.144718 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 30 13:05:53.147393 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 30 13:05:53.160065 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 30 13:05:53.161624 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 30 13:05:53.181087 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 30 13:05:53.182400 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 30 13:05:53.186140 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 30 13:05:53.191023 augenrules[1471]: No rules Jan 30 13:05:53.191954 systemd[1]: audit-rules.service: Deactivated successfully. Jan 30 13:05:53.192090 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 30 13:05:53.221504 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 30 13:05:53.222001 systemd[1]: Reached target time-set.target - System Time Set. Jan 30 13:05:53.231246 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 30 13:05:53.272137 systemd-resolved[1412]: Positive Trust Anchors: Jan 30 13:05:53.272146 systemd-resolved[1412]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 30 13:05:53.272170 systemd-resolved[1412]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 30 13:05:53.275645 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Jan 30 13:05:53.282690 kernel: ACPI: button: Power Button [PWRF] Jan 30 13:05:53.286005 systemd-resolved[1412]: Defaulting to hostname 'linux'. Jan 30 13:05:53.287410 systemd-networkd[1451]: lo: Link UP Jan 30 13:05:53.287414 systemd-networkd[1451]: lo: Gained carrier Jan 30 13:05:53.288841 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 30 13:05:53.289016 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 30 13:05:53.289530 systemd-networkd[1451]: Enumeration completed Jan 30 13:05:53.289573 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 30 13:05:53.290339 systemd[1]: Reached target network.target - Network. Jan 30 13:05:53.291790 systemd-networkd[1451]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Jan 30 13:05:53.293819 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 30 13:05:53.293959 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 30 13:05:53.295192 systemd-networkd[1451]: ens192: Link UP Jan 30 13:05:53.295327 systemd-networkd[1451]: ens192: Gained carrier Jan 30 13:05:53.297745 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 30 13:05:53.299308 systemd-timesyncd[1434]: Network configuration changed, trying to establish connection. Jan 30 13:05:53.306636 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1453) Jan 30 13:05:53.354373 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 30 13:05:53.362830 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 30 13:05:53.371946 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 30 13:05:53.375644 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Jan 30 13:05:53.387986 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Jan 30 13:05:53.396540 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 Jan 30 13:05:53.396563 kernel: Guest personality initialized and is active Jan 30 13:05:53.399630 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Jan 30 13:05:53.399652 kernel: Initialized host personality Jan 30 13:05:53.410633 kernel: mousedev: PS/2 mouse device common for all mice Jan 30 13:05:53.411557 (udev-worker)[1462]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Jan 30 13:05:53.418737 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 30 13:05:53.427804 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Jan 30 13:05:53.434821 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Jan 30 13:05:53.443839 lvm[1504]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 30 13:05:53.462407 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Jan 30 13:05:53.464361 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 30 13:05:53.468841 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Jan 30 13:05:53.469081 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 30 13:05:53.469497 systemd[1]: Reached target sysinit.target - System Initialization. Jan 30 13:05:53.469667 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 30 13:05:53.469794 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 30 13:05:53.469993 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 30 13:05:53.470142 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 30 13:05:53.470257 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 30 13:05:53.470368 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 30 13:05:53.470388 systemd[1]: Reached target paths.target - Path Units. Jan 30 13:05:53.470472 systemd[1]: Reached target timers.target - Timer Units. Jan 30 13:05:53.471673 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 30 13:05:53.472538 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 30 13:05:53.474102 lvm[1508]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 30 13:05:53.475892 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 30 13:05:53.476397 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 30 13:05:53.476566 systemd[1]: Reached target sockets.target - Socket Units. Jan 30 13:05:53.476669 systemd[1]: Reached target basic.target - Basic System. Jan 30 13:05:53.476846 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 30 13:05:53.476862 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 30 13:05:53.477547 systemd[1]: Starting containerd.service - containerd container runtime... Jan 30 13:05:53.478697 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 30 13:05:53.481232 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 30 13:05:53.482003 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 30 13:05:53.482221 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 30 13:05:53.483842 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 30 13:05:53.494705 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 30 13:05:53.497462 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 30 13:05:53.499722 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 30 13:05:53.511634 jq[1514]: false Jan 30 13:05:53.503692 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 30 13:05:53.516294 extend-filesystems[1515]: Found loop4 Jan 30 13:05:53.516294 extend-filesystems[1515]: Found loop5 Jan 30 13:05:53.516294 extend-filesystems[1515]: Found loop6 Jan 30 13:05:53.516294 extend-filesystems[1515]: Found loop7 Jan 30 13:05:53.516294 extend-filesystems[1515]: Found sda Jan 30 13:05:53.503976 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 30 13:05:53.517878 extend-filesystems[1515]: Found sda1 Jan 30 13:05:53.517878 extend-filesystems[1515]: Found sda2 Jan 30 13:05:53.517878 extend-filesystems[1515]: Found sda3 Jan 30 13:05:53.517878 extend-filesystems[1515]: Found usr Jan 30 13:05:53.517878 extend-filesystems[1515]: Found sda4 Jan 30 13:05:53.517878 extend-filesystems[1515]: Found sda6 Jan 30 13:05:53.517878 extend-filesystems[1515]: Found sda7 Jan 30 13:05:53.517878 extend-filesystems[1515]: Found sda9 Jan 30 13:05:53.517878 extend-filesystems[1515]: Checking size of /dev/sda9 Jan 30 13:05:53.504375 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 30 13:05:53.505790 systemd[1]: Starting update-engine.service - Update Engine... Jan 30 13:05:53.508094 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 30 13:05:53.514688 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Jan 30 13:05:53.515174 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Jan 30 13:05:53.515810 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 30 13:05:53.515900 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 30 13:05:53.516599 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 30 13:05:53.517003 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 30 13:05:53.534017 jq[1524]: true Jan 30 13:05:53.539062 (ntainerd)[1543]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 30 13:05:53.541796 systemd[1]: motdgen.service: Deactivated successfully. Jan 30 13:05:53.542658 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 30 13:05:53.543553 jq[1542]: true Jan 30 13:05:53.549775 dbus-daemon[1513]: [system] SELinux support is enabled Jan 30 13:05:53.549989 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 30 13:05:53.553126 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 30 13:05:53.553324 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 30 13:05:53.553460 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 30 13:05:53.553477 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 30 13:05:53.558696 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Jan 30 13:05:53.560933 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Jan 30 13:05:53.570218 update_engine[1522]: I20250130 13:05:53.570173 1522 main.cc:92] Flatcar Update Engine starting Jan 30 13:05:53.570775 tar[1528]: linux-amd64/helm Jan 30 13:05:53.575735 systemd[1]: Started update-engine.service - Update Engine. Jan 30 13:05:53.578199 update_engine[1522]: I20250130 13:05:53.578173 1522 update_check_scheduler.cc:74] Next update check in 4m13s Jan 30 13:05:53.580878 extend-filesystems[1515]: Old size kept for /dev/sda9 Jan 30 13:05:53.580878 extend-filesystems[1515]: Found sr0 Jan 30 13:05:53.580725 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 30 13:05:53.582124 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 30 13:05:53.582223 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 30 13:05:53.583695 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Jan 30 13:05:53.591510 systemd-logind[1521]: Watching system buttons on /dev/input/event1 (Power Button) Jan 30 13:05:53.591525 systemd-logind[1521]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 30 13:05:53.592058 systemd-logind[1521]: New seat seat0. Jan 30 13:05:53.593690 systemd[1]: Started systemd-logind.service - User Login Management. Jan 30 13:05:53.600131 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1460) Jan 30 13:05:53.624316 unknown[1554]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Jan 30 13:05:53.632311 unknown[1554]: Core dump limit set to -1 Jan 30 13:05:53.666172 kernel: NET: Registered PF_VSOCK protocol family Jan 30 13:05:53.699035 locksmithd[1559]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 30 13:05:53.720913 bash[1579]: Updated "/home/core/.ssh/authorized_keys" Jan 30 13:05:53.723657 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 30 13:05:53.727109 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Jan 30 13:05:53.763249 sshd_keygen[1550]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 30 13:05:53.786674 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 30 13:05:53.790962 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 30 13:05:53.804909 systemd[1]: issuegen.service: Deactivated successfully. Jan 30 13:05:53.805032 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 30 13:05:53.810734 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 30 13:05:53.830256 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 30 13:05:53.834399 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 30 13:05:53.838866 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 30 13:05:53.839173 systemd[1]: Reached target getty.target - Login Prompts. Jan 30 13:05:53.873333 containerd[1543]: time="2025-01-30T13:05:53.872774726Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Jan 30 13:05:53.894165 containerd[1543]: time="2025-01-30T13:05:53.894102332Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Jan 30 13:05:53.895532 containerd[1543]: time="2025-01-30T13:05:53.895514675Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.74-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Jan 30 13:05:53.895579 containerd[1543]: time="2025-01-30T13:05:53.895571543Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Jan 30 13:05:53.895626 containerd[1543]: time="2025-01-30T13:05:53.895608099Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Jan 30 13:05:53.895886 containerd[1543]: time="2025-01-30T13:05:53.895876128Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Jan 30 13:05:53.895925 containerd[1543]: time="2025-01-30T13:05:53.895918492Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Jan 30 13:05:53.895994 containerd[1543]: time="2025-01-30T13:05:53.895984322Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Jan 30 13:05:53.897140 containerd[1543]: time="2025-01-30T13:05:53.896635135Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Jan 30 13:05:53.897140 containerd[1543]: time="2025-01-30T13:05:53.896728612Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 30 13:05:53.897140 containerd[1543]: time="2025-01-30T13:05:53.896738112Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Jan 30 13:05:53.897140 containerd[1543]: time="2025-01-30T13:05:53.896746073Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Jan 30 13:05:53.897140 containerd[1543]: time="2025-01-30T13:05:53.896751223Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Jan 30 13:05:53.897140 containerd[1543]: time="2025-01-30T13:05:53.896793255Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Jan 30 13:05:53.897140 containerd[1543]: time="2025-01-30T13:05:53.896902607Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Jan 30 13:05:53.897140 containerd[1543]: time="2025-01-30T13:05:53.896954536Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 30 13:05:53.897140 containerd[1543]: time="2025-01-30T13:05:53.896963710Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Jan 30 13:05:53.897140 containerd[1543]: time="2025-01-30T13:05:53.897002725Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Jan 30 13:05:53.897140 containerd[1543]: time="2025-01-30T13:05:53.897029477Z" level=info msg="metadata content store policy set" policy=shared Jan 30 13:05:53.899096 containerd[1543]: time="2025-01-30T13:05:53.899084518Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Jan 30 13:05:53.899159 containerd[1543]: time="2025-01-30T13:05:53.899144702Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Jan 30 13:05:53.899216 containerd[1543]: time="2025-01-30T13:05:53.899208291Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Jan 30 13:05:53.899361 containerd[1543]: time="2025-01-30T13:05:53.899253116Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Jan 30 13:05:53.899361 containerd[1543]: time="2025-01-30T13:05:53.899265455Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Jan 30 13:05:53.899361 containerd[1543]: time="2025-01-30T13:05:53.899335868Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Jan 30 13:05:53.899538 containerd[1543]: time="2025-01-30T13:05:53.899528960Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Jan 30 13:05:53.899718 containerd[1543]: time="2025-01-30T13:05:53.899708803Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Jan 30 13:05:53.899756 containerd[1543]: time="2025-01-30T13:05:53.899749212Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Jan 30 13:05:53.899790 containerd[1543]: time="2025-01-30T13:05:53.899783030Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Jan 30 13:05:53.899843 containerd[1543]: time="2025-01-30T13:05:53.899835858Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Jan 30 13:05:53.899878 containerd[1543]: time="2025-01-30T13:05:53.899870658Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Jan 30 13:05:53.899909 containerd[1543]: time="2025-01-30T13:05:53.899903350Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Jan 30 13:05:53.899942 containerd[1543]: time="2025-01-30T13:05:53.899935485Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Jan 30 13:05:53.900568 containerd[1543]: time="2025-01-30T13:05:53.900331449Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Jan 30 13:05:53.900568 containerd[1543]: time="2025-01-30T13:05:53.900344967Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Jan 30 13:05:53.900568 containerd[1543]: time="2025-01-30T13:05:53.900353173Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Jan 30 13:05:53.900568 containerd[1543]: time="2025-01-30T13:05:53.900359152Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Jan 30 13:05:53.900568 containerd[1543]: time="2025-01-30T13:05:53.900370528Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900568 containerd[1543]: time="2025-01-30T13:05:53.900378791Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900568 containerd[1543]: time="2025-01-30T13:05:53.900385682Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900568 containerd[1543]: time="2025-01-30T13:05:53.900392888Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900568 containerd[1543]: time="2025-01-30T13:05:53.900399349Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900568 containerd[1543]: time="2025-01-30T13:05:53.900406162Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900568 containerd[1543]: time="2025-01-30T13:05:53.900412433Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900568 containerd[1543]: time="2025-01-30T13:05:53.900419318Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900568 containerd[1543]: time="2025-01-30T13:05:53.900425969Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900568 containerd[1543]: time="2025-01-30T13:05:53.900433985Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900782 containerd[1543]: time="2025-01-30T13:05:53.900440063Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900782 containerd[1543]: time="2025-01-30T13:05:53.900446092Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900782 containerd[1543]: time="2025-01-30T13:05:53.900452564Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900782 containerd[1543]: time="2025-01-30T13:05:53.900460547Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Jan 30 13:05:53.900782 containerd[1543]: time="2025-01-30T13:05:53.900474134Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900782 containerd[1543]: time="2025-01-30T13:05:53.900481692Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.900782 containerd[1543]: time="2025-01-30T13:05:53.900491097Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Jan 30 13:05:53.902629 containerd[1543]: time="2025-01-30T13:05:53.901383430Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Jan 30 13:05:53.902629 containerd[1543]: time="2025-01-30T13:05:53.901401219Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Jan 30 13:05:53.902629 containerd[1543]: time="2025-01-30T13:05:53.901408379Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Jan 30 13:05:53.902629 containerd[1543]: time="2025-01-30T13:05:53.901426230Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Jan 30 13:05:53.902629 containerd[1543]: time="2025-01-30T13:05:53.901471478Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.902629 containerd[1543]: time="2025-01-30T13:05:53.901479881Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Jan 30 13:05:53.902629 containerd[1543]: time="2025-01-30T13:05:53.901491557Z" level=info msg="NRI interface is disabled by configuration." Jan 30 13:05:53.902629 containerd[1543]: time="2025-01-30T13:05:53.901497888Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Jan 30 13:05:53.902456 systemd[1]: Started containerd.service - containerd container runtime. Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.901669260Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.901713894Z" level=info msg="Connect containerd service" Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.901729210Z" level=info msg="using legacy CRI server" Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.901733171Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.901792398Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.902099153Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.902250698Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.902274436Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.902300525Z" level=info msg="Start subscribing containerd event" Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.902321266Z" level=info msg="Start recovering state" Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.902350641Z" level=info msg="Start event monitor" Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.902360474Z" level=info msg="Start snapshots syncer" Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.902365091Z" level=info msg="Start cni network conf syncer for default" Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.902368912Z" level=info msg="Start streaming server" Jan 30 13:05:53.902791 containerd[1543]: time="2025-01-30T13:05:53.902404172Z" level=info msg="containerd successfully booted in 0.030631s" Jan 30 13:05:54.005651 tar[1528]: linux-amd64/LICENSE Jan 30 13:05:54.005651 tar[1528]: linux-amd64/README.md Jan 30 13:05:54.017438 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 30 13:05:55.289824 systemd-networkd[1451]: ens192: Gained IPv6LL Jan 30 13:05:55.290256 systemd-timesyncd[1434]: Network configuration changed, trying to establish connection. Jan 30 13:05:55.291527 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 30 13:05:55.292427 systemd[1]: Reached target network-online.target - Network is Online. Jan 30 13:05:55.298811 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Jan 30 13:05:55.301776 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 13:05:55.305710 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 30 13:05:55.332682 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 30 13:05:55.340004 systemd[1]: coreos-metadata.service: Deactivated successfully. Jan 30 13:05:55.340143 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Jan 30 13:05:55.341228 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 30 13:05:56.250165 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 13:05:56.250518 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 30 13:05:56.252756 systemd[1]: Startup finished in 990ms (kernel) + 5.529s (initrd) + 5.127s (userspace) = 11.647s. Jan 30 13:05:56.255750 (kubelet)[1693]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 30 13:05:56.258604 agetty[1615]: failed to open credentials directory Jan 30 13:05:56.258632 agetty[1619]: failed to open credentials directory Jan 30 13:05:56.282124 login[1615]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 30 13:05:56.283651 login[1619]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 30 13:05:56.290221 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 30 13:05:56.294877 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 30 13:05:56.298083 systemd-logind[1521]: New session 2 of user core. Jan 30 13:05:56.300728 systemd-logind[1521]: New session 1 of user core. Jan 30 13:05:56.304443 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 30 13:05:56.310764 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 30 13:05:56.312437 (systemd)[1700]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 30 13:05:56.376158 systemd[1700]: Queued start job for default target default.target. Jan 30 13:05:56.379480 systemd[1700]: Created slice app.slice - User Application Slice. Jan 30 13:05:56.379500 systemd[1700]: Reached target paths.target - Paths. Jan 30 13:05:56.379510 systemd[1700]: Reached target timers.target - Timers. Jan 30 13:05:56.381701 systemd[1700]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 30 13:05:56.387688 systemd[1700]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 30 13:05:56.387722 systemd[1700]: Reached target sockets.target - Sockets. Jan 30 13:05:56.387732 systemd[1700]: Reached target basic.target - Basic System. Jan 30 13:05:56.387757 systemd[1700]: Reached target default.target - Main User Target. Jan 30 13:05:56.387775 systemd[1700]: Startup finished in 71ms. Jan 30 13:05:56.387793 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 30 13:05:56.388870 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 30 13:05:56.389903 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 30 13:05:56.923788 kubelet[1693]: E0130 13:05:56.923748 1693 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 30 13:05:56.925405 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 30 13:05:56.925501 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 30 13:06:07.175886 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 30 13:06:07.184837 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 13:06:07.247787 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 13:06:07.250211 (kubelet)[1745]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 30 13:06:07.307379 kubelet[1745]: E0130 13:06:07.307341 1745 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 30 13:06:07.309655 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 30 13:06:07.309800 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 30 13:06:17.433979 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 30 13:06:17.449831 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 13:06:17.683766 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 13:06:17.686510 (kubelet)[1759]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 30 13:06:17.738765 kubelet[1759]: E0130 13:06:17.738712 1759 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 30 13:06:17.740117 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 30 13:06:17.740214 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 30 13:07:38.459896 systemd-resolved[1412]: Clock change detected. Flushing caches. Jan 30 13:07:38.459918 systemd-timesyncd[1434]: Contacted time server 104.234.61.117:123 (2.flatcar.pool.ntp.org). Jan 30 13:07:38.459951 systemd-timesyncd[1434]: Initial clock synchronization to Thu 2025-01-30 13:07:38.459815 UTC. Jan 30 13:07:40.887204 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 30 13:07:40.897907 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 13:07:41.137318 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 13:07:41.140463 (kubelet)[1774]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 30 13:07:41.162605 kubelet[1774]: E0130 13:07:41.162571 1774 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 30 13:07:41.163551 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 30 13:07:41.163625 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 30 13:07:46.743897 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 30 13:07:46.744608 systemd[1]: Started sshd@0-139.178.70.106:22-139.178.89.65:41822.service - OpenSSH per-connection server daemon (139.178.89.65:41822). Jan 30 13:07:46.784601 sshd[1782]: Accepted publickey for core from 139.178.89.65 port 41822 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:07:46.785413 sshd-session[1782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:07:46.789028 systemd-logind[1521]: New session 3 of user core. Jan 30 13:07:46.795801 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 30 13:07:46.850873 systemd[1]: Started sshd@1-139.178.70.106:22-139.178.89.65:41828.service - OpenSSH per-connection server daemon (139.178.89.65:41828). Jan 30 13:07:46.884282 sshd[1787]: Accepted publickey for core from 139.178.89.65 port 41828 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:07:46.885215 sshd-session[1787]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:07:46.888120 systemd-logind[1521]: New session 4 of user core. Jan 30 13:07:46.898826 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 30 13:07:46.946614 sshd[1789]: Connection closed by 139.178.89.65 port 41828 Jan 30 13:07:46.947009 sshd-session[1787]: pam_unix(sshd:session): session closed for user core Jan 30 13:07:46.959102 systemd[1]: sshd@1-139.178.70.106:22-139.178.89.65:41828.service: Deactivated successfully. Jan 30 13:07:46.959855 systemd[1]: session-4.scope: Deactivated successfully. Jan 30 13:07:46.960581 systemd-logind[1521]: Session 4 logged out. Waiting for processes to exit. Jan 30 13:07:46.961262 systemd[1]: Started sshd@2-139.178.70.106:22-139.178.89.65:41836.service - OpenSSH per-connection server daemon (139.178.89.65:41836). Jan 30 13:07:46.962843 systemd-logind[1521]: Removed session 4. Jan 30 13:07:46.999891 sshd[1794]: Accepted publickey for core from 139.178.89.65 port 41836 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:07:47.000651 sshd-session[1794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:07:47.003309 systemd-logind[1521]: New session 5 of user core. Jan 30 13:07:47.011808 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 30 13:07:47.058192 sshd[1796]: Connection closed by 139.178.89.65 port 41836 Jan 30 13:07:47.057752 sshd-session[1794]: pam_unix(sshd:session): session closed for user core Jan 30 13:07:47.070164 systemd[1]: sshd@2-139.178.70.106:22-139.178.89.65:41836.service: Deactivated successfully. Jan 30 13:07:47.070964 systemd[1]: session-5.scope: Deactivated successfully. Jan 30 13:07:47.071330 systemd-logind[1521]: Session 5 logged out. Waiting for processes to exit. Jan 30 13:07:47.072402 systemd[1]: Started sshd@3-139.178.70.106:22-139.178.89.65:41840.service - OpenSSH per-connection server daemon (139.178.89.65:41840). Jan 30 13:07:47.074013 systemd-logind[1521]: Removed session 5. Jan 30 13:07:47.102358 sshd[1801]: Accepted publickey for core from 139.178.89.65 port 41840 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:07:47.103107 sshd-session[1801]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:07:47.105666 systemd-logind[1521]: New session 6 of user core. Jan 30 13:07:47.112759 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 30 13:07:47.161144 sshd[1803]: Connection closed by 139.178.89.65 port 41840 Jan 30 13:07:47.161542 sshd-session[1801]: pam_unix(sshd:session): session closed for user core Jan 30 13:07:47.170800 systemd[1]: sshd@3-139.178.70.106:22-139.178.89.65:41840.service: Deactivated successfully. Jan 30 13:07:47.171743 systemd[1]: session-6.scope: Deactivated successfully. Jan 30 13:07:47.172447 systemd-logind[1521]: Session 6 logged out. Waiting for processes to exit. Jan 30 13:07:47.176973 systemd[1]: Started sshd@4-139.178.70.106:22-139.178.89.65:41850.service - OpenSSH per-connection server daemon (139.178.89.65:41850). Jan 30 13:07:47.177789 systemd-logind[1521]: Removed session 6. Jan 30 13:07:47.204229 sshd[1808]: Accepted publickey for core from 139.178.89.65 port 41850 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:07:47.205002 sshd-session[1808]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:07:47.207422 systemd-logind[1521]: New session 7 of user core. Jan 30 13:07:47.215786 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 30 13:07:47.270695 sudo[1811]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 30 13:07:47.270870 sudo[1811]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 30 13:07:47.280303 sudo[1811]: pam_unix(sudo:session): session closed for user root Jan 30 13:07:47.281367 sshd[1810]: Connection closed by 139.178.89.65 port 41850 Jan 30 13:07:47.281277 sshd-session[1808]: pam_unix(sshd:session): session closed for user core Jan 30 13:07:47.286081 systemd[1]: sshd@4-139.178.70.106:22-139.178.89.65:41850.service: Deactivated successfully. Jan 30 13:07:47.286808 systemd[1]: session-7.scope: Deactivated successfully. Jan 30 13:07:47.287526 systemd-logind[1521]: Session 7 logged out. Waiting for processes to exit. Jan 30 13:07:47.288273 systemd[1]: Started sshd@5-139.178.70.106:22-139.178.89.65:41862.service - OpenSSH per-connection server daemon (139.178.89.65:41862). Jan 30 13:07:47.289942 systemd-logind[1521]: Removed session 7. Jan 30 13:07:47.318194 sshd[1816]: Accepted publickey for core from 139.178.89.65 port 41862 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:07:47.319105 sshd-session[1816]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:07:47.321393 systemd-logind[1521]: New session 8 of user core. Jan 30 13:07:47.329784 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 30 13:07:47.379668 sudo[1820]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 30 13:07:47.379836 sudo[1820]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 30 13:07:47.381947 sudo[1820]: pam_unix(sudo:session): session closed for user root Jan 30 13:07:47.384746 sudo[1819]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 30 13:07:47.385015 sudo[1819]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 30 13:07:47.397869 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 30 13:07:47.413357 augenrules[1842]: No rules Jan 30 13:07:47.414067 systemd[1]: audit-rules.service: Deactivated successfully. Jan 30 13:07:47.414263 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 30 13:07:47.415168 sudo[1819]: pam_unix(sudo:session): session closed for user root Jan 30 13:07:47.416096 sshd[1818]: Connection closed by 139.178.89.65 port 41862 Jan 30 13:07:47.416318 sshd-session[1816]: pam_unix(sshd:session): session closed for user core Jan 30 13:07:47.421079 systemd[1]: sshd@5-139.178.70.106:22-139.178.89.65:41862.service: Deactivated successfully. Jan 30 13:07:47.422020 systemd[1]: session-8.scope: Deactivated successfully. Jan 30 13:07:47.422851 systemd-logind[1521]: Session 8 logged out. Waiting for processes to exit. Jan 30 13:07:47.423957 systemd[1]: Started sshd@6-139.178.70.106:22-139.178.89.65:41872.service - OpenSSH per-connection server daemon (139.178.89.65:41872). Jan 30 13:07:47.424969 systemd-logind[1521]: Removed session 8. Jan 30 13:07:47.462793 sshd[1850]: Accepted publickey for core from 139.178.89.65 port 41872 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:07:47.463617 sshd-session[1850]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:07:47.466714 systemd-logind[1521]: New session 9 of user core. Jan 30 13:07:47.473785 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 30 13:07:47.522349 sudo[1853]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 30 13:07:47.522505 sudo[1853]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 30 13:07:47.878845 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 30 13:07:47.878911 (dockerd)[1871]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 30 13:07:48.272277 dockerd[1871]: time="2025-01-30T13:07:48.272245660Z" level=info msg="Starting up" Jan 30 13:07:48.395317 systemd[1]: var-lib-docker-metacopy\x2dcheck2200332845-merged.mount: Deactivated successfully. Jan 30 13:07:48.411955 dockerd[1871]: time="2025-01-30T13:07:48.411805812Z" level=info msg="Loading containers: start." Jan 30 13:07:48.555695 kernel: Initializing XFRM netlink socket Jan 30 13:07:48.690196 systemd-networkd[1451]: docker0: Link UP Jan 30 13:07:48.717669 dockerd[1871]: time="2025-01-30T13:07:48.717636881Z" level=info msg="Loading containers: done." Jan 30 13:07:48.728787 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck951681812-merged.mount: Deactivated successfully. Jan 30 13:07:48.730020 dockerd[1871]: time="2025-01-30T13:07:48.729995493Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 30 13:07:48.730124 dockerd[1871]: time="2025-01-30T13:07:48.730063609Z" level=info msg="Docker daemon" commit=41ca978a0a5400cc24b274137efa9f25517fcc0b containerd-snapshotter=false storage-driver=overlay2 version=27.3.1 Jan 30 13:07:48.730151 dockerd[1871]: time="2025-01-30T13:07:48.730124992Z" level=info msg="Daemon has completed initialization" Jan 30 13:07:48.747192 dockerd[1871]: time="2025-01-30T13:07:48.747079867Z" level=info msg="API listen on /run/docker.sock" Jan 30 13:07:48.747436 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 30 13:07:49.642705 containerd[1543]: time="2025-01-30T13:07:49.642376728Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.5\"" Jan 30 13:07:50.685477 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount813540916.mount: Deactivated successfully. Jan 30 13:07:51.387043 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 30 13:07:51.396830 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 13:07:51.562888 update_engine[1522]: I20250130 13:07:51.562839 1522 update_attempter.cc:509] Updating boot flags... Jan 30 13:07:51.756423 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (2077) Jan 30 13:07:51.867704 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (2076) Jan 30 13:07:52.033515 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (2076) Jan 30 13:07:52.253208 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 13:07:52.257299 (kubelet)[2094]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 30 13:07:52.298325 kubelet[2094]: E0130 13:07:52.298288 2094 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 30 13:07:52.299699 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 30 13:07:52.299786 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 30 13:07:53.676438 containerd[1543]: time="2025-01-30T13:07:53.675709335Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:07:53.677092 containerd[1543]: time="2025-01-30T13:07:53.677067848Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.5: active requests=0, bytes read=27976721" Jan 30 13:07:53.677648 containerd[1543]: time="2025-01-30T13:07:53.677632191Z" level=info msg="ImageCreate event name:\"sha256:2212e74642e45d72a36f297bea139f607ce4ccc4792966a8e9c4d30e04a4a6fb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:07:53.679691 containerd[1543]: time="2025-01-30T13:07:53.679659790Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:fc4b366c0036b90d147f3b58244cf7d5f1f42b0db539f0fe83a8fc6e25a434ab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:07:53.680636 containerd[1543]: time="2025-01-30T13:07:53.680617353Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.5\" with image id \"sha256:2212e74642e45d72a36f297bea139f607ce4ccc4792966a8e9c4d30e04a4a6fb\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:fc4b366c0036b90d147f3b58244cf7d5f1f42b0db539f0fe83a8fc6e25a434ab\", size \"27973521\" in 4.038206282s" Jan 30 13:07:53.680734 containerd[1543]: time="2025-01-30T13:07:53.680716453Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.5\" returns image reference \"sha256:2212e74642e45d72a36f297bea139f607ce4ccc4792966a8e9c4d30e04a4a6fb\"" Jan 30 13:07:53.682564 containerd[1543]: time="2025-01-30T13:07:53.682537304Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.5\"" Jan 30 13:07:56.413301 containerd[1543]: time="2025-01-30T13:07:56.413259627Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:07:56.414080 containerd[1543]: time="2025-01-30T13:07:56.414050646Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.5: active requests=0, bytes read=24701143" Jan 30 13:07:56.414695 containerd[1543]: time="2025-01-30T13:07:56.414380828Z" level=info msg="ImageCreate event name:\"sha256:d7fccb640e0edce9c47bd71f2b2ce328b824bea199bfe5838dda3fe2af6372f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:07:56.416790 containerd[1543]: time="2025-01-30T13:07:56.416765274Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:848cf42bf6c3c5ccac232b76c901c309edb3ebeac4d856885af0fc718798207e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:07:56.417939 containerd[1543]: time="2025-01-30T13:07:56.417909770Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.5\" with image id \"sha256:d7fccb640e0edce9c47bd71f2b2ce328b824bea199bfe5838dda3fe2af6372f2\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:848cf42bf6c3c5ccac232b76c901c309edb3ebeac4d856885af0fc718798207e\", size \"26147725\" in 2.735325801s" Jan 30 13:07:56.417992 containerd[1543]: time="2025-01-30T13:07:56.417940477Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.5\" returns image reference \"sha256:d7fccb640e0edce9c47bd71f2b2ce328b824bea199bfe5838dda3fe2af6372f2\"" Jan 30 13:07:56.418544 containerd[1543]: time="2025-01-30T13:07:56.418501024Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.5\"" Jan 30 13:07:58.032587 containerd[1543]: time="2025-01-30T13:07:58.032547262Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:07:58.033938 containerd[1543]: time="2025-01-30T13:07:58.033880256Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.5: active requests=0, bytes read=18652053" Jan 30 13:07:58.034721 containerd[1543]: time="2025-01-30T13:07:58.034316580Z" level=info msg="ImageCreate event name:\"sha256:4b2fb209f5d1efc0fc980c5acda28886e4eb6ab4820173976bdd441cbd2ee09a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:07:58.035747 containerd[1543]: time="2025-01-30T13:07:58.035732924Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:0e01fd956ba32a7fa08f6b6da24e8c49015905c8e2cf752978d495e44cd4a8a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:07:58.036458 containerd[1543]: time="2025-01-30T13:07:58.036444399Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.5\" with image id \"sha256:4b2fb209f5d1efc0fc980c5acda28886e4eb6ab4820173976bdd441cbd2ee09a\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:0e01fd956ba32a7fa08f6b6da24e8c49015905c8e2cf752978d495e44cd4a8a9\", size \"20098653\" in 1.617918355s" Jan 30 13:07:58.036510 containerd[1543]: time="2025-01-30T13:07:58.036501653Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.5\" returns image reference \"sha256:4b2fb209f5d1efc0fc980c5acda28886e4eb6ab4820173976bdd441cbd2ee09a\"" Jan 30 13:07:58.037082 containerd[1543]: time="2025-01-30T13:07:58.037056033Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.5\"" Jan 30 13:07:59.016282 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2351268949.mount: Deactivated successfully. Jan 30 13:07:59.350187 containerd[1543]: time="2025-01-30T13:07:59.350105225Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:07:59.352341 containerd[1543]: time="2025-01-30T13:07:59.352313665Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.5: active requests=0, bytes read=30231128" Jan 30 13:07:59.357556 containerd[1543]: time="2025-01-30T13:07:59.357536355Z" level=info msg="ImageCreate event name:\"sha256:34018aef09a62f8b40bdd1d2e1bf6c48f359cab492d51059a09e20745ab02ce2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:07:59.367231 containerd[1543]: time="2025-01-30T13:07:59.367207535Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:c00685cc45c1fb539c5bbd8d24d2577f96e9399efac1670f688f654b30f8c64c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:07:59.367639 containerd[1543]: time="2025-01-30T13:07:59.367563126Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.5\" with image id \"sha256:34018aef09a62f8b40bdd1d2e1bf6c48f359cab492d51059a09e20745ab02ce2\", repo tag \"registry.k8s.io/kube-proxy:v1.31.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:c00685cc45c1fb539c5bbd8d24d2577f96e9399efac1670f688f654b30f8c64c\", size \"30230147\" in 1.330431617s" Jan 30 13:07:59.367639 containerd[1543]: time="2025-01-30T13:07:59.367582115Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.5\" returns image reference \"sha256:34018aef09a62f8b40bdd1d2e1bf6c48f359cab492d51059a09e20745ab02ce2\"" Jan 30 13:07:59.368146 containerd[1543]: time="2025-01-30T13:07:59.367938933Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Jan 30 13:08:00.399611 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount491973197.mount: Deactivated successfully. Jan 30 13:08:01.728703 containerd[1543]: time="2025-01-30T13:08:01.728235311Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:01.732811 containerd[1543]: time="2025-01-30T13:08:01.732780353Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" Jan 30 13:08:01.739447 containerd[1543]: time="2025-01-30T13:08:01.739422491Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:01.748078 containerd[1543]: time="2025-01-30T13:08:01.748065775Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:01.748781 containerd[1543]: time="2025-01-30T13:08:01.748648478Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.380694772s" Jan 30 13:08:01.748781 containerd[1543]: time="2025-01-30T13:08:01.748665092Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Jan 30 13:08:01.749238 containerd[1543]: time="2025-01-30T13:08:01.749126254Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 30 13:08:02.386995 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jan 30 13:08:02.394807 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 13:08:02.673914 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 13:08:02.676451 (kubelet)[2218]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 30 13:08:02.703164 kubelet[2218]: E0130 13:08:02.702981 2218 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 30 13:08:02.704433 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 30 13:08:02.704569 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 30 13:08:03.335789 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3448031800.mount: Deactivated successfully. Jan 30 13:08:03.337332 containerd[1543]: time="2025-01-30T13:08:03.337307990Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:03.338016 containerd[1543]: time="2025-01-30T13:08:03.337989234Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Jan 30 13:08:03.338369 containerd[1543]: time="2025-01-30T13:08:03.338349413Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:03.339433 containerd[1543]: time="2025-01-30T13:08:03.339410409Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:03.340251 containerd[1543]: time="2025-01-30T13:08:03.339853212Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.590677257s" Jan 30 13:08:03.340251 containerd[1543]: time="2025-01-30T13:08:03.339870438Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 30 13:08:03.340251 containerd[1543]: time="2025-01-30T13:08:03.340127398Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Jan 30 13:08:03.776270 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount451438884.mount: Deactivated successfully. Jan 30 13:08:06.291395 containerd[1543]: time="2025-01-30T13:08:06.291010168Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:06.305616 containerd[1543]: time="2025-01-30T13:08:06.305425049Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56779973" Jan 30 13:08:06.309120 containerd[1543]: time="2025-01-30T13:08:06.309073899Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:06.311647 containerd[1543]: time="2025-01-30T13:08:06.311611565Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:06.312568 containerd[1543]: time="2025-01-30T13:08:06.312448935Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.972305645s" Jan 30 13:08:06.312568 containerd[1543]: time="2025-01-30T13:08:06.312481722Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Jan 30 13:08:08.225231 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 13:08:08.232907 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 13:08:08.261510 systemd[1]: Reloading requested from client PID 2311 ('systemctl') (unit session-9.scope)... Jan 30 13:08:08.261525 systemd[1]: Reloading... Jan 30 13:08:08.319703 zram_generator::config[2348]: No configuration found. Jan 30 13:08:08.387700 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 30 13:08:08.403898 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 30 13:08:08.451910 systemd[1]: Reloading finished in 189 ms. Jan 30 13:08:08.488979 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 30 13:08:08.489029 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 30 13:08:08.489175 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 13:08:08.494850 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 13:08:08.695066 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 13:08:08.701924 (kubelet)[2416]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 30 13:08:08.736526 kubelet[2416]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 13:08:08.736901 kubelet[2416]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 30 13:08:08.736901 kubelet[2416]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 13:08:08.740160 kubelet[2416]: I0130 13:08:08.740088 2416 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 30 13:08:08.899259 kubelet[2416]: I0130 13:08:08.898975 2416 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Jan 30 13:08:08.899259 kubelet[2416]: I0130 13:08:08.898998 2416 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 30 13:08:08.899259 kubelet[2416]: I0130 13:08:08.899155 2416 server.go:929] "Client rotation is on, will bootstrap in background" Jan 30 13:08:09.036749 kubelet[2416]: I0130 13:08:09.036670 2416 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 30 13:08:09.039047 kubelet[2416]: E0130 13:08:09.039018 2416 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.106:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 13:08:09.050215 kubelet[2416]: E0130 13:08:09.050189 2416 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Jan 30 13:08:09.050444 kubelet[2416]: I0130 13:08:09.050342 2416 server.go:1403] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Jan 30 13:08:09.057705 kubelet[2416]: I0130 13:08:09.057691 2416 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 30 13:08:09.065603 kubelet[2416]: I0130 13:08:09.065168 2416 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 30 13:08:09.065603 kubelet[2416]: I0130 13:08:09.065288 2416 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 30 13:08:09.065603 kubelet[2416]: I0130 13:08:09.065311 2416 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 30 13:08:09.065603 kubelet[2416]: I0130 13:08:09.065419 2416 topology_manager.go:138] "Creating topology manager with none policy" Jan 30 13:08:09.065828 kubelet[2416]: I0130 13:08:09.065426 2416 container_manager_linux.go:300] "Creating device plugin manager" Jan 30 13:08:09.065828 kubelet[2416]: I0130 13:08:09.065489 2416 state_mem.go:36] "Initialized new in-memory state store" Jan 30 13:08:09.078618 kubelet[2416]: I0130 13:08:09.078542 2416 kubelet.go:408] "Attempting to sync node with API server" Jan 30 13:08:09.078618 kubelet[2416]: I0130 13:08:09.078564 2416 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 30 13:08:09.086064 kubelet[2416]: I0130 13:08:09.085946 2416 kubelet.go:314] "Adding apiserver pod source" Jan 30 13:08:09.087866 kubelet[2416]: I0130 13:08:09.087798 2416 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 30 13:08:09.089861 kubelet[2416]: W0130 13:08:09.089823 2416 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 30 13:08:09.089909 kubelet[2416]: E0130 13:08:09.089868 2416 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 13:08:09.108549 kubelet[2416]: W0130 13:08:09.108493 2416 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.106:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 30 13:08:09.108549 kubelet[2416]: E0130 13:08:09.108530 2416 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.106:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 13:08:09.111680 kubelet[2416]: I0130 13:08:09.111637 2416 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 30 13:08:09.116371 kubelet[2416]: I0130 13:08:09.116294 2416 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 30 13:08:09.119004 kubelet[2416]: W0130 13:08:09.118983 2416 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 30 13:08:09.119392 kubelet[2416]: I0130 13:08:09.119375 2416 server.go:1269] "Started kubelet" Jan 30 13:08:09.128770 kubelet[2416]: I0130 13:08:09.128741 2416 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 30 13:08:09.142877 kubelet[2416]: I0130 13:08:09.142858 2416 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 30 13:08:09.143690 kubelet[2416]: I0130 13:08:09.143605 2416 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 30 13:08:09.145449 kubelet[2416]: I0130 13:08:09.145432 2416 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 30 13:08:09.145612 kubelet[2416]: E0130 13:08:09.145598 2416 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 30 13:08:09.146057 kubelet[2416]: I0130 13:08:09.146047 2416 server.go:460] "Adding debug handlers to kubelet server" Jan 30 13:08:09.150158 kubelet[2416]: I0130 13:08:09.150123 2416 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 30 13:08:09.150646 kubelet[2416]: I0130 13:08:09.150255 2416 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 30 13:08:09.150709 kubelet[2416]: I0130 13:08:09.150703 2416 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 30 13:08:09.150765 kubelet[2416]: I0130 13:08:09.150760 2416 reconciler.go:26] "Reconciler: start to sync state" Jan 30 13:08:09.156251 kubelet[2416]: E0130 13:08:09.156230 2416 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="200ms" Jan 30 13:08:09.164418 kubelet[2416]: E0130 13:08:09.151187 2416 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.106:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.106:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.181f7a55917341c3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-01-30 13:08:09.119359427 +0000 UTC m=+0.415451054,LastTimestamp:2025-01-30 13:08:09.119359427 +0000 UTC m=+0.415451054,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jan 30 13:08:09.168120 kubelet[2416]: I0130 13:08:09.167829 2416 factory.go:221] Registration of the systemd container factory successfully Jan 30 13:08:09.168120 kubelet[2416]: I0130 13:08:09.167903 2416 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 30 13:08:09.168401 kubelet[2416]: W0130 13:08:09.168368 2416 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 30 13:08:09.168432 kubelet[2416]: E0130 13:08:09.168408 2416 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 13:08:09.172643 kubelet[2416]: I0130 13:08:09.172524 2416 factory.go:221] Registration of the containerd container factory successfully Jan 30 13:08:09.179981 kubelet[2416]: I0130 13:08:09.179955 2416 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 30 13:08:09.181275 kubelet[2416]: I0130 13:08:09.181028 2416 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 30 13:08:09.181275 kubelet[2416]: I0130 13:08:09.181046 2416 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 30 13:08:09.181275 kubelet[2416]: I0130 13:08:09.181059 2416 kubelet.go:2321] "Starting kubelet main sync loop" Jan 30 13:08:09.181275 kubelet[2416]: E0130 13:08:09.181084 2416 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 30 13:08:09.184546 kubelet[2416]: W0130 13:08:09.184532 2416 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 30 13:08:09.184708 kubelet[2416]: E0130 13:08:09.184612 2416 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 13:08:09.189543 kubelet[2416]: I0130 13:08:09.189416 2416 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 30 13:08:09.189543 kubelet[2416]: I0130 13:08:09.189424 2416 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 30 13:08:09.189543 kubelet[2416]: I0130 13:08:09.189433 2416 state_mem.go:36] "Initialized new in-memory state store" Jan 30 13:08:09.205009 kubelet[2416]: I0130 13:08:09.204996 2416 policy_none.go:49] "None policy: Start" Jan 30 13:08:09.205410 kubelet[2416]: I0130 13:08:09.205397 2416 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 30 13:08:09.205448 kubelet[2416]: I0130 13:08:09.205412 2416 state_mem.go:35] "Initializing new in-memory state store" Jan 30 13:08:09.232836 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 30 13:08:09.241261 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 30 13:08:09.245010 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 30 13:08:09.245669 kubelet[2416]: E0130 13:08:09.245643 2416 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 30 13:08:09.253620 kubelet[2416]: I0130 13:08:09.253479 2416 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 30 13:08:09.253620 kubelet[2416]: I0130 13:08:09.253606 2416 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 30 13:08:09.253738 kubelet[2416]: I0130 13:08:09.253614 2416 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 30 13:08:09.253976 kubelet[2416]: I0130 13:08:09.253962 2416 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 30 13:08:09.254830 kubelet[2416]: E0130 13:08:09.254802 2416 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Jan 30 13:08:09.293250 systemd[1]: Created slice kubepods-burstable-podfa5289f3c0ba7f1736282e713231ffc5.slice - libcontainer container kubepods-burstable-podfa5289f3c0ba7f1736282e713231ffc5.slice. Jan 30 13:08:09.315943 systemd[1]: Created slice kubepods-burstable-podc988230cd0d49eebfaffbefbe8c74a10.slice - libcontainer container kubepods-burstable-podc988230cd0d49eebfaffbefbe8c74a10.slice. Jan 30 13:08:09.323440 systemd[1]: Created slice kubepods-burstable-pod9f56f6f3e316b5edf7ae526d1eecf4c7.slice - libcontainer container kubepods-burstable-pod9f56f6f3e316b5edf7ae526d1eecf4c7.slice. Jan 30 13:08:09.354721 kubelet[2416]: I0130 13:08:09.354672 2416 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jan 30 13:08:09.354955 kubelet[2416]: E0130 13:08:09.354924 2416 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Jan 30 13:08:09.357110 kubelet[2416]: E0130 13:08:09.357093 2416 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="400ms" Jan 30 13:08:09.452735 kubelet[2416]: I0130 13:08:09.452643 2416 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fa5289f3c0ba7f1736282e713231ffc5-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"fa5289f3c0ba7f1736282e713231ffc5\") " pod="kube-system/kube-controller-manager-localhost" Jan 30 13:08:09.452735 kubelet[2416]: I0130 13:08:09.452682 2416 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fa5289f3c0ba7f1736282e713231ffc5-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fa5289f3c0ba7f1736282e713231ffc5\") " pod="kube-system/kube-controller-manager-localhost" Jan 30 13:08:09.452735 kubelet[2416]: I0130 13:08:09.452695 2416 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fa5289f3c0ba7f1736282e713231ffc5-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"fa5289f3c0ba7f1736282e713231ffc5\") " pod="kube-system/kube-controller-manager-localhost" Jan 30 13:08:09.452735 kubelet[2416]: I0130 13:08:09.452705 2416 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9f56f6f3e316b5edf7ae526d1eecf4c7-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"9f56f6f3e316b5edf7ae526d1eecf4c7\") " pod="kube-system/kube-apiserver-localhost" Jan 30 13:08:09.452735 kubelet[2416]: I0130 13:08:09.452717 2416 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fa5289f3c0ba7f1736282e713231ffc5-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fa5289f3c0ba7f1736282e713231ffc5\") " pod="kube-system/kube-controller-manager-localhost" Jan 30 13:08:09.452896 kubelet[2416]: I0130 13:08:09.452726 2416 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fa5289f3c0ba7f1736282e713231ffc5-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"fa5289f3c0ba7f1736282e713231ffc5\") " pod="kube-system/kube-controller-manager-localhost" Jan 30 13:08:09.452896 kubelet[2416]: I0130 13:08:09.452735 2416 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c988230cd0d49eebfaffbefbe8c74a10-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"c988230cd0d49eebfaffbefbe8c74a10\") " pod="kube-system/kube-scheduler-localhost" Jan 30 13:08:09.452896 kubelet[2416]: I0130 13:08:09.452744 2416 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9f56f6f3e316b5edf7ae526d1eecf4c7-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"9f56f6f3e316b5edf7ae526d1eecf4c7\") " pod="kube-system/kube-apiserver-localhost" Jan 30 13:08:09.452896 kubelet[2416]: I0130 13:08:09.452753 2416 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9f56f6f3e316b5edf7ae526d1eecf4c7-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"9f56f6f3e316b5edf7ae526d1eecf4c7\") " pod="kube-system/kube-apiserver-localhost" Jan 30 13:08:09.556795 kubelet[2416]: I0130 13:08:09.556657 2416 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jan 30 13:08:09.557025 kubelet[2416]: E0130 13:08:09.557006 2416 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Jan 30 13:08:09.614791 containerd[1543]: time="2025-01-30T13:08:09.614698217Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:fa5289f3c0ba7f1736282e713231ffc5,Namespace:kube-system,Attempt:0,}" Jan 30 13:08:09.622469 containerd[1543]: time="2025-01-30T13:08:09.622261664Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:c988230cd0d49eebfaffbefbe8c74a10,Namespace:kube-system,Attempt:0,}" Jan 30 13:08:09.624930 containerd[1543]: time="2025-01-30T13:08:09.624912909Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:9f56f6f3e316b5edf7ae526d1eecf4c7,Namespace:kube-system,Attempt:0,}" Jan 30 13:08:09.709515 kubelet[2416]: E0130 13:08:09.709425 2416 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.106:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.106:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.181f7a55917341c3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-01-30 13:08:09.119359427 +0000 UTC m=+0.415451054,LastTimestamp:2025-01-30 13:08:09.119359427 +0000 UTC m=+0.415451054,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jan 30 13:08:09.757606 kubelet[2416]: E0130 13:08:09.757566 2416 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="800ms" Jan 30 13:08:09.948594 kubelet[2416]: W0130 13:08:09.948553 2416 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.106:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 30 13:08:09.948731 kubelet[2416]: E0130 13:08:09.948607 2416 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.106:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 13:08:09.958616 kubelet[2416]: I0130 13:08:09.958598 2416 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jan 30 13:08:09.958828 kubelet[2416]: E0130 13:08:09.958813 2416 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Jan 30 13:08:09.986181 kubelet[2416]: W0130 13:08:09.986133 2416 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 30 13:08:09.986181 kubelet[2416]: E0130 13:08:09.986184 2416 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 13:08:10.082398 kubelet[2416]: W0130 13:08:10.082354 2416 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 30 13:08:10.082398 kubelet[2416]: E0130 13:08:10.082401 2416 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 13:08:10.291834 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4029781322.mount: Deactivated successfully. Jan 30 13:08:10.294712 containerd[1543]: time="2025-01-30T13:08:10.294647208Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 30 13:08:10.295858 containerd[1543]: time="2025-01-30T13:08:10.295768509Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 30 13:08:10.295858 containerd[1543]: time="2025-01-30T13:08:10.295835483Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 30 13:08:10.296668 containerd[1543]: time="2025-01-30T13:08:10.296631994Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Jan 30 13:08:10.297454 containerd[1543]: time="2025-01-30T13:08:10.297373001Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 30 13:08:10.301104 containerd[1543]: time="2025-01-30T13:08:10.301051222Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 30 13:08:10.301104 containerd[1543]: time="2025-01-30T13:08:10.301059372Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 30 13:08:10.304512 containerd[1543]: time="2025-01-30T13:08:10.304463404Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 30 13:08:10.305257 containerd[1543]: time="2025-01-30T13:08:10.305018121Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 680.063888ms" Jan 30 13:08:10.306569 containerd[1543]: time="2025-01-30T13:08:10.306465675Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 689.756717ms" Jan 30 13:08:10.309162 containerd[1543]: time="2025-01-30T13:08:10.309120555Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 686.800257ms" Jan 30 13:08:10.462319 containerd[1543]: time="2025-01-30T13:08:10.462183989Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 13:08:10.462319 containerd[1543]: time="2025-01-30T13:08:10.462256269Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 13:08:10.462319 containerd[1543]: time="2025-01-30T13:08:10.462277421Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:10.462441 containerd[1543]: time="2025-01-30T13:08:10.462361725Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:10.464452 containerd[1543]: time="2025-01-30T13:08:10.464094659Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 13:08:10.464452 containerd[1543]: time="2025-01-30T13:08:10.464129938Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 13:08:10.464452 containerd[1543]: time="2025-01-30T13:08:10.464149838Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:10.464452 containerd[1543]: time="2025-01-30T13:08:10.464199205Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:10.465031 containerd[1543]: time="2025-01-30T13:08:10.458868925Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 13:08:10.465031 containerd[1543]: time="2025-01-30T13:08:10.464760278Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 13:08:10.465031 containerd[1543]: time="2025-01-30T13:08:10.464789355Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:10.465031 containerd[1543]: time="2025-01-30T13:08:10.464850473Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:10.473551 kubelet[2416]: W0130 13:08:10.473513 2416 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Jan 30 13:08:10.473551 kubelet[2416]: E0130 13:08:10.473554 2416 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 13:08:10.488787 systemd[1]: Started cri-containerd-ff85d38e5a905961c021604f0dc513e8053a7533f50fde8445de345cb71ce05c.scope - libcontainer container ff85d38e5a905961c021604f0dc513e8053a7533f50fde8445de345cb71ce05c. Jan 30 13:08:10.493483 systemd[1]: Started cri-containerd-2e70e63713059f790ffd55355d5168a9a982c81db19e1e41103551359590e759.scope - libcontainer container 2e70e63713059f790ffd55355d5168a9a982c81db19e1e41103551359590e759. Jan 30 13:08:10.496161 systemd[1]: Started cri-containerd-31b9ef2b89b922cded456404297100d138337a9ad1ede8c203fb0abfda1be147.scope - libcontainer container 31b9ef2b89b922cded456404297100d138337a9ad1ede8c203fb0abfda1be147. Jan 30 13:08:10.539668 containerd[1543]: time="2025-01-30T13:08:10.538509500Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:c988230cd0d49eebfaffbefbe8c74a10,Namespace:kube-system,Attempt:0,} returns sandbox id \"2e70e63713059f790ffd55355d5168a9a982c81db19e1e41103551359590e759\"" Jan 30 13:08:10.546111 containerd[1543]: time="2025-01-30T13:08:10.546014524Z" level=info msg="CreateContainer within sandbox \"2e70e63713059f790ffd55355d5168a9a982c81db19e1e41103551359590e759\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 30 13:08:10.549824 containerd[1543]: time="2025-01-30T13:08:10.549791197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:9f56f6f3e316b5edf7ae526d1eecf4c7,Namespace:kube-system,Attempt:0,} returns sandbox id \"31b9ef2b89b922cded456404297100d138337a9ad1ede8c203fb0abfda1be147\"" Jan 30 13:08:10.550879 containerd[1543]: time="2025-01-30T13:08:10.550732384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:fa5289f3c0ba7f1736282e713231ffc5,Namespace:kube-system,Attempt:0,} returns sandbox id \"ff85d38e5a905961c021604f0dc513e8053a7533f50fde8445de345cb71ce05c\"" Jan 30 13:08:10.551584 containerd[1543]: time="2025-01-30T13:08:10.551569060Z" level=info msg="CreateContainer within sandbox \"31b9ef2b89b922cded456404297100d138337a9ad1ede8c203fb0abfda1be147\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 30 13:08:10.552104 containerd[1543]: time="2025-01-30T13:08:10.552088717Z" level=info msg="CreateContainer within sandbox \"ff85d38e5a905961c021604f0dc513e8053a7533f50fde8445de345cb71ce05c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 30 13:08:10.557892 kubelet[2416]: E0130 13:08:10.557855 2416 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="1.6s" Jan 30 13:08:10.760243 kubelet[2416]: I0130 13:08:10.760221 2416 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jan 30 13:08:10.760496 kubelet[2416]: E0130 13:08:10.760474 2416 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Jan 30 13:08:10.813753 containerd[1543]: time="2025-01-30T13:08:10.813631632Z" level=info msg="CreateContainer within sandbox \"31b9ef2b89b922cded456404297100d138337a9ad1ede8c203fb0abfda1be147\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"1cea8c8f7598dd19e4bbc2bf5618d3e88b42ae3aa749fde80a82015ad86284c8\"" Jan 30 13:08:10.814479 containerd[1543]: time="2025-01-30T13:08:10.814448763Z" level=info msg="StartContainer for \"1cea8c8f7598dd19e4bbc2bf5618d3e88b42ae3aa749fde80a82015ad86284c8\"" Jan 30 13:08:10.826485 containerd[1543]: time="2025-01-30T13:08:10.826230664Z" level=info msg="CreateContainer within sandbox \"2e70e63713059f790ffd55355d5168a9a982c81db19e1e41103551359590e759\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"f876bd892c69119880166fb9174d343d093ed3f48a103b8a809d193b881cb40f\"" Jan 30 13:08:10.826485 containerd[1543]: time="2025-01-30T13:08:10.826381713Z" level=info msg="CreateContainer within sandbox \"ff85d38e5a905961c021604f0dc513e8053a7533f50fde8445de345cb71ce05c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"ee9459e386d801baf184cef978823ceb8f322acb6fa3dd3e1ae7ac208df70cf3\"" Jan 30 13:08:10.827004 containerd[1543]: time="2025-01-30T13:08:10.826989494Z" level=info msg="StartContainer for \"ee9459e386d801baf184cef978823ceb8f322acb6fa3dd3e1ae7ac208df70cf3\"" Jan 30 13:08:10.828512 containerd[1543]: time="2025-01-30T13:08:10.827838885Z" level=info msg="StartContainer for \"f876bd892c69119880166fb9174d343d093ed3f48a103b8a809d193b881cb40f\"" Jan 30 13:08:10.847592 systemd[1]: Started cri-containerd-1cea8c8f7598dd19e4bbc2bf5618d3e88b42ae3aa749fde80a82015ad86284c8.scope - libcontainer container 1cea8c8f7598dd19e4bbc2bf5618d3e88b42ae3aa749fde80a82015ad86284c8. Jan 30 13:08:10.850462 systemd[1]: Started cri-containerd-ee9459e386d801baf184cef978823ceb8f322acb6fa3dd3e1ae7ac208df70cf3.scope - libcontainer container ee9459e386d801baf184cef978823ceb8f322acb6fa3dd3e1ae7ac208df70cf3. Jan 30 13:08:10.854301 systemd[1]: Started cri-containerd-f876bd892c69119880166fb9174d343d093ed3f48a103b8a809d193b881cb40f.scope - libcontainer container f876bd892c69119880166fb9174d343d093ed3f48a103b8a809d193b881cb40f. Jan 30 13:08:10.912704 containerd[1543]: time="2025-01-30T13:08:10.912669522Z" level=info msg="StartContainer for \"1cea8c8f7598dd19e4bbc2bf5618d3e88b42ae3aa749fde80a82015ad86284c8\" returns successfully" Jan 30 13:08:10.912857 containerd[1543]: time="2025-01-30T13:08:10.912847598Z" level=info msg="StartContainer for \"f876bd892c69119880166fb9174d343d093ed3f48a103b8a809d193b881cb40f\" returns successfully" Jan 30 13:08:10.913131 containerd[1543]: time="2025-01-30T13:08:10.912935128Z" level=info msg="StartContainer for \"ee9459e386d801baf184cef978823ceb8f322acb6fa3dd3e1ae7ac208df70cf3\" returns successfully" Jan 30 13:08:11.072743 kubelet[2416]: E0130 13:08:11.072658 2416 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.106:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 13:08:12.362481 kubelet[2416]: I0130 13:08:12.362153 2416 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jan 30 13:08:12.574795 kubelet[2416]: E0130 13:08:12.574728 2416 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Jan 30 13:08:12.734649 kubelet[2416]: I0130 13:08:12.734513 2416 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Jan 30 13:08:12.734649 kubelet[2416]: E0130 13:08:12.734551 2416 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Jan 30 13:08:12.742572 kubelet[2416]: E0130 13:08:12.742537 2416 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 30 13:08:12.842634 kubelet[2416]: E0130 13:08:12.842606 2416 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 30 13:08:12.943161 kubelet[2416]: E0130 13:08:12.943070 2416 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 30 13:08:13.092823 kubelet[2416]: I0130 13:08:13.092730 2416 apiserver.go:52] "Watching apiserver" Jan 30 13:08:13.151054 kubelet[2416]: I0130 13:08:13.151010 2416 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 30 13:08:14.394132 systemd[1]: Reloading requested from client PID 2690 ('systemctl') (unit session-9.scope)... Jan 30 13:08:14.394148 systemd[1]: Reloading... Jan 30 13:08:14.450716 zram_generator::config[2727]: No configuration found. Jan 30 13:08:14.530025 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 30 13:08:14.546827 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 30 13:08:14.610485 systemd[1]: Reloading finished in 216 ms. Jan 30 13:08:14.635241 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 13:08:14.642575 systemd[1]: kubelet.service: Deactivated successfully. Jan 30 13:08:14.642790 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 13:08:14.647921 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 13:08:15.022399 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 13:08:15.027466 (kubelet)[2795]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 30 13:08:15.079576 kubelet[2795]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 13:08:15.079576 kubelet[2795]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 30 13:08:15.079576 kubelet[2795]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 13:08:15.081030 kubelet[2795]: I0130 13:08:15.080728 2795 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 30 13:08:15.091507 kubelet[2795]: I0130 13:08:15.090238 2795 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Jan 30 13:08:15.091507 kubelet[2795]: I0130 13:08:15.090261 2795 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 30 13:08:15.091507 kubelet[2795]: I0130 13:08:15.090548 2795 server.go:929] "Client rotation is on, will bootstrap in background" Jan 30 13:08:15.092464 kubelet[2795]: I0130 13:08:15.092446 2795 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 30 13:08:15.097902 kubelet[2795]: I0130 13:08:15.097879 2795 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 30 13:08:15.103283 kubelet[2795]: E0130 13:08:15.103251 2795 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Jan 30 13:08:15.103653 kubelet[2795]: I0130 13:08:15.103626 2795 server.go:1403] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Jan 30 13:08:15.106012 kubelet[2795]: I0130 13:08:15.105995 2795 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 30 13:08:15.106707 kubelet[2795]: I0130 13:08:15.106188 2795 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 30 13:08:15.106707 kubelet[2795]: I0130 13:08:15.106299 2795 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 30 13:08:15.106707 kubelet[2795]: I0130 13:08:15.106323 2795 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 30 13:08:15.106707 kubelet[2795]: I0130 13:08:15.106477 2795 topology_manager.go:138] "Creating topology manager with none policy" Jan 30 13:08:15.106916 kubelet[2795]: I0130 13:08:15.106486 2795 container_manager_linux.go:300] "Creating device plugin manager" Jan 30 13:08:15.106916 kubelet[2795]: I0130 13:08:15.106510 2795 state_mem.go:36] "Initialized new in-memory state store" Jan 30 13:08:15.107260 kubelet[2795]: I0130 13:08:15.107250 2795 kubelet.go:408] "Attempting to sync node with API server" Jan 30 13:08:15.107323 kubelet[2795]: I0130 13:08:15.107313 2795 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 30 13:08:15.107397 kubelet[2795]: I0130 13:08:15.107381 2795 kubelet.go:314] "Adding apiserver pod source" Jan 30 13:08:15.107457 kubelet[2795]: I0130 13:08:15.107447 2795 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 30 13:08:15.108349 kubelet[2795]: I0130 13:08:15.108338 2795 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 30 13:08:15.108843 kubelet[2795]: I0130 13:08:15.108834 2795 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 30 13:08:15.109352 kubelet[2795]: I0130 13:08:15.109343 2795 server.go:1269] "Started kubelet" Jan 30 13:08:15.124909 kubelet[2795]: I0130 13:08:15.124380 2795 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 30 13:08:15.135107 kubelet[2795]: I0130 13:08:15.135066 2795 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 30 13:08:15.136504 kubelet[2795]: I0130 13:08:15.136471 2795 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 30 13:08:15.137582 kubelet[2795]: I0130 13:08:15.137570 2795 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 30 13:08:15.138019 kubelet[2795]: I0130 13:08:15.138009 2795 reconciler.go:26] "Reconciler: start to sync state" Jan 30 13:08:15.138564 kubelet[2795]: I0130 13:08:15.138549 2795 server.go:460] "Adding debug handlers to kubelet server" Jan 30 13:08:15.140132 kubelet[2795]: I0130 13:08:15.140094 2795 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 30 13:08:15.140258 kubelet[2795]: I0130 13:08:15.140243 2795 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 30 13:08:15.140416 kubelet[2795]: I0130 13:08:15.140400 2795 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 30 13:08:15.145737 kubelet[2795]: I0130 13:08:15.145558 2795 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 30 13:08:15.146419 kubelet[2795]: I0130 13:08:15.146404 2795 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 30 13:08:15.153034 kubelet[2795]: I0130 13:08:15.149324 2795 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 30 13:08:15.153129 kubelet[2795]: I0130 13:08:15.153047 2795 kubelet.go:2321] "Starting kubelet main sync loop" Jan 30 13:08:15.153129 kubelet[2795]: E0130 13:08:15.153077 2795 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 30 13:08:15.154788 kubelet[2795]: E0130 13:08:15.153633 2795 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 30 13:08:15.155267 kubelet[2795]: I0130 13:08:15.155118 2795 factory.go:221] Registration of the containerd container factory successfully Jan 30 13:08:15.155267 kubelet[2795]: I0130 13:08:15.155133 2795 factory.go:221] Registration of the systemd container factory successfully Jan 30 13:08:15.155267 kubelet[2795]: I0130 13:08:15.155221 2795 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 30 13:08:15.191414 kubelet[2795]: I0130 13:08:15.191394 2795 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 30 13:08:15.191414 kubelet[2795]: I0130 13:08:15.191409 2795 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 30 13:08:15.191531 kubelet[2795]: I0130 13:08:15.191423 2795 state_mem.go:36] "Initialized new in-memory state store" Jan 30 13:08:15.191573 kubelet[2795]: I0130 13:08:15.191552 2795 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 30 13:08:15.191594 kubelet[2795]: I0130 13:08:15.191572 2795 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 30 13:08:15.191594 kubelet[2795]: I0130 13:08:15.191586 2795 policy_none.go:49] "None policy: Start" Jan 30 13:08:15.191935 kubelet[2795]: I0130 13:08:15.191924 2795 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 30 13:08:15.191935 kubelet[2795]: I0130 13:08:15.191936 2795 state_mem.go:35] "Initializing new in-memory state store" Jan 30 13:08:15.192022 kubelet[2795]: I0130 13:08:15.192011 2795 state_mem.go:75] "Updated machine memory state" Jan 30 13:08:15.196317 kubelet[2795]: I0130 13:08:15.196294 2795 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 30 13:08:15.196421 kubelet[2795]: I0130 13:08:15.196410 2795 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 30 13:08:15.196492 kubelet[2795]: I0130 13:08:15.196420 2795 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 30 13:08:15.198594 kubelet[2795]: I0130 13:08:15.198579 2795 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 30 13:08:15.301731 kubelet[2795]: I0130 13:08:15.300947 2795 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jan 30 13:08:15.310335 kubelet[2795]: I0130 13:08:15.310313 2795 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Jan 30 13:08:15.310422 kubelet[2795]: I0130 13:08:15.310370 2795 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Jan 30 13:08:15.439252 kubelet[2795]: I0130 13:08:15.439206 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fa5289f3c0ba7f1736282e713231ffc5-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"fa5289f3c0ba7f1736282e713231ffc5\") " pod="kube-system/kube-controller-manager-localhost" Jan 30 13:08:15.439602 kubelet[2795]: I0130 13:08:15.439411 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fa5289f3c0ba7f1736282e713231ffc5-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"fa5289f3c0ba7f1736282e713231ffc5\") " pod="kube-system/kube-controller-manager-localhost" Jan 30 13:08:15.439602 kubelet[2795]: I0130 13:08:15.439440 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c988230cd0d49eebfaffbefbe8c74a10-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"c988230cd0d49eebfaffbefbe8c74a10\") " pod="kube-system/kube-scheduler-localhost" Jan 30 13:08:15.439602 kubelet[2795]: I0130 13:08:15.439481 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9f56f6f3e316b5edf7ae526d1eecf4c7-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"9f56f6f3e316b5edf7ae526d1eecf4c7\") " pod="kube-system/kube-apiserver-localhost" Jan 30 13:08:15.439602 kubelet[2795]: I0130 13:08:15.439495 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9f56f6f3e316b5edf7ae526d1eecf4c7-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"9f56f6f3e316b5edf7ae526d1eecf4c7\") " pod="kube-system/kube-apiserver-localhost" Jan 30 13:08:15.439602 kubelet[2795]: I0130 13:08:15.439509 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9f56f6f3e316b5edf7ae526d1eecf4c7-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"9f56f6f3e316b5edf7ae526d1eecf4c7\") " pod="kube-system/kube-apiserver-localhost" Jan 30 13:08:15.439757 kubelet[2795]: I0130 13:08:15.439539 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fa5289f3c0ba7f1736282e713231ffc5-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fa5289f3c0ba7f1736282e713231ffc5\") " pod="kube-system/kube-controller-manager-localhost" Jan 30 13:08:15.439757 kubelet[2795]: I0130 13:08:15.439560 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fa5289f3c0ba7f1736282e713231ffc5-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"fa5289f3c0ba7f1736282e713231ffc5\") " pod="kube-system/kube-controller-manager-localhost" Jan 30 13:08:15.439757 kubelet[2795]: I0130 13:08:15.439574 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fa5289f3c0ba7f1736282e713231ffc5-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fa5289f3c0ba7f1736282e713231ffc5\") " pod="kube-system/kube-controller-manager-localhost" Jan 30 13:08:16.116775 kubelet[2795]: I0130 13:08:16.116729 2795 apiserver.go:52] "Watching apiserver" Jan 30 13:08:16.138809 kubelet[2795]: I0130 13:08:16.138747 2795 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 30 13:08:16.192079 kubelet[2795]: E0130 13:08:16.192009 2795 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Jan 30 13:08:16.209524 kubelet[2795]: I0130 13:08:16.209414 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.209400903 podStartE2EDuration="1.209400903s" podCreationTimestamp="2025-01-30 13:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-30 13:08:16.200063691 +0000 UTC m=+1.156732827" watchObservedRunningTime="2025-01-30 13:08:16.209400903 +0000 UTC m=+1.166070043" Jan 30 13:08:16.214510 kubelet[2795]: I0130 13:08:16.214467 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.214452884 podStartE2EDuration="1.214452884s" podCreationTimestamp="2025-01-30 13:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-30 13:08:16.209368109 +0000 UTC m=+1.166037254" watchObservedRunningTime="2025-01-30 13:08:16.214452884 +0000 UTC m=+1.171122022" Jan 30 13:08:16.226584 kubelet[2795]: I0130 13:08:16.226487 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.226474867 podStartE2EDuration="1.226474867s" podCreationTimestamp="2025-01-30 13:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-30 13:08:16.214659134 +0000 UTC m=+1.171328279" watchObservedRunningTime="2025-01-30 13:08:16.226474867 +0000 UTC m=+1.183144007" Jan 30 13:08:18.895973 kubelet[2795]: I0130 13:08:18.895747 2795 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 30 13:08:18.896263 containerd[1543]: time="2025-01-30T13:08:18.895923666Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 30 13:08:18.897099 kubelet[2795]: I0130 13:08:18.896496 2795 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 30 13:08:19.674911 systemd[1]: Created slice kubepods-besteffort-poda65a3ed4_2064_4511_b5b5_10f9324e8117.slice - libcontainer container kubepods-besteffort-poda65a3ed4_2064_4511_b5b5_10f9324e8117.slice. Jan 30 13:08:19.768823 kubelet[2795]: I0130 13:08:19.768792 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/a65a3ed4-2064-4511-b5b5-10f9324e8117-kube-proxy\") pod \"kube-proxy-kdmbc\" (UID: \"a65a3ed4-2064-4511-b5b5-10f9324e8117\") " pod="kube-system/kube-proxy-kdmbc" Jan 30 13:08:19.768823 kubelet[2795]: I0130 13:08:19.768817 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a65a3ed4-2064-4511-b5b5-10f9324e8117-xtables-lock\") pod \"kube-proxy-kdmbc\" (UID: \"a65a3ed4-2064-4511-b5b5-10f9324e8117\") " pod="kube-system/kube-proxy-kdmbc" Jan 30 13:08:19.769725 kubelet[2795]: I0130 13:08:19.768837 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a65a3ed4-2064-4511-b5b5-10f9324e8117-lib-modules\") pod \"kube-proxy-kdmbc\" (UID: \"a65a3ed4-2064-4511-b5b5-10f9324e8117\") " pod="kube-system/kube-proxy-kdmbc" Jan 30 13:08:19.769725 kubelet[2795]: I0130 13:08:19.768851 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs2nj\" (UniqueName: \"kubernetes.io/projected/a65a3ed4-2064-4511-b5b5-10f9324e8117-kube-api-access-qs2nj\") pod \"kube-proxy-kdmbc\" (UID: \"a65a3ed4-2064-4511-b5b5-10f9324e8117\") " pod="kube-system/kube-proxy-kdmbc" Jan 30 13:08:19.778806 sudo[1853]: pam_unix(sudo:session): session closed for user root Jan 30 13:08:19.779921 sshd[1852]: Connection closed by 139.178.89.65 port 41872 Jan 30 13:08:19.780786 sshd-session[1850]: pam_unix(sshd:session): session closed for user core Jan 30 13:08:19.782650 systemd-logind[1521]: Session 9 logged out. Waiting for processes to exit. Jan 30 13:08:19.783290 systemd[1]: sshd@6-139.178.70.106:22-139.178.89.65:41872.service: Deactivated successfully. Jan 30 13:08:19.784846 systemd[1]: session-9.scope: Deactivated successfully. Jan 30 13:08:19.784987 systemd[1]: session-9.scope: Consumed 2.948s CPU time, 138.0M memory peak, 0B memory swap peak. Jan 30 13:08:19.786180 systemd-logind[1521]: Removed session 9. Jan 30 13:08:19.858081 systemd[1]: Created slice kubepods-besteffort-pod3269e517_8d9f_423c_837a_21a631ef6341.slice - libcontainer container kubepods-besteffort-pod3269e517_8d9f_423c_837a_21a631ef6341.slice. Jan 30 13:08:19.869221 kubelet[2795]: I0130 13:08:19.869182 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3269e517-8d9f-423c-837a-21a631ef6341-var-lib-calico\") pod \"tigera-operator-76c4976dd7-nv956\" (UID: \"3269e517-8d9f-423c-837a-21a631ef6341\") " pod="tigera-operator/tigera-operator-76c4976dd7-nv956" Jan 30 13:08:19.869221 kubelet[2795]: I0130 13:08:19.869219 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p44d5\" (UniqueName: \"kubernetes.io/projected/3269e517-8d9f-423c-837a-21a631ef6341-kube-api-access-p44d5\") pod \"tigera-operator-76c4976dd7-nv956\" (UID: \"3269e517-8d9f-423c-837a-21a631ef6341\") " pod="tigera-operator/tigera-operator-76c4976dd7-nv956" Jan 30 13:08:19.982300 containerd[1543]: time="2025-01-30T13:08:19.982206138Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-kdmbc,Uid:a65a3ed4-2064-4511-b5b5-10f9324e8117,Namespace:kube-system,Attempt:0,}" Jan 30 13:08:20.012570 containerd[1543]: time="2025-01-30T13:08:20.012461934Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 13:08:20.012570 containerd[1543]: time="2025-01-30T13:08:20.012513436Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 13:08:20.012570 containerd[1543]: time="2025-01-30T13:08:20.012532423Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:20.012789 containerd[1543]: time="2025-01-30T13:08:20.012610585Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:20.028840 systemd[1]: Started cri-containerd-e238296b620ddc8e591755ba0efac8894acce7e3b42428722c869caaa7ddac9a.scope - libcontainer container e238296b620ddc8e591755ba0efac8894acce7e3b42428722c869caaa7ddac9a. Jan 30 13:08:20.044922 containerd[1543]: time="2025-01-30T13:08:20.044892121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-kdmbc,Uid:a65a3ed4-2064-4511-b5b5-10f9324e8117,Namespace:kube-system,Attempt:0,} returns sandbox id \"e238296b620ddc8e591755ba0efac8894acce7e3b42428722c869caaa7ddac9a\"" Jan 30 13:08:20.046727 containerd[1543]: time="2025-01-30T13:08:20.046704415Z" level=info msg="CreateContainer within sandbox \"e238296b620ddc8e591755ba0efac8894acce7e3b42428722c869caaa7ddac9a\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 30 13:08:20.053820 containerd[1543]: time="2025-01-30T13:08:20.053794150Z" level=info msg="CreateContainer within sandbox \"e238296b620ddc8e591755ba0efac8894acce7e3b42428722c869caaa7ddac9a\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"864e3c8a982beb0936b1fb11f5a1c7fe8ac1f1a39caf5b99cfb45374b3831e50\"" Jan 30 13:08:20.054570 containerd[1543]: time="2025-01-30T13:08:20.054516392Z" level=info msg="StartContainer for \"864e3c8a982beb0936b1fb11f5a1c7fe8ac1f1a39caf5b99cfb45374b3831e50\"" Jan 30 13:08:20.078838 systemd[1]: Started cri-containerd-864e3c8a982beb0936b1fb11f5a1c7fe8ac1f1a39caf5b99cfb45374b3831e50.scope - libcontainer container 864e3c8a982beb0936b1fb11f5a1c7fe8ac1f1a39caf5b99cfb45374b3831e50. Jan 30 13:08:20.101498 containerd[1543]: time="2025-01-30T13:08:20.101466947Z" level=info msg="StartContainer for \"864e3c8a982beb0936b1fb11f5a1c7fe8ac1f1a39caf5b99cfb45374b3831e50\" returns successfully" Jan 30 13:08:20.160399 containerd[1543]: time="2025-01-30T13:08:20.160254850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-76c4976dd7-nv956,Uid:3269e517-8d9f-423c-837a-21a631ef6341,Namespace:tigera-operator,Attempt:0,}" Jan 30 13:08:20.197074 kubelet[2795]: I0130 13:08:20.197038 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-kdmbc" podStartSLOduration=1.197025064 podStartE2EDuration="1.197025064s" podCreationTimestamp="2025-01-30 13:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-30 13:08:20.196574749 +0000 UTC m=+5.153243894" watchObservedRunningTime="2025-01-30 13:08:20.197025064 +0000 UTC m=+5.153694202" Jan 30 13:08:20.239951 containerd[1543]: time="2025-01-30T13:08:20.239797196Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 13:08:20.239951 containerd[1543]: time="2025-01-30T13:08:20.239839286Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 13:08:20.239951 containerd[1543]: time="2025-01-30T13:08:20.239855281Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:20.240491 containerd[1543]: time="2025-01-30T13:08:20.240104904Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:20.255841 systemd[1]: Started cri-containerd-3a5ac4458d1b23398e7b03d04f163c1821809fc7ab87cc2e7644d825eaba4eeb.scope - libcontainer container 3a5ac4458d1b23398e7b03d04f163c1821809fc7ab87cc2e7644d825eaba4eeb. Jan 30 13:08:20.288154 containerd[1543]: time="2025-01-30T13:08:20.288124454Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-76c4976dd7-nv956,Uid:3269e517-8d9f-423c-837a-21a631ef6341,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3a5ac4458d1b23398e7b03d04f163c1821809fc7ab87cc2e7644d825eaba4eeb\"" Jan 30 13:08:20.289779 containerd[1543]: time="2025-01-30T13:08:20.289759997Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Jan 30 13:08:21.902901 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3777136715.mount: Deactivated successfully. Jan 30 13:08:22.319351 containerd[1543]: time="2025-01-30T13:08:22.319279435Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:22.320236 containerd[1543]: time="2025-01-30T13:08:22.320198513Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21762497" Jan 30 13:08:22.320452 containerd[1543]: time="2025-01-30T13:08:22.320431438Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:22.322386 containerd[1543]: time="2025-01-30T13:08:22.322350286Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:22.323323 containerd[1543]: time="2025-01-30T13:08:22.323293593Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 2.033511975s" Jan 30 13:08:22.323378 containerd[1543]: time="2025-01-30T13:08:22.323323160Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Jan 30 13:08:22.325239 containerd[1543]: time="2025-01-30T13:08:22.325208227Z" level=info msg="CreateContainer within sandbox \"3a5ac4458d1b23398e7b03d04f163c1821809fc7ab87cc2e7644d825eaba4eeb\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 30 13:08:22.336972 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3465861439.mount: Deactivated successfully. Jan 30 13:08:22.349534 containerd[1543]: time="2025-01-30T13:08:22.349500648Z" level=info msg="CreateContainer within sandbox \"3a5ac4458d1b23398e7b03d04f163c1821809fc7ab87cc2e7644d825eaba4eeb\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"535fa113a46a93324c13bc811173e0bff0c09ad928fc44a5f9a8c5c6b7eb720a\"" Jan 30 13:08:22.350376 containerd[1543]: time="2025-01-30T13:08:22.350219673Z" level=info msg="StartContainer for \"535fa113a46a93324c13bc811173e0bff0c09ad928fc44a5f9a8c5c6b7eb720a\"" Jan 30 13:08:22.381848 systemd[1]: Started cri-containerd-535fa113a46a93324c13bc811173e0bff0c09ad928fc44a5f9a8c5c6b7eb720a.scope - libcontainer container 535fa113a46a93324c13bc811173e0bff0c09ad928fc44a5f9a8c5c6b7eb720a. Jan 30 13:08:22.412487 containerd[1543]: time="2025-01-30T13:08:22.412451418Z" level=info msg="StartContainer for \"535fa113a46a93324c13bc811173e0bff0c09ad928fc44a5f9a8c5c6b7eb720a\" returns successfully" Jan 30 13:08:25.380171 kubelet[2795]: I0130 13:08:25.380129 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-76c4976dd7-nv956" podStartSLOduration=4.344786781 podStartE2EDuration="6.380115876s" podCreationTimestamp="2025-01-30 13:08:19 +0000 UTC" firstStartedPulling="2025-01-30 13:08:20.28903781 +0000 UTC m=+5.245706947" lastFinishedPulling="2025-01-30 13:08:22.324366904 +0000 UTC m=+7.281036042" observedRunningTime="2025-01-30 13:08:23.198864804 +0000 UTC m=+8.155533950" watchObservedRunningTime="2025-01-30 13:08:25.380115876 +0000 UTC m=+10.336785017" Jan 30 13:08:25.387525 systemd[1]: Created slice kubepods-besteffort-poda913293d_65b6_4236_9dfa_0a30a5da0655.slice - libcontainer container kubepods-besteffort-poda913293d_65b6_4236_9dfa_0a30a5da0655.slice. Jan 30 13:08:25.403784 kubelet[2795]: I0130 13:08:25.403752 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a913293d-65b6-4236-9dfa-0a30a5da0655-tigera-ca-bundle\") pod \"calico-typha-6f7f9fdd8b-k6jkj\" (UID: \"a913293d-65b6-4236-9dfa-0a30a5da0655\") " pod="calico-system/calico-typha-6f7f9fdd8b-k6jkj" Jan 30 13:08:25.403784 kubelet[2795]: I0130 13:08:25.403788 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhbml\" (UniqueName: \"kubernetes.io/projected/a913293d-65b6-4236-9dfa-0a30a5da0655-kube-api-access-lhbml\") pod \"calico-typha-6f7f9fdd8b-k6jkj\" (UID: \"a913293d-65b6-4236-9dfa-0a30a5da0655\") " pod="calico-system/calico-typha-6f7f9fdd8b-k6jkj" Jan 30 13:08:25.403903 kubelet[2795]: I0130 13:08:25.403802 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a913293d-65b6-4236-9dfa-0a30a5da0655-typha-certs\") pod \"calico-typha-6f7f9fdd8b-k6jkj\" (UID: \"a913293d-65b6-4236-9dfa-0a30a5da0655\") " pod="calico-system/calico-typha-6f7f9fdd8b-k6jkj" Jan 30 13:08:25.443058 systemd[1]: Created slice kubepods-besteffort-poda461b2ec_2e5a_4259_99f9_f22c75acb964.slice - libcontainer container kubepods-besteffort-poda461b2ec_2e5a_4259_99f9_f22c75acb964.slice. Jan 30 13:08:25.505124 kubelet[2795]: I0130 13:08:25.504783 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a461b2ec-2e5a-4259-99f9-f22c75acb964-var-run-calico\") pod \"calico-node-vznh7\" (UID: \"a461b2ec-2e5a-4259-99f9-f22c75acb964\") " pod="calico-system/calico-node-vznh7" Jan 30 13:08:25.505124 kubelet[2795]: I0130 13:08:25.504815 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a461b2ec-2e5a-4259-99f9-f22c75acb964-cni-net-dir\") pod \"calico-node-vznh7\" (UID: \"a461b2ec-2e5a-4259-99f9-f22c75acb964\") " pod="calico-system/calico-node-vznh7" Jan 30 13:08:25.505124 kubelet[2795]: I0130 13:08:25.504829 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a461b2ec-2e5a-4259-99f9-f22c75acb964-cni-log-dir\") pod \"calico-node-vznh7\" (UID: \"a461b2ec-2e5a-4259-99f9-f22c75acb964\") " pod="calico-system/calico-node-vznh7" Jan 30 13:08:25.505124 kubelet[2795]: I0130 13:08:25.504844 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75k6m\" (UniqueName: \"kubernetes.io/projected/a461b2ec-2e5a-4259-99f9-f22c75acb964-kube-api-access-75k6m\") pod \"calico-node-vznh7\" (UID: \"a461b2ec-2e5a-4259-99f9-f22c75acb964\") " pod="calico-system/calico-node-vznh7" Jan 30 13:08:25.505124 kubelet[2795]: I0130 13:08:25.504858 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a461b2ec-2e5a-4259-99f9-f22c75acb964-tigera-ca-bundle\") pod \"calico-node-vznh7\" (UID: \"a461b2ec-2e5a-4259-99f9-f22c75acb964\") " pod="calico-system/calico-node-vznh7" Jan 30 13:08:25.509238 kubelet[2795]: I0130 13:08:25.504870 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a461b2ec-2e5a-4259-99f9-f22c75acb964-xtables-lock\") pod \"calico-node-vznh7\" (UID: \"a461b2ec-2e5a-4259-99f9-f22c75acb964\") " pod="calico-system/calico-node-vznh7" Jan 30 13:08:25.509238 kubelet[2795]: I0130 13:08:25.504878 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a461b2ec-2e5a-4259-99f9-f22c75acb964-cni-bin-dir\") pod \"calico-node-vznh7\" (UID: \"a461b2ec-2e5a-4259-99f9-f22c75acb964\") " pod="calico-system/calico-node-vznh7" Jan 30 13:08:25.509238 kubelet[2795]: I0130 13:08:25.504899 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a461b2ec-2e5a-4259-99f9-f22c75acb964-lib-modules\") pod \"calico-node-vznh7\" (UID: \"a461b2ec-2e5a-4259-99f9-f22c75acb964\") " pod="calico-system/calico-node-vznh7" Jan 30 13:08:25.509238 kubelet[2795]: I0130 13:08:25.504911 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a461b2ec-2e5a-4259-99f9-f22c75acb964-policysync\") pod \"calico-node-vznh7\" (UID: \"a461b2ec-2e5a-4259-99f9-f22c75acb964\") " pod="calico-system/calico-node-vznh7" Jan 30 13:08:25.509238 kubelet[2795]: I0130 13:08:25.504924 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a461b2ec-2e5a-4259-99f9-f22c75acb964-node-certs\") pod \"calico-node-vznh7\" (UID: \"a461b2ec-2e5a-4259-99f9-f22c75acb964\") " pod="calico-system/calico-node-vznh7" Jan 30 13:08:25.509373 kubelet[2795]: I0130 13:08:25.504935 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a461b2ec-2e5a-4259-99f9-f22c75acb964-var-lib-calico\") pod \"calico-node-vznh7\" (UID: \"a461b2ec-2e5a-4259-99f9-f22c75acb964\") " pod="calico-system/calico-node-vznh7" Jan 30 13:08:25.509373 kubelet[2795]: I0130 13:08:25.504943 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a461b2ec-2e5a-4259-99f9-f22c75acb964-flexvol-driver-host\") pod \"calico-node-vznh7\" (UID: \"a461b2ec-2e5a-4259-99f9-f22c75acb964\") " pod="calico-system/calico-node-vznh7" Jan 30 13:08:25.586720 kubelet[2795]: E0130 13:08:25.585922 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gj66t" podUID="0d14c0df-65f9-4785-8227-ecaaf26cf401" Jan 30 13:08:25.607047 kubelet[2795]: I0130 13:08:25.605582 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdjv8\" (UniqueName: \"kubernetes.io/projected/0d14c0df-65f9-4785-8227-ecaaf26cf401-kube-api-access-zdjv8\") pod \"csi-node-driver-gj66t\" (UID: \"0d14c0df-65f9-4785-8227-ecaaf26cf401\") " pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:25.607047 kubelet[2795]: I0130 13:08:25.605664 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d14c0df-65f9-4785-8227-ecaaf26cf401-kubelet-dir\") pod \"csi-node-driver-gj66t\" (UID: \"0d14c0df-65f9-4785-8227-ecaaf26cf401\") " pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:25.607704 kubelet[2795]: I0130 13:08:25.607280 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d14c0df-65f9-4785-8227-ecaaf26cf401-socket-dir\") pod \"csi-node-driver-gj66t\" (UID: \"0d14c0df-65f9-4785-8227-ecaaf26cf401\") " pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:25.607704 kubelet[2795]: I0130 13:08:25.607310 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/0d14c0df-65f9-4785-8227-ecaaf26cf401-varrun\") pod \"csi-node-driver-gj66t\" (UID: \"0d14c0df-65f9-4785-8227-ecaaf26cf401\") " pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:25.607704 kubelet[2795]: I0130 13:08:25.607340 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d14c0df-65f9-4785-8227-ecaaf26cf401-registration-dir\") pod \"csi-node-driver-gj66t\" (UID: \"0d14c0df-65f9-4785-8227-ecaaf26cf401\") " pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:25.609123 kubelet[2795]: E0130 13:08:25.609094 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.609187 kubelet[2795]: W0130 13:08:25.609129 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.609907 kubelet[2795]: E0130 13:08:25.609897 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.609964 kubelet[2795]: W0130 13:08:25.609956 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.610194 kubelet[2795]: E0130 13:08:25.610183 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.610249 kubelet[2795]: E0130 13:08:25.610203 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.612494 kubelet[2795]: E0130 13:08:25.612477 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.612494 kubelet[2795]: W0130 13:08:25.612492 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.612636 kubelet[2795]: E0130 13:08:25.612526 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.612777 kubelet[2795]: E0130 13:08:25.612702 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.612777 kubelet[2795]: W0130 13:08:25.612711 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.612777 kubelet[2795]: E0130 13:08:25.612721 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.613377 kubelet[2795]: E0130 13:08:25.613276 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.613377 kubelet[2795]: W0130 13:08:25.613287 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.613377 kubelet[2795]: E0130 13:08:25.613301 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.613560 kubelet[2795]: E0130 13:08:25.613519 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.613560 kubelet[2795]: W0130 13:08:25.613525 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.613560 kubelet[2795]: E0130 13:08:25.613531 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.614855 kubelet[2795]: E0130 13:08:25.613846 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.614855 kubelet[2795]: W0130 13:08:25.613853 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.614855 kubelet[2795]: E0130 13:08:25.613859 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.615070 kubelet[2795]: E0130 13:08:25.614966 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.615070 kubelet[2795]: W0130 13:08:25.614977 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.615070 kubelet[2795]: E0130 13:08:25.614990 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.615228 kubelet[2795]: E0130 13:08:25.615221 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.615709 kubelet[2795]: W0130 13:08:25.615635 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.615809 kubelet[2795]: E0130 13:08:25.615800 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.615895 kubelet[2795]: W0130 13:08:25.615850 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.616039 kubelet[2795]: E0130 13:08:25.615977 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.616039 kubelet[2795]: W0130 13:08:25.615984 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.616300 kubelet[2795]: E0130 13:08:25.616176 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.616300 kubelet[2795]: E0130 13:08:25.616194 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.616300 kubelet[2795]: E0130 13:08:25.616204 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.616300 kubelet[2795]: E0130 13:08:25.616221 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.616300 kubelet[2795]: W0130 13:08:25.616226 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.616300 kubelet[2795]: E0130 13:08:25.616232 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.617144 kubelet[2795]: E0130 13:08:25.616964 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.617144 kubelet[2795]: W0130 13:08:25.617007 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.617144 kubelet[2795]: E0130 13:08:25.617037 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.617455 kubelet[2795]: E0130 13:08:25.617448 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.617523 kubelet[2795]: W0130 13:08:25.617492 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.617523 kubelet[2795]: E0130 13:08:25.617502 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.624819 kubelet[2795]: E0130 13:08:25.624746 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.624819 kubelet[2795]: W0130 13:08:25.624766 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.624819 kubelet[2795]: E0130 13:08:25.624783 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.690672 containerd[1543]: time="2025-01-30T13:08:25.690646694Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6f7f9fdd8b-k6jkj,Uid:a913293d-65b6-4236-9dfa-0a30a5da0655,Namespace:calico-system,Attempt:0,}" Jan 30 13:08:25.708721 kubelet[2795]: E0130 13:08:25.708605 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.708721 kubelet[2795]: W0130 13:08:25.708621 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.708721 kubelet[2795]: E0130 13:08:25.708637 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.709160 kubelet[2795]: E0130 13:08:25.708790 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.709160 kubelet[2795]: W0130 13:08:25.708796 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.709160 kubelet[2795]: E0130 13:08:25.708805 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.709160 kubelet[2795]: E0130 13:08:25.708942 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.709160 kubelet[2795]: W0130 13:08:25.708949 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.709160 kubelet[2795]: E0130 13:08:25.708959 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.709160 kubelet[2795]: E0130 13:08:25.709057 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.709160 kubelet[2795]: W0130 13:08:25.709064 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.709160 kubelet[2795]: E0130 13:08:25.709076 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.717651 kubelet[2795]: E0130 13:08:25.709382 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.717651 kubelet[2795]: W0130 13:08:25.709389 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.717651 kubelet[2795]: E0130 13:08:25.709402 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.717651 kubelet[2795]: E0130 13:08:25.709550 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.717651 kubelet[2795]: W0130 13:08:25.709555 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.717651 kubelet[2795]: E0130 13:08:25.709563 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.717651 kubelet[2795]: E0130 13:08:25.709700 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.717651 kubelet[2795]: W0130 13:08:25.709707 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.717651 kubelet[2795]: E0130 13:08:25.709718 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.717651 kubelet[2795]: E0130 13:08:25.709881 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.717871 kubelet[2795]: W0130 13:08:25.709899 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.717871 kubelet[2795]: E0130 13:08:25.709911 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.717871 kubelet[2795]: E0130 13:08:25.710033 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.717871 kubelet[2795]: W0130 13:08:25.710040 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.717871 kubelet[2795]: E0130 13:08:25.710048 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.717871 kubelet[2795]: E0130 13:08:25.710156 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.717871 kubelet[2795]: W0130 13:08:25.710174 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.717871 kubelet[2795]: E0130 13:08:25.710182 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.717871 kubelet[2795]: E0130 13:08:25.710310 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.717871 kubelet[2795]: W0130 13:08:25.710315 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.719895 kubelet[2795]: E0130 13:08:25.710335 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.719895 kubelet[2795]: E0130 13:08:25.710541 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.719895 kubelet[2795]: W0130 13:08:25.710546 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.719895 kubelet[2795]: E0130 13:08:25.710653 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.719895 kubelet[2795]: E0130 13:08:25.710759 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.719895 kubelet[2795]: W0130 13:08:25.710766 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.719895 kubelet[2795]: E0130 13:08:25.710781 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.719895 kubelet[2795]: E0130 13:08:25.710926 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.719895 kubelet[2795]: W0130 13:08:25.710931 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.719895 kubelet[2795]: E0130 13:08:25.711003 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.720951 kubelet[2795]: E0130 13:08:25.711066 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.720951 kubelet[2795]: W0130 13:08:25.711080 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.720951 kubelet[2795]: E0130 13:08:25.711107 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.720951 kubelet[2795]: E0130 13:08:25.711275 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.720951 kubelet[2795]: W0130 13:08:25.711298 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.720951 kubelet[2795]: E0130 13:08:25.711431 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.720951 kubelet[2795]: E0130 13:08:25.711468 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.720951 kubelet[2795]: W0130 13:08:25.711473 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.720951 kubelet[2795]: E0130 13:08:25.711483 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.720951 kubelet[2795]: E0130 13:08:25.711599 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.721163 kubelet[2795]: W0130 13:08:25.711604 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.721163 kubelet[2795]: E0130 13:08:25.711618 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.721163 kubelet[2795]: E0130 13:08:25.711756 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.721163 kubelet[2795]: W0130 13:08:25.711761 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.721163 kubelet[2795]: E0130 13:08:25.711767 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.721163 kubelet[2795]: E0130 13:08:25.711903 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.721163 kubelet[2795]: W0130 13:08:25.711908 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.721163 kubelet[2795]: E0130 13:08:25.711918 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.721163 kubelet[2795]: E0130 13:08:25.712090 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.721163 kubelet[2795]: W0130 13:08:25.712096 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.721735 kubelet[2795]: E0130 13:08:25.712117 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.721735 kubelet[2795]: E0130 13:08:25.712212 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.721735 kubelet[2795]: W0130 13:08:25.712218 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.721735 kubelet[2795]: E0130 13:08:25.712224 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.721735 kubelet[2795]: E0130 13:08:25.712373 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.721735 kubelet[2795]: W0130 13:08:25.712380 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.721735 kubelet[2795]: E0130 13:08:25.712387 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.721735 kubelet[2795]: E0130 13:08:25.712522 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.721735 kubelet[2795]: W0130 13:08:25.712528 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.721735 kubelet[2795]: E0130 13:08:25.712534 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.721905 kubelet[2795]: E0130 13:08:25.716470 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.721905 kubelet[2795]: W0130 13:08:25.716479 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.721905 kubelet[2795]: E0130 13:08:25.716488 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.721905 kubelet[2795]: E0130 13:08:25.721133 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:25.721905 kubelet[2795]: W0130 13:08:25.721142 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:25.721905 kubelet[2795]: E0130 13:08:25.721153 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:25.745712 containerd[1543]: time="2025-01-30T13:08:25.745691312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vznh7,Uid:a461b2ec-2e5a-4259-99f9-f22c75acb964,Namespace:calico-system,Attempt:0,}" Jan 30 13:08:25.906424 containerd[1543]: time="2025-01-30T13:08:25.906351108Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 13:08:25.906424 containerd[1543]: time="2025-01-30T13:08:25.906390483Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 13:08:25.906424 containerd[1543]: time="2025-01-30T13:08:25.906398351Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:25.906586 containerd[1543]: time="2025-01-30T13:08:25.906451210Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:25.925798 systemd[1]: Started cri-containerd-998773d75739b20cddaeae20e20016bc772c3f4cc04c3703876c9f8d6127b306.scope - libcontainer container 998773d75739b20cddaeae20e20016bc772c3f4cc04c3703876c9f8d6127b306. Jan 30 13:08:25.957862 containerd[1543]: time="2025-01-30T13:08:25.957792195Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6f7f9fdd8b-k6jkj,Uid:a913293d-65b6-4236-9dfa-0a30a5da0655,Namespace:calico-system,Attempt:0,} returns sandbox id \"998773d75739b20cddaeae20e20016bc772c3f4cc04c3703876c9f8d6127b306\"" Jan 30 13:08:25.960500 containerd[1543]: time="2025-01-30T13:08:25.960472515Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Jan 30 13:08:26.140538 containerd[1543]: time="2025-01-30T13:08:26.140489882Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 13:08:26.140778 containerd[1543]: time="2025-01-30T13:08:26.140666604Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 13:08:26.140778 containerd[1543]: time="2025-01-30T13:08:26.140707163Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:26.140872 containerd[1543]: time="2025-01-30T13:08:26.140770343Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:26.157766 systemd[1]: Started cri-containerd-b1194eb381517e8615fa0630fff5fdfc63f37461e07214f0e46ac21261b045ac.scope - libcontainer container b1194eb381517e8615fa0630fff5fdfc63f37461e07214f0e46ac21261b045ac. Jan 30 13:08:26.170766 containerd[1543]: time="2025-01-30T13:08:26.170731682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vznh7,Uid:a461b2ec-2e5a-4259-99f9-f22c75acb964,Namespace:calico-system,Attempt:0,} returns sandbox id \"b1194eb381517e8615fa0630fff5fdfc63f37461e07214f0e46ac21261b045ac\"" Jan 30 13:08:27.154516 kubelet[2795]: E0130 13:08:27.154056 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gj66t" podUID="0d14c0df-65f9-4785-8227-ecaaf26cf401" Jan 30 13:08:27.549605 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount734258745.mount: Deactivated successfully. Jan 30 13:08:27.605919 kubelet[2795]: E0130 13:08:27.605879 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.605919 kubelet[2795]: W0130 13:08:27.605907 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.606040 kubelet[2795]: E0130 13:08:27.605925 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.606118 kubelet[2795]: E0130 13:08:27.606096 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.606118 kubelet[2795]: W0130 13:08:27.606106 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.606118 kubelet[2795]: E0130 13:08:27.606115 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.606287 kubelet[2795]: E0130 13:08:27.606237 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.606287 kubelet[2795]: W0130 13:08:27.606244 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.606287 kubelet[2795]: E0130 13:08:27.606251 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.606420 kubelet[2795]: E0130 13:08:27.606358 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.606420 kubelet[2795]: W0130 13:08:27.606363 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.606420 kubelet[2795]: E0130 13:08:27.606369 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.606527 kubelet[2795]: E0130 13:08:27.606494 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.606527 kubelet[2795]: W0130 13:08:27.606499 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.606527 kubelet[2795]: E0130 13:08:27.606505 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.908429 kubelet[2795]: E0130 13:08:27.908401 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.908535 kubelet[2795]: W0130 13:08:27.908418 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.908535 kubelet[2795]: E0130 13:08:27.908458 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.909690 kubelet[2795]: E0130 13:08:27.908706 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.909690 kubelet[2795]: W0130 13:08:27.908720 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.909690 kubelet[2795]: E0130 13:08:27.908736 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.909690 kubelet[2795]: E0130 13:08:27.908974 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.909690 kubelet[2795]: W0130 13:08:27.908984 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.909690 kubelet[2795]: E0130 13:08:27.908995 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.909690 kubelet[2795]: E0130 13:08:27.909228 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.909690 kubelet[2795]: W0130 13:08:27.909238 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.909690 kubelet[2795]: E0130 13:08:27.909270 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.909690 kubelet[2795]: E0130 13:08:27.909445 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.909977 kubelet[2795]: W0130 13:08:27.909452 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.909977 kubelet[2795]: E0130 13:08:27.909459 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.909977 kubelet[2795]: E0130 13:08:27.909607 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.909977 kubelet[2795]: W0130 13:08:27.909613 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.909977 kubelet[2795]: E0130 13:08:27.909621 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.909977 kubelet[2795]: E0130 13:08:27.909807 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.909977 kubelet[2795]: W0130 13:08:27.909815 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.909977 kubelet[2795]: E0130 13:08:27.909841 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.909977 kubelet[2795]: E0130 13:08:27.909974 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.910211 kubelet[2795]: W0130 13:08:27.909996 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.910211 kubelet[2795]: E0130 13:08:27.910004 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.910211 kubelet[2795]: E0130 13:08:27.910157 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.910211 kubelet[2795]: W0130 13:08:27.910165 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.910211 kubelet[2795]: E0130 13:08:27.910171 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.910396 kubelet[2795]: E0130 13:08:27.910346 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.910396 kubelet[2795]: W0130 13:08:27.910353 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.910396 kubelet[2795]: E0130 13:08:27.910359 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.910521 kubelet[2795]: E0130 13:08:27.910507 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.910521 kubelet[2795]: W0130 13:08:27.910516 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.910521 kubelet[2795]: E0130 13:08:27.910522 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.910689 kubelet[2795]: E0130 13:08:27.910665 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.910729 kubelet[2795]: W0130 13:08:27.910698 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.910729 kubelet[2795]: E0130 13:08:27.910707 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.910865 kubelet[2795]: E0130 13:08:27.910851 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.910865 kubelet[2795]: W0130 13:08:27.910860 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.910927 kubelet[2795]: E0130 13:08:27.910867 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.911467 kubelet[2795]: E0130 13:08:27.911006 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.911467 kubelet[2795]: W0130 13:08:27.911014 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.911467 kubelet[2795]: E0130 13:08:27.911021 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:27.911467 kubelet[2795]: E0130 13:08:27.911156 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:27.911467 kubelet[2795]: W0130 13:08:27.911162 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:27.911467 kubelet[2795]: E0130 13:08:27.911167 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:28.285052 containerd[1543]: time="2025-01-30T13:08:28.284861790Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:28.294053 containerd[1543]: time="2025-01-30T13:08:28.294001385Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=31343363" Jan 30 13:08:28.298852 containerd[1543]: time="2025-01-30T13:08:28.298811007Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:28.301543 containerd[1543]: time="2025-01-30T13:08:28.301502296Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:28.302235 containerd[1543]: time="2025-01-30T13:08:28.301765085Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 2.341271493s" Jan 30 13:08:28.302235 containerd[1543]: time="2025-01-30T13:08:28.301783671Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Jan 30 13:08:28.303300 containerd[1543]: time="2025-01-30T13:08:28.302756206Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Jan 30 13:08:28.313020 containerd[1543]: time="2025-01-30T13:08:28.312993774Z" level=info msg="CreateContainer within sandbox \"998773d75739b20cddaeae20e20016bc772c3f4cc04c3703876c9f8d6127b306\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 30 13:08:28.320119 containerd[1543]: time="2025-01-30T13:08:28.320087613Z" level=info msg="CreateContainer within sandbox \"998773d75739b20cddaeae20e20016bc772c3f4cc04c3703876c9f8d6127b306\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"41c4720663cbb35307539049bb835dc0c17249ce4396be3c48a7093c5743ab6c\"" Jan 30 13:08:28.321557 containerd[1543]: time="2025-01-30T13:08:28.321538417Z" level=info msg="StartContainer for \"41c4720663cbb35307539049bb835dc0c17249ce4396be3c48a7093c5743ab6c\"" Jan 30 13:08:28.365822 systemd[1]: Started cri-containerd-41c4720663cbb35307539049bb835dc0c17249ce4396be3c48a7093c5743ab6c.scope - libcontainer container 41c4720663cbb35307539049bb835dc0c17249ce4396be3c48a7093c5743ab6c. Jan 30 13:08:28.421286 containerd[1543]: time="2025-01-30T13:08:28.421060173Z" level=info msg="StartContainer for \"41c4720663cbb35307539049bb835dc0c17249ce4396be3c48a7093c5743ab6c\" returns successfully" Jan 30 13:08:29.155111 kubelet[2795]: E0130 13:08:29.154303 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gj66t" podUID="0d14c0df-65f9-4785-8227-ecaaf26cf401" Jan 30 13:08:29.220549 kubelet[2795]: E0130 13:08:29.220525 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.220549 kubelet[2795]: W0130 13:08:29.220543 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.220709 kubelet[2795]: E0130 13:08:29.220556 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.220745 kubelet[2795]: E0130 13:08:29.220733 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.220745 kubelet[2795]: W0130 13:08:29.220740 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.220794 kubelet[2795]: E0130 13:08:29.220748 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.220906 kubelet[2795]: E0130 13:08:29.220893 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.220906 kubelet[2795]: W0130 13:08:29.220904 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.220966 kubelet[2795]: E0130 13:08:29.220913 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.221089 kubelet[2795]: E0130 13:08:29.221076 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.221089 kubelet[2795]: W0130 13:08:29.221087 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.221147 kubelet[2795]: E0130 13:08:29.221095 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.221255 kubelet[2795]: E0130 13:08:29.221240 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.221255 kubelet[2795]: W0130 13:08:29.221250 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.230650 kubelet[2795]: E0130 13:08:29.221257 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.230650 kubelet[2795]: E0130 13:08:29.221377 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.230650 kubelet[2795]: W0130 13:08:29.221383 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.230650 kubelet[2795]: E0130 13:08:29.221389 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.230650 kubelet[2795]: E0130 13:08:29.221571 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.230650 kubelet[2795]: W0130 13:08:29.221578 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.230650 kubelet[2795]: E0130 13:08:29.221584 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.230650 kubelet[2795]: E0130 13:08:29.221754 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.230650 kubelet[2795]: W0130 13:08:29.221766 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.230650 kubelet[2795]: E0130 13:08:29.221774 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.230908 kubelet[2795]: E0130 13:08:29.221922 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.230908 kubelet[2795]: W0130 13:08:29.221929 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.230908 kubelet[2795]: E0130 13:08:29.221935 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.230908 kubelet[2795]: E0130 13:08:29.222135 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.230908 kubelet[2795]: W0130 13:08:29.222142 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.230908 kubelet[2795]: E0130 13:08:29.222149 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.230908 kubelet[2795]: E0130 13:08:29.222670 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.230908 kubelet[2795]: W0130 13:08:29.222698 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.230908 kubelet[2795]: E0130 13:08:29.222707 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.230908 kubelet[2795]: E0130 13:08:29.222830 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.231121 kubelet[2795]: W0130 13:08:29.222836 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.231121 kubelet[2795]: E0130 13:08:29.222861 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.231121 kubelet[2795]: E0130 13:08:29.223085 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.231121 kubelet[2795]: W0130 13:08:29.223103 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.231121 kubelet[2795]: E0130 13:08:29.223111 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.231121 kubelet[2795]: E0130 13:08:29.223237 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.231121 kubelet[2795]: W0130 13:08:29.223243 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.231121 kubelet[2795]: E0130 13:08:29.223264 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.231121 kubelet[2795]: E0130 13:08:29.223415 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.231121 kubelet[2795]: W0130 13:08:29.223424 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.231331 kubelet[2795]: E0130 13:08:29.223431 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.233248 kubelet[2795]: E0130 13:08:29.233233 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.233248 kubelet[2795]: W0130 13:08:29.233245 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.233315 kubelet[2795]: E0130 13:08:29.233254 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.233479 kubelet[2795]: E0130 13:08:29.233407 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.233479 kubelet[2795]: W0130 13:08:29.233416 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.233479 kubelet[2795]: E0130 13:08:29.233422 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.233755 kubelet[2795]: E0130 13:08:29.233568 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.233755 kubelet[2795]: W0130 13:08:29.233582 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.233755 kubelet[2795]: E0130 13:08:29.233600 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.233909 kubelet[2795]: E0130 13:08:29.233900 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.233957 kubelet[2795]: W0130 13:08:29.233949 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.234012 kubelet[2795]: E0130 13:08:29.234004 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.234163 kubelet[2795]: E0130 13:08:29.234146 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.234163 kubelet[2795]: W0130 13:08:29.234158 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.234225 kubelet[2795]: E0130 13:08:29.234169 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.234335 kubelet[2795]: E0130 13:08:29.234319 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.234335 kubelet[2795]: W0130 13:08:29.234330 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.234410 kubelet[2795]: E0130 13:08:29.234344 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.234505 kubelet[2795]: E0130 13:08:29.234493 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.234505 kubelet[2795]: W0130 13:08:29.234503 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.234561 kubelet[2795]: E0130 13:08:29.234516 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.234664 kubelet[2795]: E0130 13:08:29.234653 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.234664 kubelet[2795]: W0130 13:08:29.234663 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.234738 kubelet[2795]: E0130 13:08:29.234728 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.234847 kubelet[2795]: E0130 13:08:29.234828 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.234847 kubelet[2795]: W0130 13:08:29.234844 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.234965 kubelet[2795]: E0130 13:08:29.234893 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.235019 kubelet[2795]: E0130 13:08:29.235005 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.235019 kubelet[2795]: W0130 13:08:29.235015 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.235080 kubelet[2795]: E0130 13:08:29.235024 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.235179 kubelet[2795]: E0130 13:08:29.235157 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.235210 kubelet[2795]: W0130 13:08:29.235182 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.235210 kubelet[2795]: E0130 13:08:29.235191 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.235628 kubelet[2795]: E0130 13:08:29.235406 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.235628 kubelet[2795]: W0130 13:08:29.235415 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.235628 kubelet[2795]: E0130 13:08:29.235427 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.235628 kubelet[2795]: E0130 13:08:29.235553 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.235628 kubelet[2795]: W0130 13:08:29.235574 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.235628 kubelet[2795]: E0130 13:08:29.235582 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.235807 kubelet[2795]: E0130 13:08:29.235706 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.235807 kubelet[2795]: W0130 13:08:29.235712 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.235807 kubelet[2795]: E0130 13:08:29.235718 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.235879 kubelet[2795]: E0130 13:08:29.235865 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.235879 kubelet[2795]: W0130 13:08:29.235871 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.235879 kubelet[2795]: E0130 13:08:29.235877 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.236303 kubelet[2795]: E0130 13:08:29.236165 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.236303 kubelet[2795]: W0130 13:08:29.236177 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.236303 kubelet[2795]: E0130 13:08:29.236194 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.236449 kubelet[2795]: E0130 13:08:29.236414 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.236449 kubelet[2795]: W0130 13:08:29.236423 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.236449 kubelet[2795]: E0130 13:08:29.236435 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.236590 kubelet[2795]: E0130 13:08:29.236574 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 13:08:29.236590 kubelet[2795]: W0130 13:08:29.236586 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 13:08:29.236769 kubelet[2795]: E0130 13:08:29.236594 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 13:08:29.246245 kubelet[2795]: I0130 13:08:29.246208 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6f7f9fdd8b-k6jkj" podStartSLOduration=1.904026936 podStartE2EDuration="4.246197342s" podCreationTimestamp="2025-01-30 13:08:25 +0000 UTC" firstStartedPulling="2025-01-30 13:08:25.960143685 +0000 UTC m=+10.916812822" lastFinishedPulling="2025-01-30 13:08:28.302314086 +0000 UTC m=+13.258983228" observedRunningTime="2025-01-30 13:08:29.244841334 +0000 UTC m=+14.201510488" watchObservedRunningTime="2025-01-30 13:08:29.246197342 +0000 UTC m=+14.202866543" Jan 30 13:08:29.973858 containerd[1543]: time="2025-01-30T13:08:29.973832788Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:29.974973 containerd[1543]: time="2025-01-30T13:08:29.974863753Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5362121" Jan 30 13:08:29.975311 containerd[1543]: time="2025-01-30T13:08:29.975299557Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:29.977718 containerd[1543]: time="2025-01-30T13:08:29.977092966Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:29.977718 containerd[1543]: time="2025-01-30T13:08:29.977619244Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.674315792s" Jan 30 13:08:29.977718 containerd[1543]: time="2025-01-30T13:08:29.977635964Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Jan 30 13:08:29.980110 containerd[1543]: time="2025-01-30T13:08:29.980078171Z" level=info msg="CreateContainer within sandbox \"b1194eb381517e8615fa0630fff5fdfc63f37461e07214f0e46ac21261b045ac\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 30 13:08:30.012595 containerd[1543]: time="2025-01-30T13:08:30.012560815Z" level=info msg="CreateContainer within sandbox \"b1194eb381517e8615fa0630fff5fdfc63f37461e07214f0e46ac21261b045ac\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"dbace1c9a8e27de7a149696a727bf35ee300eb86880aaba25179c01327b01532\"" Jan 30 13:08:30.013809 containerd[1543]: time="2025-01-30T13:08:30.013644545Z" level=info msg="StartContainer for \"dbace1c9a8e27de7a149696a727bf35ee300eb86880aaba25179c01327b01532\"" Jan 30 13:08:30.043919 systemd[1]: Started cri-containerd-dbace1c9a8e27de7a149696a727bf35ee300eb86880aaba25179c01327b01532.scope - libcontainer container dbace1c9a8e27de7a149696a727bf35ee300eb86880aaba25179c01327b01532. Jan 30 13:08:30.071581 systemd[1]: cri-containerd-dbace1c9a8e27de7a149696a727bf35ee300eb86880aaba25179c01327b01532.scope: Deactivated successfully. Jan 30 13:08:30.170596 containerd[1543]: time="2025-01-30T13:08:30.170562785Z" level=info msg="StartContainer for \"dbace1c9a8e27de7a149696a727bf35ee300eb86880aaba25179c01327b01532\" returns successfully" Jan 30 13:08:30.188215 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dbace1c9a8e27de7a149696a727bf35ee300eb86880aaba25179c01327b01532-rootfs.mount: Deactivated successfully. Jan 30 13:08:30.219521 kubelet[2795]: I0130 13:08:30.219431 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 13:08:31.154263 kubelet[2795]: E0130 13:08:31.153707 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gj66t" podUID="0d14c0df-65f9-4785-8227-ecaaf26cf401" Jan 30 13:08:31.397219 containerd[1543]: time="2025-01-30T13:08:31.389484493Z" level=info msg="shim disconnected" id=dbace1c9a8e27de7a149696a727bf35ee300eb86880aaba25179c01327b01532 namespace=k8s.io Jan 30 13:08:31.397664 containerd[1543]: time="2025-01-30T13:08:31.397551111Z" level=warning msg="cleaning up after shim disconnected" id=dbace1c9a8e27de7a149696a727bf35ee300eb86880aaba25179c01327b01532 namespace=k8s.io Jan 30 13:08:31.397664 containerd[1543]: time="2025-01-30T13:08:31.397566649Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 30 13:08:32.225931 containerd[1543]: time="2025-01-30T13:08:32.225061421Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Jan 30 13:08:33.155266 kubelet[2795]: E0130 13:08:33.154956 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gj66t" podUID="0d14c0df-65f9-4785-8227-ecaaf26cf401" Jan 30 13:08:35.179968 kubelet[2795]: E0130 13:08:35.179790 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gj66t" podUID="0d14c0df-65f9-4785-8227-ecaaf26cf401" Jan 30 13:08:36.342033 containerd[1543]: time="2025-01-30T13:08:36.341997465Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:36.349030 containerd[1543]: time="2025-01-30T13:08:36.348981616Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Jan 30 13:08:36.358493 containerd[1543]: time="2025-01-30T13:08:36.358460081Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:36.369027 containerd[1543]: time="2025-01-30T13:08:36.368999784Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:36.376113 containerd[1543]: time="2025-01-30T13:08:36.369723145Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 4.144621302s" Jan 30 13:08:36.376113 containerd[1543]: time="2025-01-30T13:08:36.369743159Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Jan 30 13:08:36.376113 containerd[1543]: time="2025-01-30T13:08:36.370968187Z" level=info msg="CreateContainer within sandbox \"b1194eb381517e8615fa0630fff5fdfc63f37461e07214f0e46ac21261b045ac\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 30 13:08:36.485686 containerd[1543]: time="2025-01-30T13:08:36.485653077Z" level=info msg="CreateContainer within sandbox \"b1194eb381517e8615fa0630fff5fdfc63f37461e07214f0e46ac21261b045ac\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a32ebaa227f9fc87b376c8918f211c73d3f30f900ef7946cf0b21600376dfd2c\"" Jan 30 13:08:36.486036 containerd[1543]: time="2025-01-30T13:08:36.486022311Z" level=info msg="StartContainer for \"a32ebaa227f9fc87b376c8918f211c73d3f30f900ef7946cf0b21600376dfd2c\"" Jan 30 13:08:36.587783 systemd[1]: Started cri-containerd-a32ebaa227f9fc87b376c8918f211c73d3f30f900ef7946cf0b21600376dfd2c.scope - libcontainer container a32ebaa227f9fc87b376c8918f211c73d3f30f900ef7946cf0b21600376dfd2c. Jan 30 13:08:36.663423 containerd[1543]: time="2025-01-30T13:08:36.663395057Z" level=info msg="StartContainer for \"a32ebaa227f9fc87b376c8918f211c73d3f30f900ef7946cf0b21600376dfd2c\" returns successfully" Jan 30 13:08:37.154280 kubelet[2795]: E0130 13:08:37.153621 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gj66t" podUID="0d14c0df-65f9-4785-8227-ecaaf26cf401" Jan 30 13:08:38.034228 systemd[1]: cri-containerd-a32ebaa227f9fc87b376c8918f211c73d3f30f900ef7946cf0b21600376dfd2c.scope: Deactivated successfully. Jan 30 13:08:38.056718 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a32ebaa227f9fc87b376c8918f211c73d3f30f900ef7946cf0b21600376dfd2c-rootfs.mount: Deactivated successfully. Jan 30 13:08:38.075745 containerd[1543]: time="2025-01-30T13:08:38.075161609Z" level=info msg="shim disconnected" id=a32ebaa227f9fc87b376c8918f211c73d3f30f900ef7946cf0b21600376dfd2c namespace=k8s.io Jan 30 13:08:38.075745 containerd[1543]: time="2025-01-30T13:08:38.075195259Z" level=warning msg="cleaning up after shim disconnected" id=a32ebaa227f9fc87b376c8918f211c73d3f30f900ef7946cf0b21600376dfd2c namespace=k8s.io Jan 30 13:08:38.075745 containerd[1543]: time="2025-01-30T13:08:38.075200511Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 30 13:08:38.113419 kubelet[2795]: I0130 13:08:38.113405 2795 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Jan 30 13:08:38.670446 kubelet[2795]: I0130 13:08:38.670238 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmnws\" (UniqueName: \"kubernetes.io/projected/11bf24ac-ae1c-4a6f-b202-4add9f89afb0-kube-api-access-tmnws\") pod \"calico-apiserver-64db96d5fb-lnt9h\" (UID: \"11bf24ac-ae1c-4a6f-b202-4add9f89afb0\") " pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" Jan 30 13:08:38.670446 kubelet[2795]: I0130 13:08:38.670269 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5zbh\" (UniqueName: \"kubernetes.io/projected/10e36bb1-d0be-4ccd-ba00-61a2715458b9-kube-api-access-f5zbh\") pod \"coredns-6f6b679f8f-rrtms\" (UID: \"10e36bb1-d0be-4ccd-ba00-61a2715458b9\") " pod="kube-system/coredns-6f6b679f8f-rrtms" Jan 30 13:08:38.670446 kubelet[2795]: I0130 13:08:38.670280 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da358280-7ea3-4fe4-afd4-56d955439401-tigera-ca-bundle\") pod \"calico-kube-controllers-697fd69f5c-2x4nc\" (UID: \"da358280-7ea3-4fe4-afd4-56d955439401\") " pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" Jan 30 13:08:38.670446 kubelet[2795]: I0130 13:08:38.670291 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10e36bb1-d0be-4ccd-ba00-61a2715458b9-config-volume\") pod \"coredns-6f6b679f8f-rrtms\" (UID: \"10e36bb1-d0be-4ccd-ba00-61a2715458b9\") " pod="kube-system/coredns-6f6b679f8f-rrtms" Jan 30 13:08:38.670446 kubelet[2795]: I0130 13:08:38.670304 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tclnh\" (UniqueName: \"kubernetes.io/projected/93ca16b2-990d-42cd-8ac7-c7b8297af1b4-kube-api-access-tclnh\") pod \"coredns-6f6b679f8f-k44fd\" (UID: \"93ca16b2-990d-42cd-8ac7-c7b8297af1b4\") " pod="kube-system/coredns-6f6b679f8f-k44fd" Jan 30 13:08:38.670842 kubelet[2795]: I0130 13:08:38.670313 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/12536d15-5456-4602-a1a1-2e8242e08904-calico-apiserver-certs\") pod \"calico-apiserver-64db96d5fb-g4bk8\" (UID: \"12536d15-5456-4602-a1a1-2e8242e08904\") " pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" Jan 30 13:08:38.670842 kubelet[2795]: I0130 13:08:38.670327 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf2pb\" (UniqueName: \"kubernetes.io/projected/da358280-7ea3-4fe4-afd4-56d955439401-kube-api-access-sf2pb\") pod \"calico-kube-controllers-697fd69f5c-2x4nc\" (UID: \"da358280-7ea3-4fe4-afd4-56d955439401\") " pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" Jan 30 13:08:38.670842 kubelet[2795]: I0130 13:08:38.670339 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdm2x\" (UniqueName: \"kubernetes.io/projected/12536d15-5456-4602-a1a1-2e8242e08904-kube-api-access-rdm2x\") pod \"calico-apiserver-64db96d5fb-g4bk8\" (UID: \"12536d15-5456-4602-a1a1-2e8242e08904\") " pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" Jan 30 13:08:38.670842 kubelet[2795]: I0130 13:08:38.670348 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93ca16b2-990d-42cd-8ac7-c7b8297af1b4-config-volume\") pod \"coredns-6f6b679f8f-k44fd\" (UID: \"93ca16b2-990d-42cd-8ac7-c7b8297af1b4\") " pod="kube-system/coredns-6f6b679f8f-k44fd" Jan 30 13:08:38.670842 kubelet[2795]: I0130 13:08:38.670359 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/11bf24ac-ae1c-4a6f-b202-4add9f89afb0-calico-apiserver-certs\") pod \"calico-apiserver-64db96d5fb-lnt9h\" (UID: \"11bf24ac-ae1c-4a6f-b202-4add9f89afb0\") " pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" Jan 30 13:08:38.696781 containerd[1543]: time="2025-01-30T13:08:38.696723378Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Jan 30 13:08:38.747347 systemd[1]: Created slice kubepods-besteffort-pod11bf24ac_ae1c_4a6f_b202_4add9f89afb0.slice - libcontainer container kubepods-besteffort-pod11bf24ac_ae1c_4a6f_b202_4add9f89afb0.slice. Jan 30 13:08:38.750422 systemd[1]: Created slice kubepods-besteffort-podda358280_7ea3_4fe4_afd4_56d955439401.slice - libcontainer container kubepods-besteffort-podda358280_7ea3_4fe4_afd4_56d955439401.slice. Jan 30 13:08:38.751374 systemd[1]: Created slice kubepods-burstable-pod10e36bb1_d0be_4ccd_ba00_61a2715458b9.slice - libcontainer container kubepods-burstable-pod10e36bb1_d0be_4ccd_ba00_61a2715458b9.slice. Jan 30 13:08:38.752999 systemd[1]: Created slice kubepods-burstable-pod93ca16b2_990d_42cd_8ac7_c7b8297af1b4.slice - libcontainer container kubepods-burstable-pod93ca16b2_990d_42cd_8ac7_c7b8297af1b4.slice. Jan 30 13:08:38.763349 systemd[1]: Created slice kubepods-besteffort-pod12536d15_5456_4602_a1a1_2e8242e08904.slice - libcontainer container kubepods-besteffort-pod12536d15_5456_4602_a1a1_2e8242e08904.slice. Jan 30 13:08:39.118468 containerd[1543]: time="2025-01-30T13:08:39.118347701Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rrtms,Uid:10e36bb1-d0be-4ccd-ba00-61a2715458b9,Namespace:kube-system,Attempt:0,}" Jan 30 13:08:39.160082 systemd[1]: Created slice kubepods-besteffort-pod0d14c0df_65f9_4785_8227_ecaaf26cf401.slice - libcontainer container kubepods-besteffort-pod0d14c0df_65f9_4785_8227_ecaaf26cf401.slice. Jan 30 13:08:39.185411 containerd[1543]: time="2025-01-30T13:08:39.185141658Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fd69f5c-2x4nc,Uid:da358280-7ea3-4fe4-afd4-56d955439401,Namespace:calico-system,Attempt:0,}" Jan 30 13:08:39.197288 containerd[1543]: time="2025-01-30T13:08:39.193625956Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-lnt9h,Uid:11bf24ac-ae1c-4a6f-b202-4add9f89afb0,Namespace:calico-apiserver,Attempt:0,}" Jan 30 13:08:39.197288 containerd[1543]: time="2025-01-30T13:08:39.194294719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gj66t,Uid:0d14c0df-65f9-4785-8227-ecaaf26cf401,Namespace:calico-system,Attempt:0,}" Jan 30 13:08:39.197724 containerd[1543]: time="2025-01-30T13:08:39.197702658Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-g4bk8,Uid:12536d15-5456-4602-a1a1-2e8242e08904,Namespace:calico-apiserver,Attempt:0,}" Jan 30 13:08:39.198617 containerd[1543]: time="2025-01-30T13:08:39.198568976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k44fd,Uid:93ca16b2-990d-42cd-8ac7-c7b8297af1b4,Namespace:kube-system,Attempt:0,}" Jan 30 13:08:39.627665 containerd[1543]: time="2025-01-30T13:08:39.626815250Z" level=error msg="Failed to destroy network for sandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.628805 containerd[1543]: time="2025-01-30T13:08:39.628774061Z" level=error msg="Failed to destroy network for sandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.629713 containerd[1543]: time="2025-01-30T13:08:39.629673210Z" level=error msg="encountered an error cleaning up failed sandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.629776 containerd[1543]: time="2025-01-30T13:08:39.629751905Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k44fd,Uid:93ca16b2-990d-42cd-8ac7-c7b8297af1b4,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.629988 containerd[1543]: time="2025-01-30T13:08:39.629969362Z" level=error msg="encountered an error cleaning up failed sandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.630020 containerd[1543]: time="2025-01-30T13:08:39.630002244Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rrtms,Uid:10e36bb1-d0be-4ccd-ba00-61a2715458b9,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.642861 kubelet[2795]: E0130 13:08:39.633987 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.645109 kubelet[2795]: E0130 13:08:39.637958 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.648144 containerd[1543]: time="2025-01-30T13:08:39.648110629Z" level=error msg="Failed to destroy network for sandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.648351 containerd[1543]: time="2025-01-30T13:08:39.648334326Z" level=error msg="encountered an error cleaning up failed sandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.648394 containerd[1543]: time="2025-01-30T13:08:39.648379440Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-g4bk8,Uid:12536d15-5456-4602-a1a1-2e8242e08904,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.651876 kubelet[2795]: E0130 13:08:39.645127 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rrtms" Jan 30 13:08:39.651974 kubelet[2795]: E0130 13:08:39.651887 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rrtms" Jan 30 13:08:39.651974 kubelet[2795]: E0130 13:08:39.651942 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-rrtms_kube-system(10e36bb1-d0be-4ccd-ba00-61a2715458b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-rrtms_kube-system(10e36bb1-d0be-4ccd-ba00-61a2715458b9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-rrtms" podUID="10e36bb1-d0be-4ccd-ba00-61a2715458b9" Jan 30 13:08:39.652172 containerd[1543]: time="2025-01-30T13:08:39.652140537Z" level=error msg="Failed to destroy network for sandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.652900 containerd[1543]: time="2025-01-30T13:08:39.652882517Z" level=error msg="encountered an error cleaning up failed sandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.653669 containerd[1543]: time="2025-01-30T13:08:39.652919910Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gj66t,Uid:0d14c0df-65f9-4785-8227-ecaaf26cf401,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.653740 kubelet[2795]: E0130 13:08:39.642905 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k44fd" Jan 30 13:08:39.653740 kubelet[2795]: E0130 13:08:39.653568 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k44fd" Jan 30 13:08:39.653740 kubelet[2795]: E0130 13:08:39.653605 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-k44fd_kube-system(93ca16b2-990d-42cd-8ac7-c7b8297af1b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-k44fd_kube-system(93ca16b2-990d-42cd-8ac7-c7b8297af1b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-k44fd" podUID="93ca16b2-990d-42cd-8ac7-c7b8297af1b4" Jan 30 13:08:39.653830 kubelet[2795]: E0130 13:08:39.653695 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.653830 kubelet[2795]: E0130 13:08:39.653720 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:39.653830 kubelet[2795]: E0130 13:08:39.653735 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:39.653886 kubelet[2795]: E0130 13:08:39.653757 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gj66t_calico-system(0d14c0df-65f9-4785-8227-ecaaf26cf401)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gj66t_calico-system(0d14c0df-65f9-4785-8227-ecaaf26cf401)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gj66t" podUID="0d14c0df-65f9-4785-8227-ecaaf26cf401" Jan 30 13:08:39.653886 kubelet[2795]: E0130 13:08:39.653784 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.653886 kubelet[2795]: E0130 13:08:39.653798 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" Jan 30 13:08:39.653959 kubelet[2795]: E0130 13:08:39.653810 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" Jan 30 13:08:39.653959 kubelet[2795]: E0130 13:08:39.653830 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64db96d5fb-g4bk8_calico-apiserver(12536d15-5456-4602-a1a1-2e8242e08904)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64db96d5fb-g4bk8_calico-apiserver(12536d15-5456-4602-a1a1-2e8242e08904)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" podUID="12536d15-5456-4602-a1a1-2e8242e08904" Jan 30 13:08:39.656696 containerd[1543]: time="2025-01-30T13:08:39.656375859Z" level=error msg="Failed to destroy network for sandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.656696 containerd[1543]: time="2025-01-30T13:08:39.656593797Z" level=error msg="encountered an error cleaning up failed sandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.656696 containerd[1543]: time="2025-01-30T13:08:39.656630115Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fd69f5c-2x4nc,Uid:da358280-7ea3-4fe4-afd4-56d955439401,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.656830 kubelet[2795]: E0130 13:08:39.656773 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.656830 kubelet[2795]: E0130 13:08:39.656816 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" Jan 30 13:08:39.656830 kubelet[2795]: E0130 13:08:39.656828 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" Jan 30 13:08:39.657432 kubelet[2795]: E0130 13:08:39.657411 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-697fd69f5c-2x4nc_calico-system(da358280-7ea3-4fe4-afd4-56d955439401)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-697fd69f5c-2x4nc_calico-system(da358280-7ea3-4fe4-afd4-56d955439401)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" podUID="da358280-7ea3-4fe4-afd4-56d955439401" Jan 30 13:08:39.659379 containerd[1543]: time="2025-01-30T13:08:39.659344518Z" level=error msg="Failed to destroy network for sandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.665978 containerd[1543]: time="2025-01-30T13:08:39.665923315Z" level=error msg="encountered an error cleaning up failed sandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.665978 containerd[1543]: time="2025-01-30T13:08:39.665975067Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-lnt9h,Uid:11bf24ac-ae1c-4a6f-b202-4add9f89afb0,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.666897 kubelet[2795]: E0130 13:08:39.666870 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:39.666970 kubelet[2795]: E0130 13:08:39.666923 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" Jan 30 13:08:39.666970 kubelet[2795]: E0130 13:08:39.666937 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" Jan 30 13:08:39.667286 kubelet[2795]: E0130 13:08:39.666966 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64db96d5fb-lnt9h_calico-apiserver(11bf24ac-ae1c-4a6f-b202-4add9f89afb0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64db96d5fb-lnt9h_calico-apiserver(11bf24ac-ae1c-4a6f-b202-4add9f89afb0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" podUID="11bf24ac-ae1c-4a6f-b202-4add9f89afb0" Jan 30 13:08:40.211396 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e-shm.mount: Deactivated successfully. Jan 30 13:08:40.211459 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60-shm.mount: Deactivated successfully. Jan 30 13:08:40.211496 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248-shm.mount: Deactivated successfully. Jan 30 13:08:40.604222 kubelet[2795]: I0130 13:08:40.603911 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248" Jan 30 13:08:40.635866 kubelet[2795]: I0130 13:08:40.604778 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e" Jan 30 13:08:40.683425 kubelet[2795]: I0130 13:08:40.683223 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2" Jan 30 13:08:40.687537 kubelet[2795]: I0130 13:08:40.683930 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60" Jan 30 13:08:40.687537 kubelet[2795]: I0130 13:08:40.684423 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842" Jan 30 13:08:40.687537 kubelet[2795]: I0130 13:08:40.684926 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd" Jan 30 13:08:40.906961 containerd[1543]: time="2025-01-30T13:08:40.906897475Z" level=info msg="StopPodSandbox for \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\"" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.916341529Z" level=info msg="Ensure that sandbox 924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2 in task-service has been cleanup successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.906899092Z" level=info msg="StopPodSandbox for \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\"" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.906912342Z" level=info msg="StopPodSandbox for \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\"" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.918250539Z" level=info msg="StopPodSandbox for \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\"" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.918395827Z" level=info msg="Ensure that sandbox 0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248 in task-service has been cleanup successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.918533110Z" level=info msg="TearDown network for sandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\" successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.918543151Z" level=info msg="StopPodSandbox for \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\" returns successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.918633993Z" level=info msg="Ensure that sandbox 85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e in task-service has been cleanup successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.918774881Z" level=info msg="TearDown network for sandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\" successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.918785943Z" level=info msg="StopPodSandbox for \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\" returns successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.918809738Z" level=info msg="TearDown network for sandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\" successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.918816467Z" level=info msg="StopPodSandbox for \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\" returns successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.918927841Z" level=info msg="Ensure that sandbox ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60 in task-service has been cleanup successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.919033542Z" level=info msg="TearDown network for sandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\" successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.919042944Z" level=info msg="StopPodSandbox for \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\" returns successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.919099073Z" level=info msg="StopPodSandbox for \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\"" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.919185609Z" level=info msg="Ensure that sandbox 538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd in task-service has been cleanup successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.919729617Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fd69f5c-2x4nc,Uid:da358280-7ea3-4fe4-afd4-56d955439401,Namespace:calico-system,Attempt:1,}" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.919874780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k44fd,Uid:93ca16b2-990d-42cd-8ac7-c7b8297af1b4,Namespace:kube-system,Attempt:1,}" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.919989181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gj66t,Uid:0d14c0df-65f9-4785-8227-ecaaf26cf401,Namespace:calico-system,Attempt:1,}" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.906929630Z" level=info msg="StopPodSandbox for \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\"" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.920510329Z" level=info msg="TearDown network for sandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\" successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.920519315Z" level=info msg="StopPodSandbox for \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\" returns successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.920580314Z" level=info msg="Ensure that sandbox c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842 in task-service has been cleanup successfully" Jan 30 13:08:40.922750 containerd[1543]: time="2025-01-30T13:08:40.920666203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rrtms,Uid:10e36bb1-d0be-4ccd-ba00-61a2715458b9,Namespace:kube-system,Attempt:1,}" Jan 30 13:08:40.918604 systemd[1]: run-netns-cni\x2d93befb4f\x2d0a20\x2d917f\x2d2e0b\x2d41defef9453f.mount: Deactivated successfully. Jan 30 13:08:40.925405 containerd[1543]: time="2025-01-30T13:08:40.922957182Z" level=info msg="TearDown network for sandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\" successfully" Jan 30 13:08:40.925405 containerd[1543]: time="2025-01-30T13:08:40.922965287Z" level=info msg="StopPodSandbox for \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\" returns successfully" Jan 30 13:08:40.925405 containerd[1543]: time="2025-01-30T13:08:40.923004618Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-lnt9h,Uid:11bf24ac-ae1c-4a6f-b202-4add9f89afb0,Namespace:calico-apiserver,Attempt:1,}" Jan 30 13:08:40.925405 containerd[1543]: time="2025-01-30T13:08:40.923362477Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-g4bk8,Uid:12536d15-5456-4602-a1a1-2e8242e08904,Namespace:calico-apiserver,Attempt:1,}" Jan 30 13:08:40.922024 systemd[1]: run-netns-cni\x2db6b874b6\x2d3bc6\x2db7a5\x2d6d69\x2d5f175907b750.mount: Deactivated successfully. Jan 30 13:08:40.922071 systemd[1]: run-netns-cni\x2da554ac10\x2d2e1b\x2da47c\x2d2c33\x2dddc65562d085.mount: Deactivated successfully. Jan 30 13:08:40.922105 systemd[1]: run-netns-cni\x2dc85b7014\x2de8cd\x2dfbbc\x2d554b\x2d56c246da62ee.mount: Deactivated successfully. Jan 30 13:08:40.922138 systemd[1]: run-netns-cni\x2d58204836\x2dd98c\x2d03d6\x2d875c\x2dcb08f71ee75b.mount: Deactivated successfully. Jan 30 13:08:40.922169 systemd[1]: run-netns-cni\x2d6370c735\x2dc3c7\x2dff57\x2d79a0\x2d9489f0f292dd.mount: Deactivated successfully. Jan 30 13:08:41.142230 containerd[1543]: time="2025-01-30T13:08:41.141878957Z" level=error msg="Failed to destroy network for sandbox \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.142661 containerd[1543]: time="2025-01-30T13:08:41.142550359Z" level=error msg="Failed to destroy network for sandbox \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.142983 containerd[1543]: time="2025-01-30T13:08:41.142893653Z" level=error msg="encountered an error cleaning up failed sandbox \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.142983 containerd[1543]: time="2025-01-30T13:08:41.142948477Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-lnt9h,Uid:11bf24ac-ae1c-4a6f-b202-4add9f89afb0,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.147752 containerd[1543]: time="2025-01-30T13:08:41.145592409Z" level=error msg="encountered an error cleaning up failed sandbox \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.147752 containerd[1543]: time="2025-01-30T13:08:41.145640843Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rrtms,Uid:10e36bb1-d0be-4ccd-ba00-61a2715458b9,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.147814 kubelet[2795]: E0130 13:08:41.143208 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.147814 kubelet[2795]: E0130 13:08:41.143258 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" Jan 30 13:08:41.147814 kubelet[2795]: E0130 13:08:41.143281 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" Jan 30 13:08:41.147924 kubelet[2795]: E0130 13:08:41.143309 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64db96d5fb-lnt9h_calico-apiserver(11bf24ac-ae1c-4a6f-b202-4add9f89afb0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64db96d5fb-lnt9h_calico-apiserver(11bf24ac-ae1c-4a6f-b202-4add9f89afb0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" podUID="11bf24ac-ae1c-4a6f-b202-4add9f89afb0" Jan 30 13:08:41.147924 kubelet[2795]: E0130 13:08:41.147237 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.147924 kubelet[2795]: E0130 13:08:41.147291 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rrtms" Jan 30 13:08:41.148065 kubelet[2795]: E0130 13:08:41.147307 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rrtms" Jan 30 13:08:41.148065 kubelet[2795]: E0130 13:08:41.147335 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-rrtms_kube-system(10e36bb1-d0be-4ccd-ba00-61a2715458b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-rrtms_kube-system(10e36bb1-d0be-4ccd-ba00-61a2715458b9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-rrtms" podUID="10e36bb1-d0be-4ccd-ba00-61a2715458b9" Jan 30 13:08:41.150730 containerd[1543]: time="2025-01-30T13:08:41.150379527Z" level=error msg="Failed to destroy network for sandbox \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.150730 containerd[1543]: time="2025-01-30T13:08:41.150623866Z" level=error msg="encountered an error cleaning up failed sandbox \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.150730 containerd[1543]: time="2025-01-30T13:08:41.150663496Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gj66t,Uid:0d14c0df-65f9-4785-8227-ecaaf26cf401,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.150856 kubelet[2795]: E0130 13:08:41.150817 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.150885 kubelet[2795]: E0130 13:08:41.150854 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:41.150885 kubelet[2795]: E0130 13:08:41.150867 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:41.150935 kubelet[2795]: E0130 13:08:41.150893 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gj66t_calico-system(0d14c0df-65f9-4785-8227-ecaaf26cf401)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gj66t_calico-system(0d14c0df-65f9-4785-8227-ecaaf26cf401)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gj66t" podUID="0d14c0df-65f9-4785-8227-ecaaf26cf401" Jan 30 13:08:41.153534 containerd[1543]: time="2025-01-30T13:08:41.153428592Z" level=error msg="Failed to destroy network for sandbox \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.153829 containerd[1543]: time="2025-01-30T13:08:41.153815709Z" level=error msg="encountered an error cleaning up failed sandbox \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.153909 containerd[1543]: time="2025-01-30T13:08:41.153897030Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fd69f5c-2x4nc,Uid:da358280-7ea3-4fe4-afd4-56d955439401,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.154740 kubelet[2795]: E0130 13:08:41.154717 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.154789 kubelet[2795]: E0130 13:08:41.154752 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" Jan 30 13:08:41.154789 kubelet[2795]: E0130 13:08:41.154764 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" Jan 30 13:08:41.154833 kubelet[2795]: E0130 13:08:41.154786 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-697fd69f5c-2x4nc_calico-system(da358280-7ea3-4fe4-afd4-56d955439401)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-697fd69f5c-2x4nc_calico-system(da358280-7ea3-4fe4-afd4-56d955439401)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" podUID="da358280-7ea3-4fe4-afd4-56d955439401" Jan 30 13:08:41.171359 containerd[1543]: time="2025-01-30T13:08:41.171185668Z" level=error msg="Failed to destroy network for sandbox \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.172325 containerd[1543]: time="2025-01-30T13:08:41.171680764Z" level=error msg="Failed to destroy network for sandbox \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.172325 containerd[1543]: time="2025-01-30T13:08:41.172025281Z" level=error msg="encountered an error cleaning up failed sandbox \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.172325 containerd[1543]: time="2025-01-30T13:08:41.172057480Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k44fd,Uid:93ca16b2-990d-42cd-8ac7-c7b8297af1b4,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.172325 containerd[1543]: time="2025-01-30T13:08:41.172166802Z" level=error msg="encountered an error cleaning up failed sandbox \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.172325 containerd[1543]: time="2025-01-30T13:08:41.172186762Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-g4bk8,Uid:12536d15-5456-4602-a1a1-2e8242e08904,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.172783 kubelet[2795]: E0130 13:08:41.172755 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.172823 kubelet[2795]: E0130 13:08:41.172791 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" Jan 30 13:08:41.172823 kubelet[2795]: E0130 13:08:41.172806 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" Jan 30 13:08:41.172823 kubelet[2795]: E0130 13:08:41.172832 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64db96d5fb-g4bk8_calico-apiserver(12536d15-5456-4602-a1a1-2e8242e08904)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64db96d5fb-g4bk8_calico-apiserver(12536d15-5456-4602-a1a1-2e8242e08904)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" podUID="12536d15-5456-4602-a1a1-2e8242e08904" Jan 30 13:08:41.173709 kubelet[2795]: E0130 13:08:41.173458 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:41.173709 kubelet[2795]: E0130 13:08:41.173481 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k44fd" Jan 30 13:08:41.173709 kubelet[2795]: E0130 13:08:41.173500 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k44fd" Jan 30 13:08:41.173786 kubelet[2795]: E0130 13:08:41.173517 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-k44fd_kube-system(93ca16b2-990d-42cd-8ac7-c7b8297af1b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-k44fd_kube-system(93ca16b2-990d-42cd-8ac7-c7b8297af1b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-k44fd" podUID="93ca16b2-990d-42cd-8ac7-c7b8297af1b4" Jan 30 13:08:41.213354 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8-shm.mount: Deactivated successfully. Jan 30 13:08:41.831900 kubelet[2795]: I0130 13:08:41.831879 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7" Jan 30 13:08:41.832378 containerd[1543]: time="2025-01-30T13:08:41.832231526Z" level=info msg="StopPodSandbox for \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\"" Jan 30 13:08:41.832378 containerd[1543]: time="2025-01-30T13:08:41.832364110Z" level=info msg="Ensure that sandbox 7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7 in task-service has been cleanup successfully" Jan 30 13:08:41.833981 kubelet[2795]: I0130 13:08:41.833792 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b" Jan 30 13:08:41.834299 containerd[1543]: time="2025-01-30T13:08:41.833893176Z" level=info msg="TearDown network for sandbox \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\" successfully" Jan 30 13:08:41.834299 containerd[1543]: time="2025-01-30T13:08:41.833904086Z" level=info msg="StopPodSandbox for \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\" returns successfully" Jan 30 13:08:41.834812 containerd[1543]: time="2025-01-30T13:08:41.834762808Z" level=info msg="StopPodSandbox for \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\"" Jan 30 13:08:41.835824 systemd[1]: run-netns-cni\x2d6de35b5f\x2dd13d\x2d20b9\x2d8c23\x2dbb6eade0dc40.mount: Deactivated successfully. Jan 30 13:08:41.837410 containerd[1543]: time="2025-01-30T13:08:41.835945620Z" level=info msg="TearDown network for sandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\" successfully" Jan 30 13:08:41.837410 containerd[1543]: time="2025-01-30T13:08:41.834953925Z" level=info msg="StopPodSandbox for \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\"" Jan 30 13:08:41.837410 containerd[1543]: time="2025-01-30T13:08:41.836988800Z" level=info msg="Ensure that sandbox d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b in task-service has been cleanup successfully" Jan 30 13:08:41.837527 containerd[1543]: time="2025-01-30T13:08:41.837516896Z" level=info msg="TearDown network for sandbox \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\" successfully" Jan 30 13:08:41.837527 containerd[1543]: time="2025-01-30T13:08:41.837525481Z" level=info msg="StopPodSandbox for \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\" returns successfully" Jan 30 13:08:41.838739 containerd[1543]: time="2025-01-30T13:08:41.837008433Z" level=info msg="StopPodSandbox for \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\" returns successfully" Jan 30 13:08:41.839133 containerd[1543]: time="2025-01-30T13:08:41.839105326Z" level=info msg="StopPodSandbox for \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\"" Jan 30 13:08:41.839190 containerd[1543]: time="2025-01-30T13:08:41.839152602Z" level=info msg="TearDown network for sandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\" successfully" Jan 30 13:08:41.839190 containerd[1543]: time="2025-01-30T13:08:41.839159099Z" level=info msg="StopPodSandbox for \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\" returns successfully" Jan 30 13:08:41.839447 systemd[1]: run-netns-cni\x2decd3cb79\x2dd3d0\x2dcd1e\x2d103c\x2d690ae7f89513.mount: Deactivated successfully. Jan 30 13:08:41.839888 kubelet[2795]: I0130 13:08:41.839791 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f" Jan 30 13:08:41.840820 containerd[1543]: time="2025-01-30T13:08:41.840758389Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rrtms,Uid:10e36bb1-d0be-4ccd-ba00-61a2715458b9,Namespace:kube-system,Attempt:2,}" Jan 30 13:08:41.857085 kubelet[2795]: I0130 13:08:41.857066 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3" Jan 30 13:08:41.857343 containerd[1543]: time="2025-01-30T13:08:41.857161190Z" level=info msg="StopPodSandbox for \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\"" Jan 30 13:08:41.857756 containerd[1543]: time="2025-01-30T13:08:41.857672847Z" level=info msg="Ensure that sandbox f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f in task-service has been cleanup successfully" Jan 30 13:08:41.857906 containerd[1543]: time="2025-01-30T13:08:41.857853681Z" level=info msg="TearDown network for sandbox \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\" successfully" Jan 30 13:08:41.857906 containerd[1543]: time="2025-01-30T13:08:41.857866296Z" level=info msg="StopPodSandbox for \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\" returns successfully" Jan 30 13:08:41.860442 systemd[1]: run-netns-cni\x2d574a0aed\x2de997\x2d2a52\x2dac3d\x2d9f0081184226.mount: Deactivated successfully. Jan 30 13:08:41.868875 containerd[1543]: time="2025-01-30T13:08:41.868782567Z" level=info msg="StopPodSandbox for \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\"" Jan 30 13:08:41.869031 containerd[1543]: time="2025-01-30T13:08:41.868972874Z" level=info msg="TearDown network for sandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\" successfully" Jan 30 13:08:41.869031 containerd[1543]: time="2025-01-30T13:08:41.868982495Z" level=info msg="StopPodSandbox for \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\" returns successfully" Jan 30 13:08:41.869241 containerd[1543]: time="2025-01-30T13:08:41.869225502Z" level=info msg="StopPodSandbox for \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\"" Jan 30 13:08:41.869553 containerd[1543]: time="2025-01-30T13:08:41.869407073Z" level=info msg="Ensure that sandbox 17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3 in task-service has been cleanup successfully" Jan 30 13:08:41.871201 systemd[1]: run-netns-cni\x2d2939ae4f\x2df3ca\x2d6178\x2d8fc8\x2d2421d0aea705.mount: Deactivated successfully. Jan 30 13:08:41.875550 containerd[1543]: time="2025-01-30T13:08:41.874256984Z" level=info msg="TearDown network for sandbox \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\" successfully" Jan 30 13:08:41.875550 containerd[1543]: time="2025-01-30T13:08:41.874283145Z" level=info msg="StopPodSandbox for \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\" returns successfully" Jan 30 13:08:41.911842 containerd[1543]: time="2025-01-30T13:08:41.911818791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fd69f5c-2x4nc,Uid:da358280-7ea3-4fe4-afd4-56d955439401,Namespace:calico-system,Attempt:2,}" Jan 30 13:08:41.912608 containerd[1543]: time="2025-01-30T13:08:41.912594304Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gj66t,Uid:0d14c0df-65f9-4785-8227-ecaaf26cf401,Namespace:calico-system,Attempt:2,}" Jan 30 13:08:41.940963 containerd[1543]: time="2025-01-30T13:08:41.940943511Z" level=info msg="StopPodSandbox for \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\"" Jan 30 13:08:41.942532 kubelet[2795]: I0130 13:08:41.941995 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd" Jan 30 13:08:41.944378 kubelet[2795]: I0130 13:08:41.944017 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8" Jan 30 13:08:41.948969 containerd[1543]: time="2025-01-30T13:08:41.941125151Z" level=info msg="TearDown network for sandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\" successfully" Jan 30 13:08:41.949047 containerd[1543]: time="2025-01-30T13:08:41.949033952Z" level=info msg="StopPodSandbox for \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\" returns successfully" Jan 30 13:08:41.949115 containerd[1543]: time="2025-01-30T13:08:41.943582865Z" level=info msg="StopPodSandbox for \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\"" Jan 30 13:08:41.949284 containerd[1543]: time="2025-01-30T13:08:41.949274270Z" level=info msg="Ensure that sandbox 01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd in task-service has been cleanup successfully" Jan 30 13:08:41.949439 containerd[1543]: time="2025-01-30T13:08:41.949430725Z" level=info msg="TearDown network for sandbox \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\" successfully" Jan 30 13:08:41.949517 containerd[1543]: time="2025-01-30T13:08:41.949508441Z" level=info msg="StopPodSandbox for \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\" returns successfully" Jan 30 13:08:41.949568 containerd[1543]: time="2025-01-30T13:08:41.944299236Z" level=info msg="StopPodSandbox for \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\"" Jan 30 13:08:41.962500 containerd[1543]: time="2025-01-30T13:08:41.949761163Z" level=info msg="Ensure that sandbox ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8 in task-service has been cleanup successfully" Jan 30 13:08:41.962500 containerd[1543]: time="2025-01-30T13:08:41.949860296Z" level=info msg="TearDown network for sandbox \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\" successfully" Jan 30 13:08:41.962500 containerd[1543]: time="2025-01-30T13:08:41.949867090Z" level=info msg="StopPodSandbox for \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\" returns successfully" Jan 30 13:08:41.962500 containerd[1543]: time="2025-01-30T13:08:41.950159307Z" level=info msg="StopPodSandbox for \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\"" Jan 30 13:08:41.962500 containerd[1543]: time="2025-01-30T13:08:41.950193014Z" level=info msg="TearDown network for sandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\" successfully" Jan 30 13:08:41.962500 containerd[1543]: time="2025-01-30T13:08:41.950198180Z" level=info msg="StopPodSandbox for \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\" returns successfully" Jan 30 13:08:41.962500 containerd[1543]: time="2025-01-30T13:08:41.950225201Z" level=info msg="StopPodSandbox for \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\"" Jan 30 13:08:41.962500 containerd[1543]: time="2025-01-30T13:08:41.950252819Z" level=info msg="TearDown network for sandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\" successfully" Jan 30 13:08:41.962500 containerd[1543]: time="2025-01-30T13:08:41.950257495Z" level=info msg="StopPodSandbox for \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\" returns successfully" Jan 30 13:08:41.962500 containerd[1543]: time="2025-01-30T13:08:41.950300313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k44fd,Uid:93ca16b2-990d-42cd-8ac7-c7b8297af1b4,Namespace:kube-system,Attempt:2,}" Jan 30 13:08:41.962500 containerd[1543]: time="2025-01-30T13:08:41.950902812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-g4bk8,Uid:12536d15-5456-4602-a1a1-2e8242e08904,Namespace:calico-apiserver,Attempt:2,}" Jan 30 13:08:41.962500 containerd[1543]: time="2025-01-30T13:08:41.951000016Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-lnt9h,Uid:11bf24ac-ae1c-4a6f-b202-4add9f89afb0,Namespace:calico-apiserver,Attempt:2,}" Jan 30 13:08:42.211203 systemd[1]: run-netns-cni\x2dfcb56ff2\x2d2678\x2dd00d\x2d73e3\x2dc4a51778febe.mount: Deactivated successfully. Jan 30 13:08:42.211266 systemd[1]: run-netns-cni\x2d2e75c6ff\x2df2a4\x2d5b6f\x2d59b4\x2de44c128e34d0.mount: Deactivated successfully. Jan 30 13:08:42.611634 containerd[1543]: time="2025-01-30T13:08:42.610700176Z" level=error msg="Failed to destroy network for sandbox \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.611634 containerd[1543]: time="2025-01-30T13:08:42.610888208Z" level=error msg="encountered an error cleaning up failed sandbox \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.611634 containerd[1543]: time="2025-01-30T13:08:42.610920755Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gj66t,Uid:0d14c0df-65f9-4785-8227-ecaaf26cf401,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.611951 kubelet[2795]: E0130 13:08:42.611062 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.611951 kubelet[2795]: E0130 13:08:42.611104 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:42.611951 kubelet[2795]: E0130 13:08:42.611119 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:42.612029 kubelet[2795]: E0130 13:08:42.611149 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gj66t_calico-system(0d14c0df-65f9-4785-8227-ecaaf26cf401)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gj66t_calico-system(0d14c0df-65f9-4785-8227-ecaaf26cf401)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gj66t" podUID="0d14c0df-65f9-4785-8227-ecaaf26cf401" Jan 30 13:08:42.649318 containerd[1543]: time="2025-01-30T13:08:42.649184806Z" level=error msg="Failed to destroy network for sandbox \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.651899 containerd[1543]: time="2025-01-30T13:08:42.649805368Z" level=error msg="encountered an error cleaning up failed sandbox \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.651899 containerd[1543]: time="2025-01-30T13:08:42.649840922Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rrtms,Uid:10e36bb1-d0be-4ccd-ba00-61a2715458b9,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.666018 kubelet[2795]: E0130 13:08:42.649963 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.677692 kubelet[2795]: E0130 13:08:42.650003 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rrtms" Jan 30 13:08:42.677692 kubelet[2795]: E0130 13:08:42.677485 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rrtms" Jan 30 13:08:42.677692 kubelet[2795]: E0130 13:08:42.677521 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-rrtms_kube-system(10e36bb1-d0be-4ccd-ba00-61a2715458b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-rrtms_kube-system(10e36bb1-d0be-4ccd-ba00-61a2715458b9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-rrtms" podUID="10e36bb1-d0be-4ccd-ba00-61a2715458b9" Jan 30 13:08:42.681692 containerd[1543]: time="2025-01-30T13:08:42.681610098Z" level=error msg="Failed to destroy network for sandbox \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.681842 containerd[1543]: time="2025-01-30T13:08:42.681821689Z" level=error msg="encountered an error cleaning up failed sandbox \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.681884 containerd[1543]: time="2025-01-30T13:08:42.681862431Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k44fd,Uid:93ca16b2-990d-42cd-8ac7-c7b8297af1b4,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.682124 kubelet[2795]: E0130 13:08:42.681984 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.682124 kubelet[2795]: E0130 13:08:42.682018 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k44fd" Jan 30 13:08:42.682124 kubelet[2795]: E0130 13:08:42.682030 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k44fd" Jan 30 13:08:42.682260 kubelet[2795]: E0130 13:08:42.682057 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-k44fd_kube-system(93ca16b2-990d-42cd-8ac7-c7b8297af1b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-k44fd_kube-system(93ca16b2-990d-42cd-8ac7-c7b8297af1b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-k44fd" podUID="93ca16b2-990d-42cd-8ac7-c7b8297af1b4" Jan 30 13:08:42.714323 containerd[1543]: time="2025-01-30T13:08:42.714185820Z" level=error msg="Failed to destroy network for sandbox \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.714820 containerd[1543]: time="2025-01-30T13:08:42.714806628Z" level=error msg="encountered an error cleaning up failed sandbox \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.715233 containerd[1543]: time="2025-01-30T13:08:42.715205755Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fd69f5c-2x4nc,Uid:da358280-7ea3-4fe4-afd4-56d955439401,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.716168 kubelet[2795]: E0130 13:08:42.715499 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.716168 kubelet[2795]: E0130 13:08:42.715546 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" Jan 30 13:08:42.716168 kubelet[2795]: E0130 13:08:42.715559 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" Jan 30 13:08:42.716260 kubelet[2795]: E0130 13:08:42.715584 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-697fd69f5c-2x4nc_calico-system(da358280-7ea3-4fe4-afd4-56d955439401)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-697fd69f5c-2x4nc_calico-system(da358280-7ea3-4fe4-afd4-56d955439401)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" podUID="da358280-7ea3-4fe4-afd4-56d955439401" Jan 30 13:08:42.740771 containerd[1543]: time="2025-01-30T13:08:42.740692310Z" level=error msg="Failed to destroy network for sandbox \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.742102 containerd[1543]: time="2025-01-30T13:08:42.741079833Z" level=error msg="encountered an error cleaning up failed sandbox \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.742102 containerd[1543]: time="2025-01-30T13:08:42.741118046Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-g4bk8,Uid:12536d15-5456-4602-a1a1-2e8242e08904,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.742299 kubelet[2795]: E0130 13:08:42.741289 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.742299 kubelet[2795]: E0130 13:08:42.741327 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" Jan 30 13:08:42.742299 kubelet[2795]: E0130 13:08:42.741339 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" Jan 30 13:08:42.756888 kubelet[2795]: E0130 13:08:42.741366 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64db96d5fb-g4bk8_calico-apiserver(12536d15-5456-4602-a1a1-2e8242e08904)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64db96d5fb-g4bk8_calico-apiserver(12536d15-5456-4602-a1a1-2e8242e08904)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" podUID="12536d15-5456-4602-a1a1-2e8242e08904" Jan 30 13:08:42.768782 kubelet[2795]: E0130 13:08:42.758174 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.768782 kubelet[2795]: E0130 13:08:42.758215 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" Jan 30 13:08:42.768782 kubelet[2795]: E0130 13:08:42.758229 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" Jan 30 13:08:42.768890 containerd[1543]: time="2025-01-30T13:08:42.757071520Z" level=error msg="Failed to destroy network for sandbox \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.768890 containerd[1543]: time="2025-01-30T13:08:42.757853161Z" level=error msg="encountered an error cleaning up failed sandbox \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.768890 containerd[1543]: time="2025-01-30T13:08:42.757886318Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-lnt9h,Uid:11bf24ac-ae1c-4a6f-b202-4add9f89afb0,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:42.768998 kubelet[2795]: E0130 13:08:42.758254 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64db96d5fb-lnt9h_calico-apiserver(11bf24ac-ae1c-4a6f-b202-4add9f89afb0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64db96d5fb-lnt9h_calico-apiserver(11bf24ac-ae1c-4a6f-b202-4add9f89afb0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" podUID="11bf24ac-ae1c-4a6f-b202-4add9f89afb0" Jan 30 13:08:42.946897 kubelet[2795]: I0130 13:08:42.946844 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953" Jan 30 13:08:42.947682 containerd[1543]: time="2025-01-30T13:08:42.947615109Z" level=info msg="StopPodSandbox for \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\"" Jan 30 13:08:42.948077 containerd[1543]: time="2025-01-30T13:08:42.947743200Z" level=info msg="Ensure that sandbox 02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953 in task-service has been cleanup successfully" Jan 30 13:08:42.948523 containerd[1543]: time="2025-01-30T13:08:42.948494724Z" level=info msg="TearDown network for sandbox \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\" successfully" Jan 30 13:08:42.948523 containerd[1543]: time="2025-01-30T13:08:42.948518368Z" level=info msg="StopPodSandbox for \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\" returns successfully" Jan 30 13:08:42.949376 containerd[1543]: time="2025-01-30T13:08:42.948734551Z" level=info msg="StopPodSandbox for \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\"" Jan 30 13:08:42.949376 containerd[1543]: time="2025-01-30T13:08:42.948772903Z" level=info msg="TearDown network for sandbox \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\" successfully" Jan 30 13:08:42.949376 containerd[1543]: time="2025-01-30T13:08:42.948778803Z" level=info msg="StopPodSandbox for \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\" returns successfully" Jan 30 13:08:42.949376 containerd[1543]: time="2025-01-30T13:08:42.948942180Z" level=info msg="StopPodSandbox for \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\"" Jan 30 13:08:42.949376 containerd[1543]: time="2025-01-30T13:08:42.948987030Z" level=info msg="TearDown network for sandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\" successfully" Jan 30 13:08:42.949376 containerd[1543]: time="2025-01-30T13:08:42.948993863Z" level=info msg="StopPodSandbox for \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\" returns successfully" Jan 30 13:08:42.949892 containerd[1543]: time="2025-01-30T13:08:42.949737430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k44fd,Uid:93ca16b2-990d-42cd-8ac7-c7b8297af1b4,Namespace:kube-system,Attempt:3,}" Jan 30 13:08:42.951066 kubelet[2795]: I0130 13:08:42.950788 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef" Jan 30 13:08:42.951149 containerd[1543]: time="2025-01-30T13:08:42.951139133Z" level=info msg="StopPodSandbox for \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\"" Jan 30 13:08:42.951277 containerd[1543]: time="2025-01-30T13:08:42.951267729Z" level=info msg="Ensure that sandbox de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef in task-service has been cleanup successfully" Jan 30 13:08:42.951422 containerd[1543]: time="2025-01-30T13:08:42.951413313Z" level=info msg="TearDown network for sandbox \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\" successfully" Jan 30 13:08:42.951482 containerd[1543]: time="2025-01-30T13:08:42.951474431Z" level=info msg="StopPodSandbox for \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\" returns successfully" Jan 30 13:08:42.951721 kubelet[2795]: I0130 13:08:42.951599 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7" Jan 30 13:08:42.952138 containerd[1543]: time="2025-01-30T13:08:42.952020698Z" level=info msg="StopPodSandbox for \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\"" Jan 30 13:08:42.952138 containerd[1543]: time="2025-01-30T13:08:42.952069368Z" level=info msg="TearDown network for sandbox \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\" successfully" Jan 30 13:08:42.952138 containerd[1543]: time="2025-01-30T13:08:42.952076817Z" level=info msg="StopPodSandbox for \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\" returns successfully" Jan 30 13:08:42.954752 containerd[1543]: time="2025-01-30T13:08:42.954693708Z" level=info msg="StopPodSandbox for \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\"" Jan 30 13:08:42.954803 containerd[1543]: time="2025-01-30T13:08:42.954779105Z" level=info msg="Ensure that sandbox e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7 in task-service has been cleanup successfully" Jan 30 13:08:42.954990 containerd[1543]: time="2025-01-30T13:08:42.954979993Z" level=info msg="StopPodSandbox for \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\"" Jan 30 13:08:42.955076 containerd[1543]: time="2025-01-30T13:08:42.955067010Z" level=info msg="TearDown network for sandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\" successfully" Jan 30 13:08:42.955267 kubelet[2795]: I0130 13:08:42.955174 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a" Jan 30 13:08:42.955298 containerd[1543]: time="2025-01-30T13:08:42.955208252Z" level=info msg="StopPodSandbox for \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\" returns successfully" Jan 30 13:08:42.955298 containerd[1543]: time="2025-01-30T13:08:42.955093385Z" level=info msg="TearDown network for sandbox \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\" successfully" Jan 30 13:08:42.955298 containerd[1543]: time="2025-01-30T13:08:42.955244463Z" level=info msg="StopPodSandbox for \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\" returns successfully" Jan 30 13:08:42.955952 containerd[1543]: time="2025-01-30T13:08:42.955804068Z" level=info msg="StopPodSandbox for \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\"" Jan 30 13:08:42.955952 containerd[1543]: time="2025-01-30T13:08:42.955889967Z" level=info msg="Ensure that sandbox d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a in task-service has been cleanup successfully" Jan 30 13:08:42.956109 containerd[1543]: time="2025-01-30T13:08:42.956065736Z" level=info msg="StopPodSandbox for \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\"" Jan 30 13:08:42.956137 containerd[1543]: time="2025-01-30T13:08:42.956117508Z" level=info msg="TearDown network for sandbox \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\" successfully" Jan 30 13:08:42.956156 containerd[1543]: time="2025-01-30T13:08:42.956135862Z" level=info msg="StopPodSandbox for \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\" returns successfully" Jan 30 13:08:42.956266 containerd[1543]: time="2025-01-30T13:08:42.956216508Z" level=info msg="TearDown network for sandbox \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\" successfully" Jan 30 13:08:42.956266 containerd[1543]: time="2025-01-30T13:08:42.956226535Z" level=info msg="StopPodSandbox for \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\" returns successfully" Jan 30 13:08:42.957002 containerd[1543]: time="2025-01-30T13:08:42.956983766Z" level=info msg="StopPodSandbox for \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\"" Jan 30 13:08:42.957036 containerd[1543]: time="2025-01-30T13:08:42.957019958Z" level=info msg="TearDown network for sandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\" successfully" Jan 30 13:08:42.957036 containerd[1543]: time="2025-01-30T13:08:42.957025845Z" level=info msg="StopPodSandbox for \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\" returns successfully" Jan 30 13:08:42.959984 containerd[1543]: time="2025-01-30T13:08:42.959533376Z" level=info msg="StopPodSandbox for \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\"" Jan 30 13:08:42.959984 containerd[1543]: time="2025-01-30T13:08:42.959725824Z" level=info msg="TearDown network for sandbox \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\" successfully" Jan 30 13:08:42.959984 containerd[1543]: time="2025-01-30T13:08:42.959827641Z" level=info msg="StopPodSandbox for \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\" returns successfully" Jan 30 13:08:42.961853 containerd[1543]: time="2025-01-30T13:08:42.961442183Z" level=info msg="StopPodSandbox for \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\"" Jan 30 13:08:42.961853 containerd[1543]: time="2025-01-30T13:08:42.961556701Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-lnt9h,Uid:11bf24ac-ae1c-4a6f-b202-4add9f89afb0,Namespace:calico-apiserver,Attempt:3,}" Jan 30 13:08:42.962035 containerd[1543]: time="2025-01-30T13:08:42.961580879Z" level=info msg="TearDown network for sandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\" successfully" Jan 30 13:08:42.962035 containerd[1543]: time="2025-01-30T13:08:42.962009424Z" level=info msg="StopPodSandbox for \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\" returns successfully" Jan 30 13:08:42.963727 containerd[1543]: time="2025-01-30T13:08:42.963439114Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fd69f5c-2x4nc,Uid:da358280-7ea3-4fe4-afd4-56d955439401,Namespace:calico-system,Attempt:3,}" Jan 30 13:08:42.991231 kubelet[2795]: I0130 13:08:42.991164 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837" Jan 30 13:08:43.004482 containerd[1543]: time="2025-01-30T13:08:42.992587494Z" level=info msg="StopPodSandbox for \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\"" Jan 30 13:08:43.004482 containerd[1543]: time="2025-01-30T13:08:42.992852168Z" level=info msg="Ensure that sandbox 089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837 in task-service has been cleanup successfully" Jan 30 13:08:43.004482 containerd[1543]: time="2025-01-30T13:08:42.992981164Z" level=info msg="TearDown network for sandbox \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\" successfully" Jan 30 13:08:43.004482 containerd[1543]: time="2025-01-30T13:08:42.992990458Z" level=info msg="StopPodSandbox for \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\" returns successfully" Jan 30 13:08:43.004482 containerd[1543]: time="2025-01-30T13:08:42.993368791Z" level=info msg="StopPodSandbox for \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\"" Jan 30 13:08:43.004482 containerd[1543]: time="2025-01-30T13:08:42.993406287Z" level=info msg="TearDown network for sandbox \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\" successfully" Jan 30 13:08:43.004482 containerd[1543]: time="2025-01-30T13:08:42.993411806Z" level=info msg="StopPodSandbox for \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\" returns successfully" Jan 30 13:08:43.004482 containerd[1543]: time="2025-01-30T13:08:42.993643522Z" level=info msg="StopPodSandbox for \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\"" Jan 30 13:08:43.004482 containerd[1543]: time="2025-01-30T13:08:42.993684689Z" level=info msg="TearDown network for sandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\" successfully" Jan 30 13:08:43.004482 containerd[1543]: time="2025-01-30T13:08:42.993692338Z" level=info msg="StopPodSandbox for \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\" returns successfully" Jan 30 13:08:43.027619 containerd[1543]: time="2025-01-30T13:08:43.027519892Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-g4bk8,Uid:12536d15-5456-4602-a1a1-2e8242e08904,Namespace:calico-apiserver,Attempt:3,}" Jan 30 13:08:43.027896 containerd[1543]: time="2025-01-30T13:08:43.027877759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rrtms,Uid:10e36bb1-d0be-4ccd-ba00-61a2715458b9,Namespace:kube-system,Attempt:3,}" Jan 30 13:08:43.034052 kubelet[2795]: I0130 13:08:43.032704 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c" Jan 30 13:08:43.034123 containerd[1543]: time="2025-01-30T13:08:43.033859994Z" level=info msg="StopPodSandbox for \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\"" Jan 30 13:08:43.034123 containerd[1543]: time="2025-01-30T13:08:43.033966772Z" level=info msg="Ensure that sandbox 698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c in task-service has been cleanup successfully" Jan 30 13:08:43.034218 containerd[1543]: time="2025-01-30T13:08:43.034208460Z" level=info msg="TearDown network for sandbox \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\" successfully" Jan 30 13:08:43.034253 containerd[1543]: time="2025-01-30T13:08:43.034246634Z" level=info msg="StopPodSandbox for \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\" returns successfully" Jan 30 13:08:43.034645 containerd[1543]: time="2025-01-30T13:08:43.034635810Z" level=info msg="StopPodSandbox for \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\"" Jan 30 13:08:43.035176 containerd[1543]: time="2025-01-30T13:08:43.035165862Z" level=info msg="TearDown network for sandbox \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\" successfully" Jan 30 13:08:43.035234 containerd[1543]: time="2025-01-30T13:08:43.035224216Z" level=info msg="StopPodSandbox for \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\" returns successfully" Jan 30 13:08:43.035841 containerd[1543]: time="2025-01-30T13:08:43.035826366Z" level=info msg="StopPodSandbox for \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\"" Jan 30 13:08:43.036036 containerd[1543]: time="2025-01-30T13:08:43.035986084Z" level=info msg="TearDown network for sandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\" successfully" Jan 30 13:08:43.036255 containerd[1543]: time="2025-01-30T13:08:43.036239908Z" level=info msg="StopPodSandbox for \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\" returns successfully" Jan 30 13:08:43.036946 containerd[1543]: time="2025-01-30T13:08:43.036836065Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gj66t,Uid:0d14c0df-65f9-4785-8227-ecaaf26cf401,Namespace:calico-system,Attempt:3,}" Jan 30 13:08:43.211616 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953-shm.mount: Deactivated successfully. Jan 30 13:08:43.211693 systemd[1]: run-netns-cni\x2d742dec6e\x2de0f8\x2d722e\x2de49c\x2d36ea8eece1f3.mount: Deactivated successfully. Jan 30 13:08:43.211733 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837-shm.mount: Deactivated successfully. Jan 30 13:08:43.211775 systemd[1]: run-netns-cni\x2d6727e4ff\x2d1f57\x2d491b\x2dd779\x2dc3eda43fcd52.mount: Deactivated successfully. Jan 30 13:08:43.211814 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c-shm.mount: Deactivated successfully. Jan 30 13:08:43.500659 containerd[1543]: time="2025-01-30T13:08:43.499763163Z" level=error msg="Failed to destroy network for sandbox \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.500659 containerd[1543]: time="2025-01-30T13:08:43.500018822Z" level=error msg="encountered an error cleaning up failed sandbox \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.500659 containerd[1543]: time="2025-01-30T13:08:43.500062430Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k44fd,Uid:93ca16b2-990d-42cd-8ac7-c7b8297af1b4,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.508614 containerd[1543]: time="2025-01-30T13:08:43.508583798Z" level=error msg="Failed to destroy network for sandbox \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.508934 kubelet[2795]: E0130 13:08:43.508907 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.508987 kubelet[2795]: E0130 13:08:43.508950 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k44fd" Jan 30 13:08:43.508987 kubelet[2795]: E0130 13:08:43.508966 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k44fd" Jan 30 13:08:43.509027 kubelet[2795]: E0130 13:08:43.508997 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-k44fd_kube-system(93ca16b2-990d-42cd-8ac7-c7b8297af1b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-k44fd_kube-system(93ca16b2-990d-42cd-8ac7-c7b8297af1b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-k44fd" podUID="93ca16b2-990d-42cd-8ac7-c7b8297af1b4" Jan 30 13:08:43.510422 containerd[1543]: time="2025-01-30T13:08:43.510397469Z" level=error msg="encountered an error cleaning up failed sandbox \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.510567 containerd[1543]: time="2025-01-30T13:08:43.510533273Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fd69f5c-2x4nc,Uid:da358280-7ea3-4fe4-afd4-56d955439401,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.511019 kubelet[2795]: E0130 13:08:43.510715 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.511019 kubelet[2795]: E0130 13:08:43.510750 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" Jan 30 13:08:43.511019 kubelet[2795]: E0130 13:08:43.510762 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" Jan 30 13:08:43.511153 kubelet[2795]: E0130 13:08:43.510866 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-697fd69f5c-2x4nc_calico-system(da358280-7ea3-4fe4-afd4-56d955439401)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-697fd69f5c-2x4nc_calico-system(da358280-7ea3-4fe4-afd4-56d955439401)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" podUID="da358280-7ea3-4fe4-afd4-56d955439401" Jan 30 13:08:43.512294 containerd[1543]: time="2025-01-30T13:08:43.512270185Z" level=error msg="Failed to destroy network for sandbox \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.512527 containerd[1543]: time="2025-01-30T13:08:43.512468297Z" level=error msg="encountered an error cleaning up failed sandbox \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.512527 containerd[1543]: time="2025-01-30T13:08:43.512504976Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-lnt9h,Uid:11bf24ac-ae1c-4a6f-b202-4add9f89afb0,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.512976 kubelet[2795]: E0130 13:08:43.512640 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.512976 kubelet[2795]: E0130 13:08:43.512701 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" Jan 30 13:08:43.512976 kubelet[2795]: E0130 13:08:43.512718 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" Jan 30 13:08:43.513044 kubelet[2795]: E0130 13:08:43.512745 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64db96d5fb-lnt9h_calico-apiserver(11bf24ac-ae1c-4a6f-b202-4add9f89afb0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64db96d5fb-lnt9h_calico-apiserver(11bf24ac-ae1c-4a6f-b202-4add9f89afb0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" podUID="11bf24ac-ae1c-4a6f-b202-4add9f89afb0" Jan 30 13:08:43.614230 containerd[1543]: time="2025-01-30T13:08:43.614068090Z" level=error msg="Failed to destroy network for sandbox \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.614635 containerd[1543]: time="2025-01-30T13:08:43.614526151Z" level=error msg="encountered an error cleaning up failed sandbox \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.614635 containerd[1543]: time="2025-01-30T13:08:43.614568318Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rrtms,Uid:10e36bb1-d0be-4ccd-ba00-61a2715458b9,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.615403 kubelet[2795]: E0130 13:08:43.614757 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.615403 kubelet[2795]: E0130 13:08:43.614796 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rrtms" Jan 30 13:08:43.615403 kubelet[2795]: E0130 13:08:43.614811 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rrtms" Jan 30 13:08:43.615487 kubelet[2795]: E0130 13:08:43.614845 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-rrtms_kube-system(10e36bb1-d0be-4ccd-ba00-61a2715458b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-rrtms_kube-system(10e36bb1-d0be-4ccd-ba00-61a2715458b9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-rrtms" podUID="10e36bb1-d0be-4ccd-ba00-61a2715458b9" Jan 30 13:08:43.627749 containerd[1543]: time="2025-01-30T13:08:43.627721621Z" level=error msg="Failed to destroy network for sandbox \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.631328 containerd[1543]: time="2025-01-30T13:08:43.631092817Z" level=error msg="Failed to destroy network for sandbox \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.631422 containerd[1543]: time="2025-01-30T13:08:43.631393644Z" level=error msg="encountered an error cleaning up failed sandbox \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.631471 containerd[1543]: time="2025-01-30T13:08:43.631443256Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gj66t,Uid:0d14c0df-65f9-4785-8227-ecaaf26cf401,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.632801 kubelet[2795]: E0130 13:08:43.631781 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.632801 kubelet[2795]: E0130 13:08:43.631828 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:43.632801 kubelet[2795]: E0130 13:08:43.631842 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:43.632943 kubelet[2795]: E0130 13:08:43.631872 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gj66t_calico-system(0d14c0df-65f9-4785-8227-ecaaf26cf401)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gj66t_calico-system(0d14c0df-65f9-4785-8227-ecaaf26cf401)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gj66t" podUID="0d14c0df-65f9-4785-8227-ecaaf26cf401" Jan 30 13:08:43.634261 containerd[1543]: time="2025-01-30T13:08:43.634214354Z" level=error msg="encountered an error cleaning up failed sandbox \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.634551 containerd[1543]: time="2025-01-30T13:08:43.634523072Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-g4bk8,Uid:12536d15-5456-4602-a1a1-2e8242e08904,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.635031 kubelet[2795]: E0130 13:08:43.634832 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:43.635031 kubelet[2795]: E0130 13:08:43.634871 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" Jan 30 13:08:43.635031 kubelet[2795]: E0130 13:08:43.634885 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" Jan 30 13:08:43.635191 kubelet[2795]: E0130 13:08:43.634914 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64db96d5fb-g4bk8_calico-apiserver(12536d15-5456-4602-a1a1-2e8242e08904)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64db96d5fb-g4bk8_calico-apiserver(12536d15-5456-4602-a1a1-2e8242e08904)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" podUID="12536d15-5456-4602-a1a1-2e8242e08904" Jan 30 13:08:43.715038 containerd[1543]: time="2025-01-30T13:08:43.714849025Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:43.720172 containerd[1543]: time="2025-01-30T13:08:43.720114555Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Jan 30 13:08:43.741201 containerd[1543]: time="2025-01-30T13:08:43.741161486Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:43.741734 containerd[1543]: time="2025-01-30T13:08:43.741703678Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:43.742769 containerd[1543]: time="2025-01-30T13:08:43.742739546Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 5.045323373s" Jan 30 13:08:43.742769 containerd[1543]: time="2025-01-30T13:08:43.742770900Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Jan 30 13:08:43.777588 containerd[1543]: time="2025-01-30T13:08:43.776027628Z" level=info msg="CreateContainer within sandbox \"b1194eb381517e8615fa0630fff5fdfc63f37461e07214f0e46ac21261b045ac\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 30 13:08:43.841807 containerd[1543]: time="2025-01-30T13:08:43.841732928Z" level=info msg="CreateContainer within sandbox \"b1194eb381517e8615fa0630fff5fdfc63f37461e07214f0e46ac21261b045ac\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"244638f9b5ba98e57f1a0b47f398d3df68a807fed97601812bb545fb9d978959\"" Jan 30 13:08:43.856105 containerd[1543]: time="2025-01-30T13:08:43.856071579Z" level=info msg="StartContainer for \"244638f9b5ba98e57f1a0b47f398d3df68a807fed97601812bb545fb9d978959\"" Jan 30 13:08:44.040750 kubelet[2795]: I0130 13:08:44.040546 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34" Jan 30 13:08:44.042852 containerd[1543]: time="2025-01-30T13:08:44.041965261Z" level=info msg="StopPodSandbox for \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\"" Jan 30 13:08:44.042852 containerd[1543]: time="2025-01-30T13:08:44.042737092Z" level=info msg="Ensure that sandbox 5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34 in task-service has been cleanup successfully" Jan 30 13:08:44.043733 containerd[1543]: time="2025-01-30T13:08:44.043211507Z" level=info msg="TearDown network for sandbox \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\" successfully" Jan 30 13:08:44.043733 containerd[1543]: time="2025-01-30T13:08:44.043224464Z" level=info msg="StopPodSandbox for \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\" returns successfully" Jan 30 13:08:44.045275 containerd[1543]: time="2025-01-30T13:08:44.044669277Z" level=info msg="StopPodSandbox for \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\"" Jan 30 13:08:44.045275 containerd[1543]: time="2025-01-30T13:08:44.044985961Z" level=info msg="TearDown network for sandbox \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\" successfully" Jan 30 13:08:44.047004 kubelet[2795]: I0130 13:08:44.045118 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc" Jan 30 13:08:44.047067 containerd[1543]: time="2025-01-30T13:08:44.044994214Z" level=info msg="StopPodSandbox for \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\" returns successfully" Jan 30 13:08:44.047822 containerd[1543]: time="2025-01-30T13:08:44.047234877Z" level=info msg="StopPodSandbox for \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\"" Jan 30 13:08:44.047822 containerd[1543]: time="2025-01-30T13:08:44.047234688Z" level=info msg="StopPodSandbox for \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\"" Jan 30 13:08:44.047822 containerd[1543]: time="2025-01-30T13:08:44.047400885Z" level=info msg="Ensure that sandbox 00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc in task-service has been cleanup successfully" Jan 30 13:08:44.048776 containerd[1543]: time="2025-01-30T13:08:44.047405460Z" level=info msg="TearDown network for sandbox \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\" successfully" Jan 30 13:08:44.048776 containerd[1543]: time="2025-01-30T13:08:44.048181895Z" level=info msg="StopPodSandbox for \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\" returns successfully" Jan 30 13:08:44.049515 containerd[1543]: time="2025-01-30T13:08:44.049325523Z" level=info msg="StopPodSandbox for \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\"" Jan 30 13:08:44.050121 containerd[1543]: time="2025-01-30T13:08:44.049494360Z" level=info msg="TearDown network for sandbox \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\" successfully" Jan 30 13:08:44.050121 containerd[1543]: time="2025-01-30T13:08:44.049934587Z" level=info msg="StopPodSandbox for \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\" returns successfully" Jan 30 13:08:44.050121 containerd[1543]: time="2025-01-30T13:08:44.049611453Z" level=info msg="TearDown network for sandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\" successfully" Jan 30 13:08:44.050843 containerd[1543]: time="2025-01-30T13:08:44.049981763Z" level=info msg="StopPodSandbox for \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\" returns successfully" Jan 30 13:08:44.051302 containerd[1543]: time="2025-01-30T13:08:44.051283140Z" level=info msg="StopPodSandbox for \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\"" Jan 30 13:08:44.051350 containerd[1543]: time="2025-01-30T13:08:44.051344288Z" level=info msg="TearDown network for sandbox \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\" successfully" Jan 30 13:08:44.051373 containerd[1543]: time="2025-01-30T13:08:44.051350741Z" level=info msg="StopPodSandbox for \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\" returns successfully" Jan 30 13:08:44.052203 containerd[1543]: time="2025-01-30T13:08:44.052132925Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gj66t,Uid:0d14c0df-65f9-4785-8227-ecaaf26cf401,Namespace:calico-system,Attempt:4,}" Jan 30 13:08:44.052349 containerd[1543]: time="2025-01-30T13:08:44.052287996Z" level=info msg="StopPodSandbox for \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\"" Jan 30 13:08:44.052487 containerd[1543]: time="2025-01-30T13:08:44.052431062Z" level=info msg="TearDown network for sandbox \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\" successfully" Jan 30 13:08:44.052487 containerd[1543]: time="2025-01-30T13:08:44.052443958Z" level=info msg="StopPodSandbox for \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\" returns successfully" Jan 30 13:08:44.053898 kubelet[2795]: I0130 13:08:44.053315 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114" Jan 30 13:08:44.055010 containerd[1543]: time="2025-01-30T13:08:44.054590793Z" level=info msg="StopPodSandbox for \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\"" Jan 30 13:08:44.055010 containerd[1543]: time="2025-01-30T13:08:44.054788439Z" level=info msg="Ensure that sandbox 87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114 in task-service has been cleanup successfully" Jan 30 13:08:44.055099 containerd[1543]: time="2025-01-30T13:08:44.055077662Z" level=info msg="TearDown network for sandbox \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\" successfully" Jan 30 13:08:44.055099 containerd[1543]: time="2025-01-30T13:08:44.055092251Z" level=info msg="StopPodSandbox for \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\" returns successfully" Jan 30 13:08:44.056774 containerd[1543]: time="2025-01-30T13:08:44.055244676Z" level=info msg="StopPodSandbox for \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\"" Jan 30 13:08:44.056774 containerd[1543]: time="2025-01-30T13:08:44.055290532Z" level=info msg="TearDown network for sandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\" successfully" Jan 30 13:08:44.056774 containerd[1543]: time="2025-01-30T13:08:44.055296239Z" level=info msg="StopPodSandbox for \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\" returns successfully" Jan 30 13:08:44.058805 containerd[1543]: time="2025-01-30T13:08:44.057671907Z" level=info msg="StopPodSandbox for \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\"" Jan 30 13:08:44.058805 containerd[1543]: time="2025-01-30T13:08:44.058615405Z" level=info msg="TearDown network for sandbox \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\" successfully" Jan 30 13:08:44.058805 containerd[1543]: time="2025-01-30T13:08:44.058624506Z" level=info msg="StopPodSandbox for \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\" returns successfully" Jan 30 13:08:44.058805 containerd[1543]: time="2025-01-30T13:08:44.058631282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k44fd,Uid:93ca16b2-990d-42cd-8ac7-c7b8297af1b4,Namespace:kube-system,Attempt:4,}" Jan 30 13:08:44.059399 containerd[1543]: time="2025-01-30T13:08:44.059357552Z" level=info msg="StopPodSandbox for \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\"" Jan 30 13:08:44.060386 containerd[1543]: time="2025-01-30T13:08:44.059407791Z" level=info msg="TearDown network for sandbox \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\" successfully" Jan 30 13:08:44.060386 containerd[1543]: time="2025-01-30T13:08:44.059415187Z" level=info msg="StopPodSandbox for \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\" returns successfully" Jan 30 13:08:44.060517 containerd[1543]: time="2025-01-30T13:08:44.060244612Z" level=info msg="StopPodSandbox for \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\"" Jan 30 13:08:44.060633 containerd[1543]: time="2025-01-30T13:08:44.060508586Z" level=info msg="TearDown network for sandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\" successfully" Jan 30 13:08:44.060633 containerd[1543]: time="2025-01-30T13:08:44.060580070Z" level=info msg="StopPodSandbox for \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\" returns successfully" Jan 30 13:08:44.061552 containerd[1543]: time="2025-01-30T13:08:44.061492850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-g4bk8,Uid:12536d15-5456-4602-a1a1-2e8242e08904,Namespace:calico-apiserver,Attempt:4,}" Jan 30 13:08:44.064256 kubelet[2795]: I0130 13:08:44.064049 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720" Jan 30 13:08:44.066136 kubelet[2795]: I0130 13:08:44.065172 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a" Jan 30 13:08:44.066188 containerd[1543]: time="2025-01-30T13:08:44.064576646Z" level=info msg="StopPodSandbox for \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\"" Jan 30 13:08:44.066188 containerd[1543]: time="2025-01-30T13:08:44.065716610Z" level=info msg="StopPodSandbox for \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\"" Jan 30 13:08:44.067422 containerd[1543]: time="2025-01-30T13:08:44.067262163Z" level=info msg="Ensure that sandbox 48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720 in task-service has been cleanup successfully" Jan 30 13:08:44.067422 containerd[1543]: time="2025-01-30T13:08:44.067316542Z" level=info msg="Ensure that sandbox f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a in task-service has been cleanup successfully" Jan 30 13:08:44.067573 containerd[1543]: time="2025-01-30T13:08:44.067557416Z" level=info msg="TearDown network for sandbox \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\" successfully" Jan 30 13:08:44.067618 containerd[1543]: time="2025-01-30T13:08:44.067610789Z" level=info msg="StopPodSandbox for \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\" returns successfully" Jan 30 13:08:44.070622 containerd[1543]: time="2025-01-30T13:08:44.070213099Z" level=info msg="TearDown network for sandbox \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\" successfully" Jan 30 13:08:44.072293 containerd[1543]: time="2025-01-30T13:08:44.071233173Z" level=info msg="StopPodSandbox for \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\" returns successfully" Jan 30 13:08:44.082112 containerd[1543]: time="2025-01-30T13:08:44.082069472Z" level=info msg="StopPodSandbox for \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\"" Jan 30 13:08:44.082633 containerd[1543]: time="2025-01-30T13:08:44.082607403Z" level=info msg="TearDown network for sandbox \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\" successfully" Jan 30 13:08:44.082766 containerd[1543]: time="2025-01-30T13:08:44.082755772Z" level=info msg="StopPodSandbox for \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\" returns successfully" Jan 30 13:08:44.082935 containerd[1543]: time="2025-01-30T13:08:44.082925057Z" level=info msg="StopPodSandbox for \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\"" Jan 30 13:08:44.083140 containerd[1543]: time="2025-01-30T13:08:44.083130463Z" level=info msg="TearDown network for sandbox \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\" successfully" Jan 30 13:08:44.083183 containerd[1543]: time="2025-01-30T13:08:44.083176124Z" level=info msg="StopPodSandbox for \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\" returns successfully" Jan 30 13:08:44.084582 containerd[1543]: time="2025-01-30T13:08:44.083866884Z" level=info msg="StopPodSandbox for \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\"" Jan 30 13:08:44.084582 containerd[1543]: time="2025-01-30T13:08:44.084072461Z" level=info msg="TearDown network for sandbox \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\" successfully" Jan 30 13:08:44.084582 containerd[1543]: time="2025-01-30T13:08:44.084082031Z" level=info msg="StopPodSandbox for \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\" returns successfully" Jan 30 13:08:44.084582 containerd[1543]: time="2025-01-30T13:08:44.084366822Z" level=info msg="StopPodSandbox for \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\"" Jan 30 13:08:44.084582 containerd[1543]: time="2025-01-30T13:08:44.084483776Z" level=info msg="TearDown network for sandbox \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\" successfully" Jan 30 13:08:44.084582 containerd[1543]: time="2025-01-30T13:08:44.084495260Z" level=info msg="StopPodSandbox for \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\" returns successfully" Jan 30 13:08:44.086429 containerd[1543]: time="2025-01-30T13:08:44.086385390Z" level=info msg="StopPodSandbox for \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\"" Jan 30 13:08:44.087541 containerd[1543]: time="2025-01-30T13:08:44.086825587Z" level=info msg="TearDown network for sandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\" successfully" Jan 30 13:08:44.087541 containerd[1543]: time="2025-01-30T13:08:44.086835401Z" level=info msg="StopPodSandbox for \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\" returns successfully" Jan 30 13:08:44.087541 containerd[1543]: time="2025-01-30T13:08:44.086614600Z" level=info msg="StopPodSandbox for \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\"" Jan 30 13:08:44.087541 containerd[1543]: time="2025-01-30T13:08:44.086950374Z" level=info msg="TearDown network for sandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\" successfully" Jan 30 13:08:44.087541 containerd[1543]: time="2025-01-30T13:08:44.086960616Z" level=info msg="StopPodSandbox for \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\" returns successfully" Jan 30 13:08:44.091669 kubelet[2795]: I0130 13:08:44.091599 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb" Jan 30 13:08:44.092822 containerd[1543]: time="2025-01-30T13:08:44.092438236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-lnt9h,Uid:11bf24ac-ae1c-4a6f-b202-4add9f89afb0,Namespace:calico-apiserver,Attempt:4,}" Jan 30 13:08:44.097782 containerd[1543]: time="2025-01-30T13:08:44.097756472Z" level=info msg="StopPodSandbox for \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\"" Jan 30 13:08:44.099087 containerd[1543]: time="2025-01-30T13:08:44.099065113Z" level=info msg="Ensure that sandbox 3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb in task-service has been cleanup successfully" Jan 30 13:08:44.099328 containerd[1543]: time="2025-01-30T13:08:44.098429518Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fd69f5c-2x4nc,Uid:da358280-7ea3-4fe4-afd4-56d955439401,Namespace:calico-system,Attempt:4,}" Jan 30 13:08:44.100821 containerd[1543]: time="2025-01-30T13:08:44.100803374Z" level=info msg="TearDown network for sandbox \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\" successfully" Jan 30 13:08:44.100923 containerd[1543]: time="2025-01-30T13:08:44.100907615Z" level=info msg="StopPodSandbox for \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\" returns successfully" Jan 30 13:08:44.101926 containerd[1543]: time="2025-01-30T13:08:44.101896457Z" level=info msg="StopPodSandbox for \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\"" Jan 30 13:08:44.102190 containerd[1543]: time="2025-01-30T13:08:44.102003203Z" level=info msg="TearDown network for sandbox \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\" successfully" Jan 30 13:08:44.102190 containerd[1543]: time="2025-01-30T13:08:44.102185882Z" level=info msg="StopPodSandbox for \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\" returns successfully" Jan 30 13:08:44.102594 containerd[1543]: time="2025-01-30T13:08:44.102576677Z" level=info msg="StopPodSandbox for \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\"" Jan 30 13:08:44.102904 containerd[1543]: time="2025-01-30T13:08:44.102891688Z" level=info msg="TearDown network for sandbox \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\" successfully" Jan 30 13:08:44.103102 containerd[1543]: time="2025-01-30T13:08:44.102995049Z" level=info msg="StopPodSandbox for \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\" returns successfully" Jan 30 13:08:44.104602 containerd[1543]: time="2025-01-30T13:08:44.104579047Z" level=info msg="StopPodSandbox for \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\"" Jan 30 13:08:44.105489 containerd[1543]: time="2025-01-30T13:08:44.105437527Z" level=info msg="TearDown network for sandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\" successfully" Jan 30 13:08:44.105568 containerd[1543]: time="2025-01-30T13:08:44.105558306Z" level=info msg="StopPodSandbox for \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\" returns successfully" Jan 30 13:08:44.107445 containerd[1543]: time="2025-01-30T13:08:44.107427480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rrtms,Uid:10e36bb1-d0be-4ccd-ba00-61a2715458b9,Namespace:kube-system,Attempt:4,}" Jan 30 13:08:44.184419 systemd[1]: Started cri-containerd-244638f9b5ba98e57f1a0b47f398d3df68a807fed97601812bb545fb9d978959.scope - libcontainer container 244638f9b5ba98e57f1a0b47f398d3df68a807fed97601812bb545fb9d978959. Jan 30 13:08:44.218053 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a-shm.mount: Deactivated successfully. Jan 30 13:08:44.218138 systemd[1]: run-netns-cni\x2d5d52e84f\x2dc2a6\x2d2d9e\x2d2edb\x2d42c66067f94b.mount: Deactivated successfully. Jan 30 13:08:44.218178 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720-shm.mount: Deactivated successfully. Jan 30 13:08:44.218222 systemd[1]: run-netns-cni\x2d6db70697\x2d617e\x2d7f82\x2d7bc6\x2d281833751626.mount: Deactivated successfully. Jan 30 13:08:44.218256 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc-shm.mount: Deactivated successfully. Jan 30 13:08:44.218306 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1113294249.mount: Deactivated successfully. Jan 30 13:08:44.259384 containerd[1543]: time="2025-01-30T13:08:44.259027729Z" level=error msg="Failed to destroy network for sandbox \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.259384 containerd[1543]: time="2025-01-30T13:08:44.259275845Z" level=error msg="encountered an error cleaning up failed sandbox \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.259384 containerd[1543]: time="2025-01-30T13:08:44.259323239Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k44fd,Uid:93ca16b2-990d-42cd-8ac7-c7b8297af1b4,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.259802 kubelet[2795]: E0130 13:08:44.259661 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.259802 kubelet[2795]: E0130 13:08:44.259717 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k44fd" Jan 30 13:08:44.259802 kubelet[2795]: E0130 13:08:44.259743 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-k44fd" Jan 30 13:08:44.261512 kubelet[2795]: E0130 13:08:44.259778 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-k44fd_kube-system(93ca16b2-990d-42cd-8ac7-c7b8297af1b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-k44fd_kube-system(93ca16b2-990d-42cd-8ac7-c7b8297af1b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-k44fd" podUID="93ca16b2-990d-42cd-8ac7-c7b8297af1b4" Jan 30 13:08:44.262201 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc-shm.mount: Deactivated successfully. Jan 30 13:08:44.278937 containerd[1543]: time="2025-01-30T13:08:44.278764972Z" level=info msg="StartContainer for \"244638f9b5ba98e57f1a0b47f398d3df68a807fed97601812bb545fb9d978959\" returns successfully" Jan 30 13:08:44.283329 containerd[1543]: time="2025-01-30T13:08:44.283238561Z" level=error msg="Failed to destroy network for sandbox \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.283661 containerd[1543]: time="2025-01-30T13:08:44.283576833Z" level=error msg="encountered an error cleaning up failed sandbox \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.283661 containerd[1543]: time="2025-01-30T13:08:44.283618352Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gj66t,Uid:0d14c0df-65f9-4785-8227-ecaaf26cf401,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.284242 kubelet[2795]: E0130 13:08:44.284033 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.284242 kubelet[2795]: E0130 13:08:44.284167 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:44.284242 kubelet[2795]: E0130 13:08:44.284186 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gj66t" Jan 30 13:08:44.284433 kubelet[2795]: E0130 13:08:44.284221 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gj66t_calico-system(0d14c0df-65f9-4785-8227-ecaaf26cf401)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gj66t_calico-system(0d14c0df-65f9-4785-8227-ecaaf26cf401)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gj66t" podUID="0d14c0df-65f9-4785-8227-ecaaf26cf401" Jan 30 13:08:44.286344 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3-shm.mount: Deactivated successfully. Jan 30 13:08:44.287547 containerd[1543]: time="2025-01-30T13:08:44.287522166Z" level=error msg="Failed to destroy network for sandbox \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.290589 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6-shm.mount: Deactivated successfully. Jan 30 13:08:44.291645 containerd[1543]: time="2025-01-30T13:08:44.291584786Z" level=error msg="encountered an error cleaning up failed sandbox \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.291645 containerd[1543]: time="2025-01-30T13:08:44.291632248Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-lnt9h,Uid:11bf24ac-ae1c-4a6f-b202-4add9f89afb0,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.292319 kubelet[2795]: E0130 13:08:44.292205 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.292590 kubelet[2795]: E0130 13:08:44.292568 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" Jan 30 13:08:44.292792 kubelet[2795]: E0130 13:08:44.292775 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" Jan 30 13:08:44.293167 kubelet[2795]: E0130 13:08:44.292955 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64db96d5fb-lnt9h_calico-apiserver(11bf24ac-ae1c-4a6f-b202-4add9f89afb0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64db96d5fb-lnt9h_calico-apiserver(11bf24ac-ae1c-4a6f-b202-4add9f89afb0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" podUID="11bf24ac-ae1c-4a6f-b202-4add9f89afb0" Jan 30 13:08:44.294941 containerd[1543]: time="2025-01-30T13:08:44.294911618Z" level=error msg="Failed to destroy network for sandbox \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.295440 containerd[1543]: time="2025-01-30T13:08:44.295200433Z" level=error msg="encountered an error cleaning up failed sandbox \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.295440 containerd[1543]: time="2025-01-30T13:08:44.295239723Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rrtms,Uid:10e36bb1-d0be-4ccd-ba00-61a2715458b9,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.296292 kubelet[2795]: E0130 13:08:44.295603 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.296292 kubelet[2795]: E0130 13:08:44.295641 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rrtms" Jan 30 13:08:44.296292 kubelet[2795]: E0130 13:08:44.295655 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rrtms" Jan 30 13:08:44.296391 kubelet[2795]: E0130 13:08:44.296262 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-rrtms_kube-system(10e36bb1-d0be-4ccd-ba00-61a2715458b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-rrtms_kube-system(10e36bb1-d0be-4ccd-ba00-61a2715458b9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-rrtms" podUID="10e36bb1-d0be-4ccd-ba00-61a2715458b9" Jan 30 13:08:44.298113 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9-shm.mount: Deactivated successfully. Jan 30 13:08:44.307753 containerd[1543]: time="2025-01-30T13:08:44.307387109Z" level=error msg="Failed to destroy network for sandbox \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.307753 containerd[1543]: time="2025-01-30T13:08:44.307618385Z" level=error msg="encountered an error cleaning up failed sandbox \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.307753 containerd[1543]: time="2025-01-30T13:08:44.307655960Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-g4bk8,Uid:12536d15-5456-4602-a1a1-2e8242e08904,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.308051 kubelet[2795]: E0130 13:08:44.308023 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.308200 kubelet[2795]: E0130 13:08:44.308126 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" Jan 30 13:08:44.308200 kubelet[2795]: E0130 13:08:44.308143 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" Jan 30 13:08:44.308317 kubelet[2795]: E0130 13:08:44.308260 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64db96d5fb-g4bk8_calico-apiserver(12536d15-5456-4602-a1a1-2e8242e08904)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64db96d5fb-g4bk8_calico-apiserver(12536d15-5456-4602-a1a1-2e8242e08904)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" podUID="12536d15-5456-4602-a1a1-2e8242e08904" Jan 30 13:08:44.313870 containerd[1543]: time="2025-01-30T13:08:44.313814876Z" level=error msg="Failed to destroy network for sandbox \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.314086 containerd[1543]: time="2025-01-30T13:08:44.314055846Z" level=error msg="encountered an error cleaning up failed sandbox \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.314644 containerd[1543]: time="2025-01-30T13:08:44.314624821Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fd69f5c-2x4nc,Uid:da358280-7ea3-4fe4-afd4-56d955439401,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.314829 kubelet[2795]: E0130 13:08:44.314804 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 13:08:44.314899 kubelet[2795]: E0130 13:08:44.314880 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" Jan 30 13:08:44.314930 kubelet[2795]: E0130 13:08:44.314902 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" Jan 30 13:08:44.314959 kubelet[2795]: E0130 13:08:44.314940 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-697fd69f5c-2x4nc_calico-system(da358280-7ea3-4fe4-afd4-56d955439401)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-697fd69f5c-2x4nc_calico-system(da358280-7ea3-4fe4-afd4-56d955439401)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" podUID="da358280-7ea3-4fe4-afd4-56d955439401" Jan 30 13:08:44.550316 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 30 13:08:44.551655 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 30 13:08:45.095171 kubelet[2795]: I0130 13:08:45.094786 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc" Jan 30 13:08:45.095692 containerd[1543]: time="2025-01-30T13:08:45.095667981Z" level=info msg="StopPodSandbox for \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\"" Jan 30 13:08:45.097925 containerd[1543]: time="2025-01-30T13:08:45.096190415Z" level=info msg="Ensure that sandbox 08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc in task-service has been cleanup successfully" Jan 30 13:08:45.098195 containerd[1543]: time="2025-01-30T13:08:45.098130466Z" level=info msg="TearDown network for sandbox \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\" successfully" Jan 30 13:08:45.098195 containerd[1543]: time="2025-01-30T13:08:45.098147627Z" level=info msg="StopPodSandbox for \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\" returns successfully" Jan 30 13:08:45.100766 containerd[1543]: time="2025-01-30T13:08:45.098358344Z" level=info msg="StopPodSandbox for \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\"" Jan 30 13:08:45.100766 containerd[1543]: time="2025-01-30T13:08:45.098405820Z" level=info msg="TearDown network for sandbox \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\" successfully" Jan 30 13:08:45.100766 containerd[1543]: time="2025-01-30T13:08:45.098412048Z" level=info msg="StopPodSandbox for \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\" returns successfully" Jan 30 13:08:45.100766 containerd[1543]: time="2025-01-30T13:08:45.098796133Z" level=info msg="StopPodSandbox for \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\"" Jan 30 13:08:45.100766 containerd[1543]: time="2025-01-30T13:08:45.099831007Z" level=info msg="TearDown network for sandbox \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\" successfully" Jan 30 13:08:45.100766 containerd[1543]: time="2025-01-30T13:08:45.099850612Z" level=info msg="StopPodSandbox for \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\" returns successfully" Jan 30 13:08:45.100894 kubelet[2795]: I0130 13:08:45.098634 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83" Jan 30 13:08:45.100923 containerd[1543]: time="2025-01-30T13:08:45.100908556Z" level=info msg="StopPodSandbox for \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\"" Jan 30 13:08:45.102643 containerd[1543]: time="2025-01-30T13:08:45.102616276Z" level=info msg="Ensure that sandbox 0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83 in task-service has been cleanup successfully" Jan 30 13:08:45.103091 containerd[1543]: time="2025-01-30T13:08:45.102801717Z" level=info msg="TearDown network for sandbox \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\" successfully" Jan 30 13:08:45.103091 containerd[1543]: time="2025-01-30T13:08:45.102812919Z" level=info msg="StopPodSandbox for \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\" returns successfully" Jan 30 13:08:45.103397 containerd[1543]: time="2025-01-30T13:08:45.103255178Z" level=info msg="StopPodSandbox for \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\"" Jan 30 13:08:45.103397 containerd[1543]: time="2025-01-30T13:08:45.103302673Z" level=info msg="TearDown network for sandbox \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\" successfully" Jan 30 13:08:45.103397 containerd[1543]: time="2025-01-30T13:08:45.103309242Z" level=info msg="StopPodSandbox for \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\" returns successfully" Jan 30 13:08:45.103397 containerd[1543]: time="2025-01-30T13:08:45.103339795Z" level=info msg="StopPodSandbox for \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\"" Jan 30 13:08:45.103397 containerd[1543]: time="2025-01-30T13:08:45.103373135Z" level=info msg="TearDown network for sandbox \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\" successfully" Jan 30 13:08:45.103397 containerd[1543]: time="2025-01-30T13:08:45.103378193Z" level=info msg="StopPodSandbox for \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\" returns successfully" Jan 30 13:08:45.105212 containerd[1543]: time="2025-01-30T13:08:45.105186122Z" level=info msg="StopPodSandbox for \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\"" Jan 30 13:08:45.105264 containerd[1543]: time="2025-01-30T13:08:45.105246512Z" level=info msg="TearDown network for sandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\" successfully" Jan 30 13:08:45.105264 containerd[1543]: time="2025-01-30T13:08:45.105253422Z" level=info msg="StopPodSandbox for \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\" returns successfully" Jan 30 13:08:45.105344 containerd[1543]: time="2025-01-30T13:08:45.105332058Z" level=info msg="StopPodSandbox for \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\"" Jan 30 13:08:45.105377 containerd[1543]: time="2025-01-30T13:08:45.105368371Z" level=info msg="TearDown network for sandbox \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\" successfully" Jan 30 13:08:45.105402 containerd[1543]: time="2025-01-30T13:08:45.105376274Z" level=info msg="StopPodSandbox for \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\" returns successfully" Jan 30 13:08:45.106093 containerd[1543]: time="2025-01-30T13:08:45.106074897Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k44fd,Uid:93ca16b2-990d-42cd-8ac7-c7b8297af1b4,Namespace:kube-system,Attempt:5,}" Jan 30 13:08:45.106320 containerd[1543]: time="2025-01-30T13:08:45.106307457Z" level=info msg="StopPodSandbox for \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\"" Jan 30 13:08:45.106846 containerd[1543]: time="2025-01-30T13:08:45.106761896Z" level=info msg="TearDown network for sandbox \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\" successfully" Jan 30 13:08:45.106846 containerd[1543]: time="2025-01-30T13:08:45.106773131Z" level=info msg="StopPodSandbox for \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\" returns successfully" Jan 30 13:08:45.106991 containerd[1543]: time="2025-01-30T13:08:45.106981786Z" level=info msg="StopPodSandbox for \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\"" Jan 30 13:08:45.107065 containerd[1543]: time="2025-01-30T13:08:45.107057024Z" level=info msg="TearDown network for sandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\" successfully" Jan 30 13:08:45.107096 containerd[1543]: time="2025-01-30T13:08:45.107090051Z" level=info msg="StopPodSandbox for \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\" returns successfully" Jan 30 13:08:45.107520 containerd[1543]: time="2025-01-30T13:08:45.107357693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-g4bk8,Uid:12536d15-5456-4602-a1a1-2e8242e08904,Namespace:calico-apiserver,Attempt:5,}" Jan 30 13:08:45.107742 kubelet[2795]: I0130 13:08:45.107726 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6" Jan 30 13:08:45.108383 containerd[1543]: time="2025-01-30T13:08:45.108367905Z" level=info msg="StopPodSandbox for \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\"" Jan 30 13:08:45.108479 containerd[1543]: time="2025-01-30T13:08:45.108465034Z" level=info msg="Ensure that sandbox c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6 in task-service has been cleanup successfully" Jan 30 13:08:45.111001 containerd[1543]: time="2025-01-30T13:08:45.109202230Z" level=info msg="TearDown network for sandbox \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\" successfully" Jan 30 13:08:45.111001 containerd[1543]: time="2025-01-30T13:08:45.109214779Z" level=info msg="StopPodSandbox for \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\" returns successfully" Jan 30 13:08:45.111259 containerd[1543]: time="2025-01-30T13:08:45.111239429Z" level=info msg="StopPodSandbox for \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\"" Jan 30 13:08:45.111303 containerd[1543]: time="2025-01-30T13:08:45.111290968Z" level=info msg="TearDown network for sandbox \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\" successfully" Jan 30 13:08:45.111333 containerd[1543]: time="2025-01-30T13:08:45.111300670Z" level=info msg="StopPodSandbox for \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\" returns successfully" Jan 30 13:08:45.111536 containerd[1543]: time="2025-01-30T13:08:45.111498013Z" level=info msg="StopPodSandbox for \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\"" Jan 30 13:08:45.111575 containerd[1543]: time="2025-01-30T13:08:45.111541090Z" level=info msg="TearDown network for sandbox \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\" successfully" Jan 30 13:08:45.111575 containerd[1543]: time="2025-01-30T13:08:45.111546866Z" level=info msg="StopPodSandbox for \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\" returns successfully" Jan 30 13:08:45.111905 containerd[1543]: time="2025-01-30T13:08:45.111759960Z" level=info msg="StopPodSandbox for \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\"" Jan 30 13:08:45.111905 containerd[1543]: time="2025-01-30T13:08:45.111794427Z" level=info msg="TearDown network for sandbox \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\" successfully" Jan 30 13:08:45.111905 containerd[1543]: time="2025-01-30T13:08:45.111799912Z" level=info msg="StopPodSandbox for \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\" returns successfully" Jan 30 13:08:45.112205 containerd[1543]: time="2025-01-30T13:08:45.112189544Z" level=info msg="StopPodSandbox for \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\"" Jan 30 13:08:45.112454 containerd[1543]: time="2025-01-30T13:08:45.112235684Z" level=info msg="TearDown network for sandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\" successfully" Jan 30 13:08:45.112454 containerd[1543]: time="2025-01-30T13:08:45.112241337Z" level=info msg="StopPodSandbox for \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\" returns successfully" Jan 30 13:08:45.113112 kubelet[2795]: I0130 13:08:45.112831 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8" Jan 30 13:08:45.113725 containerd[1543]: time="2025-01-30T13:08:45.113224026Z" level=info msg="StopPodSandbox for \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\"" Jan 30 13:08:45.114834 containerd[1543]: time="2025-01-30T13:08:45.113886898Z" level=info msg="Ensure that sandbox e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8 in task-service has been cleanup successfully" Jan 30 13:08:45.114834 containerd[1543]: time="2025-01-30T13:08:45.114056307Z" level=info msg="TearDown network for sandbox \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\" successfully" Jan 30 13:08:45.114834 containerd[1543]: time="2025-01-30T13:08:45.114064412Z" level=info msg="StopPodSandbox for \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\" returns successfully" Jan 30 13:08:45.114834 containerd[1543]: time="2025-01-30T13:08:45.114089583Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-lnt9h,Uid:11bf24ac-ae1c-4a6f-b202-4add9f89afb0,Namespace:calico-apiserver,Attempt:5,}" Jan 30 13:08:45.114834 containerd[1543]: time="2025-01-30T13:08:45.114374581Z" level=info msg="StopPodSandbox for \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\"" Jan 30 13:08:45.114834 containerd[1543]: time="2025-01-30T13:08:45.114734839Z" level=info msg="TearDown network for sandbox \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\" successfully" Jan 30 13:08:45.114834 containerd[1543]: time="2025-01-30T13:08:45.114742948Z" level=info msg="StopPodSandbox for \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\" returns successfully" Jan 30 13:08:45.117243 containerd[1543]: time="2025-01-30T13:08:45.116729290Z" level=info msg="StopPodSandbox for \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\"" Jan 30 13:08:45.117243 containerd[1543]: time="2025-01-30T13:08:45.116802024Z" level=info msg="TearDown network for sandbox \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\" successfully" Jan 30 13:08:45.117243 containerd[1543]: time="2025-01-30T13:08:45.116809225Z" level=info msg="StopPodSandbox for \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\" returns successfully" Jan 30 13:08:45.117960 containerd[1543]: time="2025-01-30T13:08:45.117875587Z" level=info msg="StopPodSandbox for \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\"" Jan 30 13:08:45.117960 containerd[1543]: time="2025-01-30T13:08:45.117939481Z" level=info msg="TearDown network for sandbox \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\" successfully" Jan 30 13:08:45.117960 containerd[1543]: time="2025-01-30T13:08:45.117946177Z" level=info msg="StopPodSandbox for \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\" returns successfully" Jan 30 13:08:45.118143 containerd[1543]: time="2025-01-30T13:08:45.118124345Z" level=info msg="StopPodSandbox for \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\"" Jan 30 13:08:45.118262 containerd[1543]: time="2025-01-30T13:08:45.118211148Z" level=info msg="TearDown network for sandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\" successfully" Jan 30 13:08:45.118262 containerd[1543]: time="2025-01-30T13:08:45.118235062Z" level=info msg="StopPodSandbox for \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\" returns successfully" Jan 30 13:08:45.118881 kubelet[2795]: I0130 13:08:45.118527 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9" Jan 30 13:08:45.118946 containerd[1543]: time="2025-01-30T13:08:45.118534728Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fd69f5c-2x4nc,Uid:da358280-7ea3-4fe4-afd4-56d955439401,Namespace:calico-system,Attempt:5,}" Jan 30 13:08:45.119073 containerd[1543]: time="2025-01-30T13:08:45.119060685Z" level=info msg="StopPodSandbox for \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\"" Jan 30 13:08:45.119250 containerd[1543]: time="2025-01-30T13:08:45.119241187Z" level=info msg="Ensure that sandbox f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9 in task-service has been cleanup successfully" Jan 30 13:08:45.119410 containerd[1543]: time="2025-01-30T13:08:45.119400750Z" level=info msg="TearDown network for sandbox \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\" successfully" Jan 30 13:08:45.119445 containerd[1543]: time="2025-01-30T13:08:45.119438756Z" level=info msg="StopPodSandbox for \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\" returns successfully" Jan 30 13:08:45.120684 containerd[1543]: time="2025-01-30T13:08:45.120658289Z" level=info msg="StopPodSandbox for \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\"" Jan 30 13:08:45.120846 containerd[1543]: time="2025-01-30T13:08:45.120817043Z" level=info msg="TearDown network for sandbox \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\" successfully" Jan 30 13:08:45.121220 containerd[1543]: time="2025-01-30T13:08:45.121208736Z" level=info msg="StopPodSandbox for \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\" returns successfully" Jan 30 13:08:45.121429 containerd[1543]: time="2025-01-30T13:08:45.121420355Z" level=info msg="StopPodSandbox for \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\"" Jan 30 13:08:45.121505 containerd[1543]: time="2025-01-30T13:08:45.121496795Z" level=info msg="TearDown network for sandbox \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\" successfully" Jan 30 13:08:45.121540 containerd[1543]: time="2025-01-30T13:08:45.121534162Z" level=info msg="StopPodSandbox for \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\" returns successfully" Jan 30 13:08:45.121887 containerd[1543]: time="2025-01-30T13:08:45.121867842Z" level=info msg="StopPodSandbox for \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\"" Jan 30 13:08:45.122105 containerd[1543]: time="2025-01-30T13:08:45.121938045Z" level=info msg="TearDown network for sandbox \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\" successfully" Jan 30 13:08:45.122105 containerd[1543]: time="2025-01-30T13:08:45.121944777Z" level=info msg="StopPodSandbox for \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\" returns successfully" Jan 30 13:08:45.122401 containerd[1543]: time="2025-01-30T13:08:45.122391444Z" level=info msg="StopPodSandbox for \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\"" Jan 30 13:08:45.122475 containerd[1543]: time="2025-01-30T13:08:45.122467473Z" level=info msg="TearDown network for sandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\" successfully" Jan 30 13:08:45.122511 containerd[1543]: time="2025-01-30T13:08:45.122504621Z" level=info msg="StopPodSandbox for \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\" returns successfully" Jan 30 13:08:45.123035 containerd[1543]: time="2025-01-30T13:08:45.123018566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rrtms,Uid:10e36bb1-d0be-4ccd-ba00-61a2715458b9,Namespace:kube-system,Attempt:5,}" Jan 30 13:08:45.128901 kubelet[2795]: I0130 13:08:45.128876 2795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3" Jan 30 13:08:45.132068 containerd[1543]: time="2025-01-30T13:08:45.132031793Z" level=info msg="StopPodSandbox for \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\"" Jan 30 13:08:45.132393 containerd[1543]: time="2025-01-30T13:08:45.132376926Z" level=info msg="Ensure that sandbox 3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3 in task-service has been cleanup successfully" Jan 30 13:08:45.132686 containerd[1543]: time="2025-01-30T13:08:45.132649277Z" level=info msg="TearDown network for sandbox \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\" successfully" Jan 30 13:08:45.132790 containerd[1543]: time="2025-01-30T13:08:45.132749915Z" level=info msg="StopPodSandbox for \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\" returns successfully" Jan 30 13:08:45.133090 containerd[1543]: time="2025-01-30T13:08:45.132985197Z" level=info msg="StopPodSandbox for \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\"" Jan 30 13:08:45.133090 containerd[1543]: time="2025-01-30T13:08:45.133027710Z" level=info msg="TearDown network for sandbox \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\" successfully" Jan 30 13:08:45.133090 containerd[1543]: time="2025-01-30T13:08:45.133033363Z" level=info msg="StopPodSandbox for \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\" returns successfully" Jan 30 13:08:45.133323 containerd[1543]: time="2025-01-30T13:08:45.133250372Z" level=info msg="StopPodSandbox for \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\"" Jan 30 13:08:45.133323 containerd[1543]: time="2025-01-30T13:08:45.133305309Z" level=info msg="TearDown network for sandbox \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\" successfully" Jan 30 13:08:45.133323 containerd[1543]: time="2025-01-30T13:08:45.133312186Z" level=info msg="StopPodSandbox for \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\" returns successfully" Jan 30 13:08:45.133608 containerd[1543]: time="2025-01-30T13:08:45.133481564Z" level=info msg="StopPodSandbox for \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\"" Jan 30 13:08:45.133608 containerd[1543]: time="2025-01-30T13:08:45.133523081Z" level=info msg="TearDown network for sandbox \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\" successfully" Jan 30 13:08:45.133608 containerd[1543]: time="2025-01-30T13:08:45.133529426Z" level=info msg="StopPodSandbox for \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\" returns successfully" Jan 30 13:08:45.133693 containerd[1543]: time="2025-01-30T13:08:45.133647573Z" level=info msg="StopPodSandbox for \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\"" Jan 30 13:08:45.133748 containerd[1543]: time="2025-01-30T13:08:45.133720453Z" level=info msg="TearDown network for sandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\" successfully" Jan 30 13:08:45.133777 containerd[1543]: time="2025-01-30T13:08:45.133748251Z" level=info msg="StopPodSandbox for \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\" returns successfully" Jan 30 13:08:45.134099 containerd[1543]: time="2025-01-30T13:08:45.134061499Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gj66t,Uid:0d14c0df-65f9-4785-8227-ecaaf26cf401,Namespace:calico-system,Attempt:5,}" Jan 30 13:08:45.224310 systemd[1]: run-netns-cni\x2d07b7c64e\x2d863c\x2d38f0\x2d90fd\x2d69ac89021597.mount: Deactivated successfully. Jan 30 13:08:45.224373 systemd[1]: run-netns-cni\x2dad23365e\x2dd28a\x2dc6fb\x2df7ec\x2dad71f59c1719.mount: Deactivated successfully. Jan 30 13:08:45.224411 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8-shm.mount: Deactivated successfully. Jan 30 13:08:45.224451 systemd[1]: run-netns-cni\x2dc527f5b7\x2d1d03\x2dc2bc\x2d9055\x2d3eb8997eb31a.mount: Deactivated successfully. Jan 30 13:08:45.224484 systemd[1]: run-netns-cni\x2dcd941691\x2d700d\x2d969a\x2d33ff\x2d405f677ddab3.mount: Deactivated successfully. Jan 30 13:08:45.224516 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83-shm.mount: Deactivated successfully. Jan 30 13:08:45.224554 systemd[1]: run-netns-cni\x2d9a36cc45\x2d517c\x2da70a\x2d9e9c\x2db3cb7d639c10.mount: Deactivated successfully. Jan 30 13:08:45.224585 systemd[1]: run-netns-cni\x2dd9e4f0e1\x2d6a33\x2dcbeb\x2d8561\x2d019b0477aaae.mount: Deactivated successfully. Jan 30 13:08:45.751209 systemd-networkd[1451]: cali40f99c32033: Link UP Jan 30 13:08:45.751576 systemd-networkd[1451]: cali40f99c32033: Gained carrier Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.148 [INFO][4546] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.288 [INFO][4546] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--k44fd-eth0 coredns-6f6b679f8f- kube-system 93ca16b2-990d-42cd-8ac7-c7b8297af1b4 670 0 2025-01-30 13:08:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-k44fd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali40f99c32033 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" Namespace="kube-system" Pod="coredns-6f6b679f8f-k44fd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k44fd-" Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.288 [INFO][4546] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" Namespace="kube-system" Pod="coredns-6f6b679f8f-k44fd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k44fd-eth0" Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.614 [INFO][4635] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" HandleID="k8s-pod-network.e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" Workload="localhost-k8s-coredns--6f6b679f8f--k44fd-eth0" Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.635 [INFO][4635] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" HandleID="k8s-pod-network.e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" Workload="localhost-k8s-coredns--6f6b679f8f--k44fd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c3ed0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-k44fd", "timestamp":"2025-01-30 13:08:45.614298371 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.635 [INFO][4635] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.636 [INFO][4635] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.636 [INFO][4635] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.637 [INFO][4635] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" host="localhost" Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.727 [INFO][4635] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.730 [INFO][4635] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.731 [INFO][4635] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.732 [INFO][4635] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.732 [INFO][4635] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" host="localhost" Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.734 [INFO][4635] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313 Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.736 [INFO][4635] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" host="localhost" Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.739 [INFO][4635] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" host="localhost" Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.739 [INFO][4635] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" host="localhost" Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.739 [INFO][4635] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 13:08:45.759029 containerd[1543]: 2025-01-30 13:08:45.739 [INFO][4635] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" HandleID="k8s-pod-network.e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" Workload="localhost-k8s-coredns--6f6b679f8f--k44fd-eth0" Jan 30 13:08:45.764704 containerd[1543]: 2025-01-30 13:08:45.741 [INFO][4546] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" Namespace="kube-system" Pod="coredns-6f6b679f8f-k44fd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k44fd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--k44fd-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"93ca16b2-990d-42cd-8ac7-c7b8297af1b4", ResourceVersion:"670", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 13, 8, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-k44fd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali40f99c32033", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 13:08:45.764704 containerd[1543]: 2025-01-30 13:08:45.741 [INFO][4546] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" Namespace="kube-system" Pod="coredns-6f6b679f8f-k44fd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k44fd-eth0" Jan 30 13:08:45.764704 containerd[1543]: 2025-01-30 13:08:45.741 [INFO][4546] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali40f99c32033 ContainerID="e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" Namespace="kube-system" Pod="coredns-6f6b679f8f-k44fd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k44fd-eth0" Jan 30 13:08:45.764704 containerd[1543]: 2025-01-30 13:08:45.748 [INFO][4546] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" Namespace="kube-system" Pod="coredns-6f6b679f8f-k44fd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k44fd-eth0" Jan 30 13:08:45.764704 containerd[1543]: 2025-01-30 13:08:45.748 [INFO][4546] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" Namespace="kube-system" Pod="coredns-6f6b679f8f-k44fd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k44fd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--k44fd-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"93ca16b2-990d-42cd-8ac7-c7b8297af1b4", ResourceVersion:"670", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 13, 8, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313", Pod:"coredns-6f6b679f8f-k44fd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali40f99c32033", MAC:"da:45:5a:bb:0d:2b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 13:08:45.764704 containerd[1543]: 2025-01-30 13:08:45.755 [INFO][4546] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313" Namespace="kube-system" Pod="coredns-6f6b679f8f-k44fd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--k44fd-eth0" Jan 30 13:08:45.766439 kubelet[2795]: I0130 13:08:45.760095 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-vznh7" podStartSLOduration=3.177581066 podStartE2EDuration="20.756580848s" podCreationTimestamp="2025-01-30 13:08:25 +0000 UTC" firstStartedPulling="2025-01-30 13:08:26.171441366 +0000 UTC m=+11.128110502" lastFinishedPulling="2025-01-30 13:08:43.750441147 +0000 UTC m=+28.707110284" observedRunningTime="2025-01-30 13:08:45.232891651 +0000 UTC m=+30.189560799" watchObservedRunningTime="2025-01-30 13:08:45.756580848 +0000 UTC m=+30.713249989" Jan 30 13:08:45.785260 containerd[1543]: time="2025-01-30T13:08:45.785184473Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 13:08:45.785260 containerd[1543]: time="2025-01-30T13:08:45.785234851Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 13:08:45.787767 containerd[1543]: time="2025-01-30T13:08:45.785244235Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:45.787767 containerd[1543]: time="2025-01-30T13:08:45.787738611Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:45.805811 systemd[1]: Started cri-containerd-e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313.scope - libcontainer container e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313. Jan 30 13:08:45.813753 systemd-resolved[1412]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 30 13:08:45.841162 containerd[1543]: time="2025-01-30T13:08:45.840851345Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-k44fd,Uid:93ca16b2-990d-42cd-8ac7-c7b8297af1b4,Namespace:kube-system,Attempt:5,} returns sandbox id \"e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313\"" Jan 30 13:08:45.846472 containerd[1543]: time="2025-01-30T13:08:45.846426702Z" level=info msg="CreateContainer within sandbox \"e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 30 13:08:45.857853 systemd-networkd[1451]: calidea22b1bb6e: Link UP Jan 30 13:08:45.858366 systemd-networkd[1451]: calidea22b1bb6e: Gained carrier Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.314 [INFO][4557] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.332 [INFO][4557] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--64db96d5fb--g4bk8-eth0 calico-apiserver-64db96d5fb- calico-apiserver 12536d15-5456-4602-a1a1-2e8242e08904 667 0 2025-01-30 13:08:25 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64db96d5fb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-64db96d5fb-g4bk8 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calidea22b1bb6e [] []}} ContainerID="6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-g4bk8" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--g4bk8-" Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.333 [INFO][4557] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-g4bk8" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--g4bk8-eth0" Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.614 [INFO][4642] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" HandleID="k8s-pod-network.6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" Workload="localhost-k8s-calico--apiserver--64db96d5fb--g4bk8-eth0" Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.635 [INFO][4642] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" HandleID="k8s-pod-network.6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" Workload="localhost-k8s-calico--apiserver--64db96d5fb--g4bk8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000103850), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-64db96d5fb-g4bk8", "timestamp":"2025-01-30 13:08:45.614152322 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.635 [INFO][4642] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.739 [INFO][4642] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.739 [INFO][4642] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.741 [INFO][4642] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" host="localhost" Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.830 [INFO][4642] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.834 [INFO][4642] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.838 [INFO][4642] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.840 [INFO][4642] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.840 [INFO][4642] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" host="localhost" Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.842 [INFO][4642] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447 Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.847 [INFO][4642] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" host="localhost" Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.850 [INFO][4642] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" host="localhost" Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.851 [INFO][4642] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" host="localhost" Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.851 [INFO][4642] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 13:08:45.866789 containerd[1543]: 2025-01-30 13:08:45.851 [INFO][4642] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" HandleID="k8s-pod-network.6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" Workload="localhost-k8s-calico--apiserver--64db96d5fb--g4bk8-eth0" Jan 30 13:08:45.867505 containerd[1543]: 2025-01-30 13:08:45.854 [INFO][4557] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-g4bk8" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--g4bk8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64db96d5fb--g4bk8-eth0", GenerateName:"calico-apiserver-64db96d5fb-", Namespace:"calico-apiserver", SelfLink:"", UID:"12536d15-5456-4602-a1a1-2e8242e08904", ResourceVersion:"667", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 13, 8, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64db96d5fb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-64db96d5fb-g4bk8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidea22b1bb6e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 13:08:45.867505 containerd[1543]: 2025-01-30 13:08:45.854 [INFO][4557] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-g4bk8" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--g4bk8-eth0" Jan 30 13:08:45.867505 containerd[1543]: 2025-01-30 13:08:45.854 [INFO][4557] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidea22b1bb6e ContainerID="6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-g4bk8" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--g4bk8-eth0" Jan 30 13:08:45.867505 containerd[1543]: 2025-01-30 13:08:45.857 [INFO][4557] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-g4bk8" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--g4bk8-eth0" Jan 30 13:08:45.867505 containerd[1543]: 2025-01-30 13:08:45.858 [INFO][4557] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-g4bk8" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--g4bk8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64db96d5fb--g4bk8-eth0", GenerateName:"calico-apiserver-64db96d5fb-", Namespace:"calico-apiserver", SelfLink:"", UID:"12536d15-5456-4602-a1a1-2e8242e08904", ResourceVersion:"667", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 13, 8, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64db96d5fb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447", Pod:"calico-apiserver-64db96d5fb-g4bk8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidea22b1bb6e", MAC:"72:21:78:72:00:5d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 13:08:45.867505 containerd[1543]: 2025-01-30 13:08:45.864 [INFO][4557] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-g4bk8" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--g4bk8-eth0" Jan 30 13:08:45.868144 containerd[1543]: time="2025-01-30T13:08:45.868055751Z" level=info msg="CreateContainer within sandbox \"e665fee7086681178cd7281fec862d31eaf2939e46f0e4fef59129cf59465313\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e371421038088b7914f5da2e1c571113377a27ca8d0e0a539ead6875bcae61b7\"" Jan 30 13:08:45.868435 containerd[1543]: time="2025-01-30T13:08:45.868424536Z" level=info msg="StartContainer for \"e371421038088b7914f5da2e1c571113377a27ca8d0e0a539ead6875bcae61b7\"" Jan 30 13:08:45.886853 systemd[1]: Started cri-containerd-e371421038088b7914f5da2e1c571113377a27ca8d0e0a539ead6875bcae61b7.scope - libcontainer container e371421038088b7914f5da2e1c571113377a27ca8d0e0a539ead6875bcae61b7. Jan 30 13:08:45.889178 containerd[1543]: time="2025-01-30T13:08:45.888955922Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 13:08:45.889178 containerd[1543]: time="2025-01-30T13:08:45.888992493Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 13:08:45.889178 containerd[1543]: time="2025-01-30T13:08:45.888999945Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:45.889178 containerd[1543]: time="2025-01-30T13:08:45.889042403Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:45.902864 systemd[1]: Started cri-containerd-6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447.scope - libcontainer container 6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447. Jan 30 13:08:45.911507 systemd-resolved[1412]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 30 13:08:45.939081 containerd[1543]: time="2025-01-30T13:08:45.939056098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-g4bk8,Uid:12536d15-5456-4602-a1a1-2e8242e08904,Namespace:calico-apiserver,Attempt:5,} returns sandbox id \"6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447\"" Jan 30 13:08:45.940622 containerd[1543]: time="2025-01-30T13:08:45.940484116Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 30 13:08:45.975982 containerd[1543]: time="2025-01-30T13:08:45.975953269Z" level=info msg="StartContainer for \"e371421038088b7914f5da2e1c571113377a27ca8d0e0a539ead6875bcae61b7\" returns successfully" Jan 30 13:08:45.983561 systemd-networkd[1451]: cali5d4557d2921: Link UP Jan 30 13:08:45.984831 systemd-networkd[1451]: cali5d4557d2921: Gained carrier Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.329 [INFO][4618] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.346 [INFO][4618] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--rrtms-eth0 coredns-6f6b679f8f- kube-system 10e36bb1-d0be-4ccd-ba00-61a2715458b9 668 0 2025-01-30 13:08:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-rrtms eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5d4557d2921 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" Namespace="kube-system" Pod="coredns-6f6b679f8f-rrtms" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rrtms-" Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.346 [INFO][4618] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" Namespace="kube-system" Pod="coredns-6f6b679f8f-rrtms" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rrtms-eth0" Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.614 [INFO][4643] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" HandleID="k8s-pod-network.82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" Workload="localhost-k8s-coredns--6f6b679f8f--rrtms-eth0" Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.633 [INFO][4643] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" HandleID="k8s-pod-network.82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" Workload="localhost-k8s-coredns--6f6b679f8f--rrtms-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000303620), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-rrtms", "timestamp":"2025-01-30 13:08:45.614225209 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.635 [INFO][4643] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.851 [INFO][4643] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.851 [INFO][4643] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.853 [INFO][4643] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" host="localhost" Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.931 [INFO][4643] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.941 [INFO][4643] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.943 [INFO][4643] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.950 [INFO][4643] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.950 [INFO][4643] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" host="localhost" Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.953 [INFO][4643] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.962 [INFO][4643] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" host="localhost" Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.969 [INFO][4643] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" host="localhost" Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.972 [INFO][4643] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" host="localhost" Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.972 [INFO][4643] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 13:08:45.996046 containerd[1543]: 2025-01-30 13:08:45.972 [INFO][4643] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" HandleID="k8s-pod-network.82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" Workload="localhost-k8s-coredns--6f6b679f8f--rrtms-eth0" Jan 30 13:08:45.996633 containerd[1543]: 2025-01-30 13:08:45.978 [INFO][4618] cni-plugin/k8s.go 386: Populated endpoint ContainerID="82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" Namespace="kube-system" Pod="coredns-6f6b679f8f-rrtms" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rrtms-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--rrtms-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"10e36bb1-d0be-4ccd-ba00-61a2715458b9", ResourceVersion:"668", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 13, 8, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-rrtms", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5d4557d2921", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 13:08:45.996633 containerd[1543]: 2025-01-30 13:08:45.978 [INFO][4618] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" Namespace="kube-system" Pod="coredns-6f6b679f8f-rrtms" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rrtms-eth0" Jan 30 13:08:45.996633 containerd[1543]: 2025-01-30 13:08:45.978 [INFO][4618] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5d4557d2921 ContainerID="82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" Namespace="kube-system" Pod="coredns-6f6b679f8f-rrtms" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rrtms-eth0" Jan 30 13:08:45.996633 containerd[1543]: 2025-01-30 13:08:45.985 [INFO][4618] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" Namespace="kube-system" Pod="coredns-6f6b679f8f-rrtms" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rrtms-eth0" Jan 30 13:08:45.996633 containerd[1543]: 2025-01-30 13:08:45.985 [INFO][4618] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" Namespace="kube-system" Pod="coredns-6f6b679f8f-rrtms" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rrtms-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--rrtms-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"10e36bb1-d0be-4ccd-ba00-61a2715458b9", ResourceVersion:"668", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 13, 8, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab", Pod:"coredns-6f6b679f8f-rrtms", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5d4557d2921", MAC:"e6:df:af:fb:a1:78", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 13:08:45.996633 containerd[1543]: 2025-01-30 13:08:45.994 [INFO][4618] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab" Namespace="kube-system" Pod="coredns-6f6b679f8f-rrtms" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rrtms-eth0" Jan 30 13:08:46.024764 containerd[1543]: time="2025-01-30T13:08:46.024550359Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 13:08:46.024764 containerd[1543]: time="2025-01-30T13:08:46.024589396Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 13:08:46.024764 containerd[1543]: time="2025-01-30T13:08:46.024599004Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:46.024764 containerd[1543]: time="2025-01-30T13:08:46.024658760Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:46.051811 systemd[1]: Started cri-containerd-82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab.scope - libcontainer container 82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab. Jan 30 13:08:46.074432 systemd-resolved[1412]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 30 13:08:46.075425 systemd-networkd[1451]: cali5587f02c303: Link UP Jan 30 13:08:46.075876 systemd-networkd[1451]: cali5587f02c303: Gained carrier Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:45.344 [INFO][4603] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:45.363 [INFO][4603] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--gj66t-eth0 csi-node-driver- calico-system 0d14c0df-65f9-4785-8227-ecaaf26cf401 576 0 2025-01-30 13:08:25 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:56747c9949 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-gj66t eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali5587f02c303 [] []}} ContainerID="5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" Namespace="calico-system" Pod="csi-node-driver-gj66t" WorkloadEndpoint="localhost-k8s-csi--node--driver--gj66t-" Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:45.365 [INFO][4603] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" Namespace="calico-system" Pod="csi-node-driver-gj66t" WorkloadEndpoint="localhost-k8s-csi--node--driver--gj66t-eth0" Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:45.616 [INFO][4645] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" HandleID="k8s-pod-network.5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" Workload="localhost-k8s-csi--node--driver--gj66t-eth0" Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:45.635 [INFO][4645] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" HandleID="k8s-pod-network.5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" Workload="localhost-k8s-csi--node--driver--gj66t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001025a0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-gj66t", "timestamp":"2025-01-30 13:08:45.616529376 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:45.635 [INFO][4645] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:45.972 [INFO][4645] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:45.972 [INFO][4645] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:45.982 [INFO][4645] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" host="localhost" Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:46.032 [INFO][4645] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:46.039 [INFO][4645] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:46.051 [INFO][4645] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:46.057 [INFO][4645] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:46.057 [INFO][4645] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" host="localhost" Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:46.060 [INFO][4645] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698 Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:46.067 [INFO][4645] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" host="localhost" Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:46.071 [INFO][4645] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" host="localhost" Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:46.071 [INFO][4645] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" host="localhost" Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:46.071 [INFO][4645] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 13:08:46.091418 containerd[1543]: 2025-01-30 13:08:46.071 [INFO][4645] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" HandleID="k8s-pod-network.5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" Workload="localhost-k8s-csi--node--driver--gj66t-eth0" Jan 30 13:08:46.092366 containerd[1543]: 2025-01-30 13:08:46.073 [INFO][4603] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" Namespace="calico-system" Pod="csi-node-driver-gj66t" WorkloadEndpoint="localhost-k8s-csi--node--driver--gj66t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--gj66t-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0d14c0df-65f9-4785-8227-ecaaf26cf401", ResourceVersion:"576", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 13, 8, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"56747c9949", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-gj66t", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5587f02c303", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 13:08:46.092366 containerd[1543]: 2025-01-30 13:08:46.073 [INFO][4603] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" Namespace="calico-system" Pod="csi-node-driver-gj66t" WorkloadEndpoint="localhost-k8s-csi--node--driver--gj66t-eth0" Jan 30 13:08:46.092366 containerd[1543]: 2025-01-30 13:08:46.073 [INFO][4603] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5587f02c303 ContainerID="5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" Namespace="calico-system" Pod="csi-node-driver-gj66t" WorkloadEndpoint="localhost-k8s-csi--node--driver--gj66t-eth0" Jan 30 13:08:46.092366 containerd[1543]: 2025-01-30 13:08:46.077 [INFO][4603] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" Namespace="calico-system" Pod="csi-node-driver-gj66t" WorkloadEndpoint="localhost-k8s-csi--node--driver--gj66t-eth0" Jan 30 13:08:46.092366 containerd[1543]: 2025-01-30 13:08:46.078 [INFO][4603] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" Namespace="calico-system" Pod="csi-node-driver-gj66t" WorkloadEndpoint="localhost-k8s-csi--node--driver--gj66t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--gj66t-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0d14c0df-65f9-4785-8227-ecaaf26cf401", ResourceVersion:"576", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 13, 8, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"56747c9949", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698", Pod:"csi-node-driver-gj66t", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5587f02c303", MAC:"aa:b4:8e:7d:3d:d4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 13:08:46.092366 containerd[1543]: 2025-01-30 13:08:46.084 [INFO][4603] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698" Namespace="calico-system" Pod="csi-node-driver-gj66t" WorkloadEndpoint="localhost-k8s-csi--node--driver--gj66t-eth0" Jan 30 13:08:46.115112 containerd[1543]: time="2025-01-30T13:08:46.114631721Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 13:08:46.115112 containerd[1543]: time="2025-01-30T13:08:46.114668918Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 13:08:46.115112 containerd[1543]: time="2025-01-30T13:08:46.114738684Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:46.115112 containerd[1543]: time="2025-01-30T13:08:46.114798514Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:46.136105 systemd[1]: Started cri-containerd-5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698.scope - libcontainer container 5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698. Jan 30 13:08:46.169597 systemd-resolved[1412]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 30 13:08:46.177529 containerd[1543]: time="2025-01-30T13:08:46.177488449Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rrtms,Uid:10e36bb1-d0be-4ccd-ba00-61a2715458b9,Namespace:kube-system,Attempt:5,} returns sandbox id \"82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab\"" Jan 30 13:08:46.178529 systemd-networkd[1451]: calie09ef955973: Link UP Jan 30 13:08:46.179231 systemd-networkd[1451]: calie09ef955973: Gained carrier Jan 30 13:08:46.191815 kubelet[2795]: I0130 13:08:46.191783 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-k44fd" podStartSLOduration=27.19176715 podStartE2EDuration="27.19176715s" podCreationTimestamp="2025-01-30 13:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-30 13:08:46.159051775 +0000 UTC m=+31.115720913" watchObservedRunningTime="2025-01-30 13:08:46.19176715 +0000 UTC m=+31.148436283" Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:45.325 [INFO][4565] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:45.353 [INFO][4565] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--697fd69f5c--2x4nc-eth0 calico-kube-controllers-697fd69f5c- calico-system da358280-7ea3-4fe4-afd4-56d955439401 666 0 2025-01-30 13:08:25 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:697fd69f5c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-697fd69f5c-2x4nc eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calie09ef955973 [] []}} ContainerID="ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" Namespace="calico-system" Pod="calico-kube-controllers-697fd69f5c-2x4nc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fd69f5c--2x4nc-" Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:45.354 [INFO][4565] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" Namespace="calico-system" Pod="calico-kube-controllers-697fd69f5c-2x4nc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fd69f5c--2x4nc-eth0" Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:45.614 [INFO][4644] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" HandleID="k8s-pod-network.ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" Workload="localhost-k8s-calico--kube--controllers--697fd69f5c--2x4nc-eth0" Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:45.634 [INFO][4644] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" HandleID="k8s-pod-network.ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" Workload="localhost-k8s-calico--kube--controllers--697fd69f5c--2x4nc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003113e0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-697fd69f5c-2x4nc", "timestamp":"2025-01-30 13:08:45.614114545 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:45.635 [INFO][4644] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:46.072 [INFO][4644] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:46.072 [INFO][4644] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:46.085 [INFO][4644] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" host="localhost" Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:46.138 [INFO][4644] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:46.146 [INFO][4644] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:46.148 [INFO][4644] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:46.151 [INFO][4644] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:46.151 [INFO][4644] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" host="localhost" Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:46.152 [INFO][4644] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798 Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:46.155 [INFO][4644] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" host="localhost" Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:46.160 [INFO][4644] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" host="localhost" Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:46.161 [INFO][4644] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" host="localhost" Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:46.161 [INFO][4644] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 13:08:46.201236 containerd[1543]: 2025-01-30 13:08:46.161 [INFO][4644] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" HandleID="k8s-pod-network.ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" Workload="localhost-k8s-calico--kube--controllers--697fd69f5c--2x4nc-eth0" Jan 30 13:08:46.203559 containerd[1543]: 2025-01-30 13:08:46.173 [INFO][4565] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" Namespace="calico-system" Pod="calico-kube-controllers-697fd69f5c-2x4nc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fd69f5c--2x4nc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--697fd69f5c--2x4nc-eth0", GenerateName:"calico-kube-controllers-697fd69f5c-", Namespace:"calico-system", SelfLink:"", UID:"da358280-7ea3-4fe4-afd4-56d955439401", ResourceVersion:"666", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 13, 8, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"697fd69f5c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-697fd69f5c-2x4nc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie09ef955973", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 13:08:46.203559 containerd[1543]: 2025-01-30 13:08:46.175 [INFO][4565] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" Namespace="calico-system" Pod="calico-kube-controllers-697fd69f5c-2x4nc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fd69f5c--2x4nc-eth0" Jan 30 13:08:46.203559 containerd[1543]: 2025-01-30 13:08:46.175 [INFO][4565] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie09ef955973 ContainerID="ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" Namespace="calico-system" Pod="calico-kube-controllers-697fd69f5c-2x4nc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fd69f5c--2x4nc-eth0" Jan 30 13:08:46.203559 containerd[1543]: 2025-01-30 13:08:46.179 [INFO][4565] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" Namespace="calico-system" Pod="calico-kube-controllers-697fd69f5c-2x4nc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fd69f5c--2x4nc-eth0" Jan 30 13:08:46.203559 containerd[1543]: 2025-01-30 13:08:46.181 [INFO][4565] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" Namespace="calico-system" Pod="calico-kube-controllers-697fd69f5c-2x4nc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fd69f5c--2x4nc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--697fd69f5c--2x4nc-eth0", GenerateName:"calico-kube-controllers-697fd69f5c-", Namespace:"calico-system", SelfLink:"", UID:"da358280-7ea3-4fe4-afd4-56d955439401", ResourceVersion:"666", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 13, 8, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"697fd69f5c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798", Pod:"calico-kube-controllers-697fd69f5c-2x4nc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie09ef955973", MAC:"8a:b9:e0:28:ab:54", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 13:08:46.203559 containerd[1543]: 2025-01-30 13:08:46.193 [INFO][4565] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798" Namespace="calico-system" Pod="calico-kube-controllers-697fd69f5c-2x4nc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fd69f5c--2x4nc-eth0" Jan 30 13:08:46.213604 containerd[1543]: time="2025-01-30T13:08:46.213468834Z" level=info msg="CreateContainer within sandbox \"82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 30 13:08:46.238428 containerd[1543]: time="2025-01-30T13:08:46.238395234Z" level=info msg="CreateContainer within sandbox \"82b6b2523eee555321d97731a55ca1505f0114bf168f73e8aeb0bc94741417ab\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9751736659c52ed7919319ff3a8ef7b96cd3d7ddee954c9d31fbeb2346e75aac\"" Jan 30 13:08:46.239096 containerd[1543]: time="2025-01-30T13:08:46.239081653Z" level=info msg="StartContainer for \"9751736659c52ed7919319ff3a8ef7b96cd3d7ddee954c9d31fbeb2346e75aac\"" Jan 30 13:08:46.239226 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount937851246.mount: Deactivated successfully. Jan 30 13:08:46.245110 containerd[1543]: time="2025-01-30T13:08:46.245082408Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gj66t,Uid:0d14c0df-65f9-4785-8227-ecaaf26cf401,Namespace:calico-system,Attempt:5,} returns sandbox id \"5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698\"" Jan 30 13:08:46.279317 containerd[1543]: time="2025-01-30T13:08:46.279052749Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 13:08:46.279317 containerd[1543]: time="2025-01-30T13:08:46.279083212Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 13:08:46.279317 containerd[1543]: time="2025-01-30T13:08:46.279090380Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:46.279317 containerd[1543]: time="2025-01-30T13:08:46.279133112Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:46.297993 systemd[1]: Started cri-containerd-ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798.scope - libcontainer container ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798. Jan 30 13:08:46.301855 systemd-networkd[1451]: cali58cde7d3731: Link UP Jan 30 13:08:46.304554 systemd-networkd[1451]: cali58cde7d3731: Gained carrier Jan 30 13:08:46.313447 systemd[1]: Started cri-containerd-9751736659c52ed7919319ff3a8ef7b96cd3d7ddee954c9d31fbeb2346e75aac.scope - libcontainer container 9751736659c52ed7919319ff3a8ef7b96cd3d7ddee954c9d31fbeb2346e75aac. Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:45.268 [INFO][4578] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:45.295 [INFO][4578] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--64db96d5fb--lnt9h-eth0 calico-apiserver-64db96d5fb- calico-apiserver 11bf24ac-ae1c-4a6f-b202-4add9f89afb0 665 0 2025-01-30 13:08:25 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64db96d5fb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-64db96d5fb-lnt9h eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali58cde7d3731 [] []}} ContainerID="b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-lnt9h" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--lnt9h-" Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:45.296 [INFO][4578] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-lnt9h" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--lnt9h-eth0" Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:45.614 [INFO][4634] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" HandleID="k8s-pod-network.b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" Workload="localhost-k8s-calico--apiserver--64db96d5fb--lnt9h-eth0" Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:45.634 [INFO][4634] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" HandleID="k8s-pod-network.b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" Workload="localhost-k8s-calico--apiserver--64db96d5fb--lnt9h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003f8630), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-64db96d5fb-lnt9h", "timestamp":"2025-01-30 13:08:45.614015762 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:45.635 [INFO][4634] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:46.161 [INFO][4634] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:46.161 [INFO][4634] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:46.187 [INFO][4634] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" host="localhost" Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:46.236 [INFO][4634] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:46.254 [INFO][4634] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:46.260 [INFO][4634] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:46.263 [INFO][4634] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:46.264 [INFO][4634] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" host="localhost" Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:46.266 [INFO][4634] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67 Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:46.274 [INFO][4634] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" host="localhost" Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:46.288 [INFO][4634] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" host="localhost" Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:46.288 [INFO][4634] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" host="localhost" Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:46.288 [INFO][4634] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 13:08:46.326733 containerd[1543]: 2025-01-30 13:08:46.288 [INFO][4634] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" HandleID="k8s-pod-network.b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" Workload="localhost-k8s-calico--apiserver--64db96d5fb--lnt9h-eth0" Jan 30 13:08:46.327210 containerd[1543]: 2025-01-30 13:08:46.295 [INFO][4578] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-lnt9h" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--lnt9h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64db96d5fb--lnt9h-eth0", GenerateName:"calico-apiserver-64db96d5fb-", Namespace:"calico-apiserver", SelfLink:"", UID:"11bf24ac-ae1c-4a6f-b202-4add9f89afb0", ResourceVersion:"665", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 13, 8, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64db96d5fb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-64db96d5fb-lnt9h", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali58cde7d3731", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 13:08:46.327210 containerd[1543]: 2025-01-30 13:08:46.295 [INFO][4578] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-lnt9h" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--lnt9h-eth0" Jan 30 13:08:46.327210 containerd[1543]: 2025-01-30 13:08:46.296 [INFO][4578] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali58cde7d3731 ContainerID="b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-lnt9h" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--lnt9h-eth0" Jan 30 13:08:46.327210 containerd[1543]: 2025-01-30 13:08:46.304 [INFO][4578] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-lnt9h" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--lnt9h-eth0" Jan 30 13:08:46.327210 containerd[1543]: 2025-01-30 13:08:46.305 [INFO][4578] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-lnt9h" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--lnt9h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64db96d5fb--lnt9h-eth0", GenerateName:"calico-apiserver-64db96d5fb-", Namespace:"calico-apiserver", SelfLink:"", UID:"11bf24ac-ae1c-4a6f-b202-4add9f89afb0", ResourceVersion:"665", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 13, 8, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64db96d5fb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67", Pod:"calico-apiserver-64db96d5fb-lnt9h", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali58cde7d3731", MAC:"de:69:e3:b1:87:81", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 13:08:46.327210 containerd[1543]: 2025-01-30 13:08:46.321 [INFO][4578] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67" Namespace="calico-apiserver" Pod="calico-apiserver-64db96d5fb-lnt9h" WorkloadEndpoint="localhost-k8s-calico--apiserver--64db96d5fb--lnt9h-eth0" Jan 30 13:08:46.344358 systemd-resolved[1412]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 30 13:08:46.359500 containerd[1543]: time="2025-01-30T13:08:46.359462015Z" level=info msg="StartContainer for \"9751736659c52ed7919319ff3a8ef7b96cd3d7ddee954c9d31fbeb2346e75aac\" returns successfully" Jan 30 13:08:46.365511 containerd[1543]: time="2025-01-30T13:08:46.365445393Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 13:08:46.365636 containerd[1543]: time="2025-01-30T13:08:46.365612408Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 13:08:46.365636 containerd[1543]: time="2025-01-30T13:08:46.365624616Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:46.365856 containerd[1543]: time="2025-01-30T13:08:46.365761551Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 13:08:46.382927 systemd[1]: Started cri-containerd-b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67.scope - libcontainer container b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67. Jan 30 13:08:46.399810 systemd-resolved[1412]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 30 13:08:46.417808 containerd[1543]: time="2025-01-30T13:08:46.417781048Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fd69f5c-2x4nc,Uid:da358280-7ea3-4fe4-afd4-56d955439401,Namespace:calico-system,Attempt:5,} returns sandbox id \"ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798\"" Jan 30 13:08:46.433848 containerd[1543]: time="2025-01-30T13:08:46.433820438Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64db96d5fb-lnt9h,Uid:11bf24ac-ae1c-4a6f-b202-4add9f89afb0,Namespace:calico-apiserver,Attempt:5,} returns sandbox id \"b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67\"" Jan 30 13:08:47.183611 kubelet[2795]: I0130 13:08:47.183558 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-rrtms" podStartSLOduration=28.183539683 podStartE2EDuration="28.183539683s" podCreationTimestamp="2025-01-30 13:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-30 13:08:47.182911494 +0000 UTC m=+32.139580634" watchObservedRunningTime="2025-01-30 13:08:47.183539683 +0000 UTC m=+32.140208821" Jan 30 13:08:47.192056 systemd-networkd[1451]: cali5d4557d2921: Gained IPv6LL Jan 30 13:08:47.250812 systemd-networkd[1451]: cali40f99c32033: Gained IPv6LL Jan 30 13:08:47.378796 systemd-networkd[1451]: calie09ef955973: Gained IPv6LL Jan 30 13:08:47.379123 systemd-networkd[1451]: cali5587f02c303: Gained IPv6LL Jan 30 13:08:47.442798 systemd-networkd[1451]: calidea22b1bb6e: Gained IPv6LL Jan 30 13:08:48.211797 systemd-networkd[1451]: cali58cde7d3731: Gained IPv6LL Jan 30 13:08:48.229593 containerd[1543]: time="2025-01-30T13:08:48.229331620Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:48.230066 containerd[1543]: time="2025-01-30T13:08:48.229847726Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Jan 30 13:08:48.231810 containerd[1543]: time="2025-01-30T13:08:48.230617200Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:48.232236 containerd[1543]: time="2025-01-30T13:08:48.232069746Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:48.232974 containerd[1543]: time="2025-01-30T13:08:48.232944476Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 2.292436762s" Jan 30 13:08:48.232974 containerd[1543]: time="2025-01-30T13:08:48.232973360Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 30 13:08:48.234883 containerd[1543]: time="2025-01-30T13:08:48.234664775Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Jan 30 13:08:48.235775 containerd[1543]: time="2025-01-30T13:08:48.235735747Z" level=info msg="CreateContainer within sandbox \"6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 30 13:08:48.249387 containerd[1543]: time="2025-01-30T13:08:48.249239390Z" level=info msg="CreateContainer within sandbox \"6ad1a0e4f0c2f010b6a4cdf4347606f0d0d4f0de0d81f77fd5a02f7c0b3ef447\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"bf2bf199a071df496af0436d87e0f7c69c340d8a44fb8a078541625b9de79b2d\"" Jan 30 13:08:48.251197 containerd[1543]: time="2025-01-30T13:08:48.250604388Z" level=info msg="StartContainer for \"bf2bf199a071df496af0436d87e0f7c69c340d8a44fb8a078541625b9de79b2d\"" Jan 30 13:08:48.276991 systemd[1]: run-containerd-runc-k8s.io-bf2bf199a071df496af0436d87e0f7c69c340d8a44fb8a078541625b9de79b2d-runc.0aMFng.mount: Deactivated successfully. Jan 30 13:08:48.286808 systemd[1]: Started cri-containerd-bf2bf199a071df496af0436d87e0f7c69c340d8a44fb8a078541625b9de79b2d.scope - libcontainer container bf2bf199a071df496af0436d87e0f7c69c340d8a44fb8a078541625b9de79b2d. Jan 30 13:08:48.341218 containerd[1543]: time="2025-01-30T13:08:48.341182732Z" level=info msg="StartContainer for \"bf2bf199a071df496af0436d87e0f7c69c340d8a44fb8a078541625b9de79b2d\" returns successfully" Jan 30 13:08:49.214582 kubelet[2795]: I0130 13:08:49.214315 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-64db96d5fb-g4bk8" podStartSLOduration=21.920593807 podStartE2EDuration="24.214301652s" podCreationTimestamp="2025-01-30 13:08:25 +0000 UTC" firstStartedPulling="2025-01-30 13:08:45.940285762 +0000 UTC m=+30.896954898" lastFinishedPulling="2025-01-30 13:08:48.233993606 +0000 UTC m=+33.190662743" observedRunningTime="2025-01-30 13:08:49.214188727 +0000 UTC m=+34.170857872" watchObservedRunningTime="2025-01-30 13:08:49.214301652 +0000 UTC m=+34.170970792" Jan 30 13:08:49.609945 containerd[1543]: time="2025-01-30T13:08:49.609727713Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:49.610414 containerd[1543]: time="2025-01-30T13:08:49.610225124Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Jan 30 13:08:49.610724 containerd[1543]: time="2025-01-30T13:08:49.610709562Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:49.611879 containerd[1543]: time="2025-01-30T13:08:49.611847431Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:49.612530 containerd[1543]: time="2025-01-30T13:08:49.612280342Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.377572219s" Jan 30 13:08:49.612530 containerd[1543]: time="2025-01-30T13:08:49.612298075Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Jan 30 13:08:49.613568 containerd[1543]: time="2025-01-30T13:08:49.613161725Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Jan 30 13:08:49.614243 containerd[1543]: time="2025-01-30T13:08:49.614226977Z" level=info msg="CreateContainer within sandbox \"5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jan 30 13:08:49.627744 containerd[1543]: time="2025-01-30T13:08:49.627720351Z" level=info msg="CreateContainer within sandbox \"5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"d33e788068afd0c1f7b569e1b35f9d5d8ca4f3925c141bd8cd67f37fc3a8bcc3\"" Jan 30 13:08:49.628635 containerd[1543]: time="2025-01-30T13:08:49.628615291Z" level=info msg="StartContainer for \"d33e788068afd0c1f7b569e1b35f9d5d8ca4f3925c141bd8cd67f37fc3a8bcc3\"" Jan 30 13:08:49.629449 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1537911601.mount: Deactivated successfully. Jan 30 13:08:49.654817 systemd[1]: Started cri-containerd-d33e788068afd0c1f7b569e1b35f9d5d8ca4f3925c141bd8cd67f37fc3a8bcc3.scope - libcontainer container d33e788068afd0c1f7b569e1b35f9d5d8ca4f3925c141bd8cd67f37fc3a8bcc3. Jan 30 13:08:49.709739 containerd[1543]: time="2025-01-30T13:08:49.709687533Z" level=info msg="StartContainer for \"d33e788068afd0c1f7b569e1b35f9d5d8ca4f3925c141bd8cd67f37fc3a8bcc3\" returns successfully" Jan 30 13:08:50.218085 kubelet[2795]: I0130 13:08:50.218058 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 13:08:50.222227 kubelet[2795]: I0130 13:08:50.222208 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 13:08:50.601704 kernel: bpftool[5398]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Jan 30 13:08:50.752009 systemd-networkd[1451]: vxlan.calico: Link UP Jan 30 13:08:50.752014 systemd-networkd[1451]: vxlan.calico: Gained carrier Jan 30 13:08:51.501528 containerd[1543]: time="2025-01-30T13:08:51.501116394Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:51.502064 containerd[1543]: time="2025-01-30T13:08:51.501534494Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Jan 30 13:08:51.502064 containerd[1543]: time="2025-01-30T13:08:51.501944735Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:51.502962 containerd[1543]: time="2025-01-30T13:08:51.502946923Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:51.503422 containerd[1543]: time="2025-01-30T13:08:51.503406273Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 1.890228588s" Jan 30 13:08:51.503454 containerd[1543]: time="2025-01-30T13:08:51.503423839Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Jan 30 13:08:51.504096 containerd[1543]: time="2025-01-30T13:08:51.504082591Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 30 13:08:51.518342 containerd[1543]: time="2025-01-30T13:08:51.518320153Z" level=info msg="CreateContainer within sandbox \"ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 30 13:08:51.525563 containerd[1543]: time="2025-01-30T13:08:51.525539349Z" level=info msg="CreateContainer within sandbox \"ba5afa9d6fee3e7583479b55d062db44a5b3249a3b950346a4bd63125b234798\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"31e33239170e83f6f5a3b4729bfbf30157549a9558b1a5d3b37fbe893ab2cc39\"" Jan 30 13:08:51.525898 containerd[1543]: time="2025-01-30T13:08:51.525885909Z" level=info msg="StartContainer for \"31e33239170e83f6f5a3b4729bfbf30157549a9558b1a5d3b37fbe893ab2cc39\"" Jan 30 13:08:51.558773 systemd[1]: Started cri-containerd-31e33239170e83f6f5a3b4729bfbf30157549a9558b1a5d3b37fbe893ab2cc39.scope - libcontainer container 31e33239170e83f6f5a3b4729bfbf30157549a9558b1a5d3b37fbe893ab2cc39. Jan 30 13:08:51.592298 containerd[1543]: time="2025-01-30T13:08:51.592155725Z" level=info msg="StartContainer for \"31e33239170e83f6f5a3b4729bfbf30157549a9558b1a5d3b37fbe893ab2cc39\" returns successfully" Jan 30 13:08:51.904132 containerd[1543]: time="2025-01-30T13:08:51.903988203Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:51.904132 containerd[1543]: time="2025-01-30T13:08:51.904107209Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Jan 30 13:08:51.905519 containerd[1543]: time="2025-01-30T13:08:51.905507315Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 401.390993ms" Jan 30 13:08:51.905613 containerd[1543]: time="2025-01-30T13:08:51.905561863Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 30 13:08:51.906375 containerd[1543]: time="2025-01-30T13:08:51.906356112Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Jan 30 13:08:51.907346 containerd[1543]: time="2025-01-30T13:08:51.907255120Z" level=info msg="CreateContainer within sandbox \"b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 30 13:08:51.925709 containerd[1543]: time="2025-01-30T13:08:51.925670072Z" level=info msg="CreateContainer within sandbox \"b9ef41698d3245b8714f53fbe810e908ee00b31fc956c54483c687c3f3490d67\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"7c61f764d7afcd727892073f5b351ad17efc8a66aa90a11bac4dc12a257c4e9f\"" Jan 30 13:08:51.926132 containerd[1543]: time="2025-01-30T13:08:51.926090014Z" level=info msg="StartContainer for \"7c61f764d7afcd727892073f5b351ad17efc8a66aa90a11bac4dc12a257c4e9f\"" Jan 30 13:08:51.948770 systemd[1]: Started cri-containerd-7c61f764d7afcd727892073f5b351ad17efc8a66aa90a11bac4dc12a257c4e9f.scope - libcontainer container 7c61f764d7afcd727892073f5b351ad17efc8a66aa90a11bac4dc12a257c4e9f. Jan 30 13:08:51.979018 containerd[1543]: time="2025-01-30T13:08:51.978992979Z" level=info msg="StartContainer for \"7c61f764d7afcd727892073f5b351ad17efc8a66aa90a11bac4dc12a257c4e9f\" returns successfully" Jan 30 13:08:52.239871 kubelet[2795]: I0130 13:08:52.239786 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-64db96d5fb-lnt9h" podStartSLOduration=21.76848954 podStartE2EDuration="27.239770172s" podCreationTimestamp="2025-01-30 13:08:25 +0000 UTC" firstStartedPulling="2025-01-30 13:08:46.434703141 +0000 UTC m=+31.391372277" lastFinishedPulling="2025-01-30 13:08:51.905983773 +0000 UTC m=+36.862652909" observedRunningTime="2025-01-30 13:08:52.238984587 +0000 UTC m=+37.195653732" watchObservedRunningTime="2025-01-30 13:08:52.239770172 +0000 UTC m=+37.196439311" Jan 30 13:08:52.288492 kubelet[2795]: I0130 13:08:52.288130 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-697fd69f5c-2x4nc" podStartSLOduration=22.204305294 podStartE2EDuration="27.288115403s" podCreationTimestamp="2025-01-30 13:08:25 +0000 UTC" firstStartedPulling="2025-01-30 13:08:46.42011921 +0000 UTC m=+31.376788346" lastFinishedPulling="2025-01-30 13:08:51.503929319 +0000 UTC m=+36.460598455" observedRunningTime="2025-01-30 13:08:52.287953791 +0000 UTC m=+37.244622945" watchObservedRunningTime="2025-01-30 13:08:52.288115403 +0000 UTC m=+37.244784560" Jan 30 13:08:52.754929 systemd-networkd[1451]: vxlan.calico: Gained IPv6LL Jan 30 13:08:53.218577 kubelet[2795]: I0130 13:08:53.218545 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 13:08:53.220532 kubelet[2795]: I0130 13:08:53.220441 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 13:08:53.819818 containerd[1543]: time="2025-01-30T13:08:53.819776644Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:53.821797 containerd[1543]: time="2025-01-30T13:08:53.821758947Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Jan 30 13:08:53.830509 containerd[1543]: time="2025-01-30T13:08:53.830469331Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:53.841503 containerd[1543]: time="2025-01-30T13:08:53.841472159Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 13:08:53.842313 containerd[1543]: time="2025-01-30T13:08:53.841975058Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 1.935597373s" Jan 30 13:08:53.842313 containerd[1543]: time="2025-01-30T13:08:53.841997924Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Jan 30 13:08:53.843472 containerd[1543]: time="2025-01-30T13:08:53.843445675Z" level=info msg="CreateContainer within sandbox \"5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jan 30 13:08:53.886603 containerd[1543]: time="2025-01-30T13:08:53.886569126Z" level=info msg="CreateContainer within sandbox \"5791a21065c52a79290d029cd9309c60c87af2b913067d274b7ffbcda2fca698\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"0b1ee5ade175de2a296d25dbfd07b8da069196815e84e6ebc05e9c98b051cdac\"" Jan 30 13:08:53.888156 containerd[1543]: time="2025-01-30T13:08:53.887095157Z" level=info msg="StartContainer for \"0b1ee5ade175de2a296d25dbfd07b8da069196815e84e6ebc05e9c98b051cdac\"" Jan 30 13:08:53.914731 systemd[1]: Started cri-containerd-0b1ee5ade175de2a296d25dbfd07b8da069196815e84e6ebc05e9c98b051cdac.scope - libcontainer container 0b1ee5ade175de2a296d25dbfd07b8da069196815e84e6ebc05e9c98b051cdac. Jan 30 13:08:53.934358 containerd[1543]: time="2025-01-30T13:08:53.934304535Z" level=info msg="StartContainer for \"0b1ee5ade175de2a296d25dbfd07b8da069196815e84e6ebc05e9c98b051cdac\" returns successfully" Jan 30 13:08:54.848548 kubelet[2795]: I0130 13:08:54.848515 2795 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jan 30 13:08:55.043311 kubelet[2795]: I0130 13:08:55.043279 2795 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jan 30 13:08:55.845124 kubelet[2795]: I0130 13:08:55.844999 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 13:08:55.904919 systemd[1]: run-containerd-runc-k8s.io-31e33239170e83f6f5a3b4729bfbf30157549a9558b1a5d3b37fbe893ab2cc39-runc.kUn07a.mount: Deactivated successfully. Jan 30 13:08:55.971769 kubelet[2795]: I0130 13:08:55.971572 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-gj66t" podStartSLOduration=23.377980548 podStartE2EDuration="30.971557936s" podCreationTimestamp="2025-01-30 13:08:25 +0000 UTC" firstStartedPulling="2025-01-30 13:08:46.248879559 +0000 UTC m=+31.205548696" lastFinishedPulling="2025-01-30 13:08:53.842456942 +0000 UTC m=+38.799126084" observedRunningTime="2025-01-30 13:08:54.238609469 +0000 UTC m=+39.195278614" watchObservedRunningTime="2025-01-30 13:08:55.971557936 +0000 UTC m=+40.928227076" Jan 30 13:09:08.097121 kubelet[2795]: I0130 13:09:08.097091 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 13:09:15.148922 containerd[1543]: time="2025-01-30T13:09:15.148894125Z" level=info msg="StopPodSandbox for \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\"" Jan 30 13:09:15.150106 containerd[1543]: time="2025-01-30T13:09:15.148970827Z" level=info msg="TearDown network for sandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\" successfully" Jan 30 13:09:15.150106 containerd[1543]: time="2025-01-30T13:09:15.148979661Z" level=info msg="StopPodSandbox for \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\" returns successfully" Jan 30 13:09:15.205110 containerd[1543]: time="2025-01-30T13:09:15.205077726Z" level=info msg="RemovePodSandbox for \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\"" Jan 30 13:09:15.209893 containerd[1543]: time="2025-01-30T13:09:15.209875767Z" level=info msg="Forcibly stopping sandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\"" Jan 30 13:09:15.212772 containerd[1543]: time="2025-01-30T13:09:15.209945417Z" level=info msg="TearDown network for sandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\" successfully" Jan 30 13:09:15.236295 containerd[1543]: time="2025-01-30T13:09:15.236255972Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.236441 containerd[1543]: time="2025-01-30T13:09:15.236320883Z" level=info msg="RemovePodSandbox \"ec777dbb980c6c170565cf34c2928657fffe6bc5a5784ea9d3b6f73cc9868e60\" returns successfully" Jan 30 13:09:15.236754 containerd[1543]: time="2025-01-30T13:09:15.236740347Z" level=info msg="StopPodSandbox for \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\"" Jan 30 13:09:15.236816 containerd[1543]: time="2025-01-30T13:09:15.236802690Z" level=info msg="TearDown network for sandbox \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\" successfully" Jan 30 13:09:15.236816 containerd[1543]: time="2025-01-30T13:09:15.236810811Z" level=info msg="StopPodSandbox for \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\" returns successfully" Jan 30 13:09:15.236973 containerd[1543]: time="2025-01-30T13:09:15.236960548Z" level=info msg="RemovePodSandbox for \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\"" Jan 30 13:09:15.237005 containerd[1543]: time="2025-01-30T13:09:15.236973108Z" level=info msg="Forcibly stopping sandbox \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\"" Jan 30 13:09:15.237039 containerd[1543]: time="2025-01-30T13:09:15.237005004Z" level=info msg="TearDown network for sandbox \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\" successfully" Jan 30 13:09:15.238645 containerd[1543]: time="2025-01-30T13:09:15.238620702Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.238732 containerd[1543]: time="2025-01-30T13:09:15.238702751Z" level=info msg="RemovePodSandbox \"17247340642b847f36d7e2a6a4139dfdb19c0b1f5997fcd680450e441c31ecd3\" returns successfully" Jan 30 13:09:15.239002 containerd[1543]: time="2025-01-30T13:09:15.238987781Z" level=info msg="StopPodSandbox for \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\"" Jan 30 13:09:15.239192 containerd[1543]: time="2025-01-30T13:09:15.239145295Z" level=info msg="TearDown network for sandbox \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\" successfully" Jan 30 13:09:15.239192 containerd[1543]: time="2025-01-30T13:09:15.239154816Z" level=info msg="StopPodSandbox for \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\" returns successfully" Jan 30 13:09:15.239343 containerd[1543]: time="2025-01-30T13:09:15.239327156Z" level=info msg="RemovePodSandbox for \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\"" Jan 30 13:09:15.239373 containerd[1543]: time="2025-01-30T13:09:15.239343345Z" level=info msg="Forcibly stopping sandbox \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\"" Jan 30 13:09:15.239790 containerd[1543]: time="2025-01-30T13:09:15.239379894Z" level=info msg="TearDown network for sandbox \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\" successfully" Jan 30 13:09:15.240660 containerd[1543]: time="2025-01-30T13:09:15.240641958Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.240727 containerd[1543]: time="2025-01-30T13:09:15.240685986Z" level=info msg="RemovePodSandbox \"02cefac875987c24beeddc5c984ceca766c614775335ae24e24ab5492809c953\" returns successfully" Jan 30 13:09:15.241199 containerd[1543]: time="2025-01-30T13:09:15.241093358Z" level=info msg="StopPodSandbox for \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\"" Jan 30 13:09:15.241199 containerd[1543]: time="2025-01-30T13:09:15.241148035Z" level=info msg="TearDown network for sandbox \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\" successfully" Jan 30 13:09:15.241199 containerd[1543]: time="2025-01-30T13:09:15.241154503Z" level=info msg="StopPodSandbox for \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\" returns successfully" Jan 30 13:09:15.241714 containerd[1543]: time="2025-01-30T13:09:15.241418145Z" level=info msg="RemovePodSandbox for \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\"" Jan 30 13:09:15.241714 containerd[1543]: time="2025-01-30T13:09:15.241458080Z" level=info msg="Forcibly stopping sandbox \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\"" Jan 30 13:09:15.241714 containerd[1543]: time="2025-01-30T13:09:15.241552733Z" level=info msg="TearDown network for sandbox \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\" successfully" Jan 30 13:09:15.243227 containerd[1543]: time="2025-01-30T13:09:15.243196660Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.243288 containerd[1543]: time="2025-01-30T13:09:15.243246769Z" level=info msg="RemovePodSandbox \"00eb39263c164f849e626a8d7ef48a65d7a449dbea4572ad252b79087026accc\" returns successfully" Jan 30 13:09:15.243943 containerd[1543]: time="2025-01-30T13:09:15.243492510Z" level=info msg="StopPodSandbox for \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\"" Jan 30 13:09:15.243943 containerd[1543]: time="2025-01-30T13:09:15.243545136Z" level=info msg="TearDown network for sandbox \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\" successfully" Jan 30 13:09:15.243943 containerd[1543]: time="2025-01-30T13:09:15.243551057Z" level=info msg="StopPodSandbox for \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\" returns successfully" Jan 30 13:09:15.243943 containerd[1543]: time="2025-01-30T13:09:15.243673104Z" level=info msg="RemovePodSandbox for \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\"" Jan 30 13:09:15.243943 containerd[1543]: time="2025-01-30T13:09:15.243694572Z" level=info msg="Forcibly stopping sandbox \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\"" Jan 30 13:09:15.243943 containerd[1543]: time="2025-01-30T13:09:15.243722453Z" level=info msg="TearDown network for sandbox \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\" successfully" Jan 30 13:09:15.244980 containerd[1543]: time="2025-01-30T13:09:15.244961378Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.245029 containerd[1543]: time="2025-01-30T13:09:15.244996109Z" level=info msg="RemovePodSandbox \"08ae3e1a6fe6a31656e1c7596d37fc67e1d0d7c2c2ebd11e651d088583a9b0cc\" returns successfully" Jan 30 13:09:15.246049 containerd[1543]: time="2025-01-30T13:09:15.245472030Z" level=info msg="StopPodSandbox for \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\"" Jan 30 13:09:15.246049 containerd[1543]: time="2025-01-30T13:09:15.245520392Z" level=info msg="TearDown network for sandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\" successfully" Jan 30 13:09:15.246049 containerd[1543]: time="2025-01-30T13:09:15.245526712Z" level=info msg="StopPodSandbox for \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\" returns successfully" Jan 30 13:09:15.246634 containerd[1543]: time="2025-01-30T13:09:15.246180784Z" level=info msg="RemovePodSandbox for \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\"" Jan 30 13:09:15.246634 containerd[1543]: time="2025-01-30T13:09:15.246214534Z" level=info msg="Forcibly stopping sandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\"" Jan 30 13:09:15.246634 containerd[1543]: time="2025-01-30T13:09:15.246251366Z" level=info msg="TearDown network for sandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\" successfully" Jan 30 13:09:15.249005 containerd[1543]: time="2025-01-30T13:09:15.248915513Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.249005 containerd[1543]: time="2025-01-30T13:09:15.248950648Z" level=info msg="RemovePodSandbox \"85f65c7ed7f21369e58521c249efbbd9cda76109879f87118dcc00cacae1455e\" returns successfully" Jan 30 13:09:15.249497 containerd[1543]: time="2025-01-30T13:09:15.249483693Z" level=info msg="StopPodSandbox for \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\"" Jan 30 13:09:15.249830 containerd[1543]: time="2025-01-30T13:09:15.249723523Z" level=info msg="TearDown network for sandbox \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\" successfully" Jan 30 13:09:15.249830 containerd[1543]: time="2025-01-30T13:09:15.249733472Z" level=info msg="StopPodSandbox for \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\" returns successfully" Jan 30 13:09:15.249891 containerd[1543]: time="2025-01-30T13:09:15.249849975Z" level=info msg="RemovePodSandbox for \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\"" Jan 30 13:09:15.249891 containerd[1543]: time="2025-01-30T13:09:15.249862616Z" level=info msg="Forcibly stopping sandbox \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\"" Jan 30 13:09:15.249932 containerd[1543]: time="2025-01-30T13:09:15.249898623Z" level=info msg="TearDown network for sandbox \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\" successfully" Jan 30 13:09:15.262416 containerd[1543]: time="2025-01-30T13:09:15.262388002Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.262534 containerd[1543]: time="2025-01-30T13:09:15.262429331Z" level=info msg="RemovePodSandbox \"d73115c43681bdfabbeb3c74697bf56d6f05355e1445c135358e966552ff5b3b\" returns successfully" Jan 30 13:09:15.265248 containerd[1543]: time="2025-01-30T13:09:15.265143673Z" level=info msg="StopPodSandbox for \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\"" Jan 30 13:09:15.265248 containerd[1543]: time="2025-01-30T13:09:15.265208854Z" level=info msg="TearDown network for sandbox \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\" successfully" Jan 30 13:09:15.265248 containerd[1543]: time="2025-01-30T13:09:15.265215411Z" level=info msg="StopPodSandbox for \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\" returns successfully" Jan 30 13:09:15.265422 containerd[1543]: time="2025-01-30T13:09:15.265389109Z" level=info msg="RemovePodSandbox for \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\"" Jan 30 13:09:15.265450 containerd[1543]: time="2025-01-30T13:09:15.265426468Z" level=info msg="Forcibly stopping sandbox \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\"" Jan 30 13:09:15.265485 containerd[1543]: time="2025-01-30T13:09:15.265458885Z" level=info msg="TearDown network for sandbox \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\" successfully" Jan 30 13:09:15.266740 containerd[1543]: time="2025-01-30T13:09:15.266722530Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.266780 containerd[1543]: time="2025-01-30T13:09:15.266768173Z" level=info msg="RemovePodSandbox \"089fbddccad9a49711cb3353573051921328cacf4b202021a2f0e5d29f88a837\" returns successfully" Jan 30 13:09:15.267055 containerd[1543]: time="2025-01-30T13:09:15.266961615Z" level=info msg="StopPodSandbox for \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\"" Jan 30 13:09:15.267055 containerd[1543]: time="2025-01-30T13:09:15.267010041Z" level=info msg="TearDown network for sandbox \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\" successfully" Jan 30 13:09:15.267055 containerd[1543]: time="2025-01-30T13:09:15.267016169Z" level=info msg="StopPodSandbox for \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\" returns successfully" Jan 30 13:09:15.267191 containerd[1543]: time="2025-01-30T13:09:15.267176478Z" level=info msg="RemovePodSandbox for \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\"" Jan 30 13:09:15.267215 containerd[1543]: time="2025-01-30T13:09:15.267191502Z" level=info msg="Forcibly stopping sandbox \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\"" Jan 30 13:09:15.267254 containerd[1543]: time="2025-01-30T13:09:15.267229135Z" level=info msg="TearDown network for sandbox \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\" successfully" Jan 30 13:09:15.268506 containerd[1543]: time="2025-01-30T13:09:15.268489881Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.268550 containerd[1543]: time="2025-01-30T13:09:15.268517332Z" level=info msg="RemovePodSandbox \"3de6a98acadbb9eb5116ed3b09ce1f103acab92e4f4e851e0f4e2fa45d375ebb\" returns successfully" Jan 30 13:09:15.268762 containerd[1543]: time="2025-01-30T13:09:15.268748159Z" level=info msg="StopPodSandbox for \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\"" Jan 30 13:09:15.268801 containerd[1543]: time="2025-01-30T13:09:15.268791377Z" level=info msg="TearDown network for sandbox \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\" successfully" Jan 30 13:09:15.268801 containerd[1543]: time="2025-01-30T13:09:15.268799287Z" level=info msg="StopPodSandbox for \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\" returns successfully" Jan 30 13:09:15.269706 containerd[1543]: time="2025-01-30T13:09:15.269105301Z" level=info msg="RemovePodSandbox for \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\"" Jan 30 13:09:15.269706 containerd[1543]: time="2025-01-30T13:09:15.269120398Z" level=info msg="Forcibly stopping sandbox \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\"" Jan 30 13:09:15.269706 containerd[1543]: time="2025-01-30T13:09:15.269151441Z" level=info msg="TearDown network for sandbox \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\" successfully" Jan 30 13:09:15.271141 containerd[1543]: time="2025-01-30T13:09:15.270344277Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.271141 containerd[1543]: time="2025-01-30T13:09:15.270370130Z" level=info msg="RemovePodSandbox \"f67e71e2184831c39c74b68e58b3020ff9f93ba040a5fdeba3878632806201d9\" returns successfully" Jan 30 13:09:15.271141 containerd[1543]: time="2025-01-30T13:09:15.270551571Z" level=info msg="StopPodSandbox for \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\"" Jan 30 13:09:15.271141 containerd[1543]: time="2025-01-30T13:09:15.270597664Z" level=info msg="TearDown network for sandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\" successfully" Jan 30 13:09:15.271141 containerd[1543]: time="2025-01-30T13:09:15.270603639Z" level=info msg="StopPodSandbox for \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\" returns successfully" Jan 30 13:09:15.271141 containerd[1543]: time="2025-01-30T13:09:15.270735939Z" level=info msg="RemovePodSandbox for \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\"" Jan 30 13:09:15.271141 containerd[1543]: time="2025-01-30T13:09:15.270747616Z" level=info msg="Forcibly stopping sandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\"" Jan 30 13:09:15.271141 containerd[1543]: time="2025-01-30T13:09:15.270775983Z" level=info msg="TearDown network for sandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\" successfully" Jan 30 13:09:15.273169 containerd[1543]: time="2025-01-30T13:09:15.272206301Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.273169 containerd[1543]: time="2025-01-30T13:09:15.272231550Z" level=info msg="RemovePodSandbox \"0de32eb5c74bd35d347d879d2fee8cabcf6cd6bb7f4acf88c67befbfa3cc1248\" returns successfully" Jan 30 13:09:15.273169 containerd[1543]: time="2025-01-30T13:09:15.272457780Z" level=info msg="StopPodSandbox for \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\"" Jan 30 13:09:15.273169 containerd[1543]: time="2025-01-30T13:09:15.272505732Z" level=info msg="TearDown network for sandbox \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\" successfully" Jan 30 13:09:15.273169 containerd[1543]: time="2025-01-30T13:09:15.272511876Z" level=info msg="StopPodSandbox for \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\" returns successfully" Jan 30 13:09:15.273169 containerd[1543]: time="2025-01-30T13:09:15.272651957Z" level=info msg="RemovePodSandbox for \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\"" Jan 30 13:09:15.273169 containerd[1543]: time="2025-01-30T13:09:15.272670113Z" level=info msg="Forcibly stopping sandbox \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\"" Jan 30 13:09:15.273169 containerd[1543]: time="2025-01-30T13:09:15.272712430Z" level=info msg="TearDown network for sandbox \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\" successfully" Jan 30 13:09:15.273882 containerd[1543]: time="2025-01-30T13:09:15.273864757Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.273909 containerd[1543]: time="2025-01-30T13:09:15.273892298Z" level=info msg="RemovePodSandbox \"7efe126f02598a968e9c68d2a819250ab8e1abc27f231db7c16fa90e71cafbd7\" returns successfully" Jan 30 13:09:15.274079 containerd[1543]: time="2025-01-30T13:09:15.274067470Z" level=info msg="StopPodSandbox for \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\"" Jan 30 13:09:15.274297 containerd[1543]: time="2025-01-30T13:09:15.274200678Z" level=info msg="TearDown network for sandbox \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\" successfully" Jan 30 13:09:15.274297 containerd[1543]: time="2025-01-30T13:09:15.274209762Z" level=info msg="StopPodSandbox for \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\" returns successfully" Jan 30 13:09:15.274352 containerd[1543]: time="2025-01-30T13:09:15.274331252Z" level=info msg="RemovePodSandbox for \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\"" Jan 30 13:09:15.274352 containerd[1543]: time="2025-01-30T13:09:15.274348971Z" level=info msg="Forcibly stopping sandbox \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\"" Jan 30 13:09:15.275272 containerd[1543]: time="2025-01-30T13:09:15.274380488Z" level=info msg="TearDown network for sandbox \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\" successfully" Jan 30 13:09:15.275640 containerd[1543]: time="2025-01-30T13:09:15.275620494Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.275671 containerd[1543]: time="2025-01-30T13:09:15.275665086Z" level=info msg="RemovePodSandbox \"d6ea206a4e727dc8f713b11fe5a5f8cb988d729627ccd2d3b7068663c536a61a\" returns successfully" Jan 30 13:09:15.275920 containerd[1543]: time="2025-01-30T13:09:15.275907407Z" level=info msg="StopPodSandbox for \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\"" Jan 30 13:09:15.276110 containerd[1543]: time="2025-01-30T13:09:15.276052280Z" level=info msg="TearDown network for sandbox \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\" successfully" Jan 30 13:09:15.276110 containerd[1543]: time="2025-01-30T13:09:15.276061538Z" level=info msg="StopPodSandbox for \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\" returns successfully" Jan 30 13:09:15.276282 containerd[1543]: time="2025-01-30T13:09:15.276269526Z" level=info msg="RemovePodSandbox for \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\"" Jan 30 13:09:15.276313 containerd[1543]: time="2025-01-30T13:09:15.276283015Z" level=info msg="Forcibly stopping sandbox \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\"" Jan 30 13:09:15.276688 containerd[1543]: time="2025-01-30T13:09:15.276336829Z" level=info msg="TearDown network for sandbox \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\" successfully" Jan 30 13:09:15.277556 containerd[1543]: time="2025-01-30T13:09:15.277540343Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.277600 containerd[1543]: time="2025-01-30T13:09:15.277568691Z" level=info msg="RemovePodSandbox \"f78ffea051d1cda3b9fc91536fad96b45b3b4133686447e4bb2dc92ffe1afd3a\" returns successfully" Jan 30 13:09:15.277781 containerd[1543]: time="2025-01-30T13:09:15.277768010Z" level=info msg="StopPodSandbox for \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\"" Jan 30 13:09:15.277826 containerd[1543]: time="2025-01-30T13:09:15.277813662Z" level=info msg="TearDown network for sandbox \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\" successfully" Jan 30 13:09:15.277826 containerd[1543]: time="2025-01-30T13:09:15.277822792Z" level=info msg="StopPodSandbox for \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\" returns successfully" Jan 30 13:09:15.278416 containerd[1543]: time="2025-01-30T13:09:15.278000464Z" level=info msg="RemovePodSandbox for \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\"" Jan 30 13:09:15.278416 containerd[1543]: time="2025-01-30T13:09:15.278014220Z" level=info msg="Forcibly stopping sandbox \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\"" Jan 30 13:09:15.278416 containerd[1543]: time="2025-01-30T13:09:15.278047671Z" level=info msg="TearDown network for sandbox \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\" successfully" Jan 30 13:09:15.279399 containerd[1543]: time="2025-01-30T13:09:15.279385501Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.279499 containerd[1543]: time="2025-01-30T13:09:15.279463077Z" level=info msg="RemovePodSandbox \"e41645aa4a8e74370affa76b0dc96e14001e40d1f5bfd5d956f325a0ac17dee8\" returns successfully" Jan 30 13:09:15.279761 containerd[1543]: time="2025-01-30T13:09:15.279748291Z" level=info msg="StopPodSandbox for \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\"" Jan 30 13:09:15.279846 containerd[1543]: time="2025-01-30T13:09:15.279830191Z" level=info msg="TearDown network for sandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\" successfully" Jan 30 13:09:15.279846 containerd[1543]: time="2025-01-30T13:09:15.279841738Z" level=info msg="StopPodSandbox for \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\" returns successfully" Jan 30 13:09:15.280804 containerd[1543]: time="2025-01-30T13:09:15.279978024Z" level=info msg="RemovePodSandbox for \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\"" Jan 30 13:09:15.280804 containerd[1543]: time="2025-01-30T13:09:15.279991006Z" level=info msg="Forcibly stopping sandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\"" Jan 30 13:09:15.280804 containerd[1543]: time="2025-01-30T13:09:15.280023478Z" level=info msg="TearDown network for sandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\" successfully" Jan 30 13:09:15.281297 containerd[1543]: time="2025-01-30T13:09:15.281283898Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.281358 containerd[1543]: time="2025-01-30T13:09:15.281348429Z" level=info msg="RemovePodSandbox \"538a3e7cf9e497d8053d323fd3e631f69c63ee435a60d88a9fe37a199a128ddd\" returns successfully" Jan 30 13:09:15.281534 containerd[1543]: time="2025-01-30T13:09:15.281518556Z" level=info msg="StopPodSandbox for \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\"" Jan 30 13:09:15.281605 containerd[1543]: time="2025-01-30T13:09:15.281590246Z" level=info msg="TearDown network for sandbox \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\" successfully" Jan 30 13:09:15.281632 containerd[1543]: time="2025-01-30T13:09:15.281602845Z" level=info msg="StopPodSandbox for \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\" returns successfully" Jan 30 13:09:15.281840 containerd[1543]: time="2025-01-30T13:09:15.281824273Z" level=info msg="RemovePodSandbox for \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\"" Jan 30 13:09:15.281875 containerd[1543]: time="2025-01-30T13:09:15.281843643Z" level=info msg="Forcibly stopping sandbox \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\"" Jan 30 13:09:15.281901 containerd[1543]: time="2025-01-30T13:09:15.281882835Z" level=info msg="TearDown network for sandbox \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\" successfully" Jan 30 13:09:15.283066 containerd[1543]: time="2025-01-30T13:09:15.283049181Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.283147 containerd[1543]: time="2025-01-30T13:09:15.283074453Z" level=info msg="RemovePodSandbox \"ffe0dc5e7e1bd9e9f539192ccf3c296216a26eb9ab62c11c9f77d996c6586cd8\" returns successfully" Jan 30 13:09:15.283237 containerd[1543]: time="2025-01-30T13:09:15.283221916Z" level=info msg="StopPodSandbox for \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\"" Jan 30 13:09:15.283270 containerd[1543]: time="2025-01-30T13:09:15.283266008Z" level=info msg="TearDown network for sandbox \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\" successfully" Jan 30 13:09:15.283374 containerd[1543]: time="2025-01-30T13:09:15.283271941Z" level=info msg="StopPodSandbox for \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\" returns successfully" Jan 30 13:09:15.284225 containerd[1543]: time="2025-01-30T13:09:15.283400552Z" level=info msg="RemovePodSandbox for \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\"" Jan 30 13:09:15.284225 containerd[1543]: time="2025-01-30T13:09:15.283410776Z" level=info msg="Forcibly stopping sandbox \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\"" Jan 30 13:09:15.284225 containerd[1543]: time="2025-01-30T13:09:15.283439940Z" level=info msg="TearDown network for sandbox \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\" successfully" Jan 30 13:09:15.286070 containerd[1543]: time="2025-01-30T13:09:15.284602687Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.286070 containerd[1543]: time="2025-01-30T13:09:15.284630319Z" level=info msg="RemovePodSandbox \"e7b4fd68234b880175acc3e7f88e27142bbe1afeb1ce25737d8d38e746f2c0e7\" returns successfully" Jan 30 13:09:15.286070 containerd[1543]: time="2025-01-30T13:09:15.284924097Z" level=info msg="StopPodSandbox for \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\"" Jan 30 13:09:15.286070 containerd[1543]: time="2025-01-30T13:09:15.284968053Z" level=info msg="TearDown network for sandbox \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\" successfully" Jan 30 13:09:15.286070 containerd[1543]: time="2025-01-30T13:09:15.284975158Z" level=info msg="StopPodSandbox for \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\" returns successfully" Jan 30 13:09:15.286070 containerd[1543]: time="2025-01-30T13:09:15.285129379Z" level=info msg="RemovePodSandbox for \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\"" Jan 30 13:09:15.286070 containerd[1543]: time="2025-01-30T13:09:15.285140966Z" level=info msg="Forcibly stopping sandbox \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\"" Jan 30 13:09:15.286070 containerd[1543]: time="2025-01-30T13:09:15.285172078Z" level=info msg="TearDown network for sandbox \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\" successfully" Jan 30 13:09:15.286624 containerd[1543]: time="2025-01-30T13:09:15.286609082Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.286706 containerd[1543]: time="2025-01-30T13:09:15.286686872Z" level=info msg="RemovePodSandbox \"48f86e6846c8c469ea51c67ef6e3c4c1364e3df599975c27b2dd6e562eedb720\" returns successfully" Jan 30 13:09:15.286995 containerd[1543]: time="2025-01-30T13:09:15.286981166Z" level=info msg="StopPodSandbox for \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\"" Jan 30 13:09:15.287034 containerd[1543]: time="2025-01-30T13:09:15.287026867Z" level=info msg="TearDown network for sandbox \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\" successfully" Jan 30 13:09:15.287034 containerd[1543]: time="2025-01-30T13:09:15.287033144Z" level=info msg="StopPodSandbox for \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\" returns successfully" Jan 30 13:09:15.287194 containerd[1543]: time="2025-01-30T13:09:15.287170660Z" level=info msg="RemovePodSandbox for \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\"" Jan 30 13:09:15.287194 containerd[1543]: time="2025-01-30T13:09:15.287183552Z" level=info msg="Forcibly stopping sandbox \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\"" Jan 30 13:09:15.287237 containerd[1543]: time="2025-01-30T13:09:15.287213601Z" level=info msg="TearDown network for sandbox \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\" successfully" Jan 30 13:09:15.288391 containerd[1543]: time="2025-01-30T13:09:15.288373407Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.288431 containerd[1543]: time="2025-01-30T13:09:15.288403705Z" level=info msg="RemovePodSandbox \"c6313b09df8f3c7aa4c3dfc99101fedde9b28af60b9a3b0c8078ba61695190f6\" returns successfully" Jan 30 13:09:15.288772 containerd[1543]: time="2025-01-30T13:09:15.288594355Z" level=info msg="StopPodSandbox for \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\"" Jan 30 13:09:15.288772 containerd[1543]: time="2025-01-30T13:09:15.288639022Z" level=info msg="TearDown network for sandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\" successfully" Jan 30 13:09:15.288772 containerd[1543]: time="2025-01-30T13:09:15.288645609Z" level=info msg="StopPodSandbox for \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\" returns successfully" Jan 30 13:09:15.288846 containerd[1543]: time="2025-01-30T13:09:15.288818636Z" level=info msg="RemovePodSandbox for \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\"" Jan 30 13:09:15.288846 containerd[1543]: time="2025-01-30T13:09:15.288830303Z" level=info msg="Forcibly stopping sandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\"" Jan 30 13:09:15.288887 containerd[1543]: time="2025-01-30T13:09:15.288866922Z" level=info msg="TearDown network for sandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\" successfully" Jan 30 13:09:15.290354 containerd[1543]: time="2025-01-30T13:09:15.290335105Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.290401 containerd[1543]: time="2025-01-30T13:09:15.290383377Z" level=info msg="RemovePodSandbox \"c2af00cb6dd6b32394c6a10f7ffbc78ee9dd9eab6183ff211907e23c6c596842\" returns successfully" Jan 30 13:09:15.290647 containerd[1543]: time="2025-01-30T13:09:15.290632632Z" level=info msg="StopPodSandbox for \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\"" Jan 30 13:09:15.290705 containerd[1543]: time="2025-01-30T13:09:15.290692388Z" level=info msg="TearDown network for sandbox \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\" successfully" Jan 30 13:09:15.290705 containerd[1543]: time="2025-01-30T13:09:15.290702876Z" level=info msg="StopPodSandbox for \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\" returns successfully" Jan 30 13:09:15.290863 containerd[1543]: time="2025-01-30T13:09:15.290847816Z" level=info msg="RemovePodSandbox for \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\"" Jan 30 13:09:15.290863 containerd[1543]: time="2025-01-30T13:09:15.290861911Z" level=info msg="Forcibly stopping sandbox \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\"" Jan 30 13:09:15.290955 containerd[1543]: time="2025-01-30T13:09:15.290921628Z" level=info msg="TearDown network for sandbox \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\" successfully" Jan 30 13:09:15.292189 containerd[1543]: time="2025-01-30T13:09:15.292170669Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.292222 containerd[1543]: time="2025-01-30T13:09:15.292216456Z" level=info msg="RemovePodSandbox \"01115885f95c84d7ddb76e23f67bfe6ad7e44d7083f346ac8d11d793062495cd\" returns successfully" Jan 30 13:09:15.292431 containerd[1543]: time="2025-01-30T13:09:15.292415806Z" level=info msg="StopPodSandbox for \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\"" Jan 30 13:09:15.292484 containerd[1543]: time="2025-01-30T13:09:15.292470407Z" level=info msg="TearDown network for sandbox \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\" successfully" Jan 30 13:09:15.292484 containerd[1543]: time="2025-01-30T13:09:15.292480556Z" level=info msg="StopPodSandbox for \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\" returns successfully" Jan 30 13:09:15.293569 containerd[1543]: time="2025-01-30T13:09:15.292695432Z" level=info msg="RemovePodSandbox for \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\"" Jan 30 13:09:15.293569 containerd[1543]: time="2025-01-30T13:09:15.292709093Z" level=info msg="Forcibly stopping sandbox \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\"" Jan 30 13:09:15.293569 containerd[1543]: time="2025-01-30T13:09:15.292745701Z" level=info msg="TearDown network for sandbox \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\" successfully" Jan 30 13:09:15.294023 containerd[1543]: time="2025-01-30T13:09:15.294010590Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.294093 containerd[1543]: time="2025-01-30T13:09:15.294084785Z" level=info msg="RemovePodSandbox \"de7b5b5bd7bb8ed012b5355dcc31073c3b03f3b6346e2bb2eddb9d8e303b7bef\" returns successfully" Jan 30 13:09:15.294514 containerd[1543]: time="2025-01-30T13:09:15.294499146Z" level=info msg="StopPodSandbox for \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\"" Jan 30 13:09:15.294567 containerd[1543]: time="2025-01-30T13:09:15.294546477Z" level=info msg="TearDown network for sandbox \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\" successfully" Jan 30 13:09:15.294567 containerd[1543]: time="2025-01-30T13:09:15.294556877Z" level=info msg="StopPodSandbox for \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\" returns successfully" Jan 30 13:09:15.294730 containerd[1543]: time="2025-01-30T13:09:15.294672984Z" level=info msg="RemovePodSandbox for \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\"" Jan 30 13:09:15.294730 containerd[1543]: time="2025-01-30T13:09:15.294694614Z" level=info msg="Forcibly stopping sandbox \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\"" Jan 30 13:09:15.294777 containerd[1543]: time="2025-01-30T13:09:15.294746136Z" level=info msg="TearDown network for sandbox \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\" successfully" Jan 30 13:09:15.295960 containerd[1543]: time="2025-01-30T13:09:15.295942530Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.295997 containerd[1543]: time="2025-01-30T13:09:15.295971919Z" level=info msg="RemovePodSandbox \"87973172597a6f962805cca2b496a4c1d4aacff8c570cc4f751a8e7c63103114\" returns successfully" Jan 30 13:09:15.296235 containerd[1543]: time="2025-01-30T13:09:15.296139044Z" level=info msg="StopPodSandbox for \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\"" Jan 30 13:09:15.296235 containerd[1543]: time="2025-01-30T13:09:15.296183022Z" level=info msg="TearDown network for sandbox \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\" successfully" Jan 30 13:09:15.296235 containerd[1543]: time="2025-01-30T13:09:15.296189428Z" level=info msg="StopPodSandbox for \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\" returns successfully" Jan 30 13:09:15.296411 containerd[1543]: time="2025-01-30T13:09:15.296396985Z" level=info msg="RemovePodSandbox for \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\"" Jan 30 13:09:15.296411 containerd[1543]: time="2025-01-30T13:09:15.296411012Z" level=info msg="Forcibly stopping sandbox \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\"" Jan 30 13:09:15.297295 containerd[1543]: time="2025-01-30T13:09:15.296453017Z" level=info msg="TearDown network for sandbox \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\" successfully" Jan 30 13:09:15.304060 containerd[1543]: time="2025-01-30T13:09:15.303945721Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.304060 containerd[1543]: time="2025-01-30T13:09:15.303997133Z" level=info msg="RemovePodSandbox \"0c2036e27ca522477aee063d5e72c5c34055220c53030dac972154f54e58aa83\" returns successfully" Jan 30 13:09:15.304703 containerd[1543]: time="2025-01-30T13:09:15.304475244Z" level=info msg="StopPodSandbox for \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\"" Jan 30 13:09:15.304703 containerd[1543]: time="2025-01-30T13:09:15.304531846Z" level=info msg="TearDown network for sandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\" successfully" Jan 30 13:09:15.304703 containerd[1543]: time="2025-01-30T13:09:15.304558872Z" level=info msg="StopPodSandbox for \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\" returns successfully" Jan 30 13:09:15.314887 containerd[1543]: time="2025-01-30T13:09:15.314864850Z" level=info msg="RemovePodSandbox for \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\"" Jan 30 13:09:15.315699 containerd[1543]: time="2025-01-30T13:09:15.315005557Z" level=info msg="Forcibly stopping sandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\"" Jan 30 13:09:15.315699 containerd[1543]: time="2025-01-30T13:09:15.315066110Z" level=info msg="TearDown network for sandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\" successfully" Jan 30 13:09:15.316437 containerd[1543]: time="2025-01-30T13:09:15.316422267Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.316922 containerd[1543]: time="2025-01-30T13:09:15.316632508Z" level=info msg="RemovePodSandbox \"924f7a77da8614de990eee427771f2eb8ca93fc23daac16eeaa5221917ce27b2\" returns successfully" Jan 30 13:09:15.316922 containerd[1543]: time="2025-01-30T13:09:15.316903880Z" level=info msg="StopPodSandbox for \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\"" Jan 30 13:09:15.316974 containerd[1543]: time="2025-01-30T13:09:15.316966018Z" level=info msg="TearDown network for sandbox \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\" successfully" Jan 30 13:09:15.316974 containerd[1543]: time="2025-01-30T13:09:15.316973300Z" level=info msg="StopPodSandbox for \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\" returns successfully" Jan 30 13:09:15.317196 containerd[1543]: time="2025-01-30T13:09:15.317152362Z" level=info msg="RemovePodSandbox for \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\"" Jan 30 13:09:15.317196 containerd[1543]: time="2025-01-30T13:09:15.317164735Z" level=info msg="Forcibly stopping sandbox \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\"" Jan 30 13:09:15.317232 containerd[1543]: time="2025-01-30T13:09:15.317206902Z" level=info msg="TearDown network for sandbox \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\" successfully" Jan 30 13:09:15.318444 containerd[1543]: time="2025-01-30T13:09:15.318421014Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.318575 containerd[1543]: time="2025-01-30T13:09:15.318457268Z" level=info msg="RemovePodSandbox \"f4b6142bd159fa0ac13ee9130a99fde6c3d646285f12303bbf809e210be7387f\" returns successfully" Jan 30 13:09:15.318761 containerd[1543]: time="2025-01-30T13:09:15.318657185Z" level=info msg="StopPodSandbox for \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\"" Jan 30 13:09:15.318761 containerd[1543]: time="2025-01-30T13:09:15.318717406Z" level=info msg="TearDown network for sandbox \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\" successfully" Jan 30 13:09:15.318761 containerd[1543]: time="2025-01-30T13:09:15.318724150Z" level=info msg="StopPodSandbox for \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\" returns successfully" Jan 30 13:09:15.318977 containerd[1543]: time="2025-01-30T13:09:15.318945255Z" level=info msg="RemovePodSandbox for \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\"" Jan 30 13:09:15.318977 containerd[1543]: time="2025-01-30T13:09:15.318957481Z" level=info msg="Forcibly stopping sandbox \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\"" Jan 30 13:09:15.319455 containerd[1543]: time="2025-01-30T13:09:15.319105521Z" level=info msg="TearDown network for sandbox \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\" successfully" Jan 30 13:09:15.320507 containerd[1543]: time="2025-01-30T13:09:15.320444544Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.320507 containerd[1543]: time="2025-01-30T13:09:15.320478864Z" level=info msg="RemovePodSandbox \"698a0b4863ee64cee6a5076c93c8399519960923ad45c7281d60861f3d108b0c\" returns successfully" Jan 30 13:09:15.320650 containerd[1543]: time="2025-01-30T13:09:15.320635971Z" level=info msg="StopPodSandbox for \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\"" Jan 30 13:09:15.320709 containerd[1543]: time="2025-01-30T13:09:15.320696384Z" level=info msg="TearDown network for sandbox \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\" successfully" Jan 30 13:09:15.320709 containerd[1543]: time="2025-01-30T13:09:15.320706103Z" level=info msg="StopPodSandbox for \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\" returns successfully" Jan 30 13:09:15.320850 containerd[1543]: time="2025-01-30T13:09:15.320834094Z" level=info msg="RemovePodSandbox for \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\"" Jan 30 13:09:15.321559 containerd[1543]: time="2025-01-30T13:09:15.320848945Z" level=info msg="Forcibly stopping sandbox \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\"" Jan 30 13:09:15.321559 containerd[1543]: time="2025-01-30T13:09:15.320883697Z" level=info msg="TearDown network for sandbox \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\" successfully" Jan 30 13:09:15.321993 containerd[1543]: time="2025-01-30T13:09:15.321976071Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.322034 containerd[1543]: time="2025-01-30T13:09:15.322007521Z" level=info msg="RemovePodSandbox \"5cda0eeaf91c3b98f7991450163d0cbb992564c6e1c30a39c5b297ab1588ea34\" returns successfully" Jan 30 13:09:15.322610 containerd[1543]: time="2025-01-30T13:09:15.322427658Z" level=info msg="StopPodSandbox for \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\"" Jan 30 13:09:15.322610 containerd[1543]: time="2025-01-30T13:09:15.322478401Z" level=info msg="TearDown network for sandbox \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\" successfully" Jan 30 13:09:15.322610 containerd[1543]: time="2025-01-30T13:09:15.322485462Z" level=info msg="StopPodSandbox for \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\" returns successfully" Jan 30 13:09:15.322702 containerd[1543]: time="2025-01-30T13:09:15.322622241Z" level=info msg="RemovePodSandbox for \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\"" Jan 30 13:09:15.322702 containerd[1543]: time="2025-01-30T13:09:15.322633159Z" level=info msg="Forcibly stopping sandbox \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\"" Jan 30 13:09:15.322702 containerd[1543]: time="2025-01-30T13:09:15.322665019Z" level=info msg="TearDown network for sandbox \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\" successfully" Jan 30 13:09:15.323882 containerd[1543]: time="2025-01-30T13:09:15.323866347Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 13:09:15.323920 containerd[1543]: time="2025-01-30T13:09:15.323895113Z" level=info msg="RemovePodSandbox \"3d7e9939e6960f0e1225ab30ddbf03fdcf27e1d7bb8fe0e431b277b00cf207a3\" returns successfully" Jan 30 13:09:37.364095 kubelet[2795]: I0130 13:09:37.356085 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 13:09:41.559053 systemd[1]: Started sshd@7-139.178.70.106:22-139.178.89.65:34036.service - OpenSSH per-connection server daemon (139.178.89.65:34036). Jan 30 13:09:41.667598 sshd[5786]: Accepted publickey for core from 139.178.89.65 port 34036 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:09:41.669574 sshd-session[5786]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:09:41.677151 systemd-logind[1521]: New session 10 of user core. Jan 30 13:09:41.682810 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 30 13:09:42.231185 sshd[5797]: Connection closed by 139.178.89.65 port 34036 Jan 30 13:09:42.231714 sshd-session[5786]: pam_unix(sshd:session): session closed for user core Jan 30 13:09:42.233605 systemd-logind[1521]: Session 10 logged out. Waiting for processes to exit. Jan 30 13:09:42.234031 systemd[1]: sshd@7-139.178.70.106:22-139.178.89.65:34036.service: Deactivated successfully. Jan 30 13:09:42.235599 systemd[1]: session-10.scope: Deactivated successfully. Jan 30 13:09:42.236643 systemd-logind[1521]: Removed session 10. Jan 30 13:09:47.239710 systemd[1]: Started sshd@8-139.178.70.106:22-139.178.89.65:34052.service - OpenSSH per-connection server daemon (139.178.89.65:34052). Jan 30 13:09:47.291516 sshd[5811]: Accepted publickey for core from 139.178.89.65 port 34052 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:09:47.292435 sshd-session[5811]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:09:47.295961 systemd-logind[1521]: New session 11 of user core. Jan 30 13:09:47.300797 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 30 13:09:47.405800 sshd[5813]: Connection closed by 139.178.89.65 port 34052 Jan 30 13:09:47.405405 sshd-session[5811]: pam_unix(sshd:session): session closed for user core Jan 30 13:09:47.407159 systemd-logind[1521]: Session 11 logged out. Waiting for processes to exit. Jan 30 13:09:47.407313 systemd[1]: sshd@8-139.178.70.106:22-139.178.89.65:34052.service: Deactivated successfully. Jan 30 13:09:47.408410 systemd[1]: session-11.scope: Deactivated successfully. Jan 30 13:09:47.409340 systemd-logind[1521]: Removed session 11. Jan 30 13:09:52.414232 systemd[1]: Started sshd@9-139.178.70.106:22-139.178.89.65:45258.service - OpenSSH per-connection server daemon (139.178.89.65:45258). Jan 30 13:09:52.683927 sshd[5827]: Accepted publickey for core from 139.178.89.65 port 45258 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:09:52.684708 sshd-session[5827]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:09:52.688076 systemd-logind[1521]: New session 12 of user core. Jan 30 13:09:52.690762 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 30 13:09:53.028788 sshd[5829]: Connection closed by 139.178.89.65 port 45258 Jan 30 13:09:53.033101 sshd-session[5827]: pam_unix(sshd:session): session closed for user core Jan 30 13:09:53.042539 systemd[1]: sshd@9-139.178.70.106:22-139.178.89.65:45258.service: Deactivated successfully. Jan 30 13:09:53.044308 systemd[1]: session-12.scope: Deactivated successfully. Jan 30 13:09:53.044935 systemd-logind[1521]: Session 12 logged out. Waiting for processes to exit. Jan 30 13:09:53.045749 systemd-logind[1521]: Removed session 12. Jan 30 13:09:58.042908 systemd[1]: Started sshd@10-139.178.70.106:22-139.178.89.65:45266.service - OpenSSH per-connection server daemon (139.178.89.65:45266). Jan 30 13:09:58.264642 sshd[5883]: Accepted publickey for core from 139.178.89.65 port 45266 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:09:58.266446 sshd-session[5883]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:09:58.270303 systemd-logind[1521]: New session 13 of user core. Jan 30 13:09:58.274802 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 30 13:09:58.558332 sshd[5885]: Connection closed by 139.178.89.65 port 45266 Jan 30 13:09:58.565283 systemd[1]: sshd@10-139.178.70.106:22-139.178.89.65:45266.service: Deactivated successfully. Jan 30 13:09:58.558589 sshd-session[5883]: pam_unix(sshd:session): session closed for user core Jan 30 13:09:58.566453 systemd[1]: session-13.scope: Deactivated successfully. Jan 30 13:09:58.567343 systemd-logind[1521]: Session 13 logged out. Waiting for processes to exit. Jan 30 13:09:58.568622 systemd[1]: Started sshd@11-139.178.70.106:22-139.178.89.65:45268.service - OpenSSH per-connection server daemon (139.178.89.65:45268). Jan 30 13:09:58.569494 systemd-logind[1521]: Removed session 13. Jan 30 13:09:58.649020 sshd[5896]: Accepted publickey for core from 139.178.89.65 port 45268 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:09:58.649879 sshd-session[5896]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:09:58.653988 systemd-logind[1521]: New session 14 of user core. Jan 30 13:09:58.657879 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 30 13:09:58.938013 sshd[5898]: Connection closed by 139.178.89.65 port 45268 Jan 30 13:09:58.949638 sshd-session[5896]: pam_unix(sshd:session): session closed for user core Jan 30 13:09:58.950038 systemd[1]: Started sshd@12-139.178.70.106:22-139.178.89.65:45270.service - OpenSSH per-connection server daemon (139.178.89.65:45270). Jan 30 13:09:58.967490 systemd[1]: sshd@11-139.178.70.106:22-139.178.89.65:45268.service: Deactivated successfully. Jan 30 13:09:58.968583 systemd[1]: session-14.scope: Deactivated successfully. Jan 30 13:09:58.969426 systemd-logind[1521]: Session 14 logged out. Waiting for processes to exit. Jan 30 13:09:58.970116 systemd-logind[1521]: Removed session 14. Jan 30 13:09:59.351409 sshd[5904]: Accepted publickey for core from 139.178.89.65 port 45270 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:09:59.352351 sshd-session[5904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:09:59.355241 systemd-logind[1521]: New session 15 of user core. Jan 30 13:09:59.364797 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 30 13:09:59.648229 sshd[5908]: Connection closed by 139.178.89.65 port 45270 Jan 30 13:09:59.648710 sshd-session[5904]: pam_unix(sshd:session): session closed for user core Jan 30 13:09:59.650670 systemd[1]: sshd@12-139.178.70.106:22-139.178.89.65:45270.service: Deactivated successfully. Jan 30 13:09:59.651853 systemd[1]: session-15.scope: Deactivated successfully. Jan 30 13:09:59.652340 systemd-logind[1521]: Session 15 logged out. Waiting for processes to exit. Jan 30 13:09:59.652885 systemd-logind[1521]: Removed session 15. Jan 30 13:10:04.662902 systemd[1]: Started sshd@13-139.178.70.106:22-139.178.89.65:58456.service - OpenSSH per-connection server daemon (139.178.89.65:58456). Jan 30 13:10:04.692840 sshd[5922]: Accepted publickey for core from 139.178.89.65 port 58456 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:10:04.694583 sshd-session[5922]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:10:04.697893 systemd-logind[1521]: New session 16 of user core. Jan 30 13:10:04.703857 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 30 13:10:04.796755 sshd[5924]: Connection closed by 139.178.89.65 port 58456 Jan 30 13:10:04.797162 sshd-session[5922]: pam_unix(sshd:session): session closed for user core Jan 30 13:10:04.798853 systemd-logind[1521]: Session 16 logged out. Waiting for processes to exit. Jan 30 13:10:04.799049 systemd[1]: sshd@13-139.178.70.106:22-139.178.89.65:58456.service: Deactivated successfully. Jan 30 13:10:04.800582 systemd[1]: session-16.scope: Deactivated successfully. Jan 30 13:10:04.802852 systemd-logind[1521]: Removed session 16. Jan 30 13:10:09.805878 systemd[1]: Started sshd@14-139.178.70.106:22-139.178.89.65:58460.service - OpenSSH per-connection server daemon (139.178.89.65:58460). Jan 30 13:10:09.867485 sshd[5937]: Accepted publickey for core from 139.178.89.65 port 58460 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:10:09.868354 sshd-session[5937]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:10:09.870977 systemd-logind[1521]: New session 17 of user core. Jan 30 13:10:09.879780 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 30 13:10:10.093364 sshd[5939]: Connection closed by 139.178.89.65 port 58460 Jan 30 13:10:10.102568 systemd[1]: Started sshd@15-139.178.70.106:22-139.178.89.65:58476.service - OpenSSH per-connection server daemon (139.178.89.65:58476). Jan 30 13:10:10.114112 sshd-session[5937]: pam_unix(sshd:session): session closed for user core Jan 30 13:10:10.116612 systemd[1]: sshd@14-139.178.70.106:22-139.178.89.65:58460.service: Deactivated successfully. Jan 30 13:10:10.118143 systemd[1]: session-17.scope: Deactivated successfully. Jan 30 13:10:10.118769 systemd-logind[1521]: Session 17 logged out. Waiting for processes to exit. Jan 30 13:10:10.119456 systemd-logind[1521]: Removed session 17. Jan 30 13:10:10.220306 sshd[5947]: Accepted publickey for core from 139.178.89.65 port 58476 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:10:10.221261 sshd-session[5947]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:10:10.225190 systemd-logind[1521]: New session 18 of user core. Jan 30 13:10:10.230819 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 30 13:10:10.602358 sshd[5951]: Connection closed by 139.178.89.65 port 58476 Jan 30 13:10:10.605865 sshd-session[5947]: pam_unix(sshd:session): session closed for user core Jan 30 13:10:10.615870 systemd[1]: Started sshd@16-139.178.70.106:22-139.178.89.65:58492.service - OpenSSH per-connection server daemon (139.178.89.65:58492). Jan 30 13:10:10.616162 systemd[1]: sshd@15-139.178.70.106:22-139.178.89.65:58476.service: Deactivated successfully. Jan 30 13:10:10.618002 systemd[1]: session-18.scope: Deactivated successfully. Jan 30 13:10:10.619903 systemd-logind[1521]: Session 18 logged out. Waiting for processes to exit. Jan 30 13:10:10.621133 systemd-logind[1521]: Removed session 18. Jan 30 13:10:10.677251 sshd[5958]: Accepted publickey for core from 139.178.89.65 port 58492 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:10:10.678384 sshd-session[5958]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:10:10.681147 systemd-logind[1521]: New session 19 of user core. Jan 30 13:10:10.692611 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 30 13:10:12.995521 sshd[5962]: Connection closed by 139.178.89.65 port 58492 Jan 30 13:10:13.005006 sshd-session[5958]: pam_unix(sshd:session): session closed for user core Jan 30 13:10:13.012259 systemd[1]: Started sshd@17-139.178.70.106:22-139.178.89.65:46354.service - OpenSSH per-connection server daemon (139.178.89.65:46354). Jan 30 13:10:13.025734 systemd-logind[1521]: Session 19 logged out. Waiting for processes to exit. Jan 30 13:10:13.026746 systemd[1]: sshd@16-139.178.70.106:22-139.178.89.65:58492.service: Deactivated successfully. Jan 30 13:10:13.027980 systemd[1]: session-19.scope: Deactivated successfully. Jan 30 13:10:13.029961 systemd-logind[1521]: Removed session 19. Jan 30 13:10:13.183151 sshd[5981]: Accepted publickey for core from 139.178.89.65 port 46354 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:10:13.184098 sshd-session[5981]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:10:13.186651 systemd-logind[1521]: New session 20 of user core. Jan 30 13:10:13.193762 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 30 13:10:14.584070 sshd[5988]: Connection closed by 139.178.89.65 port 46354 Jan 30 13:10:14.593203 sshd-session[5981]: pam_unix(sshd:session): session closed for user core Jan 30 13:10:14.601328 systemd[1]: Started sshd@18-139.178.70.106:22-139.178.89.65:46358.service - OpenSSH per-connection server daemon (139.178.89.65:46358). Jan 30 13:10:14.604395 systemd[1]: sshd@17-139.178.70.106:22-139.178.89.65:46354.service: Deactivated successfully. Jan 30 13:10:14.606862 systemd[1]: session-20.scope: Deactivated successfully. Jan 30 13:10:14.609021 systemd-logind[1521]: Session 20 logged out. Waiting for processes to exit. Jan 30 13:10:14.611194 systemd-logind[1521]: Removed session 20. Jan 30 13:10:14.957106 sshd[5995]: Accepted publickey for core from 139.178.89.65 port 46358 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:10:14.964384 sshd-session[5995]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:10:14.974847 systemd-logind[1521]: New session 21 of user core. Jan 30 13:10:14.980037 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 30 13:10:15.107083 sshd[5999]: Connection closed by 139.178.89.65 port 46358 Jan 30 13:10:15.107432 sshd-session[5995]: pam_unix(sshd:session): session closed for user core Jan 30 13:10:15.110114 systemd-logind[1521]: Session 21 logged out. Waiting for processes to exit. Jan 30 13:10:15.110220 systemd[1]: sshd@18-139.178.70.106:22-139.178.89.65:46358.service: Deactivated successfully. Jan 30 13:10:15.111303 systemd[1]: session-21.scope: Deactivated successfully. Jan 30 13:10:15.112197 systemd-logind[1521]: Removed session 21. Jan 30 13:10:20.116096 systemd[1]: Started sshd@19-139.178.70.106:22-139.178.89.65:46374.service - OpenSSH per-connection server daemon (139.178.89.65:46374). Jan 30 13:10:20.575250 sshd[6017]: Accepted publickey for core from 139.178.89.65 port 46374 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:10:20.576648 sshd-session[6017]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:10:20.580971 systemd-logind[1521]: New session 22 of user core. Jan 30 13:10:20.585827 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 30 13:10:20.894996 sshd[6019]: Connection closed by 139.178.89.65 port 46374 Jan 30 13:10:20.895531 sshd-session[6017]: pam_unix(sshd:session): session closed for user core Jan 30 13:10:20.897585 systemd-logind[1521]: Session 22 logged out. Waiting for processes to exit. Jan 30 13:10:20.897793 systemd[1]: sshd@19-139.178.70.106:22-139.178.89.65:46374.service: Deactivated successfully. Jan 30 13:10:20.898977 systemd[1]: session-22.scope: Deactivated successfully. Jan 30 13:10:20.899586 systemd-logind[1521]: Removed session 22. Jan 30 13:10:25.912950 systemd[1]: Started sshd@20-139.178.70.106:22-139.178.89.65:40576.service - OpenSSH per-connection server daemon (139.178.89.65:40576). Jan 30 13:10:26.075884 sshd[6043]: Accepted publickey for core from 139.178.89.65 port 40576 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:10:26.077593 sshd-session[6043]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:10:26.081757 systemd-logind[1521]: New session 23 of user core. Jan 30 13:10:26.092896 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 30 13:10:26.330382 sshd[6059]: Connection closed by 139.178.89.65 port 40576 Jan 30 13:10:26.329737 sshd-session[6043]: pam_unix(sshd:session): session closed for user core Jan 30 13:10:26.332033 systemd[1]: sshd@20-139.178.70.106:22-139.178.89.65:40576.service: Deactivated successfully. Jan 30 13:10:26.333476 systemd[1]: session-23.scope: Deactivated successfully. Jan 30 13:10:26.334582 systemd-logind[1521]: Session 23 logged out. Waiting for processes to exit. Jan 30 13:10:26.335465 systemd-logind[1521]: Removed session 23. Jan 30 13:10:31.337852 systemd[1]: Started sshd@21-139.178.70.106:22-139.178.89.65:43104.service - OpenSSH per-connection server daemon (139.178.89.65:43104). Jan 30 13:10:31.516789 sshd[6102]: Accepted publickey for core from 139.178.89.65 port 43104 ssh2: RSA SHA256:e2NaSLyu5IZKRPvehezdk/ivsn9B5mcszGAdtBKCAIk Jan 30 13:10:31.518291 sshd-session[6102]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 13:10:31.521337 systemd-logind[1521]: New session 24 of user core. Jan 30 13:10:31.528842 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 30 13:10:32.036504 sshd[6104]: Connection closed by 139.178.89.65 port 43104 Jan 30 13:10:32.037353 sshd-session[6102]: pam_unix(sshd:session): session closed for user core Jan 30 13:10:32.039612 systemd[1]: sshd@21-139.178.70.106:22-139.178.89.65:43104.service: Deactivated successfully. Jan 30 13:10:32.040907 systemd[1]: session-24.scope: Deactivated successfully. Jan 30 13:10:32.041502 systemd-logind[1521]: Session 24 logged out. Waiting for processes to exit. Jan 30 13:10:32.042211 systemd-logind[1521]: Removed session 24.