Oct 13 06:24:35.610340 kernel: Linux version 6.12.51-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Mon Oct 13 03:31:29 -00 2025 Oct 13 06:24:35.610376 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=4919840803704517a91afcb9d57d99e9935244ff049349c54216d9a31bc1da5d Oct 13 06:24:35.610402 kernel: BIOS-provided physical RAM map: Oct 13 06:24:35.610407 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Oct 13 06:24:35.610414 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Oct 13 06:24:35.610420 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Oct 13 06:24:35.610426 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Oct 13 06:24:35.610431 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Oct 13 06:24:35.610436 kernel: BIOS-e820: [mem 0x0000000040400000-0x0000000082519fff] usable Oct 13 06:24:35.610441 kernel: BIOS-e820: [mem 0x000000008251a000-0x000000008251afff] ACPI NVS Oct 13 06:24:35.610446 kernel: BIOS-e820: [mem 0x000000008251b000-0x000000008251bfff] reserved Oct 13 06:24:35.610450 kernel: BIOS-e820: [mem 0x000000008251c000-0x000000008afccfff] usable Oct 13 06:24:35.610455 kernel: BIOS-e820: [mem 0x000000008afcd000-0x000000008c0b1fff] reserved Oct 13 06:24:35.610459 kernel: BIOS-e820: [mem 0x000000008c0b2000-0x000000008c23afff] usable Oct 13 06:24:35.610466 kernel: BIOS-e820: [mem 0x000000008c23b000-0x000000008c66cfff] ACPI NVS Oct 13 06:24:35.610471 kernel: BIOS-e820: [mem 0x000000008c66d000-0x000000008eefefff] reserved Oct 13 06:24:35.610476 kernel: BIOS-e820: [mem 0x000000008eeff000-0x000000008eefffff] usable Oct 13 06:24:35.610481 kernel: BIOS-e820: [mem 0x000000008ef00000-0x000000008fffffff] reserved Oct 13 06:24:35.610486 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Oct 13 06:24:35.610491 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Oct 13 06:24:35.610496 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Oct 13 06:24:35.610501 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Oct 13 06:24:35.610506 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Oct 13 06:24:35.610512 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000086effffff] usable Oct 13 06:24:35.610517 kernel: NX (Execute Disable) protection: active Oct 13 06:24:35.610522 kernel: APIC: Static calls initialized Oct 13 06:24:35.610527 kernel: SMBIOS 3.2.1 present. Oct 13 06:24:35.610533 kernel: DMI: Supermicro Super Server/X11SCM-F, BIOS 1.9 09/16/2022 Oct 13 06:24:35.610538 kernel: DMI: Memory slots populated: 1/4 Oct 13 06:24:35.610543 kernel: tsc: Detected 3400.000 MHz processor Oct 13 06:24:35.610548 kernel: tsc: Detected 3399.906 MHz TSC Oct 13 06:24:35.610553 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Oct 13 06:24:35.610559 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Oct 13 06:24:35.610565 kernel: last_pfn = 0x86f000 max_arch_pfn = 0x400000000 Oct 13 06:24:35.610571 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 23), built from 10 variable MTRRs Oct 13 06:24:35.610576 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Oct 13 06:24:35.610581 kernel: last_pfn = 0x8ef00 max_arch_pfn = 0x400000000 Oct 13 06:24:35.610587 kernel: Using GB pages for direct mapping Oct 13 06:24:35.610592 kernel: ACPI: Early table checksum verification disabled Oct 13 06:24:35.610598 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Oct 13 06:24:35.610606 kernel: ACPI: XSDT 0x000000008C54E0C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Oct 13 06:24:35.610611 kernel: ACPI: FACP 0x000000008C58A670 000114 (v06 01072009 AMI 00010013) Oct 13 06:24:35.610617 kernel: ACPI: DSDT 0x000000008C54E268 03C404 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Oct 13 06:24:35.610623 kernel: ACPI: FACS 0x000000008C66CF80 000040 Oct 13 06:24:35.610628 kernel: ACPI: APIC 0x000000008C58A788 00012C (v04 01072009 AMI 00010013) Oct 13 06:24:35.610635 kernel: ACPI: FPDT 0x000000008C58A8B8 000044 (v01 01072009 AMI 00010013) Oct 13 06:24:35.610641 kernel: ACPI: FIDT 0x000000008C58A900 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Oct 13 06:24:35.610646 kernel: ACPI: MCFG 0x000000008C58A9A0 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Oct 13 06:24:35.610652 kernel: ACPI: SPMI 0x000000008C58A9E0 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Oct 13 06:24:35.610657 kernel: ACPI: SSDT 0x000000008C58AA28 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Oct 13 06:24:35.610663 kernel: ACPI: SSDT 0x000000008C58C548 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Oct 13 06:24:35.610669 kernel: ACPI: SSDT 0x000000008C58F710 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Oct 13 06:24:35.610675 kernel: ACPI: HPET 0x000000008C591A40 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Oct 13 06:24:35.610681 kernel: ACPI: SSDT 0x000000008C591A78 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Oct 13 06:24:35.610686 kernel: ACPI: SSDT 0x000000008C592A28 0008F4 (v02 INTEL xh_mossb 00000000 INTL 20160527) Oct 13 06:24:35.610692 kernel: ACPI: UEFI 0x000000008C593320 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Oct 13 06:24:35.610698 kernel: ACPI: LPIT 0x000000008C593368 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Oct 13 06:24:35.610703 kernel: ACPI: SSDT 0x000000008C593400 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Oct 13 06:24:35.610709 kernel: ACPI: SSDT 0x000000008C595BE0 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Oct 13 06:24:35.610716 kernel: ACPI: DBGP 0x000000008C5970C8 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Oct 13 06:24:35.610721 kernel: ACPI: DBG2 0x000000008C597100 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Oct 13 06:24:35.610727 kernel: ACPI: SSDT 0x000000008C597158 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Oct 13 06:24:35.610732 kernel: ACPI: DMAR 0x000000008C598CC0 000070 (v01 INTEL EDK2 00000002 01000013) Oct 13 06:24:35.610738 kernel: ACPI: SSDT 0x000000008C598D30 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Oct 13 06:24:35.610744 kernel: ACPI: TPM2 0x000000008C598E78 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Oct 13 06:24:35.610750 kernel: ACPI: SSDT 0x000000008C598EB0 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Oct 13 06:24:35.610756 kernel: ACPI: WSMT 0x000000008C599C40 000028 (v01 SUPERM 01072009 AMI 00010013) Oct 13 06:24:35.610762 kernel: ACPI: EINJ 0x000000008C599C68 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Oct 13 06:24:35.610767 kernel: ACPI: ERST 0x000000008C599D98 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Oct 13 06:24:35.610773 kernel: ACPI: BERT 0x000000008C599FC8 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Oct 13 06:24:35.610778 kernel: ACPI: HEST 0x000000008C599FF8 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Oct 13 06:24:35.610784 kernel: ACPI: SSDT 0x000000008C59A278 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Oct 13 06:24:35.610791 kernel: ACPI: Reserving FACP table memory at [mem 0x8c58a670-0x8c58a783] Oct 13 06:24:35.610796 kernel: ACPI: Reserving DSDT table memory at [mem 0x8c54e268-0x8c58a66b] Oct 13 06:24:35.610802 kernel: ACPI: Reserving FACS table memory at [mem 0x8c66cf80-0x8c66cfbf] Oct 13 06:24:35.610810 kernel: ACPI: Reserving APIC table memory at [mem 0x8c58a788-0x8c58a8b3] Oct 13 06:24:35.610816 kernel: ACPI: Reserving FPDT table memory at [mem 0x8c58a8b8-0x8c58a8fb] Oct 13 06:24:35.610822 kernel: ACPI: Reserving FIDT table memory at [mem 0x8c58a900-0x8c58a99b] Oct 13 06:24:35.610827 kernel: ACPI: Reserving MCFG table memory at [mem 0x8c58a9a0-0x8c58a9db] Oct 13 06:24:35.610833 kernel: ACPI: Reserving SPMI table memory at [mem 0x8c58a9e0-0x8c58aa20] Oct 13 06:24:35.610839 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58aa28-0x8c58c543] Oct 13 06:24:35.610845 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58c548-0x8c58f70d] Oct 13 06:24:35.610850 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58f710-0x8c591a3a] Oct 13 06:24:35.610856 kernel: ACPI: Reserving HPET table memory at [mem 0x8c591a40-0x8c591a77] Oct 13 06:24:35.610861 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c591a78-0x8c592a25] Oct 13 06:24:35.610867 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c592a28-0x8c59331b] Oct 13 06:24:35.610872 kernel: ACPI: Reserving UEFI table memory at [mem 0x8c593320-0x8c593361] Oct 13 06:24:35.610879 kernel: ACPI: Reserving LPIT table memory at [mem 0x8c593368-0x8c5933fb] Oct 13 06:24:35.610885 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c593400-0x8c595bdd] Oct 13 06:24:35.610890 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c595be0-0x8c5970c1] Oct 13 06:24:35.610895 kernel: ACPI: Reserving DBGP table memory at [mem 0x8c5970c8-0x8c5970fb] Oct 13 06:24:35.610901 kernel: ACPI: Reserving DBG2 table memory at [mem 0x8c597100-0x8c597153] Oct 13 06:24:35.610906 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c597158-0x8c598cbe] Oct 13 06:24:35.610912 kernel: ACPI: Reserving DMAR table memory at [mem 0x8c598cc0-0x8c598d2f] Oct 13 06:24:35.610917 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598d30-0x8c598e73] Oct 13 06:24:35.610924 kernel: ACPI: Reserving TPM2 table memory at [mem 0x8c598e78-0x8c598eab] Oct 13 06:24:35.610929 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598eb0-0x8c599c3e] Oct 13 06:24:35.610935 kernel: ACPI: Reserving WSMT table memory at [mem 0x8c599c40-0x8c599c67] Oct 13 06:24:35.610940 kernel: ACPI: Reserving EINJ table memory at [mem 0x8c599c68-0x8c599d97] Oct 13 06:24:35.610946 kernel: ACPI: Reserving ERST table memory at [mem 0x8c599d98-0x8c599fc7] Oct 13 06:24:35.610951 kernel: ACPI: Reserving BERT table memory at [mem 0x8c599fc8-0x8c599ff7] Oct 13 06:24:35.610957 kernel: ACPI: Reserving HEST table memory at [mem 0x8c599ff8-0x8c59a273] Oct 13 06:24:35.610962 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c59a278-0x8c59a3d9] Oct 13 06:24:35.610969 kernel: No NUMA configuration found Oct 13 06:24:35.610974 kernel: Faking a node at [mem 0x0000000000000000-0x000000086effffff] Oct 13 06:24:35.610980 kernel: NODE_DATA(0) allocated [mem 0x86eff8dc0-0x86effffff] Oct 13 06:24:35.610986 kernel: Zone ranges: Oct 13 06:24:35.610991 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Oct 13 06:24:35.610997 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Oct 13 06:24:35.611002 kernel: Normal [mem 0x0000000100000000-0x000000086effffff] Oct 13 06:24:35.611009 kernel: Device empty Oct 13 06:24:35.611014 kernel: Movable zone start for each node Oct 13 06:24:35.611020 kernel: Early memory node ranges Oct 13 06:24:35.611026 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Oct 13 06:24:35.611031 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Oct 13 06:24:35.611037 kernel: node 0: [mem 0x0000000040400000-0x0000000082519fff] Oct 13 06:24:35.611042 kernel: node 0: [mem 0x000000008251c000-0x000000008afccfff] Oct 13 06:24:35.611048 kernel: node 0: [mem 0x000000008c0b2000-0x000000008c23afff] Oct 13 06:24:35.611057 kernel: node 0: [mem 0x000000008eeff000-0x000000008eefffff] Oct 13 06:24:35.611063 kernel: node 0: [mem 0x0000000100000000-0x000000086effffff] Oct 13 06:24:35.611070 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000086effffff] Oct 13 06:24:35.611076 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Oct 13 06:24:35.611082 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Oct 13 06:24:35.611088 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Oct 13 06:24:35.611094 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Oct 13 06:24:35.611100 kernel: On node 0, zone DMA32: 4325 pages in unavailable ranges Oct 13 06:24:35.611106 kernel: On node 0, zone DMA32: 11460 pages in unavailable ranges Oct 13 06:24:35.611112 kernel: On node 0, zone Normal: 4352 pages in unavailable ranges Oct 13 06:24:35.611118 kernel: On node 0, zone Normal: 4096 pages in unavailable ranges Oct 13 06:24:35.611124 kernel: ACPI: PM-Timer IO Port: 0x1808 Oct 13 06:24:35.611130 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Oct 13 06:24:35.611137 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Oct 13 06:24:35.611143 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Oct 13 06:24:35.611149 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Oct 13 06:24:35.611155 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Oct 13 06:24:35.611161 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Oct 13 06:24:35.611166 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Oct 13 06:24:35.611172 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Oct 13 06:24:35.611178 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Oct 13 06:24:35.611185 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Oct 13 06:24:35.611191 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Oct 13 06:24:35.611196 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Oct 13 06:24:35.611202 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Oct 13 06:24:35.611208 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Oct 13 06:24:35.611214 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Oct 13 06:24:35.611220 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Oct 13 06:24:35.611225 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Oct 13 06:24:35.611232 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Oct 13 06:24:35.611238 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Oct 13 06:24:35.611244 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Oct 13 06:24:35.611250 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Oct 13 06:24:35.611256 kernel: TSC deadline timer available Oct 13 06:24:35.611262 kernel: CPU topo: Max. logical packages: 1 Oct 13 06:24:35.611268 kernel: CPU topo: Max. logical dies: 1 Oct 13 06:24:35.611274 kernel: CPU topo: Max. dies per package: 1 Oct 13 06:24:35.611280 kernel: CPU topo: Max. threads per core: 2 Oct 13 06:24:35.611286 kernel: CPU topo: Num. cores per package: 8 Oct 13 06:24:35.611292 kernel: CPU topo: Num. threads per package: 16 Oct 13 06:24:35.611298 kernel: CPU topo: Allowing 16 present CPUs plus 0 hotplug CPUs Oct 13 06:24:35.611304 kernel: [mem 0x90000000-0xdfffffff] available for PCI devices Oct 13 06:24:35.611310 kernel: Booting paravirtualized kernel on bare hardware Oct 13 06:24:35.611316 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Oct 13 06:24:35.611323 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Oct 13 06:24:35.611329 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Oct 13 06:24:35.611335 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Oct 13 06:24:35.611341 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Oct 13 06:24:35.611347 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=4919840803704517a91afcb9d57d99e9935244ff049349c54216d9a31bc1da5d Oct 13 06:24:35.611354 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Oct 13 06:24:35.611360 kernel: random: crng init done Oct 13 06:24:35.611366 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Oct 13 06:24:35.611372 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Oct 13 06:24:35.611378 kernel: Fallback order for Node 0: 0 Oct 13 06:24:35.611384 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8363245 Oct 13 06:24:35.611390 kernel: Policy zone: Normal Oct 13 06:24:35.611396 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Oct 13 06:24:35.611403 kernel: software IO TLB: area num 16. Oct 13 06:24:35.611409 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Oct 13 06:24:35.611415 kernel: ftrace: allocating 40210 entries in 158 pages Oct 13 06:24:35.611421 kernel: ftrace: allocated 158 pages with 5 groups Oct 13 06:24:35.611426 kernel: Dynamic Preempt: voluntary Oct 13 06:24:35.611433 kernel: rcu: Preemptible hierarchical RCU implementation. Oct 13 06:24:35.611439 kernel: rcu: RCU event tracing is enabled. Oct 13 06:24:35.611446 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Oct 13 06:24:35.611452 kernel: Trampoline variant of Tasks RCU enabled. Oct 13 06:24:35.611458 kernel: Rude variant of Tasks RCU enabled. Oct 13 06:24:35.611464 kernel: Tracing variant of Tasks RCU enabled. Oct 13 06:24:35.611470 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Oct 13 06:24:35.611476 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Oct 13 06:24:35.611482 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Oct 13 06:24:35.611488 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Oct 13 06:24:35.611495 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Oct 13 06:24:35.611501 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Oct 13 06:24:35.611506 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Oct 13 06:24:35.611512 kernel: Console: colour VGA+ 80x25 Oct 13 06:24:35.611518 kernel: printk: legacy console [tty0] enabled Oct 13 06:24:35.611524 kernel: printk: legacy console [ttyS1] enabled Oct 13 06:24:35.611530 kernel: ACPI: Core revision 20240827 Oct 13 06:24:35.611537 kernel: hpet: HPET dysfunctional in PC10. Force disabled. Oct 13 06:24:35.611543 kernel: APIC: Switch to symmetric I/O mode setup Oct 13 06:24:35.611549 kernel: DMAR: Host address width 39 Oct 13 06:24:35.611555 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Oct 13 06:24:35.611561 kernel: DMAR: dmar0: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Oct 13 06:24:35.611567 kernel: DMAR: RMRR base: 0x0000008cf18000 end: 0x0000008d161fff Oct 13 06:24:35.611573 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 0 Oct 13 06:24:35.611580 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Oct 13 06:24:35.611586 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Oct 13 06:24:35.611592 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Oct 13 06:24:35.611598 kernel: x2apic enabled Oct 13 06:24:35.611604 kernel: APIC: Switched APIC routing to: cluster x2apic Oct 13 06:24:35.611610 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Oct 13 06:24:35.611616 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Oct 13 06:24:35.611622 kernel: CPU0: Thermal monitoring enabled (TM1) Oct 13 06:24:35.611628 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Oct 13 06:24:35.611634 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Oct 13 06:24:35.611640 kernel: process: using mwait in idle threads Oct 13 06:24:35.611645 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Oct 13 06:24:35.611651 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Oct 13 06:24:35.611657 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Oct 13 06:24:35.611662 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Oct 13 06:24:35.611668 kernel: RETBleed: Mitigation: Enhanced IBRS Oct 13 06:24:35.611674 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Oct 13 06:24:35.611680 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Oct 13 06:24:35.611686 kernel: TAA: Mitigation: TSX disabled Oct 13 06:24:35.611692 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Oct 13 06:24:35.611697 kernel: SRBDS: Mitigation: Microcode Oct 13 06:24:35.611703 kernel: GDS: Vulnerable: No microcode Oct 13 06:24:35.611709 kernel: active return thunk: its_return_thunk Oct 13 06:24:35.611715 kernel: ITS: Mitigation: Aligned branch/return thunks Oct 13 06:24:35.611720 kernel: VMSCAPE: Mitigation: IBPB before exit to userspace Oct 13 06:24:35.611726 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Oct 13 06:24:35.611732 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Oct 13 06:24:35.611738 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Oct 13 06:24:35.611745 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Oct 13 06:24:35.611750 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Oct 13 06:24:35.611756 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Oct 13 06:24:35.611762 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Oct 13 06:24:35.611767 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Oct 13 06:24:35.611773 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Oct 13 06:24:35.611779 kernel: Freeing SMP alternatives memory: 32K Oct 13 06:24:35.611785 kernel: pid_max: default: 32768 minimum: 301 Oct 13 06:24:35.611790 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Oct 13 06:24:35.611796 kernel: landlock: Up and running. Oct 13 06:24:35.611804 kernel: SELinux: Initializing. Oct 13 06:24:35.611811 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Oct 13 06:24:35.611816 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Oct 13 06:24:35.611822 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Oct 13 06:24:35.611828 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Oct 13 06:24:35.611834 kernel: ... version: 4 Oct 13 06:24:35.611840 kernel: ... bit width: 48 Oct 13 06:24:35.611846 kernel: ... generic registers: 4 Oct 13 06:24:35.611852 kernel: ... value mask: 0000ffffffffffff Oct 13 06:24:35.611859 kernel: ... max period: 00007fffffffffff Oct 13 06:24:35.611865 kernel: ... fixed-purpose events: 3 Oct 13 06:24:35.611871 kernel: ... event mask: 000000070000000f Oct 13 06:24:35.611877 kernel: signal: max sigframe size: 2032 Oct 13 06:24:35.611883 kernel: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Oct 13 06:24:35.611889 kernel: rcu: Hierarchical SRCU implementation. Oct 13 06:24:35.611895 kernel: rcu: Max phase no-delay instances is 400. Oct 13 06:24:35.611902 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Oct 13 06:24:35.611908 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Oct 13 06:24:35.611914 kernel: smp: Bringing up secondary CPUs ... Oct 13 06:24:35.611920 kernel: smpboot: x86: Booting SMP configuration: Oct 13 06:24:35.611926 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12 #13 #14 #15 Oct 13 06:24:35.611932 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Oct 13 06:24:35.611938 kernel: smp: Brought up 1 node, 16 CPUs Oct 13 06:24:35.611945 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Oct 13 06:24:35.611951 kernel: Memory: 32725908K/33452980K available (14336K kernel code, 2450K rwdata, 10012K rodata, 24532K init, 1684K bss, 701792K reserved, 0K cma-reserved) Oct 13 06:24:35.611957 kernel: devtmpfs: initialized Oct 13 06:24:35.611963 kernel: x86/mm: Memory block size: 128MB Oct 13 06:24:35.611969 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x8251a000-0x8251afff] (4096 bytes) Oct 13 06:24:35.611975 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x8c23b000-0x8c66cfff] (4399104 bytes) Oct 13 06:24:35.611981 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Oct 13 06:24:35.611988 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Oct 13 06:24:35.611994 kernel: pinctrl core: initialized pinctrl subsystem Oct 13 06:24:35.612000 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Oct 13 06:24:35.612006 kernel: audit: initializing netlink subsys (disabled) Oct 13 06:24:35.612012 kernel: audit: type=2000 audit(1760336667.041:1): state=initialized audit_enabled=0 res=1 Oct 13 06:24:35.612017 kernel: thermal_sys: Registered thermal governor 'step_wise' Oct 13 06:24:35.612023 kernel: thermal_sys: Registered thermal governor 'user_space' Oct 13 06:24:35.612030 kernel: cpuidle: using governor menu Oct 13 06:24:35.612036 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Oct 13 06:24:35.612042 kernel: dca service started, version 1.12.1 Oct 13 06:24:35.612048 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Oct 13 06:24:35.612054 kernel: PCI: Using configuration type 1 for base access Oct 13 06:24:35.612060 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Oct 13 06:24:35.612066 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Oct 13 06:24:35.612073 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Oct 13 06:24:35.612079 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Oct 13 06:24:35.612085 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Oct 13 06:24:35.612090 kernel: ACPI: Added _OSI(Module Device) Oct 13 06:24:35.612096 kernel: ACPI: Added _OSI(Processor Device) Oct 13 06:24:35.612102 kernel: ACPI: Added _OSI(Processor Aggregator Device) Oct 13 06:24:35.612108 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Oct 13 06:24:35.612115 kernel: ACPI: Dynamic OEM Table Load: Oct 13 06:24:35.612121 kernel: ACPI: SSDT 0xFFFF8D84020D2400 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Oct 13 06:24:35.612127 kernel: ACPI: Dynamic OEM Table Load: Oct 13 06:24:35.612133 kernel: ACPI: SSDT 0xFFFF8D840212C800 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Oct 13 06:24:35.612139 kernel: ACPI: Dynamic OEM Table Load: Oct 13 06:24:35.612145 kernel: ACPI: SSDT 0xFFFF8D840024AA00 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Oct 13 06:24:35.612151 kernel: ACPI: Dynamic OEM Table Load: Oct 13 06:24:35.612156 kernel: ACPI: SSDT 0xFFFF8D840212D000 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Oct 13 06:24:35.612163 kernel: ACPI: Dynamic OEM Table Load: Oct 13 06:24:35.612169 kernel: ACPI: SSDT 0xFFFF8D84001A6000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Oct 13 06:24:35.612175 kernel: ACPI: Dynamic OEM Table Load: Oct 13 06:24:35.612181 kernel: ACPI: SSDT 0xFFFF8D84020D4C00 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Oct 13 06:24:35.612186 kernel: ACPI: Interpreter enabled Oct 13 06:24:35.612192 kernel: ACPI: PM: (supports S0 S5) Oct 13 06:24:35.612198 kernel: ACPI: Using IOAPIC for interrupt routing Oct 13 06:24:35.612205 kernel: HEST: Enabling Firmware First mode for corrected errors. Oct 13 06:24:35.612211 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Oct 13 06:24:35.612217 kernel: HEST: Table parsing has been initialized. Oct 13 06:24:35.612223 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Oct 13 06:24:35.612229 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Oct 13 06:24:35.612235 kernel: PCI: Using E820 reservations for host bridge windows Oct 13 06:24:35.612241 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Oct 13 06:24:35.612248 kernel: ACPI: \_SB_.PCI0.XDCI.USBC: New power resource Oct 13 06:24:35.612254 kernel: ACPI: \_SB_.PCI0.SAT0.VOL0.V0PR: New power resource Oct 13 06:24:35.612260 kernel: ACPI: \_SB_.PCI0.SAT0.VOL1.V1PR: New power resource Oct 13 06:24:35.612265 kernel: ACPI: \_SB_.PCI0.SAT0.VOL2.V2PR: New power resource Oct 13 06:24:35.612271 kernel: ACPI: \_SB_.PCI0.CNVW.WRST: New power resource Oct 13 06:24:35.612277 kernel: ACPI: \_TZ_.FN00: New power resource Oct 13 06:24:35.612283 kernel: ACPI: \_TZ_.FN01: New power resource Oct 13 06:24:35.612290 kernel: ACPI: \_TZ_.FN02: New power resource Oct 13 06:24:35.612296 kernel: ACPI: \_TZ_.FN03: New power resource Oct 13 06:24:35.612302 kernel: ACPI: \_TZ_.FN04: New power resource Oct 13 06:24:35.612308 kernel: ACPI: \PIN_: New power resource Oct 13 06:24:35.612314 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Oct 13 06:24:35.612406 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Oct 13 06:24:35.612475 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Oct 13 06:24:35.612542 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Oct 13 06:24:35.612551 kernel: PCI host bridge to bus 0000:00 Oct 13 06:24:35.612617 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Oct 13 06:24:35.612684 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Oct 13 06:24:35.612774 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Oct 13 06:24:35.612935 kernel: pci_bus 0000:00: root bus resource [mem 0x90000000-0xdfffffff window] Oct 13 06:24:35.613050 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Oct 13 06:24:35.613129 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Oct 13 06:24:35.613238 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 conventional PCI endpoint Oct 13 06:24:35.613323 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 PCIe Root Port Oct 13 06:24:35.613396 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 13 06:24:35.613465 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Oct 13 06:24:35.613536 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Oct 13 06:24:35.613603 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Oct 13 06:24:35.613673 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 conventional PCI endpoint Oct 13 06:24:35.613739 kernel: pci 0000:00:08.0: BAR 0 [mem 0x95520000-0x95520fff 64bit] Oct 13 06:24:35.613816 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 conventional PCI endpoint Oct 13 06:24:35.613884 kernel: pci 0000:00:12.0: BAR 0 [mem 0x9551f000-0x9551ffff 64bit] Oct 13 06:24:35.613955 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 conventional PCI endpoint Oct 13 06:24:35.614022 kernel: pci 0000:00:14.0: BAR 0 [mem 0x95500000-0x9550ffff 64bit] Oct 13 06:24:35.614087 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Oct 13 06:24:35.614154 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 conventional PCI endpoint Oct 13 06:24:35.614223 kernel: pci 0000:00:14.2: BAR 0 [mem 0x95512000-0x95513fff 64bit] Oct 13 06:24:35.614294 kernel: pci 0000:00:14.2: BAR 2 [mem 0x9551e000-0x9551efff 64bit] Oct 13 06:24:35.614367 kernel: pci 0000:00:14.5: [8086:a375] type 00 class 0x080501 conventional PCI endpoint Oct 13 06:24:35.614434 kernel: pci 0000:00:14.5: BAR 0 [mem 0x9551d000-0x9551dfff 64bit] Oct 13 06:24:35.614504 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 conventional PCI endpoint Oct 13 06:24:35.614571 kernel: pci 0000:00:15.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Oct 13 06:24:35.614642 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 conventional PCI endpoint Oct 13 06:24:35.614707 kernel: pci 0000:00:15.1: BAR 0 [mem 0x00000000-0x00000fff 64bit] Oct 13 06:24:35.614775 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 conventional PCI endpoint Oct 13 06:24:35.614848 kernel: pci 0000:00:16.0: BAR 0 [mem 0x9551a000-0x9551afff 64bit] Oct 13 06:24:35.614917 kernel: pci 0000:00:16.0: PME# supported from D3hot Oct 13 06:24:35.614986 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 conventional PCI endpoint Oct 13 06:24:35.615051 kernel: pci 0000:00:16.1: BAR 0 [mem 0x95519000-0x95519fff 64bit] Oct 13 06:24:35.615118 kernel: pci 0000:00:16.1: PME# supported from D3hot Oct 13 06:24:35.615186 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 conventional PCI endpoint Oct 13 06:24:35.615251 kernel: pci 0000:00:16.4: BAR 0 [mem 0x95518000-0x95518fff 64bit] Oct 13 06:24:35.615317 kernel: pci 0000:00:16.4: PME# supported from D3hot Oct 13 06:24:35.615387 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 conventional PCI endpoint Oct 13 06:24:35.615453 kernel: pci 0000:00:17.0: BAR 0 [mem 0x95510000-0x95511fff] Oct 13 06:24:35.615520 kernel: pci 0000:00:17.0: BAR 1 [mem 0x95517000-0x955170ff] Oct 13 06:24:35.615586 kernel: pci 0000:00:17.0: BAR 2 [io 0x6050-0x6057] Oct 13 06:24:35.615651 kernel: pci 0000:00:17.0: BAR 3 [io 0x6040-0x6043] Oct 13 06:24:35.615716 kernel: pci 0000:00:17.0: BAR 4 [io 0x6020-0x603f] Oct 13 06:24:35.615780 kernel: pci 0000:00:17.0: BAR 5 [mem 0x95516000-0x955167ff] Oct 13 06:24:35.615852 kernel: pci 0000:00:17.0: PME# supported from D3hot Oct 13 06:24:35.615928 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 PCIe Root Port Oct 13 06:24:35.615995 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Oct 13 06:24:35.616060 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Oct 13 06:24:35.616130 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 PCIe Root Port Oct 13 06:24:35.616196 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Oct 13 06:24:35.616264 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Oct 13 06:24:35.616329 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Oct 13 06:24:35.616394 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Oct 13 06:24:35.616463 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 PCIe Root Port Oct 13 06:24:35.616528 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Oct 13 06:24:35.616593 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Oct 13 06:24:35.616660 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Oct 13 06:24:35.616727 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Oct 13 06:24:35.616796 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 PCIe Root Port Oct 13 06:24:35.616868 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Oct 13 06:24:35.616933 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Oct 13 06:24:35.617002 kernel: pci 0000:00:1c.3: [8086:a33b] type 01 class 0x060400 PCIe Root Port Oct 13 06:24:35.617070 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Oct 13 06:24:35.617135 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Oct 13 06:24:35.617199 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Oct 13 06:24:35.617264 kernel: pci 0000:00:1c.3: PME# supported from D0 D3hot D3cold Oct 13 06:24:35.617334 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 conventional PCI endpoint Oct 13 06:24:35.617417 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Oct 13 06:24:35.617485 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 conventional PCI endpoint Oct 13 06:24:35.617555 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 conventional PCI endpoint Oct 13 06:24:35.617618 kernel: pci 0000:00:1f.4: BAR 0 [mem 0x95514000-0x955140ff 64bit] Oct 13 06:24:35.617681 kernel: pci 0000:00:1f.4: BAR 4 [io 0xefa0-0xefbf] Oct 13 06:24:35.617749 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 conventional PCI endpoint Oct 13 06:24:35.617837 kernel: pci 0000:00:1f.5: BAR 0 [mem 0xfe010000-0xfe010fff] Oct 13 06:24:35.617923 kernel: pci 0000:01:00.0: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Oct 13 06:24:35.617988 kernel: pci 0000:01:00.0: BAR 0 [mem 0x92000000-0x93ffffff 64bit pref] Oct 13 06:24:35.618052 kernel: pci 0000:01:00.0: ROM [mem 0x95200000-0x952fffff pref] Oct 13 06:24:35.618116 kernel: pci 0000:01:00.0: PME# supported from D3cold Oct 13 06:24:35.618185 kernel: pci 0000:01:00.0: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Oct 13 06:24:35.618250 kernel: pci 0000:01:00.0: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Oct 13 06:24:35.618319 kernel: pci 0000:01:00.1: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Oct 13 06:24:35.618385 kernel: pci 0000:01:00.1: BAR 0 [mem 0x90000000-0x91ffffff 64bit pref] Oct 13 06:24:35.618449 kernel: pci 0000:01:00.1: ROM [mem 0x95100000-0x951fffff pref] Oct 13 06:24:35.618513 kernel: pci 0000:01:00.1: PME# supported from D3cold Oct 13 06:24:35.618580 kernel: pci 0000:01:00.1: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Oct 13 06:24:35.618645 kernel: pci 0000:01:00.1: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Oct 13 06:24:35.618710 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 13 06:24:35.618774 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Oct 13 06:24:35.618850 kernel: pci 0000:03:00.0: working around ROM BAR overlap defect Oct 13 06:24:35.618916 kernel: pci 0000:03:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Oct 13 06:24:35.618983 kernel: pci 0000:03:00.0: BAR 0 [mem 0x95400000-0x9547ffff] Oct 13 06:24:35.619047 kernel: pci 0000:03:00.0: BAR 2 [io 0x5000-0x501f] Oct 13 06:24:35.619111 kernel: pci 0000:03:00.0: BAR 3 [mem 0x95480000-0x95483fff] Oct 13 06:24:35.619175 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Oct 13 06:24:35.619240 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Oct 13 06:24:35.619308 kernel: pci 0000:04:00.0: working around ROM BAR overlap defect Oct 13 06:24:35.619375 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Oct 13 06:24:35.619440 kernel: pci 0000:04:00.0: BAR 0 [mem 0x95300000-0x9537ffff] Oct 13 06:24:35.619504 kernel: pci 0000:04:00.0: BAR 2 [io 0x4000-0x401f] Oct 13 06:24:35.619569 kernel: pci 0000:04:00.0: BAR 3 [mem 0x95380000-0x95383fff] Oct 13 06:24:35.619633 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Oct 13 06:24:35.619698 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Oct 13 06:24:35.619764 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Oct 13 06:24:35.619840 kernel: pci 0000:06:00.0: [1a03:1150] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Oct 13 06:24:35.619906 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Oct 13 06:24:35.619970 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Oct 13 06:24:35.620035 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Oct 13 06:24:35.620099 kernel: pci 0000:06:00.0: enabling Extended Tags Oct 13 06:24:35.620165 kernel: pci 0000:06:00.0: supports D1 D2 Oct 13 06:24:35.620230 kernel: pci 0000:06:00.0: PME# supported from D0 D1 D2 D3hot D3cold Oct 13 06:24:35.620294 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Oct 13 06:24:35.620361 kernel: pci_bus 0000:07: extended config space not accessible Oct 13 06:24:35.620434 kernel: pci 0000:07:00.0: [1a03:2000] type 00 class 0x030000 conventional PCI endpoint Oct 13 06:24:35.620504 kernel: pci 0000:07:00.0: BAR 0 [mem 0x94000000-0x94ffffff] Oct 13 06:24:35.620570 kernel: pci 0000:07:00.0: BAR 1 [mem 0x95000000-0x9501ffff] Oct 13 06:24:35.620636 kernel: pci 0000:07:00.0: BAR 2 [io 0x3000-0x307f] Oct 13 06:24:35.620701 kernel: pci 0000:07:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Oct 13 06:24:35.620767 kernel: pci 0000:07:00.0: supports D1 D2 Oct 13 06:24:35.620852 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Oct 13 06:24:35.620919 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Oct 13 06:24:35.620928 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Oct 13 06:24:35.620934 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Oct 13 06:24:35.620940 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Oct 13 06:24:35.620946 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Oct 13 06:24:35.620954 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Oct 13 06:24:35.620960 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Oct 13 06:24:35.620967 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Oct 13 06:24:35.620973 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Oct 13 06:24:35.620979 kernel: iommu: Default domain type: Translated Oct 13 06:24:35.620985 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Oct 13 06:24:35.620992 kernel: PCI: Using ACPI for IRQ routing Oct 13 06:24:35.620998 kernel: PCI: pci_cache_line_size set to 64 bytes Oct 13 06:24:35.621004 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Oct 13 06:24:35.621011 kernel: e820: reserve RAM buffer [mem 0x8251a000-0x83ffffff] Oct 13 06:24:35.621017 kernel: e820: reserve RAM buffer [mem 0x8afcd000-0x8bffffff] Oct 13 06:24:35.621022 kernel: e820: reserve RAM buffer [mem 0x8c23b000-0x8fffffff] Oct 13 06:24:35.621028 kernel: e820: reserve RAM buffer [mem 0x8ef00000-0x8fffffff] Oct 13 06:24:35.621034 kernel: e820: reserve RAM buffer [mem 0x86f000000-0x86fffffff] Oct 13 06:24:35.621099 kernel: pci 0000:07:00.0: vgaarb: setting as boot VGA device Oct 13 06:24:35.621163 kernel: pci 0000:07:00.0: vgaarb: bridge control possible Oct 13 06:24:35.621230 kernel: pci 0000:07:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Oct 13 06:24:35.621239 kernel: vgaarb: loaded Oct 13 06:24:35.621245 kernel: clocksource: Switched to clocksource tsc-early Oct 13 06:24:35.621252 kernel: VFS: Disk quotas dquot_6.6.0 Oct 13 06:24:35.621258 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Oct 13 06:24:35.621264 kernel: pnp: PnP ACPI init Oct 13 06:24:35.621328 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Oct 13 06:24:35.621394 kernel: pnp 00:02: [dma 0 disabled] Oct 13 06:24:35.621457 kernel: pnp 00:03: [dma 0 disabled] Oct 13 06:24:35.621521 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Oct 13 06:24:35.621580 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Oct 13 06:24:35.621643 kernel: system 00:05: [mem 0xfed10000-0xfed17fff] has been reserved Oct 13 06:24:35.621704 kernel: system 00:05: [mem 0xfed18000-0xfed18fff] has been reserved Oct 13 06:24:35.621762 kernel: system 00:05: [mem 0xfed19000-0xfed19fff] has been reserved Oct 13 06:24:35.621847 kernel: system 00:05: [mem 0xe0000000-0xefffffff] has been reserved Oct 13 06:24:35.621923 kernel: system 00:05: [mem 0xfed20000-0xfed3ffff] has been reserved Oct 13 06:24:35.621980 kernel: system 00:05: [mem 0xfed90000-0xfed93fff] could not be reserved Oct 13 06:24:35.622039 kernel: system 00:05: [mem 0xfed45000-0xfed8ffff] has been reserved Oct 13 06:24:35.622099 kernel: system 00:05: [mem 0xfee00000-0xfeefffff] could not be reserved Oct 13 06:24:35.622160 kernel: system 00:06: [io 0x1800-0x18fe] could not be reserved Oct 13 06:24:35.622219 kernel: system 00:06: [mem 0xfd000000-0xfd69ffff] has been reserved Oct 13 06:24:35.622278 kernel: system 00:06: [mem 0xfd6c0000-0xfd6cffff] has been reserved Oct 13 06:24:35.622335 kernel: system 00:06: [mem 0xfd6f0000-0xfdffffff] has been reserved Oct 13 06:24:35.622393 kernel: system 00:06: [mem 0xfe000000-0xfe01ffff] could not be reserved Oct 13 06:24:35.622453 kernel: system 00:06: [mem 0xfe200000-0xfe7fffff] has been reserved Oct 13 06:24:35.622511 kernel: system 00:06: [mem 0xff000000-0xffffffff] has been reserved Oct 13 06:24:35.622572 kernel: system 00:07: [io 0x2000-0x20fe] has been reserved Oct 13 06:24:35.622581 kernel: pnp: PnP ACPI: found 9 devices Oct 13 06:24:35.622587 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Oct 13 06:24:35.622594 kernel: NET: Registered PF_INET protocol family Oct 13 06:24:35.622602 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Oct 13 06:24:35.622608 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Oct 13 06:24:35.622614 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Oct 13 06:24:35.622620 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Oct 13 06:24:35.622626 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Oct 13 06:24:35.622632 kernel: TCP: Hash tables configured (established 262144 bind 65536) Oct 13 06:24:35.622638 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Oct 13 06:24:35.622645 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Oct 13 06:24:35.622651 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Oct 13 06:24:35.622657 kernel: NET: Registered PF_XDP protocol family Oct 13 06:24:35.622719 kernel: pci 0000:00:15.0: BAR 0 [mem 0x95515000-0x95515fff 64bit]: assigned Oct 13 06:24:35.622782 kernel: pci 0000:00:15.1: BAR 0 [mem 0x9551b000-0x9551bfff 64bit]: assigned Oct 13 06:24:35.622848 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x9551c000-0x9551cfff 64bit]: assigned Oct 13 06:24:35.622915 kernel: pci 0000:01:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Oct 13 06:24:35.622980 kernel: pci 0000:01:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Oct 13 06:24:35.623043 kernel: pci 0000:01:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Oct 13 06:24:35.623107 kernel: pci 0000:01:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Oct 13 06:24:35.623172 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 13 06:24:35.623235 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Oct 13 06:24:35.623298 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Oct 13 06:24:35.623361 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Oct 13 06:24:35.623424 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Oct 13 06:24:35.623486 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Oct 13 06:24:35.623548 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Oct 13 06:24:35.623613 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Oct 13 06:24:35.623674 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Oct 13 06:24:35.623736 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Oct 13 06:24:35.623799 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Oct 13 06:24:35.623907 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Oct 13 06:24:35.623971 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Oct 13 06:24:35.624035 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Oct 13 06:24:35.624097 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Oct 13 06:24:35.624160 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Oct 13 06:24:35.624223 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Oct 13 06:24:35.624280 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Oct 13 06:24:35.624337 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Oct 13 06:24:35.624392 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Oct 13 06:24:35.624448 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Oct 13 06:24:35.624504 kernel: pci_bus 0000:00: resource 7 [mem 0x90000000-0xdfffffff window] Oct 13 06:24:35.624562 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Oct 13 06:24:35.624625 kernel: pci_bus 0000:01: resource 1 [mem 0x95100000-0x952fffff] Oct 13 06:24:35.624683 kernel: pci_bus 0000:01: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Oct 13 06:24:35.624744 kernel: pci_bus 0000:03: resource 0 [io 0x5000-0x5fff] Oct 13 06:24:35.624805 kernel: pci_bus 0000:03: resource 1 [mem 0x95400000-0x954fffff] Oct 13 06:24:35.624873 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Oct 13 06:24:35.624931 kernel: pci_bus 0000:04: resource 1 [mem 0x95300000-0x953fffff] Oct 13 06:24:35.624994 kernel: pci_bus 0000:06: resource 0 [io 0x3000-0x3fff] Oct 13 06:24:35.625052 kernel: pci_bus 0000:06: resource 1 [mem 0x94000000-0x950fffff] Oct 13 06:24:35.625112 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Oct 13 06:24:35.625172 kernel: pci_bus 0000:07: resource 1 [mem 0x94000000-0x950fffff] Oct 13 06:24:35.625182 kernel: PCI: CLS 64 bytes, default 64 Oct 13 06:24:35.625189 kernel: DMAR: No ATSR found Oct 13 06:24:35.625195 kernel: DMAR: No SATC found Oct 13 06:24:35.625201 kernel: DMAR: dmar0: Using Queued invalidation Oct 13 06:24:35.625264 kernel: pci 0000:00:00.0: Adding to iommu group 0 Oct 13 06:24:35.625327 kernel: pci 0000:00:01.0: Adding to iommu group 1 Oct 13 06:24:35.625390 kernel: pci 0000:00:08.0: Adding to iommu group 2 Oct 13 06:24:35.625455 kernel: pci 0000:00:12.0: Adding to iommu group 3 Oct 13 06:24:35.625517 kernel: pci 0000:00:14.0: Adding to iommu group 4 Oct 13 06:24:35.625579 kernel: pci 0000:00:14.2: Adding to iommu group 4 Oct 13 06:24:35.625641 kernel: pci 0000:00:14.5: Adding to iommu group 4 Oct 13 06:24:35.625702 kernel: pci 0000:00:15.0: Adding to iommu group 5 Oct 13 06:24:35.625763 kernel: pci 0000:00:15.1: Adding to iommu group 5 Oct 13 06:24:35.625830 kernel: pci 0000:00:16.0: Adding to iommu group 6 Oct 13 06:24:35.625932 kernel: pci 0000:00:16.1: Adding to iommu group 6 Oct 13 06:24:35.625994 kernel: pci 0000:00:16.4: Adding to iommu group 6 Oct 13 06:24:35.626056 kernel: pci 0000:00:17.0: Adding to iommu group 7 Oct 13 06:24:35.626118 kernel: pci 0000:00:1b.0: Adding to iommu group 8 Oct 13 06:24:35.626181 kernel: pci 0000:00:1b.4: Adding to iommu group 9 Oct 13 06:24:35.626244 kernel: pci 0000:00:1b.5: Adding to iommu group 10 Oct 13 06:24:35.626309 kernel: pci 0000:00:1c.0: Adding to iommu group 11 Oct 13 06:24:35.626371 kernel: pci 0000:00:1c.3: Adding to iommu group 12 Oct 13 06:24:35.626433 kernel: pci 0000:00:1e.0: Adding to iommu group 13 Oct 13 06:24:35.626495 kernel: pci 0000:00:1f.0: Adding to iommu group 14 Oct 13 06:24:35.626558 kernel: pci 0000:00:1f.4: Adding to iommu group 14 Oct 13 06:24:35.626619 kernel: pci 0000:00:1f.5: Adding to iommu group 14 Oct 13 06:24:35.626685 kernel: pci 0000:01:00.0: Adding to iommu group 1 Oct 13 06:24:35.626748 kernel: pci 0000:01:00.1: Adding to iommu group 1 Oct 13 06:24:35.626814 kernel: pci 0000:03:00.0: Adding to iommu group 15 Oct 13 06:24:35.627145 kernel: pci 0000:04:00.0: Adding to iommu group 16 Oct 13 06:24:35.627209 kernel: pci 0000:06:00.0: Adding to iommu group 17 Oct 13 06:24:35.627277 kernel: pci 0000:07:00.0: Adding to iommu group 17 Oct 13 06:24:35.627322 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Oct 13 06:24:35.627329 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Oct 13 06:24:35.627352 kernel: software IO TLB: mapped [mem 0x0000000086fcd000-0x000000008afcd000] (64MB) Oct 13 06:24:35.627359 kernel: RAPL PMU: API unit is 2^-32 Joules, 3 fixed counters, 655360 ms ovfl timer Oct 13 06:24:35.627398 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Oct 13 06:24:35.627405 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Oct 13 06:24:35.627425 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Oct 13 06:24:35.627507 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Oct 13 06:24:35.627517 kernel: Initialise system trusted keyrings Oct 13 06:24:35.627523 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Oct 13 06:24:35.627529 kernel: Key type asymmetric registered Oct 13 06:24:35.627535 kernel: Asymmetric key parser 'x509' registered Oct 13 06:24:35.627541 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Oct 13 06:24:35.627547 kernel: io scheduler mq-deadline registered Oct 13 06:24:35.627555 kernel: io scheduler kyber registered Oct 13 06:24:35.627561 kernel: io scheduler bfq registered Oct 13 06:24:35.627623 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 121 Oct 13 06:24:35.627686 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 122 Oct 13 06:24:35.627748 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 123 Oct 13 06:24:35.627814 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 124 Oct 13 06:24:35.627877 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 125 Oct 13 06:24:35.627941 kernel: pcieport 0000:00:1c.3: PME: Signaling with IRQ 126 Oct 13 06:24:35.628010 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Oct 13 06:24:35.628019 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Oct 13 06:24:35.628026 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Oct 13 06:24:35.628032 kernel: pstore: Using crash dump compression: deflate Oct 13 06:24:35.628038 kernel: pstore: Registered erst as persistent store backend Oct 13 06:24:35.628046 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Oct 13 06:24:35.628052 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Oct 13 06:24:35.628058 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Oct 13 06:24:35.628064 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Oct 13 06:24:35.628070 kernel: hpet_acpi_add: no address or irqs in _CRS Oct 13 06:24:35.628132 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Oct 13 06:24:35.628141 kernel: i8042: PNP: No PS/2 controller found. Oct 13 06:24:35.628257 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Oct 13 06:24:35.628316 kernel: rtc_cmos rtc_cmos: registered as rtc0 Oct 13 06:24:35.628374 kernel: rtc_cmos rtc_cmos: setting system clock to 2025-10-13T06:24:32 UTC (1760336672) Oct 13 06:24:35.628432 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Oct 13 06:24:35.628440 kernel: intel_pstate: Intel P-state driver initializing Oct 13 06:24:35.628447 kernel: intel_pstate: Disabling energy efficiency optimization Oct 13 06:24:35.628454 kernel: intel_pstate: HWP enabled Oct 13 06:24:35.628461 kernel: NET: Registered PF_INET6 protocol family Oct 13 06:24:35.628467 kernel: Segment Routing with IPv6 Oct 13 06:24:35.628473 kernel: In-situ OAM (IOAM) with IPv6 Oct 13 06:24:35.628479 kernel: NET: Registered PF_PACKET protocol family Oct 13 06:24:35.628485 kernel: Key type dns_resolver registered Oct 13 06:24:35.628491 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Oct 13 06:24:35.628497 kernel: microcode: Current revision: 0x000000f4 Oct 13 06:24:35.628504 kernel: IPI shorthand broadcast: enabled Oct 13 06:24:35.628510 kernel: sched_clock: Marking stable (2210073583, 1495597801)->(5322368330, -1616696946) Oct 13 06:24:35.628516 kernel: registered taskstats version 1 Oct 13 06:24:35.628522 kernel: Loading compiled-in X.509 certificates Oct 13 06:24:35.628528 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.51-flatcar: 9f1258ccc510afd4f2a37f4774c4b2e958d823b7' Oct 13 06:24:35.628534 kernel: Demotion targets for Node 0: null Oct 13 06:24:35.628540 kernel: Key type .fscrypt registered Oct 13 06:24:35.628548 kernel: Key type fscrypt-provisioning registered Oct 13 06:24:35.628554 kernel: ima: Allocated hash algorithm: sha1 Oct 13 06:24:35.628560 kernel: tsc: Refined TSC clocksource calibration: 3408.004 MHz Oct 13 06:24:35.628566 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd7ed0ff, max_idle_ns: 440795285512 ns Oct 13 06:24:35.628572 kernel: clocksource: Switched to clocksource tsc Oct 13 06:24:35.628578 kernel: ima: No architecture policies found Oct 13 06:24:35.628584 kernel: clk: Disabling unused clocks Oct 13 06:24:35.628591 kernel: Freeing unused kernel image (initmem) memory: 24532K Oct 13 06:24:35.628597 kernel: Write protecting the kernel read-only data: 24576k Oct 13 06:24:35.628603 kernel: Freeing unused kernel image (rodata/data gap) memory: 228K Oct 13 06:24:35.628609 kernel: Run /init as init process Oct 13 06:24:35.628615 kernel: with arguments: Oct 13 06:24:35.628621 kernel: /init Oct 13 06:24:35.628627 kernel: with environment: Oct 13 06:24:35.628634 kernel: HOME=/ Oct 13 06:24:35.628640 kernel: TERM=linux Oct 13 06:24:35.628646 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Oct 13 06:24:35.628651 kernel: SCSI subsystem initialized Oct 13 06:24:35.628657 kernel: libata version 3.00 loaded. Oct 13 06:24:35.628720 kernel: ahci 0000:00:17.0: version 3.0 Oct 13 06:24:35.628787 kernel: ahci 0000:00:17.0: AHCI vers 0001.0301, 32 command slots, 6 Gbps, SATA mode Oct 13 06:24:35.628939 kernel: ahci 0000:00:17.0: 7/7 ports implemented (port mask 0x7f) Oct 13 06:24:35.629062 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Oct 13 06:24:35.629212 kernel: scsi host0: ahci Oct 13 06:24:35.629359 kernel: scsi host1: ahci Oct 13 06:24:35.629499 kernel: scsi host2: ahci Oct 13 06:24:35.629601 kernel: scsi host3: ahci Oct 13 06:24:35.629672 kernel: scsi host4: ahci Oct 13 06:24:35.629743 kernel: scsi host5: ahci Oct 13 06:24:35.629814 kernel: scsi host6: ahci Oct 13 06:24:35.629823 kernel: ata1: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516100 irq 127 lpm-pol 0 Oct 13 06:24:35.629830 kernel: ata2: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516180 irq 127 lpm-pol 0 Oct 13 06:24:35.629838 kernel: ata3: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516200 irq 127 lpm-pol 0 Oct 13 06:24:35.629845 kernel: ata4: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516280 irq 127 lpm-pol 0 Oct 13 06:24:35.629851 kernel: ata5: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516300 irq 127 lpm-pol 0 Oct 13 06:24:35.629857 kernel: ata6: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516380 irq 127 lpm-pol 0 Oct 13 06:24:35.629864 kernel: ata7: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516400 irq 127 lpm-pol 0 Oct 13 06:24:35.629870 kernel: ata7: SATA link down (SStatus 0 SControl 300) Oct 13 06:24:35.629876 kernel: ata5: SATA link down (SStatus 0 SControl 300) Oct 13 06:24:35.629883 kernel: ata6: SATA link down (SStatus 0 SControl 300) Oct 13 06:24:35.629889 kernel: ata4: SATA link down (SStatus 0 SControl 300) Oct 13 06:24:35.629896 kernel: ata3: SATA link down (SStatus 0 SControl 300) Oct 13 06:24:35.629902 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Oct 13 06:24:35.629908 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Oct 13 06:24:35.629915 kernel: ata1.00: Model 'Micron_5200_MTFDDAK480TDN', rev ' D1MU020', applying quirks: zeroaftertrim Oct 13 06:24:35.629921 kernel: ata1.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU020, max UDMA/133 Oct 13 06:24:35.629928 kernel: ata2.00: Model 'Micron_5200_MTFDDAK480TDN', rev ' D1MU020', applying quirks: zeroaftertrim Oct 13 06:24:35.629934 kernel: ata2.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU020, max UDMA/133 Oct 13 06:24:35.629941 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Oct 13 06:24:35.629947 kernel: ata1.00: Features: NCQ-prio Oct 13 06:24:35.629953 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Oct 13 06:24:35.629959 kernel: ata2.00: Features: NCQ-prio Oct 13 06:24:35.629966 kernel: ata1.00: configured for UDMA/133 Oct 13 06:24:35.630040 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5200_MTFD U020 PQ: 0 ANSI: 5 Oct 13 06:24:35.630049 kernel: ata2.00: configured for UDMA/133 Oct 13 06:24:35.630120 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5200_MTFD U020 PQ: 0 ANSI: 5 Oct 13 06:24:35.630130 kernel: ata1.00: Enabling discard_zeroes_data Oct 13 06:24:35.630196 kernel: sd 0:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Oct 13 06:24:35.630205 kernel: ata2.00: Enabling discard_zeroes_data Oct 13 06:24:35.630273 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Oct 13 06:24:35.630341 kernel: sd 1:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Oct 13 06:24:35.630410 kernel: sd 0:0:0:0: [sda] Write Protect is off Oct 13 06:24:35.630478 kernel: sd 1:0:0:0: [sdb] 4096-byte physical blocks Oct 13 06:24:35.630546 kernel: sd 1:0:0:0: [sdb] Write Protect is off Oct 13 06:24:35.630614 kernel: sd 0:0:0:0: [sda] Mode Sense: 00 3a 00 00 Oct 13 06:24:35.630684 kernel: sd 1:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Oct 13 06:24:35.630752 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Oct 13 06:24:35.630837 kernel: sd 1:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Oct 13 06:24:35.630922 kernel: sd 0:0:0:0: [sda] Preferred minimum I/O size 4096 bytes Oct 13 06:24:35.630992 kernel: sd 1:0:0:0: [sdb] Preferred minimum I/O size 4096 bytes Oct 13 06:24:35.631001 kernel: ata1.00: Enabling discard_zeroes_data Oct 13 06:24:35.631007 kernel: ata2.00: Enabling discard_zeroes_data Oct 13 06:24:35.631016 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Oct 13 06:24:35.631022 kernel: GPT:16515071 != 937703087 Oct 13 06:24:35.631028 kernel: GPT:Alternate GPT header not at the end of the disk. Oct 13 06:24:35.631097 kernel: sd 1:0:0:0: [sdb] Attached SCSI disk Oct 13 06:24:35.631106 kernel: GPT:16515071 != 937703087 Oct 13 06:24:35.631112 kernel: GPT: Use GNU Parted to correct GPT errors. Oct 13 06:24:35.631118 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Oct 13 06:24:35.631187 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Oct 13 06:24:35.631196 kernel: ACPI: bus type USB registered Oct 13 06:24:35.631203 kernel: usbcore: registered new interface driver usbfs Oct 13 06:24:35.631209 kernel: usbcore: registered new interface driver hub Oct 13 06:24:35.631216 kernel: usbcore: registered new device driver usb Oct 13 06:24:35.631280 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Oct 13 06:24:35.631346 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Oct 13 06:24:35.631414 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Oct 13 06:24:35.631480 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Oct 13 06:24:35.631545 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Oct 13 06:24:35.631610 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Oct 13 06:24:35.631690 kernel: hub 1-0:1.0: USB hub found Oct 13 06:24:35.631760 kernel: hub 1-0:1.0: 16 ports detected Oct 13 06:24:35.631841 kernel: hub 2-0:1.0: USB hub found Oct 13 06:24:35.631911 kernel: hub 2-0:1.0: 10 ports detected Oct 13 06:24:35.631920 kernel: sdhci: Secure Digital Host Controller Interface driver Oct 13 06:24:35.631927 kernel: sdhci: Copyright(c) Pierre Ossman Oct 13 06:24:35.631991 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:35.632000 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Oct 13 06:24:35.632008 kernel: device-mapper: uevent: version 1.0.3 Oct 13 06:24:35.632015 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Oct 13 06:24:35.632021 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Oct 13 06:24:35.632028 kernel: Invalid ELF header magic: != \u007fELF Oct 13 06:24:35.632034 kernel: Invalid ELF header magic: != \u007fELF Oct 13 06:24:35.632041 kernel: raid6: avx2x4 gen() 47339 MB/s Oct 13 06:24:35.632047 kernel: raid6: avx2x2 gen() 51808 MB/s Oct 13 06:24:35.632054 kernel: raid6: avx2x1 gen() 44211 MB/s Oct 13 06:24:35.632060 kernel: raid6: using algorithm avx2x2 gen() 51808 MB/s Oct 13 06:24:35.632066 kernel: raid6: .... xor() 33080 MB/s, rmw enabled Oct 13 06:24:35.632073 kernel: raid6: using avx2x2 recovery algorithm Oct 13 06:24:35.632079 kernel: Invalid ELF header magic: != \u007fELF Oct 13 06:24:35.632156 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Oct 13 06:24:35.632166 kernel: Invalid ELF header magic: != \u007fELF Oct 13 06:24:35.632174 kernel: Invalid ELF header magic: != \u007fELF Oct 13 06:24:35.632180 kernel: xor: automatically using best checksumming function avx Oct 13 06:24:35.632187 kernel: Invalid ELF header magic: != \u007fELF Oct 13 06:24:35.632263 kernel: hub 1-14:1.0: USB hub found Oct 13 06:24:35.632335 kernel: hub 1-14:1.0: 4 ports detected Oct 13 06:24:35.632399 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:35.632409 kernel: Btrfs loaded, zoned=no, fsverity=no Oct 13 06:24:35.632417 kernel: BTRFS: device fsid e87b15e9-127c-40e2-bae7-d0ea05b4f2e3 devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (282) Oct 13 06:24:35.632424 kernel: BTRFS info (device dm-0): first mount of filesystem e87b15e9-127c-40e2-bae7-d0ea05b4f2e3 Oct 13 06:24:35.632430 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Oct 13 06:24:35.632436 kernel: BTRFS info (device dm-0): enabling ssd optimizations Oct 13 06:24:35.632443 kernel: BTRFS info (device dm-0): disabling log replay at mount time Oct 13 06:24:35.632449 kernel: BTRFS info (device dm-0): enabling free space tree Oct 13 06:24:35.632457 kernel: Invalid ELF header magic: != \u007fELF Oct 13 06:24:35.632463 kernel: loop: module loaded Oct 13 06:24:35.632469 kernel: loop0: detected capacity change from 0 to 100048 Oct 13 06:24:35.632476 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Oct 13 06:24:35.632553 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Oct 13 06:24:35.632564 systemd[1]: Successfully made /usr/ read-only. Oct 13 06:24:35.632573 systemd[1]: systemd 257.7 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Oct 13 06:24:35.632582 systemd[1]: Detected architecture x86-64. Oct 13 06:24:35.632588 systemd[1]: Running in initrd. Oct 13 06:24:35.632595 systemd[1]: No hostname configured, using default hostname. Oct 13 06:24:35.632601 systemd[1]: Hostname set to . Oct 13 06:24:35.632666 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:35.632677 systemd[1]: Initializing machine ID from random generator. Oct 13 06:24:35.632684 systemd[1]: Queued start job for default target initrd.target. Oct 13 06:24:35.632691 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Oct 13 06:24:35.632698 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 13 06:24:35.632704 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 13 06:24:35.632711 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Oct 13 06:24:35.632718 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Oct 13 06:24:35.632726 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Oct 13 06:24:35.632733 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Oct 13 06:24:35.632739 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 13 06:24:35.632746 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Oct 13 06:24:35.632752 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Oct 13 06:24:35.632759 systemd[1]: Reached target paths.target - Path Units. Oct 13 06:24:35.632767 systemd[1]: Reached target slices.target - Slice Units. Oct 13 06:24:35.632774 systemd[1]: Reached target swap.target - Swaps. Oct 13 06:24:35.632780 systemd[1]: Reached target timers.target - Timer Units. Oct 13 06:24:35.632786 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Oct 13 06:24:35.632793 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Oct 13 06:24:35.632800 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Oct 13 06:24:35.632814 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Oct 13 06:24:35.632823 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Oct 13 06:24:35.632830 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Oct 13 06:24:35.632836 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Oct 13 06:24:35.632843 systemd[1]: Reached target sockets.target - Socket Units. Oct 13 06:24:35.632850 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Oct 13 06:24:35.632856 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Oct 13 06:24:35.632864 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Oct 13 06:24:35.632871 systemd[1]: Finished network-cleanup.service - Network Cleanup. Oct 13 06:24:35.632877 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Oct 13 06:24:35.632884 systemd[1]: Starting systemd-fsck-usr.service... Oct 13 06:24:35.632891 systemd[1]: Starting systemd-journald.service - Journal Service... Oct 13 06:24:35.632897 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Oct 13 06:24:35.632904 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 13 06:24:35.632924 systemd-journald[424]: Collecting audit messages is disabled. Oct 13 06:24:35.632939 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Oct 13 06:24:35.632946 kernel: Bridge firewalling registered Oct 13 06:24:35.632954 systemd-journald[424]: Journal started Oct 13 06:24:35.632968 systemd-journald[424]: Runtime Journal (/run/log/journal/4b572d0868ea41278fa3b5e7844959f2) is 8M, max 640.1M, 632.1M free. Oct 13 06:24:35.634064 systemd-modules-load[426]: Inserted module 'br_netfilter' Oct 13 06:24:35.641610 systemd[1]: Started systemd-journald.service - Journal Service. Oct 13 06:24:35.641748 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Oct 13 06:24:35.641870 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Oct 13 06:24:35.641966 systemd[1]: Finished systemd-fsck-usr.service. Oct 13 06:24:35.642054 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Oct 13 06:24:35.643072 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Oct 13 06:24:35.643461 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Oct 13 06:24:35.643845 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Oct 13 06:24:35.665077 systemd-tmpfiles[438]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Oct 13 06:24:35.665427 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Oct 13 06:24:35.674926 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Oct 13 06:24:35.823674 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 13 06:24:35.834714 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 13 06:24:35.857569 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Oct 13 06:24:35.873720 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Oct 13 06:24:35.892432 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Oct 13 06:24:35.908702 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 13 06:24:35.916063 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Oct 13 06:24:35.916971 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Oct 13 06:24:35.922992 systemd-resolved[451]: Positive Trust Anchors: Oct 13 06:24:35.922996 systemd-resolved[451]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Oct 13 06:24:35.922998 systemd-resolved[451]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Oct 13 06:24:35.923018 systemd-resolved[451]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Oct 13 06:24:35.934148 systemd-resolved[451]: Defaulting to hostname 'linux'. Oct 13 06:24:36.063929 dracut-cmdline[467]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=4919840803704517a91afcb9d57d99e9935244ff049349c54216d9a31bc1da5d Oct 13 06:24:36.117914 kernel: Loading iSCSI transport class v2.0-870. Oct 13 06:24:36.117932 kernel: iscsi: registered transport (tcp) Oct 13 06:24:35.945092 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Oct 13 06:24:35.963010 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Oct 13 06:24:36.143850 kernel: iscsi: registered transport (qla4xxx) Oct 13 06:24:36.143863 kernel: QLogic iSCSI HBA Driver Oct 13 06:24:36.158289 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Oct 13 06:24:36.201945 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Oct 13 06:24:36.213925 systemd[1]: Reached target network-pre.target - Preparation for Network. Oct 13 06:24:36.250772 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Oct 13 06:24:36.269026 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Oct 13 06:24:36.279561 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Oct 13 06:24:36.329291 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Oct 13 06:24:36.339739 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 13 06:24:36.377108 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Oct 13 06:24:36.388127 systemd-udevd[734]: Using default interface naming scheme 'v257'. Oct 13 06:24:36.394458 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 13 06:24:36.407953 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Oct 13 06:24:36.417937 systemd[1]: Starting systemd-networkd.service - Network Configuration... Oct 13 06:24:36.469522 systemd-networkd[809]: lo: Link UP Oct 13 06:24:36.469524 systemd-networkd[809]: lo: Gained carrier Oct 13 06:24:36.469828 systemd[1]: Started systemd-networkd.service - Network Configuration. Oct 13 06:24:36.499049 dracut-pre-trigger[808]: rd.md=0: removing MD RAID activation Oct 13 06:24:36.489995 systemd[1]: Reached target network.target - Network. Oct 13 06:24:36.506214 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Oct 13 06:24:36.513855 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Oct 13 06:24:36.655089 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Oct 13 06:24:36.659782 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Oct 13 06:24:36.732321 kernel: pps_core: LinuxPPS API ver. 1 registered Oct 13 06:24:36.732343 kernel: cryptd: max_cpu_qlen set to 1000 Oct 13 06:24:36.732358 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Oct 13 06:24:36.732369 kernel: hid: raw HID events driver (C) Jiri Kosina Oct 13 06:24:36.732377 kernel: PTP clock support registered Oct 13 06:24:36.732385 kernel: AES CTR mode by8 optimization enabled Oct 13 06:24:36.732392 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:36.732547 kernel: usbcore: registered new interface driver usbhid Oct 13 06:24:36.732560 kernel: usbhid: USB HID core driver Oct 13 06:24:36.714007 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Micron_5200_MTFDDAK480TDN EFI-SYSTEM. Oct 13 06:24:36.780572 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:36.780746 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Oct 13 06:24:36.780762 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Oct 13 06:24:36.780779 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Oct 13 06:24:36.781500 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Micron_5200_MTFDDAK480TDN ROOT. Oct 13 06:24:36.820741 kernel: igb 0000:03:00.0: added PHC on eth0 Oct 13 06:24:36.820890 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Oct 13 06:24:36.820991 kernel: igb 0000:03:00.0: Intel(R) Gigabit Ethernet Network Connection Oct 13 06:24:36.821072 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:36.821155 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Oct 13 06:24:36.821164 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Oct 13 06:24:36.834935 kernel: igb 0000:03:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 00:25:90:bd:76:c8 Oct 13 06:24:36.846591 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:36.859433 kernel: igb 0000:03:00.0: eth0: PBA No: 010000-000 Oct 13 06:24:36.891617 kernel: igb 0000:03:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Oct 13 06:24:36.892070 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Oct 13 06:24:36.899776 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:36.923823 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5200_MTFDDAK480TDN OEM. Oct 13 06:24:36.981927 kernel: igb 0000:04:00.0: added PHC on eth1 Oct 13 06:24:36.982031 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Oct 13 06:24:36.982114 kernel: igb 0000:04:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 00:25:90:bd:76:c9 Oct 13 06:24:36.982194 kernel: igb 0000:04:00.0: eth1: PBA No: 010000-000 Oct 13 06:24:36.982271 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Oct 13 06:24:36.982350 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:36.977751 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Micron_5200_MTFDDAK480TDN USR-A. Oct 13 06:24:36.992416 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Oct 13 06:24:37.012931 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 13 06:24:37.072923 kernel: igb 0000:04:00.0 eno2: renamed from eth1 Oct 13 06:24:37.073034 kernel: igb 0000:03:00.0 eno1: renamed from eth0 Oct 13 06:24:37.073123 kernel: mlx5_core 0000:01:00.0: PTM is not supported by PCIe Oct 13 06:24:37.073207 kernel: mlx5_core 0000:01:00.0: firmware version: 14.29.2002 Oct 13 06:24:37.073284 kernel: mlx5_core 0000:01:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Oct 13 06:24:37.033940 systemd-networkd[809]: eth1: Interface name change detected, renamed to eno2. Oct 13 06:24:37.040321 systemd-networkd[809]: eth0: Interface name change detected, renamed to eno1. Oct 13 06:24:37.044487 systemd-networkd[809]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Oct 13 06:24:37.049065 systemd[1]: Reached target remote-fs.target - Remote File Systems. Oct 13 06:24:37.073510 systemd-networkd[809]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Oct 13 06:24:37.081434 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Oct 13 06:24:37.100639 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Oct 13 06:24:37.178879 disk-uuid[978]: Primary Header is updated. Oct 13 06:24:37.178879 disk-uuid[978]: Secondary Entries is updated. Oct 13 06:24:37.178879 disk-uuid[978]: Secondary Header is updated. Oct 13 06:24:37.101949 systemd-networkd[809]: eno1: Link UP Oct 13 06:24:37.102025 systemd-networkd[809]: eno2: Link UP Oct 13 06:24:37.125878 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 13 06:24:37.125926 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Oct 13 06:24:37.151960 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Oct 13 06:24:37.173350 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 13 06:24:37.187178 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Oct 13 06:24:37.264210 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 13 06:24:37.313858 kernel: mlx5_core 0000:01:00.0: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Oct 13 06:24:37.324957 kernel: mlx5_core 0000:01:00.0: Port module event: module 0, Cable plugged Oct 13 06:24:37.561887 kernel: mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Oct 13 06:24:37.562448 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:37.582543 kernel: mlx5_core 0000:01:00.1: PTM is not supported by PCIe Oct 13 06:24:37.583080 kernel: mlx5_core 0000:01:00.1: firmware version: 14.29.2002 Oct 13 06:24:37.583491 kernel: mlx5_core 0000:01:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Oct 13 06:24:37.592853 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:37.866840 kernel: mlx5_core 0000:01:00.1: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Oct 13 06:24:37.879158 kernel: mlx5_core 0000:01:00.1: Port module event: module 1, Cable plugged Oct 13 06:24:38.134877 kernel: mlx5_core 0000:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Oct 13 06:24:38.135473 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:38.150846 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:38.156615 disk-uuid[979]: Warning: The kernel is still using the old partition table. Oct 13 06:24:38.156615 disk-uuid[979]: The new table will be used at the next reboot or after you Oct 13 06:24:38.156615 disk-uuid[979]: run partprobe(8) or kpartx(8) Oct 13 06:24:38.156615 disk-uuid[979]: The operation has completed successfully. Oct 13 06:24:38.198068 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: renamed from eth1 Oct 13 06:24:38.198173 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: renamed from eth0 Oct 13 06:24:38.158817 systemd-networkd[809]: eth1: Interface name change detected, renamed to enp1s0f1np1. Oct 13 06:24:38.166613 systemd-networkd[809]: eth0: Interface name change detected, renamed to enp1s0f0np0. Oct 13 06:24:38.172293 systemd-networkd[809]: enp1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Oct 13 06:24:38.268889 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1045) Oct 13 06:24:38.268903 kernel: BTRFS info (device sda6): first mount of filesystem 56bbaf92-79f4-4948-a1fd-5992c383eba8 Oct 13 06:24:38.268912 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 13 06:24:38.268919 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 13 06:24:38.172459 systemd[1]: disk-uuid.service: Deactivated successfully. Oct 13 06:24:38.304042 kernel: BTRFS info (device sda6): turning on async discard Oct 13 06:24:38.304055 kernel: BTRFS info (device sda6): enabling free space tree Oct 13 06:24:38.304062 kernel: BTRFS info (device sda6): last unmount of filesystem 56bbaf92-79f4-4948-a1fd-5992c383eba8 Oct 13 06:24:38.172511 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Oct 13 06:24:38.193611 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Oct 13 06:24:38.300820 systemd[1]: Finished ignition-setup.service - Ignition (setup). Oct 13 06:24:38.355002 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Oct 13 06:24:38.313791 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Oct 13 06:24:38.333085 systemd-networkd[809]: enp1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Oct 13 06:24:38.443540 ignition[1065]: Ignition 2.22.0 Oct 13 06:24:38.443547 ignition[1065]: Stage: fetch-offline Oct 13 06:24:38.447064 unknown[1065]: fetched base config from "system" Oct 13 06:24:38.443576 ignition[1065]: no configs at "/usr/lib/ignition/base.d" Oct 13 06:24:38.447068 unknown[1065]: fetched user config from "system" Oct 13 06:24:38.443585 ignition[1065]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Oct 13 06:24:38.448168 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Oct 13 06:24:38.443646 ignition[1065]: parsed url from cmdline: "" Oct 13 06:24:38.449315 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Oct 13 06:24:38.443649 ignition[1065]: no config URL provided Oct 13 06:24:38.449852 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Oct 13 06:24:38.546000 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Oct 13 06:24:38.443654 ignition[1065]: reading system config file "/usr/lib/ignition/user.ign" Oct 13 06:24:38.542458 systemd-networkd[809]: enp1s0f0np0: Link UP Oct 13 06:24:38.443698 ignition[1065]: parsing config with SHA512: 70d87483f46bfad21c7b752a5d89dd41a1acacf6c6457e073e83766cd11c63fa1976503f0ab8f7c38eaa9b98c17f113c6d97da201995f71de71fee94a33c96aa Oct 13 06:24:38.542765 systemd-networkd[809]: enp1s0f1np1: Link UP Oct 13 06:24:38.447247 ignition[1065]: fetch-offline: fetch-offline passed Oct 13 06:24:38.543095 systemd-networkd[809]: enp1s0f0np0: Gained carrier Oct 13 06:24:38.447251 ignition[1065]: POST message to Packet Timeline Oct 13 06:24:38.554949 systemd-networkd[809]: enp1s0f1np1: Gained carrier Oct 13 06:24:38.447253 ignition[1065]: POST Status error: resource requires networking Oct 13 06:24:38.584022 systemd-networkd[809]: enp1s0f0np0: DHCPv4 address 139.178.94.13/31, gateway 139.178.94.12 acquired from 145.40.83.140 Oct 13 06:24:38.447285 ignition[1065]: Ignition finished successfully Oct 13 06:24:38.503961 ignition[1080]: Ignition 2.22.0 Oct 13 06:24:38.503966 ignition[1080]: Stage: kargs Oct 13 06:24:38.504075 ignition[1080]: no configs at "/usr/lib/ignition/base.d" Oct 13 06:24:38.504082 ignition[1080]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Oct 13 06:24:38.504695 ignition[1080]: kargs: kargs passed Oct 13 06:24:38.504698 ignition[1080]: POST message to Packet Timeline Oct 13 06:24:38.504709 ignition[1080]: GET https://metadata.packet.net/metadata: attempt #1 Oct 13 06:24:38.505275 ignition[1080]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:49992->[::1]:53: read: connection refused Oct 13 06:24:38.706521 ignition[1080]: GET https://metadata.packet.net/metadata: attempt #2 Oct 13 06:24:38.707762 ignition[1080]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:34539->[::1]:53: read: connection refused Oct 13 06:24:39.108951 ignition[1080]: GET https://metadata.packet.net/metadata: attempt #3 Oct 13 06:24:39.110098 ignition[1080]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:45912->[::1]:53: read: connection refused Oct 13 06:24:39.910448 ignition[1080]: GET https://metadata.packet.net/metadata: attempt #4 Oct 13 06:24:39.911832 ignition[1080]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:37997->[::1]:53: read: connection refused Oct 13 06:24:40.105461 systemd-networkd[809]: enp1s0f1np1: Gained IPv6LL Oct 13 06:24:40.553323 systemd-networkd[809]: enp1s0f0np0: Gained IPv6LL Oct 13 06:24:41.512982 ignition[1080]: GET https://metadata.packet.net/metadata: attempt #5 Oct 13 06:24:41.514593 ignition[1080]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:42491->[::1]:53: read: connection refused Oct 13 06:24:44.717903 ignition[1080]: GET https://metadata.packet.net/metadata: attempt #6 Oct 13 06:24:45.845505 ignition[1080]: GET result: OK Oct 13 06:24:46.470766 ignition[1080]: Ignition finished successfully Oct 13 06:24:46.472870 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Oct 13 06:24:46.487540 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Oct 13 06:24:46.517595 ignition[1098]: Ignition 2.22.0 Oct 13 06:24:46.517600 ignition[1098]: Stage: disks Oct 13 06:24:46.517675 ignition[1098]: no configs at "/usr/lib/ignition/base.d" Oct 13 06:24:46.517680 ignition[1098]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Oct 13 06:24:46.518129 ignition[1098]: disks: disks passed Oct 13 06:24:46.518131 ignition[1098]: POST message to Packet Timeline Oct 13 06:24:46.518139 ignition[1098]: GET https://metadata.packet.net/metadata: attempt #1 Oct 13 06:24:47.432744 ignition[1098]: GET result: OK Oct 13 06:24:48.426514 ignition[1098]: Ignition finished successfully Oct 13 06:24:48.430700 systemd[1]: Finished ignition-disks.service - Ignition (disks). Oct 13 06:24:48.456004 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:48.448220 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Oct 13 06:24:48.465982 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Oct 13 06:24:48.485038 systemd[1]: Reached target local-fs.target - Local File Systems. Oct 13 06:24:48.505117 systemd[1]: Reached target sysinit.target - System Initialization. Oct 13 06:24:48.522116 systemd[1]: Reached target basic.target - Basic System. Oct 13 06:24:48.541855 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Oct 13 06:24:48.597634 systemd-fsck[1120]: ROOT: clean, 15/456736 files, 38230/456704 blocks Oct 13 06:24:48.606326 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Oct 13 06:24:48.621393 systemd[1]: Mounting sysroot.mount - /sysroot... Oct 13 06:24:48.736861 kernel: EXT4-fs (sda9): mounted filesystem c7d6ef00-6dd1-40b4-91f2-c4c5965e3cac r/w with ordered data mode. Quota mode: none. Oct 13 06:24:48.736978 systemd[1]: Mounted sysroot.mount - /sysroot. Oct 13 06:24:48.745216 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Oct 13 06:24:48.770429 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Oct 13 06:24:48.778678 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Oct 13 06:24:48.811811 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sda6 (8:6) scanned by mount (1129) Oct 13 06:24:48.813267 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Oct 13 06:24:48.867041 kernel: BTRFS info (device sda6): first mount of filesystem 56bbaf92-79f4-4948-a1fd-5992c383eba8 Oct 13 06:24:48.867054 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 13 06:24:48.867062 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 13 06:24:48.867072 kernel: BTRFS info (device sda6): turning on async discard Oct 13 06:24:48.867080 kernel: BTRFS info (device sda6): enabling free space tree Oct 13 06:24:48.857430 systemd[1]: Starting flatcar-static-network.service - Flatcar Static Network Agent... Oct 13 06:24:48.867138 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Oct 13 06:24:48.867157 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Oct 13 06:24:48.896862 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Oct 13 06:24:48.920342 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Oct 13 06:24:48.936885 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Oct 13 06:24:48.969926 coreos-metadata[1131]: Oct 13 06:24:48.959 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Oct 13 06:24:48.982897 coreos-metadata[1147]: Oct 13 06:24:48.960 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Oct 13 06:24:49.005033 initrd-setup-root[1161]: cut: /sysroot/etc/passwd: No such file or directory Oct 13 06:24:49.014916 initrd-setup-root[1168]: cut: /sysroot/etc/group: No such file or directory Oct 13 06:24:49.023906 initrd-setup-root[1175]: cut: /sysroot/etc/shadow: No such file or directory Oct 13 06:24:49.032900 initrd-setup-root[1182]: cut: /sysroot/etc/gshadow: No such file or directory Oct 13 06:24:49.075927 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Oct 13 06:24:49.085811 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Oct 13 06:24:49.111186 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Oct 13 06:24:49.127671 systemd[1]: sysroot-oem.mount: Deactivated successfully. Oct 13 06:24:49.142026 kernel: BTRFS info (device sda6): last unmount of filesystem 56bbaf92-79f4-4948-a1fd-5992c383eba8 Oct 13 06:24:49.153273 ignition[1250]: INFO : Ignition 2.22.0 Oct 13 06:24:49.153273 ignition[1250]: INFO : Stage: mount Oct 13 06:24:49.166014 ignition[1250]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 13 06:24:49.166014 ignition[1250]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Oct 13 06:24:49.166014 ignition[1250]: INFO : mount: mount passed Oct 13 06:24:49.166014 ignition[1250]: INFO : POST message to Packet Timeline Oct 13 06:24:49.166014 ignition[1250]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Oct 13 06:24:49.163010 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Oct 13 06:24:49.992641 coreos-metadata[1131]: Oct 13 06:24:49.992 INFO Fetch successful Oct 13 06:24:50.056380 coreos-metadata[1147]: Oct 13 06:24:50.056 INFO Fetch successful Oct 13 06:24:50.073227 coreos-metadata[1131]: Oct 13 06:24:50.073 INFO wrote hostname ci-4487.0.0-a-becc29ce89 to /sysroot/etc/hostname Oct 13 06:24:50.074616 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Oct 13 06:24:50.098782 systemd[1]: flatcar-static-network.service: Deactivated successfully. Oct 13 06:24:50.098881 systemd[1]: Finished flatcar-static-network.service - Flatcar Static Network Agent. Oct 13 06:24:50.136690 ignition[1250]: INFO : GET result: OK Oct 13 06:24:50.484070 ignition[1250]: INFO : Ignition finished successfully Oct 13 06:24:50.488459 systemd[1]: Finished ignition-mount.service - Ignition (mount). Oct 13 06:24:50.504596 systemd[1]: Starting ignition-files.service - Ignition (files)... Oct 13 06:24:50.543766 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Oct 13 06:24:50.593847 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sda6 (8:6) scanned by mount (1277) Oct 13 06:24:50.611293 kernel: BTRFS info (device sda6): first mount of filesystem 56bbaf92-79f4-4948-a1fd-5992c383eba8 Oct 13 06:24:50.611313 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 13 06:24:50.626792 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 13 06:24:50.626839 kernel: BTRFS info (device sda6): turning on async discard Oct 13 06:24:50.632944 kernel: BTRFS info (device sda6): enabling free space tree Oct 13 06:24:50.634609 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Oct 13 06:24:50.671753 ignition[1294]: INFO : Ignition 2.22.0 Oct 13 06:24:50.671753 ignition[1294]: INFO : Stage: files Oct 13 06:24:50.684068 ignition[1294]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 13 06:24:50.684068 ignition[1294]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Oct 13 06:24:50.684068 ignition[1294]: DEBUG : files: compiled without relabeling support, skipping Oct 13 06:24:50.684068 ignition[1294]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Oct 13 06:24:50.684068 ignition[1294]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Oct 13 06:24:50.684068 ignition[1294]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Oct 13 06:24:50.684068 ignition[1294]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Oct 13 06:24:50.684068 ignition[1294]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Oct 13 06:24:50.684068 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Oct 13 06:24:50.684068 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Oct 13 06:24:50.676608 unknown[1294]: wrote ssh authorized keys file for user: core Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Oct 13 06:24:50.808983 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Oct 13 06:24:51.047034 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-x86-64.raw: attempt #1 Oct 13 06:24:51.245369 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Oct 13 06:24:51.513347 ignition[1294]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Oct 13 06:24:51.513347 ignition[1294]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Oct 13 06:24:51.541023 ignition[1294]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Oct 13 06:24:51.541023 ignition[1294]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Oct 13 06:24:51.541023 ignition[1294]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Oct 13 06:24:51.541023 ignition[1294]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Oct 13 06:24:51.541023 ignition[1294]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Oct 13 06:24:51.541023 ignition[1294]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Oct 13 06:24:51.541023 ignition[1294]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Oct 13 06:24:51.541023 ignition[1294]: INFO : files: files passed Oct 13 06:24:51.541023 ignition[1294]: INFO : POST message to Packet Timeline Oct 13 06:24:51.541023 ignition[1294]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Oct 13 06:24:52.958182 ignition[1294]: INFO : GET result: OK Oct 13 06:24:53.385702 ignition[1294]: INFO : Ignition finished successfully Oct 13 06:24:53.389409 systemd[1]: Finished ignition-files.service - Ignition (files). Oct 13 06:24:53.406296 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Oct 13 06:24:53.421450 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Oct 13 06:24:53.438824 systemd[1]: ignition-quench.service: Deactivated successfully. Oct 13 06:24:53.438918 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Oct 13 06:24:53.472355 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Oct 13 06:24:53.486170 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Oct 13 06:24:53.506959 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Oct 13 06:24:53.534091 initrd-setup-root-after-ignition[1337]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Oct 13 06:24:53.534091 initrd-setup-root-after-ignition[1337]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Oct 13 06:24:53.560038 initrd-setup-root-after-ignition[1341]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Oct 13 06:24:53.613905 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Oct 13 06:24:53.613954 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Oct 13 06:24:53.631192 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Oct 13 06:24:53.656998 systemd[1]: Reached target initrd.target - Initrd Default Target. Oct 13 06:24:53.666423 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Oct 13 06:24:53.666946 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Oct 13 06:24:53.721622 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Oct 13 06:24:53.732825 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Oct 13 06:24:53.773628 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Oct 13 06:24:53.773706 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Oct 13 06:24:53.792433 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 13 06:24:53.812517 systemd[1]: Stopped target timers.target - Timer Units. Oct 13 06:24:53.830452 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Oct 13 06:24:53.830893 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Oct 13 06:24:53.866212 systemd[1]: Stopped target initrd.target - Initrd Default Target. Oct 13 06:24:53.875413 systemd[1]: Stopped target basic.target - Basic System. Oct 13 06:24:53.893420 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Oct 13 06:24:53.910410 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Oct 13 06:24:53.930416 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Oct 13 06:24:53.951410 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Oct 13 06:24:53.971414 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Oct 13 06:24:53.989530 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Oct 13 06:24:54.008577 systemd[1]: Stopped target sysinit.target - System Initialization. Oct 13 06:24:54.028562 systemd[1]: Stopped target local-fs.target - Local File Systems. Oct 13 06:24:54.047551 systemd[1]: Stopped target swap.target - Swaps. Oct 13 06:24:54.064076 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Oct 13 06:24:54.064162 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Oct 13 06:24:54.097971 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Oct 13 06:24:54.107102 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 13 06:24:54.126161 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Oct 13 06:24:54.126520 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 13 06:24:54.146294 systemd[1]: dracut-initqueue.service: Deactivated successfully. Oct 13 06:24:54.146698 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Oct 13 06:24:54.176514 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Oct 13 06:24:54.177002 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Oct 13 06:24:54.194546 systemd[1]: Stopped target paths.target - Path Units. Oct 13 06:24:54.210262 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Oct 13 06:24:54.210703 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 13 06:24:54.230413 systemd[1]: Stopped target slices.target - Slice Units. Oct 13 06:24:54.247421 systemd[1]: Stopped target sockets.target - Socket Units. Oct 13 06:24:54.264393 systemd[1]: iscsid.socket: Deactivated successfully. Oct 13 06:24:54.264690 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Oct 13 06:24:54.282567 systemd[1]: iscsiuio.socket: Deactivated successfully. Oct 13 06:24:54.282890 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Oct 13 06:24:54.303686 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Oct 13 06:24:54.304132 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Oct 13 06:24:54.321403 systemd[1]: ignition-files.service: Deactivated successfully. Oct 13 06:24:54.441069 ignition[1361]: INFO : Ignition 2.22.0 Oct 13 06:24:54.441069 ignition[1361]: INFO : Stage: umount Oct 13 06:24:54.441069 ignition[1361]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 13 06:24:54.441069 ignition[1361]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Oct 13 06:24:54.441069 ignition[1361]: INFO : umount: umount passed Oct 13 06:24:54.441069 ignition[1361]: INFO : POST message to Packet Timeline Oct 13 06:24:54.441069 ignition[1361]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Oct 13 06:24:54.321761 systemd[1]: Stopped ignition-files.service - Ignition (files). Oct 13 06:24:54.337530 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Oct 13 06:24:54.337960 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Oct 13 06:24:54.356894 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Oct 13 06:24:54.369005 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Oct 13 06:24:54.369080 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Oct 13 06:24:54.392419 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Oct 13 06:24:54.406042 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Oct 13 06:24:54.406216 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 13 06:24:54.434019 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Oct 13 06:24:54.434115 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Oct 13 06:24:54.452092 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Oct 13 06:24:54.452174 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Oct 13 06:24:54.486686 systemd[1]: sysroot-boot.mount: Deactivated successfully. Oct 13 06:24:54.487914 systemd[1]: sysroot-boot.service: Deactivated successfully. Oct 13 06:24:54.488035 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Oct 13 06:24:54.506241 systemd[1]: initrd-cleanup.service: Deactivated successfully. Oct 13 06:24:54.506288 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Oct 13 06:24:55.474734 ignition[1361]: INFO : GET result: OK Oct 13 06:24:55.862897 ignition[1361]: INFO : Ignition finished successfully Oct 13 06:24:55.867599 systemd[1]: ignition-mount.service: Deactivated successfully. Oct 13 06:24:55.867949 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Oct 13 06:24:55.880083 systemd[1]: Stopped target network.target - Network. Oct 13 06:24:55.894125 systemd[1]: ignition-disks.service: Deactivated successfully. Oct 13 06:24:55.894340 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Oct 13 06:24:55.912170 systemd[1]: ignition-kargs.service: Deactivated successfully. Oct 13 06:24:55.912321 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Oct 13 06:24:55.928208 systemd[1]: ignition-setup.service: Deactivated successfully. Oct 13 06:24:55.928399 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Oct 13 06:24:55.944339 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Oct 13 06:24:55.944518 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Oct 13 06:24:55.961323 systemd[1]: initrd-setup-root.service: Deactivated successfully. Oct 13 06:24:55.961520 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Oct 13 06:24:55.978581 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Oct 13 06:24:55.995204 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Oct 13 06:24:56.002235 systemd[1]: systemd-resolved.service: Deactivated successfully. Oct 13 06:24:56.002288 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Oct 13 06:24:56.031125 systemd[1]: systemd-networkd.service: Deactivated successfully. Oct 13 06:24:56.031385 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Oct 13 06:24:56.050837 systemd[1]: Stopped target network-pre.target - Preparation for Network. Oct 13 06:24:56.065009 systemd[1]: systemd-networkd.socket: Deactivated successfully. Oct 13 06:24:56.065035 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Oct 13 06:24:56.093061 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Oct 13 06:24:56.101035 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Oct 13 06:24:56.101115 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Oct 13 06:24:56.108149 systemd[1]: systemd-sysctl.service: Deactivated successfully. Oct 13 06:24:56.108225 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Oct 13 06:24:56.136041 systemd[1]: systemd-modules-load.service: Deactivated successfully. Oct 13 06:24:56.136082 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Oct 13 06:24:56.154060 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 13 06:24:56.173491 systemd[1]: systemd-udevd.service: Deactivated successfully. Oct 13 06:24:56.173643 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 13 06:24:56.195172 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Oct 13 06:24:56.195351 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Oct 13 06:24:56.211043 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Oct 13 06:24:56.211064 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Oct 13 06:24:56.227137 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Oct 13 06:24:56.227197 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Oct 13 06:24:56.271962 systemd[1]: dracut-cmdline.service: Deactivated successfully. Oct 13 06:24:56.272030 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Oct 13 06:24:56.307991 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Oct 13 06:24:56.308178 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Oct 13 06:24:56.338329 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Oct 13 06:24:56.361880 systemd[1]: systemd-network-generator.service: Deactivated successfully. Oct 13 06:24:56.361916 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Oct 13 06:24:56.361972 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Oct 13 06:24:56.361999 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 13 06:24:56.392094 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 13 06:24:56.392231 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Oct 13 06:24:56.413072 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Oct 13 06:24:56.413290 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Oct 13 06:24:56.624995 systemd-journald[424]: Received SIGTERM from PID 1 (systemd). Oct 13 06:24:56.486663 systemd[1]: network-cleanup.service: Deactivated successfully. Oct 13 06:24:56.486950 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Oct 13 06:24:56.503977 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Oct 13 06:24:56.525059 systemd[1]: Starting initrd-switch-root.service - Switch Root... Oct 13 06:24:56.575838 systemd[1]: Switching root. Oct 13 06:24:56.666020 systemd-journald[424]: Journal stopped Oct 13 06:24:58.474974 kernel: SELinux: policy capability network_peer_controls=1 Oct 13 06:24:58.474990 kernel: SELinux: policy capability open_perms=1 Oct 13 06:24:58.474999 kernel: SELinux: policy capability extended_socket_class=1 Oct 13 06:24:58.475005 kernel: SELinux: policy capability always_check_network=0 Oct 13 06:24:58.475011 kernel: SELinux: policy capability cgroup_seclabel=1 Oct 13 06:24:58.475017 kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 13 06:24:58.475023 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Oct 13 06:24:58.475030 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Oct 13 06:24:58.475036 kernel: SELinux: policy capability userspace_initial_context=0 Oct 13 06:24:58.475042 kernel: audit: type=1403 audit(1760336696.859:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Oct 13 06:24:58.475050 systemd[1]: Successfully loaded SELinux policy in 93.837ms. Oct 13 06:24:58.475057 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.998ms. Oct 13 06:24:58.475065 systemd[1]: systemd 257.7 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Oct 13 06:24:58.475072 systemd[1]: Detected architecture x86-64. Oct 13 06:24:58.475080 systemd[1]: Detected first boot. Oct 13 06:24:58.475086 systemd[1]: Hostname set to . Oct 13 06:24:58.475093 systemd[1]: Initializing machine ID from random generator. Oct 13 06:24:58.475101 zram_generator::config[1416]: No configuration found. Oct 13 06:24:58.475108 systemd[1]: Populated /etc with preset unit settings. Oct 13 06:24:58.475115 systemd[1]: initrd-switch-root.service: Deactivated successfully. Oct 13 06:24:58.475122 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Oct 13 06:24:58.475129 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Oct 13 06:24:58.475136 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Oct 13 06:24:58.475144 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Oct 13 06:24:58.475151 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Oct 13 06:24:58.475158 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Oct 13 06:24:58.475165 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Oct 13 06:24:58.475172 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Oct 13 06:24:58.475179 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Oct 13 06:24:58.475187 systemd[1]: Created slice user.slice - User and Session Slice. Oct 13 06:24:58.475194 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 13 06:24:58.475201 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 13 06:24:58.475208 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Oct 13 06:24:58.475215 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Oct 13 06:24:58.475222 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Oct 13 06:24:58.475231 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Oct 13 06:24:58.475238 systemd[1]: Expecting device dev-ttyS1.device - /dev/ttyS1... Oct 13 06:24:58.475245 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 13 06:24:58.475252 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Oct 13 06:24:58.475259 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Oct 13 06:24:58.475267 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Oct 13 06:24:58.475274 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Oct 13 06:24:58.475282 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Oct 13 06:24:58.475290 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 13 06:24:58.475297 systemd[1]: Reached target remote-fs.target - Remote File Systems. Oct 13 06:24:58.475304 systemd[1]: Reached target slices.target - Slice Units. Oct 13 06:24:58.475311 systemd[1]: Reached target swap.target - Swaps. Oct 13 06:24:58.475318 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Oct 13 06:24:58.475326 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Oct 13 06:24:58.475333 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Oct 13 06:24:58.475341 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Oct 13 06:24:58.475348 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Oct 13 06:24:58.475356 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Oct 13 06:24:58.475363 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Oct 13 06:24:58.475371 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Oct 13 06:24:58.475378 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Oct 13 06:24:58.475385 systemd[1]: Mounting media.mount - External Media Directory... Oct 13 06:24:58.475392 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 13 06:24:58.475400 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Oct 13 06:24:58.475407 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Oct 13 06:24:58.475415 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Oct 13 06:24:58.475423 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Oct 13 06:24:58.475430 systemd[1]: Reached target machines.target - Containers. Oct 13 06:24:58.475437 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Oct 13 06:24:58.475445 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Oct 13 06:24:58.475453 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Oct 13 06:24:58.475461 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Oct 13 06:24:58.475468 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 13 06:24:58.475475 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Oct 13 06:24:58.475482 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 13 06:24:58.475489 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Oct 13 06:24:58.475496 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 13 06:24:58.475505 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Oct 13 06:24:58.475513 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Oct 13 06:24:58.475520 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Oct 13 06:24:58.475527 kernel: ACPI: bus type drm_connector registered Oct 13 06:24:58.475534 kernel: fuse: init (API version 7.41) Oct 13 06:24:58.475540 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Oct 13 06:24:58.475547 systemd[1]: Stopped systemd-fsck-usr.service. Oct 13 06:24:58.475556 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 13 06:24:58.475563 systemd[1]: Starting systemd-journald.service - Journal Service... Oct 13 06:24:58.475571 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Oct 13 06:24:58.475578 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Oct 13 06:24:58.475585 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Oct 13 06:24:58.475601 systemd-journald[1520]: Collecting audit messages is disabled. Oct 13 06:24:58.475618 systemd-journald[1520]: Journal started Oct 13 06:24:58.475632 systemd-journald[1520]: Runtime Journal (/run/log/journal/828a255675194b1782f6c5ced674f388) is 8M, max 640.1M, 632.1M free. Oct 13 06:24:57.337885 systemd[1]: Queued start job for default target multi-user.target. Oct 13 06:24:57.348588 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Oct 13 06:24:57.348910 systemd[1]: systemd-journald.service: Deactivated successfully. Oct 13 06:24:58.490876 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Oct 13 06:24:58.511866 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Oct 13 06:24:58.536846 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 13 06:24:58.544853 systemd[1]: Started systemd-journald.service - Journal Service. Oct 13 06:24:58.554462 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Oct 13 06:24:58.562990 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Oct 13 06:24:58.571928 systemd[1]: Mounted media.mount - External Media Directory. Oct 13 06:24:58.581101 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Oct 13 06:24:58.591075 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Oct 13 06:24:58.600076 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Oct 13 06:24:58.610155 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Oct 13 06:24:58.620159 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Oct 13 06:24:58.630115 systemd[1]: modprobe@configfs.service: Deactivated successfully. Oct 13 06:24:58.630222 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Oct 13 06:24:58.640137 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 13 06:24:58.640262 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 13 06:24:58.650214 systemd[1]: modprobe@drm.service: Deactivated successfully. Oct 13 06:24:58.650368 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Oct 13 06:24:58.660301 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 13 06:24:58.660497 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 13 06:24:58.672394 systemd[1]: modprobe@fuse.service: Deactivated successfully. Oct 13 06:24:58.672670 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Oct 13 06:24:58.682681 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 13 06:24:58.683154 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 13 06:24:58.692819 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Oct 13 06:24:58.703972 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Oct 13 06:24:58.717507 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Oct 13 06:24:58.729995 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Oct 13 06:24:58.742467 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Oct 13 06:24:58.775700 systemd[1]: Reached target network-pre.target - Preparation for Network. Oct 13 06:24:58.786384 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Oct 13 06:24:58.800650 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Oct 13 06:24:58.822232 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Oct 13 06:24:58.831006 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Oct 13 06:24:58.831036 systemd[1]: Reached target local-fs.target - Local File Systems. Oct 13 06:24:58.831858 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Oct 13 06:24:58.851094 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 13 06:24:58.852674 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Oct 13 06:24:58.871537 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Oct 13 06:24:58.881921 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Oct 13 06:24:58.892025 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Oct 13 06:24:58.894519 systemd-journald[1520]: Time spent on flushing to /var/log/journal/828a255675194b1782f6c5ced674f388 is 12.283ms for 1402 entries. Oct 13 06:24:58.894519 systemd-journald[1520]: System Journal (/var/log/journal/828a255675194b1782f6c5ced674f388) is 8M, max 163.5M, 155.5M free. Oct 13 06:24:58.916718 systemd-journald[1520]: Received client request to flush runtime journal. Oct 13 06:24:58.908931 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Oct 13 06:24:58.919082 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Oct 13 06:24:58.936105 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Oct 13 06:24:58.964175 systemd[1]: Starting systemd-sysusers.service - Create System Users... Oct 13 06:24:58.975870 kernel: loop1: detected capacity change from 0 to 110984 Oct 13 06:24:58.979933 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Oct 13 06:24:58.989982 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Oct 13 06:24:59.000465 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Oct 13 06:24:59.013846 kernel: loop2: detected capacity change from 0 to 8 Oct 13 06:24:59.016054 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Oct 13 06:24:59.026138 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Oct 13 06:24:59.035140 systemd[1]: Finished systemd-sysusers.service - Create System Users. Oct 13 06:24:59.045306 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Oct 13 06:24:59.054888 kernel: loop3: detected capacity change from 0 to 128048 Oct 13 06:24:59.061561 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Oct 13 06:24:59.081311 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Oct 13 06:24:59.090535 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Oct 13 06:24:59.100857 kernel: loop4: detected capacity change from 0 to 219144 Oct 13 06:24:59.107199 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Oct 13 06:24:59.118504 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Oct 13 06:24:59.122904 systemd-tmpfiles[1573]: ACLs are not supported, ignoring. Oct 13 06:24:59.122913 systemd-tmpfiles[1573]: ACLs are not supported, ignoring. Oct 13 06:24:59.127255 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 13 06:24:59.138242 systemd[1]: Started systemd-userdbd.service - User Database Manager. Oct 13 06:24:59.155875 kernel: loop5: detected capacity change from 0 to 110984 Oct 13 06:24:59.170093 systemd-resolved[1572]: Positive Trust Anchors: Oct 13 06:24:59.170102 systemd-resolved[1572]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Oct 13 06:24:59.170104 systemd-resolved[1572]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Oct 13 06:24:59.170126 systemd-resolved[1572]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Oct 13 06:24:59.171865 kernel: loop6: detected capacity change from 0 to 8 Oct 13 06:24:59.178885 kernel: loop7: detected capacity change from 0 to 128048 Oct 13 06:24:59.211251 systemd-resolved[1572]: Using system hostname 'ci-4487.0.0-a-becc29ce89'. Oct 13 06:24:59.212052 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Oct 13 06:24:59.226152 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Oct 13 06:24:59.226836 kernel: loop1: detected capacity change from 0 to 219144 Oct 13 06:24:59.236075 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Oct 13 06:24:59.239770 (sd-merge)[1585]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-packet.raw'. Oct 13 06:24:59.241538 (sd-merge)[1585]: Merged extensions into '/usr'. Oct 13 06:24:59.248607 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 13 06:24:59.259146 systemd[1]: Reload requested from client PID 1556 ('systemd-sysext') (unit systemd-sysext.service)... Oct 13 06:24:59.259154 systemd[1]: Reloading... Oct 13 06:24:59.286842 systemd-udevd[1587]: Using default interface naming scheme 'v257'. Oct 13 06:24:59.291882 zram_generator::config[1614]: No configuration found. Oct 13 06:24:59.339815 kernel: mousedev: PS/2 mouse device common for all mice Oct 13 06:24:59.339872 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Oct 13 06:24:59.348812 kernel: ACPI: button: Sleep Button [SLPB] Oct 13 06:24:59.352817 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:59.353030 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Oct 13 06:24:59.376817 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Oct 13 06:24:59.377124 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Oct 13 06:24:59.377440 kernel: ACPI: button: Power Button [PWRF] Oct 13 06:24:59.377456 kernel: IPMI message handler: version 39.2 Oct 13 06:24:59.377467 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:59.409816 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:59.422820 kernel: ipmi device interface Oct 13 06:24:59.422903 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Oct 13 06:24:59.435859 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Oct 13 06:24:59.436439 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:59.443826 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:59.484815 kernel: iTCO_vendor_support: vendor-support=0 Oct 13 06:24:59.485786 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Oct 13 06:24:59.485920 systemd[1]: Condition check resulted in dev-ttyS1.device - /dev/ttyS1 being skipped. Oct 13 06:24:59.486027 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5200_MTFDDAK480TDN OEM. Oct 13 06:24:59.493648 kernel: ipmi_si: IPMI System Interface driver Oct 13 06:24:59.493675 kernel: MACsec IEEE 802.1AE Oct 13 06:24:59.493688 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Oct 13 06:24:59.500848 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Oct 13 06:24:59.500873 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Oct 13 06:24:59.500886 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Oct 13 06:24:59.501003 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:59.539252 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Oct 13 06:24:59.549367 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Oct 13 06:24:59.556446 kernel: ipmi_si: Adding ACPI-specified kcs state machine Oct 13 06:24:59.567683 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Oct 13 06:24:59.567708 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:59.578207 systemd[1]: Reloading finished in 318 ms. Oct 13 06:24:59.597891 kernel: iTCO_wdt iTCO_wdt: Found a Intel PCH TCO device (Version=6, TCOBASE=0x0400) Oct 13 06:24:59.598133 kernel: iTCO_wdt iTCO_wdt: initialized. heartbeat=30 sec (nowayout=0) Oct 13 06:24:59.598811 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:59.606839 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Oct 13 06:24:59.640837 kernel: intel_rapl_common: Found RAPL domain package Oct 13 06:24:59.640888 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:59.641011 kernel: intel_rapl_common: Found RAPL domain core Oct 13 06:24:59.641026 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b0f, dev_id: 0x20) Oct 13 06:24:59.646549 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 13 06:24:59.663815 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Oct 13 06:24:59.664158 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:59.664431 kernel: intel_rapl_common: Found RAPL domain dram Oct 13 06:24:59.683836 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:24:59.699810 kernel: ipmi_ssif: IPMI SSIF Interface driver Oct 13 06:24:59.705125 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Oct 13 06:24:59.736681 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Oct 13 06:24:59.761470 systemd[1]: Starting ensure-sysext.service... Oct 13 06:24:59.767428 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Oct 13 06:24:59.780762 systemd[1]: Starting systemd-networkd.service - Network Configuration... Oct 13 06:24:59.790071 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Oct 13 06:24:59.791322 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 13 06:24:59.802212 systemd[1]: Reload requested from client PID 1776 ('systemctl') (unit ensure-sysext.service)... Oct 13 06:24:59.802226 systemd[1]: Reloading... Oct 13 06:24:59.819421 systemd-tmpfiles[1780]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Oct 13 06:24:59.819439 systemd-tmpfiles[1780]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Oct 13 06:24:59.819598 systemd-tmpfiles[1780]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Oct 13 06:24:59.819756 systemd-tmpfiles[1780]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Oct 13 06:24:59.820276 systemd-tmpfiles[1780]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Oct 13 06:24:59.820437 systemd-tmpfiles[1780]: ACLs are not supported, ignoring. Oct 13 06:24:59.820471 systemd-tmpfiles[1780]: ACLs are not supported, ignoring. Oct 13 06:24:59.823308 systemd-tmpfiles[1780]: Detected autofs mount point /boot during canonicalization of boot. Oct 13 06:24:59.823312 systemd-tmpfiles[1780]: Skipping /boot Oct 13 06:24:59.827116 systemd-tmpfiles[1780]: Detected autofs mount point /boot during canonicalization of boot. Oct 13 06:24:59.827120 systemd-tmpfiles[1780]: Skipping /boot Oct 13 06:24:59.841613 systemd-networkd[1778]: lo: Link UP Oct 13 06:24:59.841617 systemd-networkd[1778]: lo: Gained carrier Oct 13 06:24:59.846820 zram_generator::config[1817]: No configuration found. Oct 13 06:24:59.849013 systemd-networkd[1778]: enp1s0f0np0: Configuring with /etc/systemd/network/10-b8:59:9f:e1:64:26.network. Oct 13 06:24:59.861630 systemd-networkd[1778]: enp1s0f1np1: Configuring with /etc/systemd/network/10-b8:59:9f:e1:64:27.network. Oct 13 06:24:59.862331 systemd-networkd[1778]: bond0: netdev ready Oct 13 06:25:00.313961 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Oct 13 06:25:00.326847 kernel: bond0: (slave enp1s0f0np0): Enslaving as a backup interface with an up link Oct 13 06:25:00.489921 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Oct 13 06:25:00.501731 systemd-networkd[1778]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Oct 13 06:25:00.501894 kernel: bond0: (slave enp1s0f1np1): Enslaving as a backup interface with an up link Oct 13 06:25:00.513866 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Oct 13 06:25:00.513827 systemd-networkd[1778]: enp1s0f0np0: Link UP Oct 13 06:25:00.517846 systemd-networkd[1778]: enp1s0f1np1: Reconfiguring with /etc/systemd/network/10-b8:59:9f:e1:64:26.network. Oct 13 06:25:00.518261 systemd-networkd[1778]: enp1s0f1np1: Link UP Oct 13 06:25:00.519055 systemd-networkd[1778]: enp1s0f0np0: Gained carrier Oct 13 06:25:00.532217 systemd-networkd[1778]: enp1s0f1np1: Gained carrier Oct 13 06:25:00.542197 systemd-networkd[1778]: bond0: Link UP Oct 13 06:25:00.542677 systemd-networkd[1778]: bond0: Gained carrier Oct 13 06:25:00.607342 systemd[1]: Reloading finished in 804 ms. Oct 13 06:25:00.623178 kernel: bond0: (slave enp1s0f0np0): link status definitely up, 25000 Mbps full duplex Oct 13 06:25:00.623203 kernel: bond0: active interface up! Oct 13 06:25:00.633424 systemd[1]: Started systemd-networkd.service - Network Configuration. Oct 13 06:25:00.654058 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Oct 13 06:25:00.664212 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 13 06:25:00.674929 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 13 06:25:00.688940 systemd[1]: Reached target network.target - Network. Oct 13 06:25:00.696636 systemd[1]: Starting audit-rules.service - Load Audit Rules... Oct 13 06:25:00.713106 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Oct 13 06:25:00.724686 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Oct 13 06:25:00.734765 augenrules[1902]: No rules Oct 13 06:25:00.739852 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 25000 Mbps full duplex Oct 13 06:25:00.755128 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Oct 13 06:25:00.772212 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Oct 13 06:25:00.782621 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Oct 13 06:25:00.806196 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Oct 13 06:25:00.817080 systemd[1]: audit-rules.service: Deactivated successfully. Oct 13 06:25:00.817199 systemd[1]: Finished audit-rules.service - Load Audit Rules. Oct 13 06:25:00.826168 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Oct 13 06:25:00.836177 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Oct 13 06:25:00.849432 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 13 06:25:00.849599 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Oct 13 06:25:00.858417 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 13 06:25:00.876178 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 13 06:25:00.896124 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 13 06:25:00.904936 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 13 06:25:00.905056 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 13 06:25:00.905170 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Oct 13 06:25:00.905265 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 13 06:25:00.914960 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Oct 13 06:25:00.925398 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 13 06:25:00.925498 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 13 06:25:00.935229 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 13 06:25:00.935323 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 13 06:25:00.945218 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 13 06:25:00.945308 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 13 06:25:00.954962 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Oct 13 06:25:00.967664 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 13 06:25:00.967777 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Oct 13 06:25:00.968563 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 13 06:25:00.988136 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 13 06:25:01.015089 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 13 06:25:01.023940 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 13 06:25:01.024010 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 13 06:25:01.024073 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Oct 13 06:25:01.024121 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 13 06:25:01.024900 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 13 06:25:01.025000 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 13 06:25:01.035148 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 13 06:25:01.035242 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 13 06:25:01.045109 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 13 06:25:01.045198 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 13 06:25:01.056492 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 13 06:25:01.057248 systemd[1]: Starting audit-rules.service - Load Audit Rules... Oct 13 06:25:01.063982 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Oct 13 06:25:01.079265 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 13 06:25:01.085399 augenrules[1931]: /sbin/augenrules: No change Oct 13 06:25:01.088593 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Oct 13 06:25:01.088642 augenrules[1949]: No rules Oct 13 06:25:01.097530 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 13 06:25:01.107502 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 13 06:25:01.115967 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 13 06:25:01.116037 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 13 06:25:01.116113 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Oct 13 06:25:01.116166 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 13 06:25:01.117157 systemd[1]: audit-rules.service: Deactivated successfully. Oct 13 06:25:01.117269 systemd[1]: Finished audit-rules.service - Load Audit Rules. Oct 13 06:25:01.126138 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 13 06:25:01.126230 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 13 06:25:01.126428 ldconfig[1895]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Oct 13 06:25:01.137170 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Oct 13 06:25:01.146092 systemd[1]: modprobe@drm.service: Deactivated successfully. Oct 13 06:25:01.146185 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Oct 13 06:25:01.155086 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 13 06:25:01.155175 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 13 06:25:01.165079 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 13 06:25:01.165168 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 13 06:25:01.175089 systemd[1]: Finished ensure-sysext.service. Oct 13 06:25:01.183773 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Oct 13 06:25:01.183799 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Oct 13 06:25:01.184694 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Oct 13 06:25:01.193477 systemd[1]: Starting systemd-update-done.service - Update is Completed... Oct 13 06:25:01.219272 systemd[1]: Finished systemd-update-done.service - Update is Completed. Oct 13 06:25:01.241465 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Oct 13 06:25:01.252001 systemd[1]: Reached target sysinit.target - System Initialization. Oct 13 06:25:01.260947 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Oct 13 06:25:01.271889 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Oct 13 06:25:01.281882 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Oct 13 06:25:01.291898 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Oct 13 06:25:01.301876 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Oct 13 06:25:01.301900 systemd[1]: Reached target paths.target - Path Units. Oct 13 06:25:01.308879 systemd[1]: Reached target time-set.target - System Time Set. Oct 13 06:25:01.317951 systemd[1]: Started logrotate.timer - Daily rotation of log files. Oct 13 06:25:01.326932 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Oct 13 06:25:01.336875 systemd[1]: Reached target timers.target - Timer Units. Oct 13 06:25:01.344513 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Oct 13 06:25:01.355550 systemd[1]: Starting docker.socket - Docker Socket for the API... Oct 13 06:25:01.363758 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Oct 13 06:25:01.380134 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Oct 13 06:25:01.389120 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Oct 13 06:25:01.399227 systemd[1]: Listening on docker.socket - Docker Socket for the API. Oct 13 06:25:01.408460 systemd[1]: Reached target sockets.target - Socket Units. Oct 13 06:25:01.416879 systemd[1]: Reached target basic.target - Basic System. Oct 13 06:25:01.423906 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Oct 13 06:25:01.423928 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Oct 13 06:25:01.424524 systemd[1]: Starting containerd.service - containerd container runtime... Oct 13 06:25:01.443232 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Oct 13 06:25:01.460024 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Oct 13 06:25:01.467465 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Oct 13 06:25:01.472094 coreos-metadata[1973]: Oct 13 06:25:01.472 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Oct 13 06:25:01.485108 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Oct 13 06:25:01.508007 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Oct 13 06:25:01.509991 jq[1979]: false Oct 13 06:25:01.516918 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Oct 13 06:25:01.517509 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Oct 13 06:25:01.522466 extend-filesystems[1980]: Found /dev/sda6 Oct 13 06:25:01.525999 extend-filesystems[1980]: Found /dev/sda9 Oct 13 06:25:01.525999 extend-filesystems[1980]: Checking size of /dev/sda9 Oct 13 06:25:01.559013 kernel: EXT4-fs (sda9): resizing filesystem from 456704 to 115622609 blocks Oct 13 06:25:01.553456 oslogin_cache_refresh[1981]: Refreshing passwd entry cache Oct 13 06:25:01.559168 extend-filesystems[1980]: Resized partition /dev/sda9 Oct 13 06:25:01.526519 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Oct 13 06:25:01.568037 google_oslogin_nss_cache[1981]: oslogin_cache_refresh[1981]: Refreshing passwd entry cache Oct 13 06:25:01.568165 extend-filesystems[1992]: resize2fs 1.47.3 (8-Jul-2025) Oct 13 06:25:01.549508 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Oct 13 06:25:01.559595 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Oct 13 06:25:01.584453 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Oct 13 06:25:01.602314 systemd[1]: Starting systemd-logind.service - User Login Management... Oct 13 06:25:01.611491 systemd[1]: Starting tcsd.service - TCG Core Services Daemon... Oct 13 06:25:01.620223 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Oct 13 06:25:01.620598 systemd[1]: Starting update-engine.service - Update Engine... Oct 13 06:25:01.628423 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Oct 13 06:25:01.635903 update_engine[2017]: I20251013 06:25:01.635818 2017 main.cc:92] Flatcar Update Engine starting Oct 13 06:25:01.638351 systemd-logind[2006]: Watching system buttons on /dev/input/event3 (Power Button) Oct 13 06:25:01.638362 systemd-logind[2006]: Watching system buttons on /dev/input/event2 (Sleep Button) Oct 13 06:25:01.638373 systemd-logind[2006]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Oct 13 06:25:01.638608 systemd-logind[2006]: New seat seat0. Oct 13 06:25:01.639347 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Oct 13 06:25:01.640234 jq[2018]: true Oct 13 06:25:01.649931 systemd[1]: Started systemd-logind.service - User Login Management. Oct 13 06:25:01.659070 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Oct 13 06:25:01.659176 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Oct 13 06:25:01.659316 systemd[1]: motdgen.service: Deactivated successfully. Oct 13 06:25:01.659416 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Oct 13 06:25:01.669411 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Oct 13 06:25:01.669514 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Oct 13 06:25:01.689871 jq[2021]: true Oct 13 06:25:01.690244 (ntainerd)[2022]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Oct 13 06:25:01.708388 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Oct 13 06:25:01.708503 systemd[1]: Condition check resulted in tcsd.service - TCG Core Services Daemon being skipped. Oct 13 06:25:01.708671 tar[2020]: linux-amd64/LICENSE Oct 13 06:25:01.708820 tar[2020]: linux-amd64/helm Oct 13 06:25:01.741316 dbus-daemon[1974]: [system] SELinux support is enabled Oct 13 06:25:01.741481 systemd[1]: Started dbus.service - D-Bus System Message Bus. Oct 13 06:25:01.741739 bash[2049]: Updated "/home/core/.ssh/authorized_keys" Oct 13 06:25:01.743374 update_engine[2017]: I20251013 06:25:01.743346 2017 update_check_scheduler.cc:74] Next update check in 11m1s Oct 13 06:25:01.751590 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Oct 13 06:25:01.762961 dbus-daemon[1974]: [system] Successfully activated service 'org.freedesktop.systemd1' Oct 13 06:25:01.763312 systemd[1]: Starting sshkeys.service... Oct 13 06:25:01.768875 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Oct 13 06:25:01.768894 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Oct 13 06:25:01.778892 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Oct 13 06:25:01.778909 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Oct 13 06:25:01.792917 systemd[1]: Started update-engine.service - Update Engine. Oct 13 06:25:01.802939 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Oct 13 06:25:01.813887 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Oct 13 06:25:01.833274 sshd_keygen[2009]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Oct 13 06:25:01.841627 systemd[1]: Started locksmithd.service - Cluster reboot manager. Oct 13 06:25:01.851757 coreos-metadata[2059]: Oct 13 06:25:01.851 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Oct 13 06:25:01.856955 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Oct 13 06:25:01.868104 systemd[1]: Starting issuegen.service - Generate /run/issue... Oct 13 06:25:01.869944 containerd[2022]: time="2025-10-13T06:25:01Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Oct 13 06:25:01.870262 containerd[2022]: time="2025-10-13T06:25:01.870247734Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Oct 13 06:25:01.875771 containerd[2022]: time="2025-10-13T06:25:01.875732239Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="16.765µs" Oct 13 06:25:01.875841 containerd[2022]: time="2025-10-13T06:25:01.875813115Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Oct 13 06:25:01.875841 containerd[2022]: time="2025-10-13T06:25:01.875836298Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Oct 13 06:25:01.875971 containerd[2022]: time="2025-10-13T06:25:01.875960723Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Oct 13 06:25:01.875995 containerd[2022]: time="2025-10-13T06:25:01.875972494Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Oct 13 06:25:01.876010 containerd[2022]: time="2025-10-13T06:25:01.875993603Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Oct 13 06:25:01.876046 containerd[2022]: time="2025-10-13T06:25:01.876037757Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Oct 13 06:25:01.876067 containerd[2022]: time="2025-10-13T06:25:01.876046283Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Oct 13 06:25:01.876281 containerd[2022]: time="2025-10-13T06:25:01.876264109Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Oct 13 06:25:01.876281 containerd[2022]: time="2025-10-13T06:25:01.876275322Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Oct 13 06:25:01.876333 containerd[2022]: time="2025-10-13T06:25:01.876282922Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Oct 13 06:25:01.876333 containerd[2022]: time="2025-10-13T06:25:01.876289198Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Oct 13 06:25:01.876378 containerd[2022]: time="2025-10-13T06:25:01.876340191Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Oct 13 06:25:01.876537 containerd[2022]: time="2025-10-13T06:25:01.876528238Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Oct 13 06:25:01.876555 containerd[2022]: time="2025-10-13T06:25:01.876546288Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Oct 13 06:25:01.876555 containerd[2022]: time="2025-10-13T06:25:01.876552265Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Oct 13 06:25:01.876585 containerd[2022]: time="2025-10-13T06:25:01.876571441Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Oct 13 06:25:01.876726 containerd[2022]: time="2025-10-13T06:25:01.876717829Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Oct 13 06:25:01.876758 containerd[2022]: time="2025-10-13T06:25:01.876751286Z" level=info msg="metadata content store policy set" policy=shared Oct 13 06:25:01.881100 locksmithd[2066]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Oct 13 06:25:01.887137 containerd[2022]: time="2025-10-13T06:25:01.887076586Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Oct 13 06:25:01.887137 containerd[2022]: time="2025-10-13T06:25:01.887106800Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Oct 13 06:25:01.887137 containerd[2022]: time="2025-10-13T06:25:01.887119418Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Oct 13 06:25:01.887137 containerd[2022]: time="2025-10-13T06:25:01.887131424Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Oct 13 06:25:01.887201 containerd[2022]: time="2025-10-13T06:25:01.887140259Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Oct 13 06:25:01.887201 containerd[2022]: time="2025-10-13T06:25:01.887146241Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Oct 13 06:25:01.887201 containerd[2022]: time="2025-10-13T06:25:01.887156035Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Oct 13 06:25:01.887201 containerd[2022]: time="2025-10-13T06:25:01.887163551Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Oct 13 06:25:01.887201 containerd[2022]: time="2025-10-13T06:25:01.887172776Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Oct 13 06:25:01.887201 containerd[2022]: time="2025-10-13T06:25:01.887183289Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Oct 13 06:25:01.887201 containerd[2022]: time="2025-10-13T06:25:01.887189223Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Oct 13 06:25:01.887201 containerd[2022]: time="2025-10-13T06:25:01.887199008Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Oct 13 06:25:01.887301 containerd[2022]: time="2025-10-13T06:25:01.887261167Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Oct 13 06:25:01.887301 containerd[2022]: time="2025-10-13T06:25:01.887280676Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Oct 13 06:25:01.887301 containerd[2022]: time="2025-10-13T06:25:01.887295737Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Oct 13 06:25:01.887337 containerd[2022]: time="2025-10-13T06:25:01.887302986Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Oct 13 06:25:01.887337 containerd[2022]: time="2025-10-13T06:25:01.887308580Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Oct 13 06:25:01.887337 containerd[2022]: time="2025-10-13T06:25:01.887313919Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Oct 13 06:25:01.887337 containerd[2022]: time="2025-10-13T06:25:01.887319551Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Oct 13 06:25:01.887337 containerd[2022]: time="2025-10-13T06:25:01.887324797Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Oct 13 06:25:01.887337 containerd[2022]: time="2025-10-13T06:25:01.887331531Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Oct 13 06:25:01.887415 containerd[2022]: time="2025-10-13T06:25:01.887338071Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Oct 13 06:25:01.887415 containerd[2022]: time="2025-10-13T06:25:01.887343574Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Oct 13 06:25:01.887415 containerd[2022]: time="2025-10-13T06:25:01.887384428Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Oct 13 06:25:01.887415 containerd[2022]: time="2025-10-13T06:25:01.887392430Z" level=info msg="Start snapshots syncer" Oct 13 06:25:01.887415 containerd[2022]: time="2025-10-13T06:25:01.887405800Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Oct 13 06:25:01.887611 containerd[2022]: time="2025-10-13T06:25:01.887564003Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Oct 13 06:25:01.887611 containerd[2022]: time="2025-10-13T06:25:01.887594036Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Oct 13 06:25:01.887690 containerd[2022]: time="2025-10-13T06:25:01.887635548Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Oct 13 06:25:01.887705 containerd[2022]: time="2025-10-13T06:25:01.887686394Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Oct 13 06:25:01.887718 containerd[2022]: time="2025-10-13T06:25:01.887708357Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Oct 13 06:25:01.887718 containerd[2022]: time="2025-10-13T06:25:01.887715359Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Oct 13 06:25:01.887745 containerd[2022]: time="2025-10-13T06:25:01.887723675Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Oct 13 06:25:01.887745 containerd[2022]: time="2025-10-13T06:25:01.887734100Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Oct 13 06:25:01.887745 containerd[2022]: time="2025-10-13T06:25:01.887740504Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Oct 13 06:25:01.887784 containerd[2022]: time="2025-10-13T06:25:01.887748475Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Oct 13 06:25:01.887784 containerd[2022]: time="2025-10-13T06:25:01.887763369Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Oct 13 06:25:01.887784 containerd[2022]: time="2025-10-13T06:25:01.887773252Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Oct 13 06:25:01.887828 containerd[2022]: time="2025-10-13T06:25:01.887785367Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Oct 13 06:25:01.887828 containerd[2022]: time="2025-10-13T06:25:01.887802230Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Oct 13 06:25:01.887828 containerd[2022]: time="2025-10-13T06:25:01.887818008Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Oct 13 06:25:01.887828 containerd[2022]: time="2025-10-13T06:25:01.887823776Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Oct 13 06:25:01.887882 containerd[2022]: time="2025-10-13T06:25:01.887829199Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Oct 13 06:25:01.887882 containerd[2022]: time="2025-10-13T06:25:01.887833596Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Oct 13 06:25:01.887882 containerd[2022]: time="2025-10-13T06:25:01.887838875Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Oct 13 06:25:01.887882 containerd[2022]: time="2025-10-13T06:25:01.887844126Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Oct 13 06:25:01.887882 containerd[2022]: time="2025-10-13T06:25:01.887855599Z" level=info msg="runtime interface created" Oct 13 06:25:01.887882 containerd[2022]: time="2025-10-13T06:25:01.887858859Z" level=info msg="created NRI interface" Oct 13 06:25:01.887882 containerd[2022]: time="2025-10-13T06:25:01.887868247Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Oct 13 06:25:01.887882 containerd[2022]: time="2025-10-13T06:25:01.887876614Z" level=info msg="Connect containerd service" Oct 13 06:25:01.887983 containerd[2022]: time="2025-10-13T06:25:01.887895376Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Oct 13 06:25:01.888364 containerd[2022]: time="2025-10-13T06:25:01.888325057Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Oct 13 06:25:01.890010 tar[2020]: linux-amd64/README.md Oct 13 06:25:01.894618 systemd[1]: issuegen.service: Deactivated successfully. Oct 13 06:25:01.894737 systemd[1]: Finished issuegen.service - Generate /run/issue. Oct 13 06:25:01.904273 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Oct 13 06:25:01.915128 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Oct 13 06:25:01.929125 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Oct 13 06:25:01.940174 systemd[1]: Started getty@tty1.service - Getty on tty1. Oct 13 06:25:01.948576 systemd[1]: Started serial-getty@ttyS1.service - Serial Getty on ttyS1. Oct 13 06:25:01.960011 systemd[1]: Reached target getty.target - Login Prompts. Oct 13 06:25:01.992596 containerd[2022]: time="2025-10-13T06:25:01.992543269Z" level=info msg="Start subscribing containerd event" Oct 13 06:25:01.992596 containerd[2022]: time="2025-10-13T06:25:01.992577439Z" level=info msg="Start recovering state" Oct 13 06:25:01.992669 containerd[2022]: time="2025-10-13T06:25:01.992605149Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Oct 13 06:25:01.992669 containerd[2022]: time="2025-10-13T06:25:01.992633905Z" level=info msg="Start event monitor" Oct 13 06:25:01.992669 containerd[2022]: time="2025-10-13T06:25:01.992644586Z" level=info msg="Start cni network conf syncer for default" Oct 13 06:25:01.992669 containerd[2022]: time="2025-10-13T06:25:01.992648993Z" level=info msg="Start streaming server" Oct 13 06:25:01.992669 containerd[2022]: time="2025-10-13T06:25:01.992635147Z" level=info msg=serving... address=/run/containerd/containerd.sock Oct 13 06:25:01.992669 containerd[2022]: time="2025-10-13T06:25:01.992656560Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Oct 13 06:25:01.992753 containerd[2022]: time="2025-10-13T06:25:01.992675146Z" level=info msg="runtime interface starting up..." Oct 13 06:25:01.992753 containerd[2022]: time="2025-10-13T06:25:01.992679076Z" level=info msg="starting plugins..." Oct 13 06:25:01.992753 containerd[2022]: time="2025-10-13T06:25:01.992688201Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Oct 13 06:25:01.992790 containerd[2022]: time="2025-10-13T06:25:01.992753732Z" level=info msg="containerd successfully booted in 0.123056s" Oct 13 06:25:01.992821 systemd[1]: Started containerd.service - containerd container runtime. Oct 13 06:25:02.046836 kernel: EXT4-fs (sda9): resized filesystem to 115622609 Oct 13 06:25:02.073220 extend-filesystems[1992]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Oct 13 06:25:02.073220 extend-filesystems[1992]: old_desc_blocks = 1, new_desc_blocks = 56 Oct 13 06:25:02.073220 extend-filesystems[1992]: The filesystem on /dev/sda9 is now 115622609 (4k) blocks long. Oct 13 06:25:02.109886 extend-filesystems[1980]: Resized filesystem in /dev/sda9 Oct 13 06:25:02.073930 systemd[1]: extend-filesystems.service: Deactivated successfully. Oct 13 06:25:02.074062 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Oct 13 06:25:02.312870 systemd-networkd[1778]: bond0: Gained IPv6LL Oct 13 06:25:02.314280 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Oct 13 06:25:02.324405 systemd[1]: Reached target network-online.target - Network is Online. Oct 13 06:25:02.333933 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 13 06:25:02.350111 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Oct 13 06:25:02.370138 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Oct 13 06:25:03.067679 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:25:03.078324 (kubelet)[2133]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 13 06:25:03.296838 kernel: mlx5_core 0000:01:00.0: lag map: port 1:1 port 2:2 Oct 13 06:25:03.297000 kernel: mlx5_core 0000:01:00.0: shared_fdb:0 mode:queue_affinity Oct 13 06:25:03.365812 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Oct 13 06:25:03.455570 kubelet[2133]: E1013 06:25:03.455544 2133 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 13 06:25:03.456552 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 13 06:25:03.456635 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 13 06:25:03.456856 systemd[1]: kubelet.service: Consumed 551ms CPU time, 262.5M memory peak. Oct 13 06:25:04.923219 systemd-resolved[1572]: Clock change detected. Flushing caches. Oct 13 06:25:04.923429 systemd-timesyncd[1964]: Contacted time server 67.205.162.81:123 (0.flatcar.pool.ntp.org). Oct 13 06:25:04.923570 systemd-timesyncd[1964]: Initial clock synchronization to Mon 2025-10-13 06:25:04.923069 UTC. Oct 13 06:25:05.648282 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Oct 13 06:25:05.658161 systemd[1]: Started sshd@0-139.178.94.13:22-139.178.68.195:51490.service - OpenSSH per-connection server daemon (139.178.68.195:51490). Oct 13 06:25:05.751004 sshd[2155]: Accepted publickey for core from 139.178.68.195 port 51490 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:25:05.752056 sshd-session[2155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:25:05.755549 coreos-metadata[2059]: Oct 13 06:25:05.755 INFO Fetch successful Oct 13 06:25:05.759282 systemd-logind[2006]: New session 1 of user core. Oct 13 06:25:05.760395 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Oct 13 06:25:05.770008 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Oct 13 06:25:05.771975 coreos-metadata[1973]: Oct 13 06:25:05.771 INFO Fetch successful Oct 13 06:25:05.790094 unknown[2059]: wrote ssh authorized keys file for user: core Oct 13 06:25:05.800651 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Oct 13 06:25:05.804358 update-ssh-keys[2159]: Updated "/home/core/.ssh/authorized_keys" Oct 13 06:25:05.811353 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Oct 13 06:25:05.826007 systemd[1]: Finished sshkeys.service. Oct 13 06:25:05.835165 systemd[1]: Starting user@500.service - User Manager for UID 500... Oct 13 06:25:05.842801 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Oct 13 06:25:05.852975 systemd[1]: Starting packet-phone-home.service - Report Success to Packet... Oct 13 06:25:05.853632 (systemd)[2168]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Oct 13 06:25:05.863823 systemd-logind[2006]: New session c1 of user core. Oct 13 06:25:05.963250 google_oslogin_nss_cache[1981]: oslogin_cache_refresh[1981]: Failure getting users, quitting Oct 13 06:25:05.963250 google_oslogin_nss_cache[1981]: oslogin_cache_refresh[1981]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Oct 13 06:25:05.963250 google_oslogin_nss_cache[1981]: oslogin_cache_refresh[1981]: Refreshing group entry cache Oct 13 06:25:05.963188 oslogin_cache_refresh[1981]: Failure getting users, quitting Oct 13 06:25:05.963199 oslogin_cache_refresh[1981]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Oct 13 06:25:05.963228 oslogin_cache_refresh[1981]: Refreshing group entry cache Oct 13 06:25:05.963800 google_oslogin_nss_cache[1981]: oslogin_cache_refresh[1981]: Failure getting groups, quitting Oct 13 06:25:05.963800 google_oslogin_nss_cache[1981]: oslogin_cache_refresh[1981]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Oct 13 06:25:05.963770 oslogin_cache_refresh[1981]: Failure getting groups, quitting Oct 13 06:25:05.963774 oslogin_cache_refresh[1981]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Oct 13 06:25:05.964329 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Oct 13 06:25:05.964444 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Oct 13 06:25:05.969746 systemd[2168]: Queued start job for default target default.target. Oct 13 06:25:05.983920 systemd[2168]: Created slice app.slice - User Application Slice. Oct 13 06:25:05.983934 systemd[2168]: Reached target paths.target - Paths. Oct 13 06:25:05.983954 systemd[2168]: Reached target timers.target - Timers. Oct 13 06:25:05.984583 systemd[2168]: Starting dbus.socket - D-Bus User Message Bus Socket... Oct 13 06:25:05.990315 systemd[2168]: Listening on dbus.socket - D-Bus User Message Bus Socket. Oct 13 06:25:05.990343 systemd[2168]: Reached target sockets.target - Sockets. Oct 13 06:25:05.990366 systemd[2168]: Reached target basic.target - Basic System. Oct 13 06:25:05.990387 systemd[2168]: Reached target default.target - Main User Target. Oct 13 06:25:05.990401 systemd[2168]: Startup finished in 121ms. Oct 13 06:25:05.990474 systemd[1]: Started user@500.service - User Manager for UID 500. Oct 13 06:25:06.001211 systemd[1]: Started session-1.scope - Session 1 of User core. Oct 13 06:25:06.069919 systemd[1]: Started sshd@1-139.178.94.13:22-139.178.68.195:51504.service - OpenSSH per-connection server daemon (139.178.68.195:51504). Oct 13 06:25:06.125611 sshd[2182]: Accepted publickey for core from 139.178.68.195 port 51504 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:25:06.126644 sshd-session[2182]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:25:06.130495 systemd-logind[2006]: New session 2 of user core. Oct 13 06:25:06.136712 systemd[1]: Started session-2.scope - Session 2 of User core. Oct 13 06:25:06.205509 sshd[2185]: Connection closed by 139.178.68.195 port 51504 Oct 13 06:25:06.205649 sshd-session[2182]: pam_unix(sshd:session): session closed for user core Oct 13 06:25:06.227993 systemd[1]: sshd@1-139.178.94.13:22-139.178.68.195:51504.service: Deactivated successfully. Oct 13 06:25:06.229096 systemd[1]: session-2.scope: Deactivated successfully. Oct 13 06:25:06.229739 systemd-logind[2006]: Session 2 logged out. Waiting for processes to exit. Oct 13 06:25:06.231115 systemd[1]: Started sshd@2-139.178.94.13:22-139.178.68.195:51512.service - OpenSSH per-connection server daemon (139.178.68.195:51512). Oct 13 06:25:06.242010 systemd-logind[2006]: Removed session 2. Oct 13 06:25:06.298038 systemd[1]: Finished packet-phone-home.service - Report Success to Packet. Oct 13 06:25:06.298364 sshd[2191]: Accepted publickey for core from 139.178.68.195 port 51512 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:25:06.298959 sshd-session[2191]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:25:06.309017 systemd[1]: Reached target multi-user.target - Multi-User System. Oct 13 06:25:06.317537 systemd[1]: Startup finished in 4.413s (kernel) + 21.987s (initrd) + 9.144s (userspace) = 35.546s. Oct 13 06:25:06.319749 systemd-logind[2006]: New session 3 of user core. Oct 13 06:25:06.340618 systemd[1]: Started session-3.scope - Session 3 of User core. Oct 13 06:25:06.395346 sshd[2196]: Connection closed by 139.178.68.195 port 51512 Oct 13 06:25:06.395489 sshd-session[2191]: pam_unix(sshd:session): session closed for user core Oct 13 06:25:06.397092 systemd[1]: sshd@2-139.178.94.13:22-139.178.68.195:51512.service: Deactivated successfully. Oct 13 06:25:06.397988 systemd[1]: session-3.scope: Deactivated successfully. Oct 13 06:25:06.398678 systemd-logind[2006]: Session 3 logged out. Waiting for processes to exit. Oct 13 06:25:06.399330 systemd-logind[2006]: Removed session 3. Oct 13 06:25:06.674474 login[2105]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Oct 13 06:25:06.677550 systemd-logind[2006]: New session 4 of user core. Oct 13 06:25:06.678160 systemd[1]: Started session-4.scope - Session 4 of User core. Oct 13 06:25:06.685392 login[2104]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Oct 13 06:25:06.688496 systemd-logind[2006]: New session 5 of user core. Oct 13 06:25:06.689033 systemd[1]: Started session-5.scope - Session 5 of User core. Oct 13 06:25:09.248632 systemd[1]: Started sshd@3-139.178.94.13:22-193.46.255.20:21860.service - OpenSSH per-connection server daemon (193.46.255.20:21860). Oct 13 06:25:10.546644 sshd-session[2230]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20 user=root Oct 13 06:25:12.625120 sshd[2227]: PAM: Permission denied for root from 193.46.255.20 Oct 13 06:25:12.971149 sshd-session[2231]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20 user=root Oct 13 06:25:14.106350 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Oct 13 06:25:14.109098 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 13 06:25:14.401541 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:25:14.410169 (kubelet)[2240]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 13 06:25:14.453159 kubelet[2240]: E1013 06:25:14.453103 2240 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 13 06:25:14.455750 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 13 06:25:14.455848 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 13 06:25:14.456048 systemd[1]: kubelet.service: Consumed 196ms CPU time, 116.4M memory peak. Oct 13 06:25:14.658158 sshd[2227]: PAM: Permission denied for root from 193.46.255.20 Oct 13 06:25:15.004325 sshd-session[2256]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20 user=root Oct 13 06:25:16.415887 systemd[1]: Started sshd@4-139.178.94.13:22-139.178.68.195:41724.service - OpenSSH per-connection server daemon (139.178.68.195:41724). Oct 13 06:25:16.461497 sshd[2258]: Accepted publickey for core from 139.178.68.195 port 41724 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:25:16.462254 sshd-session[2258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:25:16.465800 systemd-logind[2006]: New session 6 of user core. Oct 13 06:25:16.480657 systemd[1]: Started session-6.scope - Session 6 of User core. Oct 13 06:25:16.534891 sshd[2261]: Connection closed by 139.178.68.195 port 41724 Oct 13 06:25:16.535027 sshd-session[2258]: pam_unix(sshd:session): session closed for user core Oct 13 06:25:16.545264 systemd[1]: sshd@4-139.178.94.13:22-139.178.68.195:41724.service: Deactivated successfully. Oct 13 06:25:16.546033 systemd[1]: session-6.scope: Deactivated successfully. Oct 13 06:25:16.546550 systemd-logind[2006]: Session 6 logged out. Waiting for processes to exit. Oct 13 06:25:16.547809 systemd[1]: Started sshd@5-139.178.94.13:22-139.178.68.195:41726.service - OpenSSH per-connection server daemon (139.178.68.195:41726). Oct 13 06:25:16.548140 systemd-logind[2006]: Removed session 6. Oct 13 06:25:16.582560 sshd[2267]: Accepted publickey for core from 139.178.68.195 port 41726 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:25:16.583310 sshd-session[2267]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:25:16.587014 systemd-logind[2006]: New session 7 of user core. Oct 13 06:25:16.607725 systemd[1]: Started session-7.scope - Session 7 of User core. Oct 13 06:25:16.662756 sshd[2270]: Connection closed by 139.178.68.195 port 41726 Oct 13 06:25:16.663443 sshd-session[2267]: pam_unix(sshd:session): session closed for user core Oct 13 06:25:16.685525 systemd[1]: sshd@5-139.178.94.13:22-139.178.68.195:41726.service: Deactivated successfully. Oct 13 06:25:16.689355 systemd[1]: session-7.scope: Deactivated successfully. Oct 13 06:25:16.691572 systemd-logind[2006]: Session 7 logged out. Waiting for processes to exit. Oct 13 06:25:16.697185 systemd[1]: Started sshd@6-139.178.94.13:22-139.178.68.195:41728.service - OpenSSH per-connection server daemon (139.178.68.195:41728). Oct 13 06:25:16.699053 systemd-logind[2006]: Removed session 7. Oct 13 06:25:16.790344 sshd[2276]: Accepted publickey for core from 139.178.68.195 port 41728 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:25:16.791592 sshd-session[2276]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:25:16.797098 systemd-logind[2006]: New session 8 of user core. Oct 13 06:25:16.812696 systemd[1]: Started session-8.scope - Session 8 of User core. Oct 13 06:25:16.881861 sshd[2281]: Connection closed by 139.178.68.195 port 41728 Oct 13 06:25:16.882556 sshd-session[2276]: pam_unix(sshd:session): session closed for user core Oct 13 06:25:16.902391 systemd[1]: sshd@6-139.178.94.13:22-139.178.68.195:41728.service: Deactivated successfully. Oct 13 06:25:16.903097 systemd[1]: session-8.scope: Deactivated successfully. Oct 13 06:25:16.903625 systemd-logind[2006]: Session 8 logged out. Waiting for processes to exit. Oct 13 06:25:16.904516 systemd[1]: Started sshd@7-139.178.94.13:22-139.178.68.195:41730.service - OpenSSH per-connection server daemon (139.178.68.195:41730). Oct 13 06:25:16.905073 systemd-logind[2006]: Removed session 8. Oct 13 06:25:16.938183 sshd[2287]: Accepted publickey for core from 139.178.68.195 port 41730 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:25:16.938916 sshd-session[2287]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:25:16.942316 systemd-logind[2006]: New session 9 of user core. Oct 13 06:25:16.954714 systemd[1]: Started session-9.scope - Session 9 of User core. Oct 13 06:25:17.022421 sudo[2291]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Oct 13 06:25:17.022554 sudo[2291]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 13 06:25:17.038760 sudo[2291]: pam_unix(sudo:session): session closed for user root Oct 13 06:25:17.039872 sshd[2290]: Connection closed by 139.178.68.195 port 41730 Oct 13 06:25:17.040097 sshd-session[2287]: pam_unix(sshd:session): session closed for user core Oct 13 06:25:17.051919 systemd[1]: sshd@7-139.178.94.13:22-139.178.68.195:41730.service: Deactivated successfully. Oct 13 06:25:17.052973 systemd[1]: session-9.scope: Deactivated successfully. Oct 13 06:25:17.053681 systemd-logind[2006]: Session 9 logged out. Waiting for processes to exit. Oct 13 06:25:17.055325 systemd[1]: Started sshd@8-139.178.94.13:22-139.178.68.195:49118.service - OpenSSH per-connection server daemon (139.178.68.195:49118). Oct 13 06:25:17.055885 systemd-logind[2006]: Removed session 9. Oct 13 06:25:17.091711 sshd[2297]: Accepted publickey for core from 139.178.68.195 port 49118 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:25:17.092521 sshd-session[2297]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:25:17.096043 systemd-logind[2006]: New session 10 of user core. Oct 13 06:25:17.109496 systemd[1]: Started session-10.scope - Session 10 of User core. Oct 13 06:25:17.167727 sudo[2302]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Oct 13 06:25:17.167865 sudo[2302]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 13 06:25:17.170311 sudo[2302]: pam_unix(sudo:session): session closed for user root Oct 13 06:25:17.173486 sudo[2301]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Oct 13 06:25:17.173619 sudo[2301]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 13 06:25:17.179213 systemd[1]: Starting audit-rules.service - Load Audit Rules... Oct 13 06:25:17.221922 augenrules[2324]: No rules Oct 13 06:25:17.222958 systemd[1]: audit-rules.service: Deactivated successfully. Oct 13 06:25:17.223346 systemd[1]: Finished audit-rules.service - Load Audit Rules. Oct 13 06:25:17.224672 sudo[2301]: pam_unix(sudo:session): session closed for user root Oct 13 06:25:17.226793 sshd[2300]: Connection closed by 139.178.68.195 port 49118 Oct 13 06:25:17.227355 sshd-session[2297]: pam_unix(sshd:session): session closed for user core Oct 13 06:25:17.254455 systemd[1]: sshd@8-139.178.94.13:22-139.178.68.195:49118.service: Deactivated successfully. Oct 13 06:25:17.258145 systemd[1]: session-10.scope: Deactivated successfully. Oct 13 06:25:17.260447 systemd-logind[2006]: Session 10 logged out. Waiting for processes to exit. Oct 13 06:25:17.265776 systemd[1]: Started sshd@9-139.178.94.13:22-139.178.68.195:49122.service - OpenSSH per-connection server daemon (139.178.68.195:49122). Oct 13 06:25:17.267648 systemd-logind[2006]: Removed session 10. Oct 13 06:25:17.297585 sshd[2227]: PAM: Permission denied for root from 193.46.255.20 Oct 13 06:25:17.356103 sshd[2333]: Accepted publickey for core from 139.178.68.195 port 49122 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:25:17.357382 sshd-session[2333]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:25:17.362628 systemd-logind[2006]: New session 11 of user core. Oct 13 06:25:17.375677 systemd[1]: Started session-11.scope - Session 11 of User core. Oct 13 06:25:17.447295 sudo[2337]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Oct 13 06:25:17.448027 sudo[2337]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 13 06:25:17.469582 sshd[2227]: Received disconnect from 193.46.255.20 port 21860:11: [preauth] Oct 13 06:25:17.469582 sshd[2227]: Disconnected from authenticating user root 193.46.255.20 port 21860 [preauth] Oct 13 06:25:17.474940 systemd[1]: sshd@3-139.178.94.13:22-193.46.255.20:21860.service: Deactivated successfully. Oct 13 06:25:17.649746 systemd[1]: Started sshd@10-139.178.94.13:22-193.46.255.20:21876.service - OpenSSH per-connection server daemon (193.46.255.20:21876). Oct 13 06:25:17.852732 systemd[1]: Starting docker.service - Docker Application Container Engine... Oct 13 06:25:17.865609 (dockerd)[2371]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Oct 13 06:25:18.067817 dockerd[2371]: time="2025-10-13T06:25:18.067764162Z" level=info msg="Starting up" Oct 13 06:25:18.068233 dockerd[2371]: time="2025-10-13T06:25:18.068225224Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Oct 13 06:25:18.074759 dockerd[2371]: time="2025-10-13T06:25:18.074739843Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Oct 13 06:25:18.095791 dockerd[2371]: time="2025-10-13T06:25:18.095732936Z" level=info msg="Loading containers: start." Oct 13 06:25:18.109283 kernel: Initializing XFRM netlink socket Oct 13 06:25:18.287501 systemd-networkd[1778]: docker0: Link UP Oct 13 06:25:18.288803 dockerd[2371]: time="2025-10-13T06:25:18.288760122Z" level=info msg="Loading containers: done." Oct 13 06:25:18.295481 dockerd[2371]: time="2025-10-13T06:25:18.295435623Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Oct 13 06:25:18.295481 dockerd[2371]: time="2025-10-13T06:25:18.295475300Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Oct 13 06:25:18.295566 dockerd[2371]: time="2025-10-13T06:25:18.295514487Z" level=info msg="Initializing buildkit" Oct 13 06:25:18.306549 dockerd[2371]: time="2025-10-13T06:25:18.306495693Z" level=info msg="Completed buildkit initialization" Oct 13 06:25:18.309840 dockerd[2371]: time="2025-10-13T06:25:18.309796513Z" level=info msg="Daemon has completed initialization" Oct 13 06:25:18.309840 dockerd[2371]: time="2025-10-13T06:25:18.309819955Z" level=info msg="API listen on /run/docker.sock" Oct 13 06:25:18.309962 systemd[1]: Started docker.service - Docker Application Container Engine. Oct 13 06:25:18.885460 containerd[2022]: time="2025-10-13T06:25:18.885403505Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.1\"" Oct 13 06:25:18.939193 sshd-session[2607]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20 user=root Oct 13 06:25:19.084342 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3646442902-merged.mount: Deactivated successfully. Oct 13 06:25:19.508431 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2270430299.mount: Deactivated successfully. Oct 13 06:25:20.211117 containerd[2022]: time="2025-10-13T06:25:20.211054827Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:20.211346 containerd[2022]: time="2025-10-13T06:25:20.211225785Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.1: active requests=0, bytes read=27065392" Oct 13 06:25:20.211665 containerd[2022]: time="2025-10-13T06:25:20.211625365Z" level=info msg="ImageCreate event name:\"sha256:c3994bc6961024917ec0aeee02e62828108c21a52d87648e30f3080d9cbadc97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:20.212989 containerd[2022]: time="2025-10-13T06:25:20.212950813Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:b9d7c117f8ac52bed4b13aeed973dc5198f9d93a926e6fe9e0b384f155baa902\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:20.213515 containerd[2022]: time="2025-10-13T06:25:20.213471733Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.1\" with image id \"sha256:c3994bc6961024917ec0aeee02e62828108c21a52d87648e30f3080d9cbadc97\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.1\", repo digest \"registry.k8s.io/kube-apiserver@sha256:b9d7c117f8ac52bed4b13aeed973dc5198f9d93a926e6fe9e0b384f155baa902\", size \"27061991\" in 1.328041434s" Oct 13 06:25:20.213515 containerd[2022]: time="2025-10-13T06:25:20.213489537Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.1\" returns image reference \"sha256:c3994bc6961024917ec0aeee02e62828108c21a52d87648e30f3080d9cbadc97\"" Oct 13 06:25:20.213845 containerd[2022]: time="2025-10-13T06:25:20.213804286Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.1\"" Oct 13 06:25:21.209510 containerd[2022]: time="2025-10-13T06:25:21.209453958Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:21.209730 containerd[2022]: time="2025-10-13T06:25:21.209686015Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.1: active requests=0, bytes read=21159757" Oct 13 06:25:21.210052 containerd[2022]: time="2025-10-13T06:25:21.210040981Z" level=info msg="ImageCreate event name:\"sha256:c80c8dbafe7dd71fc21527912a6dd20ccd1b71f3e561a5c28337388d0619538f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:21.211284 containerd[2022]: time="2025-10-13T06:25:21.211251111Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:2bf47c1b01f51e8963bf2327390883c9fa4ed03ea1b284500a2cba17ce303e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:21.211788 containerd[2022]: time="2025-10-13T06:25:21.211747771Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.1\" with image id \"sha256:c80c8dbafe7dd71fc21527912a6dd20ccd1b71f3e561a5c28337388d0619538f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.1\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:2bf47c1b01f51e8963bf2327390883c9fa4ed03ea1b284500a2cba17ce303e89\", size \"22820214\" in 997.926117ms" Oct 13 06:25:21.211788 containerd[2022]: time="2025-10-13T06:25:21.211763022Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.1\" returns image reference \"sha256:c80c8dbafe7dd71fc21527912a6dd20ccd1b71f3e561a5c28337388d0619538f\"" Oct 13 06:25:21.212043 containerd[2022]: time="2025-10-13T06:25:21.212009693Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.1\"" Oct 13 06:25:21.447624 sshd[2351]: PAM: Permission denied for root from 193.46.255.20 Oct 13 06:25:21.791824 sshd-session[2675]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20 user=root Oct 13 06:25:22.058947 containerd[2022]: time="2025-10-13T06:25:22.058856111Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:22.059089 containerd[2022]: time="2025-10-13T06:25:22.059049634Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.1: active requests=0, bytes read=15725093" Oct 13 06:25:22.059486 containerd[2022]: time="2025-10-13T06:25:22.059443743Z" level=info msg="ImageCreate event name:\"sha256:7dd6aaa1717ab7eaae4578503e4c4d9965fcf5a249e8155fe16379ee9b6cb813\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:22.060816 containerd[2022]: time="2025-10-13T06:25:22.060776767Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:6e9fbc4e25a576483e6a233976353a66e4d77eb5d0530e9118e94b7d46fb3500\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:22.061710 containerd[2022]: time="2025-10-13T06:25:22.061670183Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.1\" with image id \"sha256:7dd6aaa1717ab7eaae4578503e4c4d9965fcf5a249e8155fe16379ee9b6cb813\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.1\", repo digest \"registry.k8s.io/kube-scheduler@sha256:6e9fbc4e25a576483e6a233976353a66e4d77eb5d0530e9118e94b7d46fb3500\", size \"17385568\" in 849.645471ms" Oct 13 06:25:22.061710 containerd[2022]: time="2025-10-13T06:25:22.061685058Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.1\" returns image reference \"sha256:7dd6aaa1717ab7eaae4578503e4c4d9965fcf5a249e8155fe16379ee9b6cb813\"" Oct 13 06:25:22.061998 containerd[2022]: time="2025-10-13T06:25:22.061946299Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.1\"" Oct 13 06:25:22.960702 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2771072730.mount: Deactivated successfully. Oct 13 06:25:23.099793 containerd[2022]: time="2025-10-13T06:25:23.099742341Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:23.100011 containerd[2022]: time="2025-10-13T06:25:23.099971350Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.1: active requests=0, bytes read=25964699" Oct 13 06:25:23.100349 containerd[2022]: time="2025-10-13T06:25:23.100312955Z" level=info msg="ImageCreate event name:\"sha256:fc25172553d79197ecd840ec8dba1fba68330079355e974b04c1a441e6a4a0b7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:23.101218 containerd[2022]: time="2025-10-13T06:25:23.101182549Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:913cc83ca0b5588a81d86ce8eedeb3ed1e9c1326e81852a1ea4f622b74ff749a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:23.101418 containerd[2022]: time="2025-10-13T06:25:23.101382317Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.1\" with image id \"sha256:fc25172553d79197ecd840ec8dba1fba68330079355e974b04c1a441e6a4a0b7\", repo tag \"registry.k8s.io/kube-proxy:v1.34.1\", repo digest \"registry.k8s.io/kube-proxy@sha256:913cc83ca0b5588a81d86ce8eedeb3ed1e9c1326e81852a1ea4f622b74ff749a\", size \"25963718\" in 1.039421465s" Oct 13 06:25:23.101418 containerd[2022]: time="2025-10-13T06:25:23.101398650Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.1\" returns image reference \"sha256:fc25172553d79197ecd840ec8dba1fba68330079355e974b04c1a441e6a4a0b7\"" Oct 13 06:25:23.101705 containerd[2022]: time="2025-10-13T06:25:23.101667754Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Oct 13 06:25:23.377905 sshd[2351]: PAM: Permission denied for root from 193.46.255.20 Oct 13 06:25:23.631683 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1297098447.mount: Deactivated successfully. Oct 13 06:25:23.720518 sshd-session[2688]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20 user=root Oct 13 06:25:24.216996 containerd[2022]: time="2025-10-13T06:25:24.216941060Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:24.217209 containerd[2022]: time="2025-10-13T06:25:24.217129659Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=22388007" Oct 13 06:25:24.217554 containerd[2022]: time="2025-10-13T06:25:24.217511403Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:24.218989 containerd[2022]: time="2025-10-13T06:25:24.218946741Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:24.219568 containerd[2022]: time="2025-10-13T06:25:24.219525045Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 1.117841045s" Oct 13 06:25:24.219568 containerd[2022]: time="2025-10-13T06:25:24.219541720Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Oct 13 06:25:24.219869 containerd[2022]: time="2025-10-13T06:25:24.219830331Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Oct 13 06:25:24.604840 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Oct 13 06:25:24.605921 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 13 06:25:24.881964 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:25:24.884029 (kubelet)[2749]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 13 06:25:24.908508 kubelet[2749]: E1013 06:25:24.908439 2749 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 13 06:25:24.909560 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 13 06:25:24.909638 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 13 06:25:24.909818 systemd[1]: kubelet.service: Consumed 105ms CPU time, 113.3M memory peak. Oct 13 06:25:24.921800 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount987687161.mount: Deactivated successfully. Oct 13 06:25:24.922898 containerd[2022]: time="2025-10-13T06:25:24.922879540Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:24.923081 containerd[2022]: time="2025-10-13T06:25:24.923054522Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321218" Oct 13 06:25:24.923542 containerd[2022]: time="2025-10-13T06:25:24.923529023Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:24.924538 containerd[2022]: time="2025-10-13T06:25:24.924499090Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:24.925150 containerd[2022]: time="2025-10-13T06:25:24.925116511Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 705.273002ms" Oct 13 06:25:24.925150 containerd[2022]: time="2025-10-13T06:25:24.925129474Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Oct 13 06:25:24.925581 containerd[2022]: time="2025-10-13T06:25:24.925532324Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Oct 13 06:25:25.251177 sshd[2351]: PAM: Permission denied for root from 193.46.255.20 Oct 13 06:25:25.422737 sshd[2351]: Received disconnect from 193.46.255.20 port 21876:11: [preauth] Oct 13 06:25:25.422737 sshd[2351]: Disconnected from authenticating user root 193.46.255.20 port 21876 [preauth] Oct 13 06:25:25.427350 systemd[1]: sshd@10-139.178.94.13:22-193.46.255.20:21876.service: Deactivated successfully. Oct 13 06:25:25.611804 systemd[1]: Started sshd@11-139.178.94.13:22-193.46.255.20:23518.service - OpenSSH per-connection server daemon (193.46.255.20:23518). Oct 13 06:25:26.865370 containerd[2022]: time="2025-10-13T06:25:26.865314476Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:26.865596 containerd[2022]: time="2025-10-13T06:25:26.865533018Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=73514593" Oct 13 06:25:26.865875 containerd[2022]: time="2025-10-13T06:25:26.865839757Z" level=info msg="ImageCreate event name:\"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:26.867291 containerd[2022]: time="2025-10-13T06:25:26.867261040Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:26.867954 containerd[2022]: time="2025-10-13T06:25:26.867908855Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"74311308\" in 1.942346388s" Oct 13 06:25:26.867954 containerd[2022]: time="2025-10-13T06:25:26.867931170Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\"" Oct 13 06:25:26.878772 sshd-session[2817]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20 user=root Oct 13 06:25:28.820593 sshd[2810]: PAM: Permission denied for root from 193.46.255.20 Oct 13 06:25:28.953761 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:25:28.953865 systemd[1]: kubelet.service: Consumed 105ms CPU time, 113.3M memory peak. Oct 13 06:25:28.955120 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 13 06:25:28.969613 systemd[1]: Reload requested from client PID 2869 ('systemctl') (unit session-11.scope)... Oct 13 06:25:28.969620 systemd[1]: Reloading... Oct 13 06:25:29.009250 zram_generator::config[2917]: No configuration found. Oct 13 06:25:29.157920 systemd[1]: Reloading finished in 188 ms. Oct 13 06:25:29.164333 sshd-session[2893]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20 user=root Oct 13 06:25:29.208860 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Oct 13 06:25:29.208903 systemd[1]: kubelet.service: Failed with result 'signal'. Oct 13 06:25:29.209032 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:25:29.210181 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 13 06:25:29.487143 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:25:29.513440 (kubelet)[2983]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Oct 13 06:25:29.533981 kubelet[2983]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Oct 13 06:25:29.533981 kubelet[2983]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 13 06:25:29.534214 kubelet[2983]: I1013 06:25:29.533979 2983 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 13 06:25:29.716135 kubelet[2983]: I1013 06:25:29.716096 2983 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Oct 13 06:25:29.716135 kubelet[2983]: I1013 06:25:29.716105 2983 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 13 06:25:29.716891 kubelet[2983]: I1013 06:25:29.716861 2983 watchdog_linux.go:95] "Systemd watchdog is not enabled" Oct 13 06:25:29.716891 kubelet[2983]: I1013 06:25:29.716886 2983 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Oct 13 06:25:29.717043 kubelet[2983]: I1013 06:25:29.717008 2983 server.go:956] "Client rotation is on, will bootstrap in background" Oct 13 06:25:29.720380 kubelet[2983]: E1013 06:25:29.720310 2983 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://139.178.94.13:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.94.13:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Oct 13 06:25:29.720588 kubelet[2983]: I1013 06:25:29.720550 2983 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Oct 13 06:25:29.722756 kubelet[2983]: I1013 06:25:29.722748 2983 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 13 06:25:29.730943 kubelet[2983]: I1013 06:25:29.730906 2983 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Oct 13 06:25:29.731534 kubelet[2983]: I1013 06:25:29.731495 2983 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 13 06:25:29.731624 kubelet[2983]: I1013 06:25:29.731507 2983 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4487.0.0-a-becc29ce89","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 13 06:25:29.731624 kubelet[2983]: I1013 06:25:29.731593 2983 topology_manager.go:138] "Creating topology manager with none policy" Oct 13 06:25:29.731624 kubelet[2983]: I1013 06:25:29.731598 2983 container_manager_linux.go:306] "Creating device plugin manager" Oct 13 06:25:29.731723 kubelet[2983]: I1013 06:25:29.731641 2983 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Oct 13 06:25:29.732908 kubelet[2983]: I1013 06:25:29.732887 2983 state_mem.go:36] "Initialized new in-memory state store" Oct 13 06:25:29.733318 kubelet[2983]: I1013 06:25:29.733128 2983 kubelet.go:475] "Attempting to sync node with API server" Oct 13 06:25:29.733318 kubelet[2983]: I1013 06:25:29.733153 2983 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 13 06:25:29.733318 kubelet[2983]: I1013 06:25:29.733190 2983 kubelet.go:387] "Adding apiserver pod source" Oct 13 06:25:29.733318 kubelet[2983]: I1013 06:25:29.733230 2983 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 13 06:25:29.733794 kubelet[2983]: E1013 06:25:29.733781 2983 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://139.178.94.13:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4487.0.0-a-becc29ce89&limit=500&resourceVersion=0\": dial tcp 139.178.94.13:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Oct 13 06:25:29.733794 kubelet[2983]: E1013 06:25:29.733782 2983 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://139.178.94.13:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.94.13:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Oct 13 06:25:29.735784 kubelet[2983]: I1013 06:25:29.735741 2983 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Oct 13 06:25:29.736108 kubelet[2983]: I1013 06:25:29.736078 2983 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Oct 13 06:25:29.736136 kubelet[2983]: I1013 06:25:29.736111 2983 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Oct 13 06:25:29.736136 kubelet[2983]: W1013 06:25:29.736134 2983 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Oct 13 06:25:29.738195 kubelet[2983]: I1013 06:25:29.738158 2983 server.go:1262] "Started kubelet" Oct 13 06:25:29.738242 kubelet[2983]: I1013 06:25:29.738221 2983 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Oct 13 06:25:29.738290 kubelet[2983]: I1013 06:25:29.738233 2983 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 13 06:25:29.738318 kubelet[2983]: I1013 06:25:29.738308 2983 server_v1.go:49] "podresources" method="list" useActivePods=true Oct 13 06:25:29.738614 kubelet[2983]: I1013 06:25:29.738602 2983 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 13 06:25:29.738798 kubelet[2983]: I1013 06:25:29.738789 2983 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 13 06:25:29.738826 kubelet[2983]: I1013 06:25:29.738814 2983 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Oct 13 06:25:29.738842 kubelet[2983]: I1013 06:25:29.738831 2983 volume_manager.go:313] "Starting Kubelet Volume Manager" Oct 13 06:25:29.738901 kubelet[2983]: I1013 06:25:29.738895 2983 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 13 06:25:29.738935 kubelet[2983]: E1013 06:25:29.738926 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:29.738963 kubelet[2983]: I1013 06:25:29.738949 2983 reconciler.go:29] "Reconciler: start to sync state" Oct 13 06:25:29.739143 kubelet[2983]: I1013 06:25:29.739134 2983 server.go:310] "Adding debug handlers to kubelet server" Oct 13 06:25:29.739188 kubelet[2983]: E1013 06:25:29.739158 2983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.13:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4487.0.0-a-becc29ce89?timeout=10s\": dial tcp 139.178.94.13:6443: connect: connection refused" interval="200ms" Oct 13 06:25:29.739246 kubelet[2983]: E1013 06:25:29.739130 2983 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://139.178.94.13:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.94.13:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Oct 13 06:25:29.739277 kubelet[2983]: I1013 06:25:29.739263 2983 factory.go:223] Registration of the systemd container factory successfully Oct 13 06:25:29.739322 kubelet[2983]: E1013 06:25:29.739312 2983 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Oct 13 06:25:29.739322 kubelet[2983]: I1013 06:25:29.739310 2983 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Oct 13 06:25:29.742211 kubelet[2983]: I1013 06:25:29.742192 2983 factory.go:223] Registration of the containerd container factory successfully Oct 13 06:25:29.743980 kubelet[2983]: E1013 06:25:29.742871 2983 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.94.13:6443/api/v1/namespaces/default/events\": dial tcp 139.178.94.13:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4487.0.0-a-becc29ce89.186df8edd51d6b7f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4487.0.0-a-becc29ce89,UID:ci-4487.0.0-a-becc29ce89,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4487.0.0-a-becc29ce89,},FirstTimestamp:2025-10-13 06:25:29.738144639 +0000 UTC m=+0.222823036,LastTimestamp:2025-10-13 06:25:29.738144639 +0000 UTC m=+0.222823036,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4487.0.0-a-becc29ce89,}" Oct 13 06:25:29.748433 kubelet[2983]: I1013 06:25:29.748423 2983 cpu_manager.go:221] "Starting CPU manager" policy="none" Oct 13 06:25:29.748433 kubelet[2983]: I1013 06:25:29.748431 2983 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Oct 13 06:25:29.748541 kubelet[2983]: I1013 06:25:29.748441 2983 state_mem.go:36] "Initialized new in-memory state store" Oct 13 06:25:29.749395 kubelet[2983]: I1013 06:25:29.749388 2983 policy_none.go:49] "None policy: Start" Oct 13 06:25:29.749423 kubelet[2983]: I1013 06:25:29.749397 2983 memory_manager.go:187] "Starting memorymanager" policy="None" Oct 13 06:25:29.749423 kubelet[2983]: I1013 06:25:29.749403 2983 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Oct 13 06:25:29.749937 kubelet[2983]: I1013 06:25:29.749931 2983 policy_none.go:47] "Start" Oct 13 06:25:29.751200 kubelet[2983]: I1013 06:25:29.751187 2983 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Oct 13 06:25:29.751730 kubelet[2983]: I1013 06:25:29.751721 2983 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Oct 13 06:25:29.751763 kubelet[2983]: I1013 06:25:29.751733 2983 status_manager.go:244] "Starting to sync pod status with apiserver" Oct 13 06:25:29.751763 kubelet[2983]: I1013 06:25:29.751750 2983 kubelet.go:2427] "Starting kubelet main sync loop" Oct 13 06:25:29.751816 kubelet[2983]: E1013 06:25:29.751777 2983 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 13 06:25:29.751854 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Oct 13 06:25:29.752079 kubelet[2983]: E1013 06:25:29.752064 2983 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://139.178.94.13:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.94.13:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Oct 13 06:25:29.770028 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Oct 13 06:25:29.772012 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Oct 13 06:25:29.781955 kubelet[2983]: E1013 06:25:29.781912 2983 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Oct 13 06:25:29.782058 kubelet[2983]: I1013 06:25:29.782046 2983 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 13 06:25:29.782110 kubelet[2983]: I1013 06:25:29.782056 2983 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 13 06:25:29.782194 kubelet[2983]: I1013 06:25:29.782184 2983 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 13 06:25:29.782621 kubelet[2983]: E1013 06:25:29.782573 2983 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Oct 13 06:25:29.782621 kubelet[2983]: E1013 06:25:29.782596 2983 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:29.862330 systemd[1]: Created slice kubepods-burstable-podf5cbea7c8ee68c74c150ca61efdb0b33.slice - libcontainer container kubepods-burstable-podf5cbea7c8ee68c74c150ca61efdb0b33.slice. Oct 13 06:25:29.883790 kubelet[2983]: I1013 06:25:29.883743 2983 kubelet_node_status.go:75] "Attempting to register node" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:29.884002 kubelet[2983]: E1013 06:25:29.883960 2983 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.94.13:6443/api/v1/nodes\": dial tcp 139.178.94.13:6443: connect: connection refused" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:29.891285 kubelet[2983]: E1013 06:25:29.891241 2983 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4487.0.0-a-becc29ce89\" not found" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:29.893372 systemd[1]: Created slice kubepods-burstable-podfee0fd67d59afcfa4f9efd0c6357c6a9.slice - libcontainer container kubepods-burstable-podfee0fd67d59afcfa4f9efd0c6357c6a9.slice. Oct 13 06:25:29.907338 kubelet[2983]: E1013 06:25:29.907298 2983 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4487.0.0-a-becc29ce89\" not found" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:29.909686 systemd[1]: Created slice kubepods-burstable-podac50836ebba401cb4102e7bef9eb0483.slice - libcontainer container kubepods-burstable-podac50836ebba401cb4102e7bef9eb0483.slice. Oct 13 06:25:29.911163 kubelet[2983]: E1013 06:25:29.911123 2983 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4487.0.0-a-becc29ce89\" not found" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:29.939921 kubelet[2983]: I1013 06:25:29.939810 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f5cbea7c8ee68c74c150ca61efdb0b33-kubeconfig\") pod \"kube-scheduler-ci-4487.0.0-a-becc29ce89\" (UID: \"f5cbea7c8ee68c74c150ca61efdb0b33\") " pod="kube-system/kube-scheduler-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:29.940435 kubelet[2983]: E1013 06:25:29.940308 2983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.13:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4487.0.0-a-becc29ce89?timeout=10s\": dial tcp 139.178.94.13:6443: connect: connection refused" interval="400ms" Oct 13 06:25:30.040442 kubelet[2983]: I1013 06:25:30.040198 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ac50836ebba401cb4102e7bef9eb0483-flexvolume-dir\") pod \"kube-controller-manager-ci-4487.0.0-a-becc29ce89\" (UID: \"ac50836ebba401cb4102e7bef9eb0483\") " pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:30.040442 kubelet[2983]: I1013 06:25:30.040353 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fee0fd67d59afcfa4f9efd0c6357c6a9-ca-certs\") pod \"kube-apiserver-ci-4487.0.0-a-becc29ce89\" (UID: \"fee0fd67d59afcfa4f9efd0c6357c6a9\") " pod="kube-system/kube-apiserver-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:30.040442 kubelet[2983]: I1013 06:25:30.040422 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fee0fd67d59afcfa4f9efd0c6357c6a9-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4487.0.0-a-becc29ce89\" (UID: \"fee0fd67d59afcfa4f9efd0c6357c6a9\") " pod="kube-system/kube-apiserver-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:30.040812 kubelet[2983]: I1013 06:25:30.040469 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ac50836ebba401cb4102e7bef9eb0483-ca-certs\") pod \"kube-controller-manager-ci-4487.0.0-a-becc29ce89\" (UID: \"ac50836ebba401cb4102e7bef9eb0483\") " pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:30.040812 kubelet[2983]: I1013 06:25:30.040541 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ac50836ebba401cb4102e7bef9eb0483-k8s-certs\") pod \"kube-controller-manager-ci-4487.0.0-a-becc29ce89\" (UID: \"ac50836ebba401cb4102e7bef9eb0483\") " pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:30.040812 kubelet[2983]: I1013 06:25:30.040583 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ac50836ebba401cb4102e7bef9eb0483-kubeconfig\") pod \"kube-controller-manager-ci-4487.0.0-a-becc29ce89\" (UID: \"ac50836ebba401cb4102e7bef9eb0483\") " pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:30.040812 kubelet[2983]: I1013 06:25:30.040626 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ac50836ebba401cb4102e7bef9eb0483-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4487.0.0-a-becc29ce89\" (UID: \"ac50836ebba401cb4102e7bef9eb0483\") " pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:30.040812 kubelet[2983]: I1013 06:25:30.040721 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fee0fd67d59afcfa4f9efd0c6357c6a9-k8s-certs\") pod \"kube-apiserver-ci-4487.0.0-a-becc29ce89\" (UID: \"fee0fd67d59afcfa4f9efd0c6357c6a9\") " pod="kube-system/kube-apiserver-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:30.087936 kubelet[2983]: I1013 06:25:30.087830 2983 kubelet_node_status.go:75] "Attempting to register node" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:30.088691 kubelet[2983]: E1013 06:25:30.088583 2983 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.94.13:6443/api/v1/nodes\": dial tcp 139.178.94.13:6443: connect: connection refused" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:30.195599 containerd[2022]: time="2025-10-13T06:25:30.195546617Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4487.0.0-a-becc29ce89,Uid:f5cbea7c8ee68c74c150ca61efdb0b33,Namespace:kube-system,Attempt:0,}" Oct 13 06:25:30.208646 containerd[2022]: time="2025-10-13T06:25:30.208630473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4487.0.0-a-becc29ce89,Uid:fee0fd67d59afcfa4f9efd0c6357c6a9,Namespace:kube-system,Attempt:0,}" Oct 13 06:25:30.212748 containerd[2022]: time="2025-10-13T06:25:30.212730268Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4487.0.0-a-becc29ce89,Uid:ac50836ebba401cb4102e7bef9eb0483,Namespace:kube-system,Attempt:0,}" Oct 13 06:25:30.342190 kubelet[2983]: E1013 06:25:30.341928 2983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.13:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4487.0.0-a-becc29ce89?timeout=10s\": dial tcp 139.178.94.13:6443: connect: connection refused" interval="800ms" Oct 13 06:25:30.491508 kubelet[2983]: I1013 06:25:30.491476 2983 kubelet_node_status.go:75] "Attempting to register node" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:30.491834 kubelet[2983]: E1013 06:25:30.491807 2983 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.94.13:6443/api/v1/nodes\": dial tcp 139.178.94.13:6443: connect: connection refused" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:30.719311 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1668332154.mount: Deactivated successfully. Oct 13 06:25:30.720812 containerd[2022]: time="2025-10-13T06:25:30.720795157Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 13 06:25:30.721861 containerd[2022]: time="2025-10-13T06:25:30.721832443Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Oct 13 06:25:30.722205 containerd[2022]: time="2025-10-13T06:25:30.722192913Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 13 06:25:30.722855 containerd[2022]: time="2025-10-13T06:25:30.722841678Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 13 06:25:30.722955 containerd[2022]: time="2025-10-13T06:25:30.722944434Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Oct 13 06:25:30.723444 containerd[2022]: time="2025-10-13T06:25:30.723430693Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 13 06:25:30.723497 containerd[2022]: time="2025-10-13T06:25:30.723486912Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Oct 13 06:25:30.724714 containerd[2022]: time="2025-10-13T06:25:30.724702767Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 527.754583ms" Oct 13 06:25:30.724910 containerd[2022]: time="2025-10-13T06:25:30.724900079Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 13 06:25:30.725933 containerd[2022]: time="2025-10-13T06:25:30.725919328Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 516.224693ms" Oct 13 06:25:30.726418 containerd[2022]: time="2025-10-13T06:25:30.726406935Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 512.816764ms" Oct 13 06:25:30.733296 containerd[2022]: time="2025-10-13T06:25:30.733240988Z" level=info msg="connecting to shim eaa710c21edd0688dd875a2068a97e92e532ded3ba744780572f599a7cd83c85" address="unix:///run/containerd/s/b6ee974b63220aa9df528e69a8a3ebe75be31dff1ff500e6518a363857a3e8ea" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:25:30.734482 containerd[2022]: time="2025-10-13T06:25:30.734450686Z" level=info msg="connecting to shim f697c50be01fcd626705c1b957ee2ae490c7375f14b20bba4816a467dc07be8a" address="unix:///run/containerd/s/de3f19401e5c69eeed72927a3e58966fa7b1fb8681d214417ea0083c0a33e8d2" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:25:30.734610 containerd[2022]: time="2025-10-13T06:25:30.734529364Z" level=info msg="connecting to shim ca8d8218fcd459a51df3f602959450678068b9e81c85bac0601a2ea36ac16298" address="unix:///run/containerd/s/776b42806e55745e3822a535af392265056a30371ff4af0b278032a0d0283928" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:25:30.742651 kubelet[2983]: E1013 06:25:30.742632 2983 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://139.178.94.13:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4487.0.0-a-becc29ce89&limit=500&resourceVersion=0\": dial tcp 139.178.94.13:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Oct 13 06:25:30.752561 systemd[1]: Started cri-containerd-ca8d8218fcd459a51df3f602959450678068b9e81c85bac0601a2ea36ac16298.scope - libcontainer container ca8d8218fcd459a51df3f602959450678068b9e81c85bac0601a2ea36ac16298. Oct 13 06:25:30.753455 systemd[1]: Started cri-containerd-eaa710c21edd0688dd875a2068a97e92e532ded3ba744780572f599a7cd83c85.scope - libcontainer container eaa710c21edd0688dd875a2068a97e92e532ded3ba744780572f599a7cd83c85. Oct 13 06:25:30.754282 systemd[1]: Started cri-containerd-f697c50be01fcd626705c1b957ee2ae490c7375f14b20bba4816a467dc07be8a.scope - libcontainer container f697c50be01fcd626705c1b957ee2ae490c7375f14b20bba4816a467dc07be8a. Oct 13 06:25:30.779420 containerd[2022]: time="2025-10-13T06:25:30.779392644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4487.0.0-a-becc29ce89,Uid:ac50836ebba401cb4102e7bef9eb0483,Namespace:kube-system,Attempt:0,} returns sandbox id \"f697c50be01fcd626705c1b957ee2ae490c7375f14b20bba4816a467dc07be8a\"" Oct 13 06:25:30.780246 containerd[2022]: time="2025-10-13T06:25:30.780223553Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4487.0.0-a-becc29ce89,Uid:f5cbea7c8ee68c74c150ca61efdb0b33,Namespace:kube-system,Attempt:0,} returns sandbox id \"eaa710c21edd0688dd875a2068a97e92e532ded3ba744780572f599a7cd83c85\"" Oct 13 06:25:30.780478 containerd[2022]: time="2025-10-13T06:25:30.780467924Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4487.0.0-a-becc29ce89,Uid:fee0fd67d59afcfa4f9efd0c6357c6a9,Namespace:kube-system,Attempt:0,} returns sandbox id \"ca8d8218fcd459a51df3f602959450678068b9e81c85bac0601a2ea36ac16298\"" Oct 13 06:25:30.781590 containerd[2022]: time="2025-10-13T06:25:30.781576988Z" level=info msg="CreateContainer within sandbox \"f697c50be01fcd626705c1b957ee2ae490c7375f14b20bba4816a467dc07be8a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Oct 13 06:25:30.781990 containerd[2022]: time="2025-10-13T06:25:30.781978351Z" level=info msg="CreateContainer within sandbox \"eaa710c21edd0688dd875a2068a97e92e532ded3ba744780572f599a7cd83c85\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Oct 13 06:25:30.782648 containerd[2022]: time="2025-10-13T06:25:30.782634568Z" level=info msg="CreateContainer within sandbox \"ca8d8218fcd459a51df3f602959450678068b9e81c85bac0601a2ea36ac16298\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Oct 13 06:25:30.786042 containerd[2022]: time="2025-10-13T06:25:30.786019852Z" level=info msg="Container 80c6cecb59189b5afab3a1873531c94b6d8a4a43884748ff9e03ef8c5b49e096: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:25:30.786750 containerd[2022]: time="2025-10-13T06:25:30.786726056Z" level=info msg="Container 9a1bc1522a94ac483794a3c5e8a129d7b1454d50bf17bb5d62a46fdf4fbf8ddf: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:25:30.787598 containerd[2022]: time="2025-10-13T06:25:30.787586703Z" level=info msg="Container 9d887a4b9252720fbf3840ce5576043a6ab214717871fef0dcfa941562dfc9fa: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:25:30.790199 containerd[2022]: time="2025-10-13T06:25:30.790187152Z" level=info msg="CreateContainer within sandbox \"ca8d8218fcd459a51df3f602959450678068b9e81c85bac0601a2ea36ac16298\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"9d887a4b9252720fbf3840ce5576043a6ab214717871fef0dcfa941562dfc9fa\"" Oct 13 06:25:30.790575 containerd[2022]: time="2025-10-13T06:25:30.790565743Z" level=info msg="StartContainer for \"9d887a4b9252720fbf3840ce5576043a6ab214717871fef0dcfa941562dfc9fa\"" Oct 13 06:25:30.790743 containerd[2022]: time="2025-10-13T06:25:30.790731751Z" level=info msg="CreateContainer within sandbox \"f697c50be01fcd626705c1b957ee2ae490c7375f14b20bba4816a467dc07be8a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"80c6cecb59189b5afab3a1873531c94b6d8a4a43884748ff9e03ef8c5b49e096\"" Oct 13 06:25:30.790885 containerd[2022]: time="2025-10-13T06:25:30.790873372Z" level=info msg="StartContainer for \"80c6cecb59189b5afab3a1873531c94b6d8a4a43884748ff9e03ef8c5b49e096\"" Oct 13 06:25:30.791085 containerd[2022]: time="2025-10-13T06:25:30.791073985Z" level=info msg="CreateContainer within sandbox \"eaa710c21edd0688dd875a2068a97e92e532ded3ba744780572f599a7cd83c85\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"9a1bc1522a94ac483794a3c5e8a129d7b1454d50bf17bb5d62a46fdf4fbf8ddf\"" Oct 13 06:25:30.791211 containerd[2022]: time="2025-10-13T06:25:30.791203040Z" level=info msg="StartContainer for \"9a1bc1522a94ac483794a3c5e8a129d7b1454d50bf17bb5d62a46fdf4fbf8ddf\"" Oct 13 06:25:30.791296 containerd[2022]: time="2025-10-13T06:25:30.791258571Z" level=info msg="connecting to shim 9d887a4b9252720fbf3840ce5576043a6ab214717871fef0dcfa941562dfc9fa" address="unix:///run/containerd/s/776b42806e55745e3822a535af392265056a30371ff4af0b278032a0d0283928" protocol=ttrpc version=3 Oct 13 06:25:30.791618 containerd[2022]: time="2025-10-13T06:25:30.791582002Z" level=info msg="connecting to shim 80c6cecb59189b5afab3a1873531c94b6d8a4a43884748ff9e03ef8c5b49e096" address="unix:///run/containerd/s/de3f19401e5c69eeed72927a3e58966fa7b1fb8681d214417ea0083c0a33e8d2" protocol=ttrpc version=3 Oct 13 06:25:30.791745 containerd[2022]: time="2025-10-13T06:25:30.791708663Z" level=info msg="connecting to shim 9a1bc1522a94ac483794a3c5e8a129d7b1454d50bf17bb5d62a46fdf4fbf8ddf" address="unix:///run/containerd/s/b6ee974b63220aa9df528e69a8a3ebe75be31dff1ff500e6518a363857a3e8ea" protocol=ttrpc version=3 Oct 13 06:25:30.809555 systemd[1]: Started cri-containerd-80c6cecb59189b5afab3a1873531c94b6d8a4a43884748ff9e03ef8c5b49e096.scope - libcontainer container 80c6cecb59189b5afab3a1873531c94b6d8a4a43884748ff9e03ef8c5b49e096. Oct 13 06:25:30.810275 systemd[1]: Started cri-containerd-9a1bc1522a94ac483794a3c5e8a129d7b1454d50bf17bb5d62a46fdf4fbf8ddf.scope - libcontainer container 9a1bc1522a94ac483794a3c5e8a129d7b1454d50bf17bb5d62a46fdf4fbf8ddf. Oct 13 06:25:30.810968 systemd[1]: Started cri-containerd-9d887a4b9252720fbf3840ce5576043a6ab214717871fef0dcfa941562dfc9fa.scope - libcontainer container 9d887a4b9252720fbf3840ce5576043a6ab214717871fef0dcfa941562dfc9fa. Oct 13 06:25:30.838339 containerd[2022]: time="2025-10-13T06:25:30.838316947Z" level=info msg="StartContainer for \"80c6cecb59189b5afab3a1873531c94b6d8a4a43884748ff9e03ef8c5b49e096\" returns successfully" Oct 13 06:25:30.847541 containerd[2022]: time="2025-10-13T06:25:30.847505457Z" level=info msg="StartContainer for \"9a1bc1522a94ac483794a3c5e8a129d7b1454d50bf17bb5d62a46fdf4fbf8ddf\" returns successfully" Oct 13 06:25:30.847541 containerd[2022]: time="2025-10-13T06:25:30.847541270Z" level=info msg="StartContainer for \"9d887a4b9252720fbf3840ce5576043a6ab214717871fef0dcfa941562dfc9fa\" returns successfully" Oct 13 06:25:30.859408 kubelet[2983]: E1013 06:25:30.859361 2983 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://139.178.94.13:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.94.13:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Oct 13 06:25:31.045510 sshd[2810]: PAM: Permission denied for root from 193.46.255.20 Oct 13 06:25:31.295894 kubelet[2983]: I1013 06:25:31.295412 2983 kubelet_node_status.go:75] "Attempting to register node" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:31.387407 kubelet[2983]: E1013 06:25:31.387376 2983 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4487.0.0-a-becc29ce89\" not found" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:31.389138 sshd-session[3306]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20 user=root Oct 13 06:25:31.491530 kubelet[2983]: I1013 06:25:31.491512 2983 kubelet_node_status.go:78] "Successfully registered node" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:31.491530 kubelet[2983]: E1013 06:25:31.491533 2983 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4487.0.0-a-becc29ce89\": node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:31.495688 kubelet[2983]: E1013 06:25:31.495672 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:31.596357 kubelet[2983]: E1013 06:25:31.596281 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:31.696988 kubelet[2983]: E1013 06:25:31.696937 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:31.758029 kubelet[2983]: E1013 06:25:31.758015 2983 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4487.0.0-a-becc29ce89\" not found" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:31.758526 kubelet[2983]: E1013 06:25:31.758519 2983 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4487.0.0-a-becc29ce89\" not found" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:31.759262 kubelet[2983]: E1013 06:25:31.759254 2983 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4487.0.0-a-becc29ce89\" not found" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:31.797561 kubelet[2983]: E1013 06:25:31.797518 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:31.898240 kubelet[2983]: E1013 06:25:31.898193 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:31.998452 kubelet[2983]: E1013 06:25:31.998347 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:32.098609 kubelet[2983]: E1013 06:25:32.098479 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:32.199551 kubelet[2983]: E1013 06:25:32.199330 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:32.300200 kubelet[2983]: E1013 06:25:32.300084 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:32.401306 kubelet[2983]: E1013 06:25:32.401209 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:32.502440 kubelet[2983]: E1013 06:25:32.502278 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:32.603224 kubelet[2983]: E1013 06:25:32.603124 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:32.704352 kubelet[2983]: E1013 06:25:32.704273 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:32.766743 kubelet[2983]: E1013 06:25:32.766590 2983 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4487.0.0-a-becc29ce89\" not found" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:32.767504 kubelet[2983]: E1013 06:25:32.766881 2983 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4487.0.0-a-becc29ce89\" not found" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:32.805554 kubelet[2983]: E1013 06:25:32.805477 2983 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:32.939941 kubelet[2983]: I1013 06:25:32.939838 2983 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:32.950969 kubelet[2983]: I1013 06:25:32.950926 2983 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Oct 13 06:25:32.951209 kubelet[2983]: I1013 06:25:32.951178 2983 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:32.957038 kubelet[2983]: I1013 06:25:32.956959 2983 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Oct 13 06:25:32.957258 kubelet[2983]: I1013 06:25:32.957128 2983 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:32.963566 kubelet[2983]: I1013 06:25:32.963485 2983 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Oct 13 06:25:33.241172 kubelet[2983]: I1013 06:25:33.241128 2983 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:33.248001 kubelet[2983]: I1013 06:25:33.247936 2983 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Oct 13 06:25:33.248201 kubelet[2983]: E1013 06:25:33.248040 2983 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4487.0.0-a-becc29ce89\" already exists" pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:33.682630 sshd[2810]: PAM: Permission denied for root from 193.46.255.20 Oct 13 06:25:33.735800 kubelet[2983]: I1013 06:25:33.735700 2983 apiserver.go:52] "Watching apiserver" Oct 13 06:25:33.765480 kubelet[2983]: I1013 06:25:33.765396 2983 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:33.773214 kubelet[2983]: I1013 06:25:33.773123 2983 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Oct 13 06:25:33.773910 kubelet[2983]: E1013 06:25:33.773229 2983 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4487.0.0-a-becc29ce89\" already exists" pod="kube-system/kube-apiserver-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:33.840221 kubelet[2983]: I1013 06:25:33.840109 2983 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 13 06:25:33.855270 sshd[2810]: Received disconnect from 193.46.255.20 port 23518:11: [preauth] Oct 13 06:25:33.855270 sshd[2810]: Disconnected from authenticating user root 193.46.255.20 port 23518 [preauth] Oct 13 06:25:33.859574 systemd[1]: sshd@11-139.178.94.13:22-193.46.255.20:23518.service: Deactivated successfully. Oct 13 06:25:33.945443 systemd[1]: Reload requested from client PID 3316 ('systemctl') (unit session-11.scope)... Oct 13 06:25:33.945450 systemd[1]: Reloading... Oct 13 06:25:33.984329 zram_generator::config[3362]: No configuration found. Oct 13 06:25:34.142187 systemd[1]: Reloading finished in 196 ms. Oct 13 06:25:34.170574 kubelet[2983]: I1013 06:25:34.170508 2983 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Oct 13 06:25:34.170539 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Oct 13 06:25:34.188627 systemd[1]: kubelet.service: Deactivated successfully. Oct 13 06:25:34.188747 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:25:34.188771 systemd[1]: kubelet.service: Consumed 615ms CPU time, 131.2M memory peak. Oct 13 06:25:34.190004 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 13 06:25:34.495255 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:25:34.501431 (kubelet)[3426]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Oct 13 06:25:34.539567 kubelet[3426]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Oct 13 06:25:34.539567 kubelet[3426]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 13 06:25:34.539890 kubelet[3426]: I1013 06:25:34.539602 3426 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 13 06:25:34.546522 kubelet[3426]: I1013 06:25:34.546496 3426 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Oct 13 06:25:34.546522 kubelet[3426]: I1013 06:25:34.546517 3426 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 13 06:25:34.546726 kubelet[3426]: I1013 06:25:34.546551 3426 watchdog_linux.go:95] "Systemd watchdog is not enabled" Oct 13 06:25:34.546726 kubelet[3426]: I1013 06:25:34.546563 3426 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Oct 13 06:25:34.546890 kubelet[3426]: I1013 06:25:34.546873 3426 server.go:956] "Client rotation is on, will bootstrap in background" Oct 13 06:25:34.548087 kubelet[3426]: I1013 06:25:34.548068 3426 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Oct 13 06:25:34.549896 kubelet[3426]: I1013 06:25:34.549885 3426 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Oct 13 06:25:34.551345 kubelet[3426]: I1013 06:25:34.551335 3426 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 13 06:25:34.559163 kubelet[3426]: I1013 06:25:34.559112 3426 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Oct 13 06:25:34.559304 kubelet[3426]: I1013 06:25:34.559231 3426 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 13 06:25:34.559333 kubelet[3426]: I1013 06:25:34.559249 3426 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4487.0.0-a-becc29ce89","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 13 06:25:34.559385 kubelet[3426]: I1013 06:25:34.559333 3426 topology_manager.go:138] "Creating topology manager with none policy" Oct 13 06:25:34.559385 kubelet[3426]: I1013 06:25:34.559339 3426 container_manager_linux.go:306] "Creating device plugin manager" Oct 13 06:25:34.559385 kubelet[3426]: I1013 06:25:34.559353 3426 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Oct 13 06:25:34.559700 kubelet[3426]: I1013 06:25:34.559665 3426 state_mem.go:36] "Initialized new in-memory state store" Oct 13 06:25:34.559813 kubelet[3426]: I1013 06:25:34.559781 3426 kubelet.go:475] "Attempting to sync node with API server" Oct 13 06:25:34.559813 kubelet[3426]: I1013 06:25:34.559789 3426 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 13 06:25:34.559813 kubelet[3426]: I1013 06:25:34.559801 3426 kubelet.go:387] "Adding apiserver pod source" Oct 13 06:25:34.559813 kubelet[3426]: I1013 06:25:34.559814 3426 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 13 06:25:34.560308 kubelet[3426]: I1013 06:25:34.560297 3426 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Oct 13 06:25:34.560596 kubelet[3426]: I1013 06:25:34.560587 3426 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Oct 13 06:25:34.560622 kubelet[3426]: I1013 06:25:34.560606 3426 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Oct 13 06:25:34.561664 kubelet[3426]: I1013 06:25:34.561655 3426 server.go:1262] "Started kubelet" Oct 13 06:25:34.561713 kubelet[3426]: I1013 06:25:34.561690 3426 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Oct 13 06:25:34.561743 kubelet[3426]: I1013 06:25:34.561702 3426 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 13 06:25:34.561772 kubelet[3426]: I1013 06:25:34.561749 3426 server_v1.go:49] "podresources" method="list" useActivePods=true Oct 13 06:25:34.561982 kubelet[3426]: I1013 06:25:34.561970 3426 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 13 06:25:34.562130 kubelet[3426]: I1013 06:25:34.562121 3426 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 13 06:25:34.562168 kubelet[3426]: I1013 06:25:34.562150 3426 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Oct 13 06:25:34.562276 kubelet[3426]: I1013 06:25:34.562232 3426 volume_manager.go:313] "Starting Kubelet Volume Manager" Oct 13 06:25:34.562276 kubelet[3426]: E1013 06:25:34.562245 3426 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4487.0.0-a-becc29ce89\" not found" Oct 13 06:25:34.562835 kubelet[3426]: I1013 06:25:34.562454 3426 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 13 06:25:34.562835 kubelet[3426]: I1013 06:25:34.562595 3426 reconciler.go:29] "Reconciler: start to sync state" Oct 13 06:25:34.562835 kubelet[3426]: I1013 06:25:34.562702 3426 server.go:310] "Adding debug handlers to kubelet server" Oct 13 06:25:34.562835 kubelet[3426]: I1013 06:25:34.562734 3426 factory.go:223] Registration of the systemd container factory successfully Oct 13 06:25:34.562835 kubelet[3426]: I1013 06:25:34.562805 3426 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Oct 13 06:25:34.564436 kubelet[3426]: E1013 06:25:34.564413 3426 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Oct 13 06:25:34.565187 kubelet[3426]: I1013 06:25:34.565173 3426 factory.go:223] Registration of the containerd container factory successfully Oct 13 06:25:34.569712 kubelet[3426]: I1013 06:25:34.569693 3426 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Oct 13 06:25:34.570194 kubelet[3426]: I1013 06:25:34.570184 3426 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Oct 13 06:25:34.570194 kubelet[3426]: I1013 06:25:34.570194 3426 status_manager.go:244] "Starting to sync pod status with apiserver" Oct 13 06:25:34.570281 kubelet[3426]: I1013 06:25:34.570208 3426 kubelet.go:2427] "Starting kubelet main sync loop" Oct 13 06:25:34.570281 kubelet[3426]: E1013 06:25:34.570231 3426 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 13 06:25:34.579362 kubelet[3426]: I1013 06:25:34.579347 3426 cpu_manager.go:221] "Starting CPU manager" policy="none" Oct 13 06:25:34.579362 kubelet[3426]: I1013 06:25:34.579357 3426 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Oct 13 06:25:34.579362 kubelet[3426]: I1013 06:25:34.579369 3426 state_mem.go:36] "Initialized new in-memory state store" Oct 13 06:25:34.579492 kubelet[3426]: I1013 06:25:34.579442 3426 state_mem.go:88] "Updated default CPUSet" cpuSet="" Oct 13 06:25:34.579492 kubelet[3426]: I1013 06:25:34.579449 3426 state_mem.go:96] "Updated CPUSet assignments" assignments={} Oct 13 06:25:34.579492 kubelet[3426]: I1013 06:25:34.579460 3426 policy_none.go:49] "None policy: Start" Oct 13 06:25:34.579492 kubelet[3426]: I1013 06:25:34.579466 3426 memory_manager.go:187] "Starting memorymanager" policy="None" Oct 13 06:25:34.579492 kubelet[3426]: I1013 06:25:34.579471 3426 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Oct 13 06:25:34.579578 kubelet[3426]: I1013 06:25:34.579523 3426 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Oct 13 06:25:34.579578 kubelet[3426]: I1013 06:25:34.579528 3426 policy_none.go:47] "Start" Oct 13 06:25:34.581420 kubelet[3426]: E1013 06:25:34.581375 3426 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Oct 13 06:25:34.581507 kubelet[3426]: I1013 06:25:34.581476 3426 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 13 06:25:34.581507 kubelet[3426]: I1013 06:25:34.581483 3426 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 13 06:25:34.581573 kubelet[3426]: I1013 06:25:34.581565 3426 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 13 06:25:34.581847 kubelet[3426]: E1013 06:25:34.581834 3426 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Oct 13 06:25:34.672646 kubelet[3426]: I1013 06:25:34.672538 3426 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.672646 kubelet[3426]: I1013 06:25:34.672637 3426 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.673020 kubelet[3426]: I1013 06:25:34.672538 3426 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.680762 kubelet[3426]: I1013 06:25:34.680705 3426 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Oct 13 06:25:34.680963 kubelet[3426]: E1013 06:25:34.680859 3426 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4487.0.0-a-becc29ce89\" already exists" pod="kube-system/kube-scheduler-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.681603 kubelet[3426]: I1013 06:25:34.681556 3426 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Oct 13 06:25:34.681603 kubelet[3426]: I1013 06:25:34.681589 3426 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Oct 13 06:25:34.681829 kubelet[3426]: E1013 06:25:34.681685 3426 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4487.0.0-a-becc29ce89\" already exists" pod="kube-system/kube-apiserver-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.681829 kubelet[3426]: E1013 06:25:34.681700 3426 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4487.0.0-a-becc29ce89\" already exists" pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.686068 kubelet[3426]: I1013 06:25:34.685982 3426 kubelet_node_status.go:75] "Attempting to register node" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.695664 kubelet[3426]: I1013 06:25:34.695618 3426 kubelet_node_status.go:124] "Node was previously registered" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.695846 kubelet[3426]: I1013 06:25:34.695748 3426 kubelet_node_status.go:78] "Successfully registered node" node="ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.863752 kubelet[3426]: I1013 06:25:34.863619 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ac50836ebba401cb4102e7bef9eb0483-k8s-certs\") pod \"kube-controller-manager-ci-4487.0.0-a-becc29ce89\" (UID: \"ac50836ebba401cb4102e7bef9eb0483\") " pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.863752 kubelet[3426]: I1013 06:25:34.863723 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ac50836ebba401cb4102e7bef9eb0483-kubeconfig\") pod \"kube-controller-manager-ci-4487.0.0-a-becc29ce89\" (UID: \"ac50836ebba401cb4102e7bef9eb0483\") " pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.864278 kubelet[3426]: I1013 06:25:34.863787 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f5cbea7c8ee68c74c150ca61efdb0b33-kubeconfig\") pod \"kube-scheduler-ci-4487.0.0-a-becc29ce89\" (UID: \"f5cbea7c8ee68c74c150ca61efdb0b33\") " pod="kube-system/kube-scheduler-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.864278 kubelet[3426]: I1013 06:25:34.863835 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fee0fd67d59afcfa4f9efd0c6357c6a9-ca-certs\") pod \"kube-apiserver-ci-4487.0.0-a-becc29ce89\" (UID: \"fee0fd67d59afcfa4f9efd0c6357c6a9\") " pod="kube-system/kube-apiserver-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.864278 kubelet[3426]: I1013 06:25:34.863895 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fee0fd67d59afcfa4f9efd0c6357c6a9-k8s-certs\") pod \"kube-apiserver-ci-4487.0.0-a-becc29ce89\" (UID: \"fee0fd67d59afcfa4f9efd0c6357c6a9\") " pod="kube-system/kube-apiserver-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.864278 kubelet[3426]: I1013 06:25:34.863952 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fee0fd67d59afcfa4f9efd0c6357c6a9-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4487.0.0-a-becc29ce89\" (UID: \"fee0fd67d59afcfa4f9efd0c6357c6a9\") " pod="kube-system/kube-apiserver-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.864278 kubelet[3426]: I1013 06:25:34.864078 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ac50836ebba401cb4102e7bef9eb0483-ca-certs\") pod \"kube-controller-manager-ci-4487.0.0-a-becc29ce89\" (UID: \"ac50836ebba401cb4102e7bef9eb0483\") " pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.864835 kubelet[3426]: I1013 06:25:34.864265 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ac50836ebba401cb4102e7bef9eb0483-flexvolume-dir\") pod \"kube-controller-manager-ci-4487.0.0-a-becc29ce89\" (UID: \"ac50836ebba401cb4102e7bef9eb0483\") " pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:34.864835 kubelet[3426]: I1013 06:25:34.864371 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ac50836ebba401cb4102e7bef9eb0483-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4487.0.0-a-becc29ce89\" (UID: \"ac50836ebba401cb4102e7bef9eb0483\") " pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:35.560619 kubelet[3426]: I1013 06:25:35.560597 3426 apiserver.go:52] "Watching apiserver" Oct 13 06:25:35.563104 kubelet[3426]: I1013 06:25:35.563081 3426 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 13 06:25:35.576067 kubelet[3426]: I1013 06:25:35.576050 3426 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:35.576203 kubelet[3426]: I1013 06:25:35.576154 3426 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:35.576271 kubelet[3426]: I1013 06:25:35.576217 3426 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:35.597728 kubelet[3426]: I1013 06:25:35.597702 3426 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Oct 13 06:25:35.597838 kubelet[3426]: E1013 06:25:35.597745 3426 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4487.0.0-a-becc29ce89\" already exists" pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:35.597838 kubelet[3426]: I1013 06:25:35.597758 3426 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Oct 13 06:25:35.597838 kubelet[3426]: E1013 06:25:35.597789 3426 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4487.0.0-a-becc29ce89\" already exists" pod="kube-system/kube-apiserver-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:35.597965 kubelet[3426]: I1013 06:25:35.597749 3426 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Oct 13 06:25:35.597965 kubelet[3426]: E1013 06:25:35.597896 3426 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4487.0.0-a-becc29ce89\" already exists" pod="kube-system/kube-scheduler-ci-4487.0.0-a-becc29ce89" Oct 13 06:25:35.603396 kubelet[3426]: I1013 06:25:35.603337 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4487.0.0-a-becc29ce89" podStartSLOduration=3.60331992 podStartE2EDuration="3.60331992s" podCreationTimestamp="2025-10-13 06:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 06:25:35.597541688 +0000 UTC m=+1.091894837" watchObservedRunningTime="2025-10-13 06:25:35.60331992 +0000 UTC m=+1.097673073" Oct 13 06:25:35.609045 kubelet[3426]: I1013 06:25:35.608997 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4487.0.0-a-becc29ce89" podStartSLOduration=3.60898074 podStartE2EDuration="3.60898074s" podCreationTimestamp="2025-10-13 06:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 06:25:35.603470669 +0000 UTC m=+1.097823825" watchObservedRunningTime="2025-10-13 06:25:35.60898074 +0000 UTC m=+1.103333890" Oct 13 06:25:35.609149 kubelet[3426]: I1013 06:25:35.609097 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4487.0.0-a-becc29ce89" podStartSLOduration=3.6090874299999998 podStartE2EDuration="3.60908743s" podCreationTimestamp="2025-10-13 06:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 06:25:35.608809171 +0000 UTC m=+1.103162330" watchObservedRunningTime="2025-10-13 06:25:35.60908743 +0000 UTC m=+1.103440579" Oct 13 06:25:40.870557 systemd[1]: Created slice kubepods-besteffort-pod7bd24d85_b71c_4ae0_a8fe_5306ee02617d.slice - libcontainer container kubepods-besteffort-pod7bd24d85_b71c_4ae0_a8fe_5306ee02617d.slice. Oct 13 06:25:40.900928 kubelet[3426]: I1013 06:25:40.900825 3426 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Oct 13 06:25:40.901769 containerd[2022]: time="2025-10-13T06:25:40.901501305Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Oct 13 06:25:40.902401 kubelet[3426]: I1013 06:25:40.901890 3426 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Oct 13 06:25:40.902401 kubelet[3426]: I1013 06:25:40.902160 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7bd24d85-b71c-4ae0-a8fe-5306ee02617d-lib-modules\") pod \"kube-proxy-c9ccx\" (UID: \"7bd24d85-b71c-4ae0-a8fe-5306ee02617d\") " pod="kube-system/kube-proxy-c9ccx" Oct 13 06:25:40.902401 kubelet[3426]: I1013 06:25:40.902275 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/7bd24d85-b71c-4ae0-a8fe-5306ee02617d-kube-proxy\") pod \"kube-proxy-c9ccx\" (UID: \"7bd24d85-b71c-4ae0-a8fe-5306ee02617d\") " pod="kube-system/kube-proxy-c9ccx" Oct 13 06:25:40.902401 kubelet[3426]: I1013 06:25:40.902364 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7bd24d85-b71c-4ae0-a8fe-5306ee02617d-xtables-lock\") pod \"kube-proxy-c9ccx\" (UID: \"7bd24d85-b71c-4ae0-a8fe-5306ee02617d\") " pod="kube-system/kube-proxy-c9ccx" Oct 13 06:25:40.902797 kubelet[3426]: I1013 06:25:40.902457 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjh77\" (UniqueName: \"kubernetes.io/projected/7bd24d85-b71c-4ae0-a8fe-5306ee02617d-kube-api-access-xjh77\") pod \"kube-proxy-c9ccx\" (UID: \"7bd24d85-b71c-4ae0-a8fe-5306ee02617d\") " pod="kube-system/kube-proxy-c9ccx" Oct 13 06:25:41.016830 kubelet[3426]: E1013 06:25:41.016720 3426 projected.go:291] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Oct 13 06:25:41.016830 kubelet[3426]: E1013 06:25:41.016783 3426 projected.go:196] Error preparing data for projected volume kube-api-access-xjh77 for pod kube-system/kube-proxy-c9ccx: configmap "kube-root-ca.crt" not found Oct 13 06:25:41.017164 kubelet[3426]: E1013 06:25:41.016931 3426 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7bd24d85-b71c-4ae0-a8fe-5306ee02617d-kube-api-access-xjh77 podName:7bd24d85-b71c-4ae0-a8fe-5306ee02617d nodeName:}" failed. No retries permitted until 2025-10-13 06:25:41.516867517 +0000 UTC m=+7.011220732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xjh77" (UniqueName: "kubernetes.io/projected/7bd24d85-b71c-4ae0-a8fe-5306ee02617d-kube-api-access-xjh77") pod "kube-proxy-c9ccx" (UID: "7bd24d85-b71c-4ae0-a8fe-5306ee02617d") : configmap "kube-root-ca.crt" not found Oct 13 06:25:41.607940 kubelet[3426]: E1013 06:25:41.607838 3426 projected.go:291] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Oct 13 06:25:41.607940 kubelet[3426]: E1013 06:25:41.607901 3426 projected.go:196] Error preparing data for projected volume kube-api-access-xjh77 for pod kube-system/kube-proxy-c9ccx: configmap "kube-root-ca.crt" not found Oct 13 06:25:41.608368 kubelet[3426]: E1013 06:25:41.608035 3426 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7bd24d85-b71c-4ae0-a8fe-5306ee02617d-kube-api-access-xjh77 podName:7bd24d85-b71c-4ae0-a8fe-5306ee02617d nodeName:}" failed. No retries permitted until 2025-10-13 06:25:42.607981761 +0000 UTC m=+8.102334961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xjh77" (UniqueName: "kubernetes.io/projected/7bd24d85-b71c-4ae0-a8fe-5306ee02617d-kube-api-access-xjh77") pod "kube-proxy-c9ccx" (UID: "7bd24d85-b71c-4ae0-a8fe-5306ee02617d") : configmap "kube-root-ca.crt" not found Oct 13 06:25:42.065545 systemd[1]: Created slice kubepods-besteffort-pod365d90f9_fb54_459a_9583_a0aa3967e6db.slice - libcontainer container kubepods-besteffort-pod365d90f9_fb54_459a_9583_a0aa3967e6db.slice. Oct 13 06:25:42.111106 kubelet[3426]: I1013 06:25:42.111024 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq7sj\" (UniqueName: \"kubernetes.io/projected/365d90f9-fb54-459a-9583-a0aa3967e6db-kube-api-access-dq7sj\") pod \"tigera-operator-db78d5bd4-jmtgn\" (UID: \"365d90f9-fb54-459a-9583-a0aa3967e6db\") " pod="tigera-operator/tigera-operator-db78d5bd4-jmtgn" Oct 13 06:25:42.111106 kubelet[3426]: I1013 06:25:42.111104 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/365d90f9-fb54-459a-9583-a0aa3967e6db-var-lib-calico\") pod \"tigera-operator-db78d5bd4-jmtgn\" (UID: \"365d90f9-fb54-459a-9583-a0aa3967e6db\") " pod="tigera-operator/tigera-operator-db78d5bd4-jmtgn" Oct 13 06:25:42.370944 containerd[2022]: time="2025-10-13T06:25:42.370858075Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-db78d5bd4-jmtgn,Uid:365d90f9-fb54-459a-9583-a0aa3967e6db,Namespace:tigera-operator,Attempt:0,}" Oct 13 06:25:42.378418 containerd[2022]: time="2025-10-13T06:25:42.378348429Z" level=info msg="connecting to shim 77892dca9f489efaee76162acafe398f9c8b2b0728533e01df17dba1da0f8f9e" address="unix:///run/containerd/s/78aa5ad9d601b220f5b635a2c65fa36c9a729ee3ec8f8494fa8e972e54db11e2" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:25:42.394552 systemd[1]: Started cri-containerd-77892dca9f489efaee76162acafe398f9c8b2b0728533e01df17dba1da0f8f9e.scope - libcontainer container 77892dca9f489efaee76162acafe398f9c8b2b0728533e01df17dba1da0f8f9e. Oct 13 06:25:42.424864 containerd[2022]: time="2025-10-13T06:25:42.424838681Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-db78d5bd4-jmtgn,Uid:365d90f9-fb54-459a-9583-a0aa3967e6db,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"77892dca9f489efaee76162acafe398f9c8b2b0728533e01df17dba1da0f8f9e\"" Oct 13 06:25:42.425804 containerd[2022]: time="2025-10-13T06:25:42.425741395Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Oct 13 06:25:42.688258 containerd[2022]: time="2025-10-13T06:25:42.688162212Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-c9ccx,Uid:7bd24d85-b71c-4ae0-a8fe-5306ee02617d,Namespace:kube-system,Attempt:0,}" Oct 13 06:25:42.695502 containerd[2022]: time="2025-10-13T06:25:42.695480225Z" level=info msg="connecting to shim 8dfd41b64ebec994397295617479b23eff1a864f0f04ba82cf39d4400590edfc" address="unix:///run/containerd/s/f9be7b1de7f3520bd08599472c67f4ca5aa42c4b204b3509dccf71d9d6db0c19" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:25:42.721512 systemd[1]: Started cri-containerd-8dfd41b64ebec994397295617479b23eff1a864f0f04ba82cf39d4400590edfc.scope - libcontainer container 8dfd41b64ebec994397295617479b23eff1a864f0f04ba82cf39d4400590edfc. Oct 13 06:25:42.737091 containerd[2022]: time="2025-10-13T06:25:42.737071633Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-c9ccx,Uid:7bd24d85-b71c-4ae0-a8fe-5306ee02617d,Namespace:kube-system,Attempt:0,} returns sandbox id \"8dfd41b64ebec994397295617479b23eff1a864f0f04ba82cf39d4400590edfc\"" Oct 13 06:25:42.738999 containerd[2022]: time="2025-10-13T06:25:42.738984786Z" level=info msg="CreateContainer within sandbox \"8dfd41b64ebec994397295617479b23eff1a864f0f04ba82cf39d4400590edfc\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Oct 13 06:25:42.742902 containerd[2022]: time="2025-10-13T06:25:42.742858902Z" level=info msg="Container f96c0839cf80003f4188d53c02992fd12de589b91ca68c9db300fd808bf7b2b3: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:25:42.746068 containerd[2022]: time="2025-10-13T06:25:42.746026663Z" level=info msg="CreateContainer within sandbox \"8dfd41b64ebec994397295617479b23eff1a864f0f04ba82cf39d4400590edfc\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f96c0839cf80003f4188d53c02992fd12de589b91ca68c9db300fd808bf7b2b3\"" Oct 13 06:25:42.746341 containerd[2022]: time="2025-10-13T06:25:42.746281713Z" level=info msg="StartContainer for \"f96c0839cf80003f4188d53c02992fd12de589b91ca68c9db300fd808bf7b2b3\"" Oct 13 06:25:42.747022 containerd[2022]: time="2025-10-13T06:25:42.746978619Z" level=info msg="connecting to shim f96c0839cf80003f4188d53c02992fd12de589b91ca68c9db300fd808bf7b2b3" address="unix:///run/containerd/s/f9be7b1de7f3520bd08599472c67f4ca5aa42c4b204b3509dccf71d9d6db0c19" protocol=ttrpc version=3 Oct 13 06:25:42.760407 systemd[1]: Started cri-containerd-f96c0839cf80003f4188d53c02992fd12de589b91ca68c9db300fd808bf7b2b3.scope - libcontainer container f96c0839cf80003f4188d53c02992fd12de589b91ca68c9db300fd808bf7b2b3. Oct 13 06:25:42.781163 containerd[2022]: time="2025-10-13T06:25:42.781143059Z" level=info msg="StartContainer for \"f96c0839cf80003f4188d53c02992fd12de589b91ca68c9db300fd808bf7b2b3\" returns successfully" Oct 13 06:25:43.604993 kubelet[3426]: I1013 06:25:43.604908 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-c9ccx" podStartSLOduration=3.604893722 podStartE2EDuration="3.604893722s" podCreationTimestamp="2025-10-13 06:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 06:25:43.604781215 +0000 UTC m=+9.099134358" watchObservedRunningTime="2025-10-13 06:25:43.604893722 +0000 UTC m=+9.099246862" Oct 13 06:25:43.675907 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1646571042.mount: Deactivated successfully. Oct 13 06:25:43.953172 containerd[2022]: time="2025-10-13T06:25:43.953087181Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:43.953389 containerd[2022]: time="2025-10-13T06:25:43.953244610Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Oct 13 06:25:43.953724 containerd[2022]: time="2025-10-13T06:25:43.953683884Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:43.954581 containerd[2022]: time="2025-10-13T06:25:43.954541715Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:43.954947 containerd[2022]: time="2025-10-13T06:25:43.954902586Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.529145076s" Oct 13 06:25:43.954947 containerd[2022]: time="2025-10-13T06:25:43.954917739Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Oct 13 06:25:43.956411 containerd[2022]: time="2025-10-13T06:25:43.956381549Z" level=info msg="CreateContainer within sandbox \"77892dca9f489efaee76162acafe398f9c8b2b0728533e01df17dba1da0f8f9e\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Oct 13 06:25:43.958931 containerd[2022]: time="2025-10-13T06:25:43.958919542Z" level=info msg="Container a11760bba6c5d108bcfa23da8f0d39db008abdc6ed68764857cdd63050f5942d: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:25:43.960924 containerd[2022]: time="2025-10-13T06:25:43.960909007Z" level=info msg="CreateContainer within sandbox \"77892dca9f489efaee76162acafe398f9c8b2b0728533e01df17dba1da0f8f9e\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a11760bba6c5d108bcfa23da8f0d39db008abdc6ed68764857cdd63050f5942d\"" Oct 13 06:25:43.961169 containerd[2022]: time="2025-10-13T06:25:43.961156874Z" level=info msg="StartContainer for \"a11760bba6c5d108bcfa23da8f0d39db008abdc6ed68764857cdd63050f5942d\"" Oct 13 06:25:43.961595 containerd[2022]: time="2025-10-13T06:25:43.961583005Z" level=info msg="connecting to shim a11760bba6c5d108bcfa23da8f0d39db008abdc6ed68764857cdd63050f5942d" address="unix:///run/containerd/s/78aa5ad9d601b220f5b635a2c65fa36c9a729ee3ec8f8494fa8e972e54db11e2" protocol=ttrpc version=3 Oct 13 06:25:43.985522 systemd[1]: Started cri-containerd-a11760bba6c5d108bcfa23da8f0d39db008abdc6ed68764857cdd63050f5942d.scope - libcontainer container a11760bba6c5d108bcfa23da8f0d39db008abdc6ed68764857cdd63050f5942d. Oct 13 06:25:43.999605 containerd[2022]: time="2025-10-13T06:25:43.999582083Z" level=info msg="StartContainer for \"a11760bba6c5d108bcfa23da8f0d39db008abdc6ed68764857cdd63050f5942d\" returns successfully" Oct 13 06:25:44.824060 kubelet[3426]: I1013 06:25:44.824017 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-db78d5bd4-jmtgn" podStartSLOduration=1.294236363 podStartE2EDuration="2.824006124s" podCreationTimestamp="2025-10-13 06:25:42 +0000 UTC" firstStartedPulling="2025-10-13 06:25:42.425573304 +0000 UTC m=+7.919926442" lastFinishedPulling="2025-10-13 06:25:43.955343061 +0000 UTC m=+9.449696203" observedRunningTime="2025-10-13 06:25:44.606461652 +0000 UTC m=+10.100814795" watchObservedRunningTime="2025-10-13 06:25:44.824006124 +0000 UTC m=+10.318359263" Oct 13 06:25:47.774594 update_engine[2017]: I20251013 06:25:47.774447 2017 update_attempter.cc:509] Updating boot flags... Oct 13 06:25:48.519138 sudo[2337]: pam_unix(sudo:session): session closed for user root Oct 13 06:25:48.519900 sshd[2336]: Connection closed by 139.178.68.195 port 49122 Oct 13 06:25:48.520090 sshd-session[2333]: pam_unix(sshd:session): session closed for user core Oct 13 06:25:48.522507 systemd[1]: sshd@9-139.178.94.13:22-139.178.68.195:49122.service: Deactivated successfully. Oct 13 06:25:48.523728 systemd[1]: session-11.scope: Deactivated successfully. Oct 13 06:25:48.523859 systemd[1]: session-11.scope: Consumed 3.831s CPU time, 236.7M memory peak. Oct 13 06:25:48.524902 systemd-logind[2006]: Session 11 logged out. Waiting for processes to exit. Oct 13 06:25:48.525516 systemd-logind[2006]: Removed session 11. Oct 13 06:25:50.857408 systemd[1]: Created slice kubepods-besteffort-pod62ed53da_4de6_49b2_bd2f_794794e98338.slice - libcontainer container kubepods-besteffort-pod62ed53da_4de6_49b2_bd2f_794794e98338.slice. Oct 13 06:25:50.868267 kubelet[3426]: I1013 06:25:50.868246 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xltr6\" (UniqueName: \"kubernetes.io/projected/62ed53da-4de6-49b2-bd2f-794794e98338-kube-api-access-xltr6\") pod \"calico-typha-6fc8898dc4-zcvcv\" (UID: \"62ed53da-4de6-49b2-bd2f-794794e98338\") " pod="calico-system/calico-typha-6fc8898dc4-zcvcv" Oct 13 06:25:50.868267 kubelet[3426]: I1013 06:25:50.868270 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62ed53da-4de6-49b2-bd2f-794794e98338-tigera-ca-bundle\") pod \"calico-typha-6fc8898dc4-zcvcv\" (UID: \"62ed53da-4de6-49b2-bd2f-794794e98338\") " pod="calico-system/calico-typha-6fc8898dc4-zcvcv" Oct 13 06:25:50.868499 kubelet[3426]: I1013 06:25:50.868281 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/62ed53da-4de6-49b2-bd2f-794794e98338-typha-certs\") pod \"calico-typha-6fc8898dc4-zcvcv\" (UID: \"62ed53da-4de6-49b2-bd2f-794794e98338\") " pod="calico-system/calico-typha-6fc8898dc4-zcvcv" Oct 13 06:25:51.162108 containerd[2022]: time="2025-10-13T06:25:51.161983726Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6fc8898dc4-zcvcv,Uid:62ed53da-4de6-49b2-bd2f-794794e98338,Namespace:calico-system,Attempt:0,}" Oct 13 06:25:51.169710 containerd[2022]: time="2025-10-13T06:25:51.169639178Z" level=info msg="connecting to shim c9d2c6ca21dd39c454f4f16fd7a70121d5c9438e27d0d4bcd8b86c1ddcee611d" address="unix:///run/containerd/s/8ce3338e5d6c13a86cdbcac5bd17b7f82cf3195a27e33536155eec35d1e30382" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:25:51.190435 systemd[1]: Started cri-containerd-c9d2c6ca21dd39c454f4f16fd7a70121d5c9438e27d0d4bcd8b86c1ddcee611d.scope - libcontainer container c9d2c6ca21dd39c454f4f16fd7a70121d5c9438e27d0d4bcd8b86c1ddcee611d. Oct 13 06:25:51.197277 systemd[1]: Created slice kubepods-besteffort-pod139f11de_073e_411e_a9ef_dcf0a623ad41.slice - libcontainer container kubepods-besteffort-pod139f11de_073e_411e_a9ef_dcf0a623ad41.slice. Oct 13 06:25:51.217996 containerd[2022]: time="2025-10-13T06:25:51.217945054Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6fc8898dc4-zcvcv,Uid:62ed53da-4de6-49b2-bd2f-794794e98338,Namespace:calico-system,Attempt:0,} returns sandbox id \"c9d2c6ca21dd39c454f4f16fd7a70121d5c9438e27d0d4bcd8b86c1ddcee611d\"" Oct 13 06:25:51.218618 containerd[2022]: time="2025-10-13T06:25:51.218604732Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Oct 13 06:25:51.272374 kubelet[3426]: I1013 06:25:51.272315 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/139f11de-073e-411e-a9ef-dcf0a623ad41-cni-log-dir\") pod \"calico-node-j9px9\" (UID: \"139f11de-073e-411e-a9ef-dcf0a623ad41\") " pod="calico-system/calico-node-j9px9" Oct 13 06:25:51.272374 kubelet[3426]: I1013 06:25:51.272340 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/139f11de-073e-411e-a9ef-dcf0a623ad41-lib-modules\") pod \"calico-node-j9px9\" (UID: \"139f11de-073e-411e-a9ef-dcf0a623ad41\") " pod="calico-system/calico-node-j9px9" Oct 13 06:25:51.272374 kubelet[3426]: I1013 06:25:51.272349 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/139f11de-073e-411e-a9ef-dcf0a623ad41-policysync\") pod \"calico-node-j9px9\" (UID: \"139f11de-073e-411e-a9ef-dcf0a623ad41\") " pod="calico-system/calico-node-j9px9" Oct 13 06:25:51.272374 kubelet[3426]: I1013 06:25:51.272358 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/139f11de-073e-411e-a9ef-dcf0a623ad41-var-lib-calico\") pod \"calico-node-j9px9\" (UID: \"139f11de-073e-411e-a9ef-dcf0a623ad41\") " pod="calico-system/calico-node-j9px9" Oct 13 06:25:51.272374 kubelet[3426]: I1013 06:25:51.272368 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/139f11de-073e-411e-a9ef-dcf0a623ad41-flexvol-driver-host\") pod \"calico-node-j9px9\" (UID: \"139f11de-073e-411e-a9ef-dcf0a623ad41\") " pod="calico-system/calico-node-j9px9" Oct 13 06:25:51.272551 kubelet[3426]: I1013 06:25:51.272378 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/139f11de-073e-411e-a9ef-dcf0a623ad41-xtables-lock\") pod \"calico-node-j9px9\" (UID: \"139f11de-073e-411e-a9ef-dcf0a623ad41\") " pod="calico-system/calico-node-j9px9" Oct 13 06:25:51.272551 kubelet[3426]: I1013 06:25:51.272388 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sghwd\" (UniqueName: \"kubernetes.io/projected/139f11de-073e-411e-a9ef-dcf0a623ad41-kube-api-access-sghwd\") pod \"calico-node-j9px9\" (UID: \"139f11de-073e-411e-a9ef-dcf0a623ad41\") " pod="calico-system/calico-node-j9px9" Oct 13 06:25:51.272551 kubelet[3426]: I1013 06:25:51.272398 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/139f11de-073e-411e-a9ef-dcf0a623ad41-cni-net-dir\") pod \"calico-node-j9px9\" (UID: \"139f11de-073e-411e-a9ef-dcf0a623ad41\") " pod="calico-system/calico-node-j9px9" Oct 13 06:25:51.272551 kubelet[3426]: I1013 06:25:51.272409 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/139f11de-073e-411e-a9ef-dcf0a623ad41-node-certs\") pod \"calico-node-j9px9\" (UID: \"139f11de-073e-411e-a9ef-dcf0a623ad41\") " pod="calico-system/calico-node-j9px9" Oct 13 06:25:51.272551 kubelet[3426]: I1013 06:25:51.272420 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/139f11de-073e-411e-a9ef-dcf0a623ad41-cni-bin-dir\") pod \"calico-node-j9px9\" (UID: \"139f11de-073e-411e-a9ef-dcf0a623ad41\") " pod="calico-system/calico-node-j9px9" Oct 13 06:25:51.272636 kubelet[3426]: I1013 06:25:51.272453 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/139f11de-073e-411e-a9ef-dcf0a623ad41-var-run-calico\") pod \"calico-node-j9px9\" (UID: \"139f11de-073e-411e-a9ef-dcf0a623ad41\") " pod="calico-system/calico-node-j9px9" Oct 13 06:25:51.272636 kubelet[3426]: I1013 06:25:51.272475 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/139f11de-073e-411e-a9ef-dcf0a623ad41-tigera-ca-bundle\") pod \"calico-node-j9px9\" (UID: \"139f11de-073e-411e-a9ef-dcf0a623ad41\") " pod="calico-system/calico-node-j9px9" Oct 13 06:25:51.375934 kubelet[3426]: E1013 06:25:51.375855 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.375934 kubelet[3426]: W1013 06:25:51.375901 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.375934 kubelet[3426]: E1013 06:25:51.375943 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.381213 kubelet[3426]: E1013 06:25:51.381152 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.381213 kubelet[3426]: W1013 06:25:51.381189 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.381626 kubelet[3426]: E1013 06:25:51.381226 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.393230 kubelet[3426]: E1013 06:25:51.393179 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.393230 kubelet[3426]: W1013 06:25:51.393220 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.393566 kubelet[3426]: E1013 06:25:51.393293 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.501499 containerd[2022]: time="2025-10-13T06:25:51.501390015Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-j9px9,Uid:139f11de-073e-411e-a9ef-dcf0a623ad41,Namespace:calico-system,Attempt:0,}" Oct 13 06:25:51.507278 kubelet[3426]: E1013 06:25:51.507229 3426 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x788d" podUID="da2b5110-cb9c-4d9e-a520-e14947dc335c" Oct 13 06:25:51.510009 containerd[2022]: time="2025-10-13T06:25:51.509974967Z" level=info msg="connecting to shim a3363dbc7563092372632e7b18798c8a5eda0cf60e69862c1215d5d8adc72b5a" address="unix:///run/containerd/s/e4584b9a126e89f3b0b7839934dd592e2b8acaeae7400dd8feccecef417dbd9f" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:25:51.532560 systemd[1]: Started cri-containerd-a3363dbc7563092372632e7b18798c8a5eda0cf60e69862c1215d5d8adc72b5a.scope - libcontainer container a3363dbc7563092372632e7b18798c8a5eda0cf60e69862c1215d5d8adc72b5a. Oct 13 06:25:51.543360 containerd[2022]: time="2025-10-13T06:25:51.543332257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-j9px9,Uid:139f11de-073e-411e-a9ef-dcf0a623ad41,Namespace:calico-system,Attempt:0,} returns sandbox id \"a3363dbc7563092372632e7b18798c8a5eda0cf60e69862c1215d5d8adc72b5a\"" Oct 13 06:25:51.562401 kubelet[3426]: E1013 06:25:51.562356 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.562401 kubelet[3426]: W1013 06:25:51.562369 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.562401 kubelet[3426]: E1013 06:25:51.562380 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.562570 kubelet[3426]: E1013 06:25:51.562528 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.562570 kubelet[3426]: W1013 06:25:51.562535 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.562570 kubelet[3426]: E1013 06:25:51.562542 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.562704 kubelet[3426]: E1013 06:25:51.562666 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.562704 kubelet[3426]: W1013 06:25:51.562674 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.562704 kubelet[3426]: E1013 06:25:51.562681 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.562838 kubelet[3426]: E1013 06:25:51.562792 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.562838 kubelet[3426]: W1013 06:25:51.562797 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.562838 kubelet[3426]: E1013 06:25:51.562802 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.562909 kubelet[3426]: E1013 06:25:51.562887 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.562909 kubelet[3426]: W1013 06:25:51.562892 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.562909 kubelet[3426]: E1013 06:25:51.562897 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.562969 kubelet[3426]: E1013 06:25:51.562962 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.562969 kubelet[3426]: W1013 06:25:51.562966 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.563006 kubelet[3426]: E1013 06:25:51.562971 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.563075 kubelet[3426]: E1013 06:25:51.563036 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.563075 kubelet[3426]: W1013 06:25:51.563041 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.563075 kubelet[3426]: E1013 06:25:51.563046 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.563149 kubelet[3426]: E1013 06:25:51.563110 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.563149 kubelet[3426]: W1013 06:25:51.563115 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.563149 kubelet[3426]: E1013 06:25:51.563120 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.563229 kubelet[3426]: E1013 06:25:51.563187 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.563229 kubelet[3426]: W1013 06:25:51.563193 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.563229 kubelet[3426]: E1013 06:25:51.563198 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.563300 kubelet[3426]: E1013 06:25:51.563270 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.563300 kubelet[3426]: W1013 06:25:51.563274 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.563300 kubelet[3426]: E1013 06:25:51.563279 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.563364 kubelet[3426]: E1013 06:25:51.563353 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.563364 kubelet[3426]: W1013 06:25:51.563358 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.563364 kubelet[3426]: E1013 06:25:51.563363 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.563432 kubelet[3426]: E1013 06:25:51.563427 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.563432 kubelet[3426]: W1013 06:25:51.563432 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.563471 kubelet[3426]: E1013 06:25:51.563436 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.563513 kubelet[3426]: E1013 06:25:51.563506 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.563513 kubelet[3426]: W1013 06:25:51.563511 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.563555 kubelet[3426]: E1013 06:25:51.563516 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.563585 kubelet[3426]: E1013 06:25:51.563580 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.563585 kubelet[3426]: W1013 06:25:51.563585 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.563621 kubelet[3426]: E1013 06:25:51.563590 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.563659 kubelet[3426]: E1013 06:25:51.563654 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.563679 kubelet[3426]: W1013 06:25:51.563658 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.563679 kubelet[3426]: E1013 06:25:51.563663 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.563732 kubelet[3426]: E1013 06:25:51.563727 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.563732 kubelet[3426]: W1013 06:25:51.563732 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.563780 kubelet[3426]: E1013 06:25:51.563736 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.563824 kubelet[3426]: E1013 06:25:51.563816 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.563824 kubelet[3426]: W1013 06:25:51.563822 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.563892 kubelet[3426]: E1013 06:25:51.563828 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.563930 kubelet[3426]: E1013 06:25:51.563905 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.563930 kubelet[3426]: W1013 06:25:51.563912 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.563930 kubelet[3426]: E1013 06:25:51.563918 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.564024 kubelet[3426]: E1013 06:25:51.563989 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.564024 kubelet[3426]: W1013 06:25:51.563993 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.564024 kubelet[3426]: E1013 06:25:51.563999 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.564105 kubelet[3426]: E1013 06:25:51.564070 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.564105 kubelet[3426]: W1013 06:25:51.564075 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.564105 kubelet[3426]: E1013 06:25:51.564081 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.576146 kubelet[3426]: E1013 06:25:51.576073 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.576146 kubelet[3426]: W1013 06:25:51.576119 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.576441 kubelet[3426]: E1013 06:25:51.576159 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.576441 kubelet[3426]: I1013 06:25:51.576214 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da2b5110-cb9c-4d9e-a520-e14947dc335c-kubelet-dir\") pod \"csi-node-driver-x788d\" (UID: \"da2b5110-cb9c-4d9e-a520-e14947dc335c\") " pod="calico-system/csi-node-driver-x788d" Oct 13 06:25:51.576769 kubelet[3426]: E1013 06:25:51.576725 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.576769 kubelet[3426]: W1013 06:25:51.576759 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.577082 kubelet[3426]: E1013 06:25:51.576793 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.577082 kubelet[3426]: I1013 06:25:51.576848 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/da2b5110-cb9c-4d9e-a520-e14947dc335c-registration-dir\") pod \"csi-node-driver-x788d\" (UID: \"da2b5110-cb9c-4d9e-a520-e14947dc335c\") " pod="calico-system/csi-node-driver-x788d" Oct 13 06:25:51.577428 kubelet[3426]: E1013 06:25:51.577392 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.577611 kubelet[3426]: W1013 06:25:51.577434 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.577611 kubelet[3426]: E1013 06:25:51.577476 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.578020 kubelet[3426]: E1013 06:25:51.577971 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.578020 kubelet[3426]: W1013 06:25:51.578001 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.578356 kubelet[3426]: E1013 06:25:51.578032 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.578567 kubelet[3426]: E1013 06:25:51.578514 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.578567 kubelet[3426]: W1013 06:25:51.578550 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.578799 kubelet[3426]: E1013 06:25:51.578581 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.578799 kubelet[3426]: I1013 06:25:51.578645 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/da2b5110-cb9c-4d9e-a520-e14947dc335c-varrun\") pod \"csi-node-driver-x788d\" (UID: \"da2b5110-cb9c-4d9e-a520-e14947dc335c\") " pod="calico-system/csi-node-driver-x788d" Oct 13 06:25:51.579212 kubelet[3426]: E1013 06:25:51.579146 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.579212 kubelet[3426]: W1013 06:25:51.579183 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.579559 kubelet[3426]: E1013 06:25:51.579215 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.579732 kubelet[3426]: E1013 06:25:51.579630 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.579732 kubelet[3426]: W1013 06:25:51.579659 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.579732 kubelet[3426]: E1013 06:25:51.579686 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.580268 kubelet[3426]: E1013 06:25:51.580200 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.580268 kubelet[3426]: W1013 06:25:51.580228 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.580601 kubelet[3426]: E1013 06:25:51.580301 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.580601 kubelet[3426]: I1013 06:25:51.580367 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzvh5\" (UniqueName: \"kubernetes.io/projected/da2b5110-cb9c-4d9e-a520-e14947dc335c-kube-api-access-dzvh5\") pod \"csi-node-driver-x788d\" (UID: \"da2b5110-cb9c-4d9e-a520-e14947dc335c\") " pod="calico-system/csi-node-driver-x788d" Oct 13 06:25:51.580966 kubelet[3426]: E1013 06:25:51.580898 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.580966 kubelet[3426]: W1013 06:25:51.580946 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.581187 kubelet[3426]: E1013 06:25:51.580993 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.581521 kubelet[3426]: E1013 06:25:51.581488 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.581617 kubelet[3426]: W1013 06:25:51.581525 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.581617 kubelet[3426]: E1013 06:25:51.581563 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.582065 kubelet[3426]: E1013 06:25:51.582035 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.582156 kubelet[3426]: W1013 06:25:51.582069 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.582156 kubelet[3426]: E1013 06:25:51.582106 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.582405 kubelet[3426]: I1013 06:25:51.582182 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/da2b5110-cb9c-4d9e-a520-e14947dc335c-socket-dir\") pod \"csi-node-driver-x788d\" (UID: \"da2b5110-cb9c-4d9e-a520-e14947dc335c\") " pod="calico-system/csi-node-driver-x788d" Oct 13 06:25:51.582707 kubelet[3426]: E1013 06:25:51.582652 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.582707 kubelet[3426]: W1013 06:25:51.582700 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.582912 kubelet[3426]: E1013 06:25:51.582746 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.583282 kubelet[3426]: E1013 06:25:51.583198 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.583421 kubelet[3426]: W1013 06:25:51.583283 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.583421 kubelet[3426]: E1013 06:25:51.583332 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.583886 kubelet[3426]: E1013 06:25:51.583849 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.583886 kubelet[3426]: W1013 06:25:51.583876 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.584093 kubelet[3426]: E1013 06:25:51.583916 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.584386 kubelet[3426]: E1013 06:25:51.584353 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.584386 kubelet[3426]: W1013 06:25:51.584380 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.584665 kubelet[3426]: E1013 06:25:51.584406 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.682940 kubelet[3426]: E1013 06:25:51.682913 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.682940 kubelet[3426]: W1013 06:25:51.682931 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.683112 kubelet[3426]: E1013 06:25:51.682952 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.683186 kubelet[3426]: E1013 06:25:51.683131 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.683186 kubelet[3426]: W1013 06:25:51.683144 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.683186 kubelet[3426]: E1013 06:25:51.683158 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.683366 kubelet[3426]: E1013 06:25:51.683350 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.683366 kubelet[3426]: W1013 06:25:51.683363 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.683469 kubelet[3426]: E1013 06:25:51.683377 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.683531 kubelet[3426]: E1013 06:25:51.683521 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.683531 kubelet[3426]: W1013 06:25:51.683529 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.683638 kubelet[3426]: E1013 06:25:51.683538 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.683688 kubelet[3426]: E1013 06:25:51.683667 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.683688 kubelet[3426]: W1013 06:25:51.683675 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.683688 kubelet[3426]: E1013 06:25:51.683683 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.683850 kubelet[3426]: E1013 06:25:51.683838 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.683850 kubelet[3426]: W1013 06:25:51.683847 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.683923 kubelet[3426]: E1013 06:25:51.683855 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.684012 kubelet[3426]: E1013 06:25:51.684003 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.684012 kubelet[3426]: W1013 06:25:51.684011 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.684071 kubelet[3426]: E1013 06:25:51.684020 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.684269 kubelet[3426]: E1013 06:25:51.684248 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.684330 kubelet[3426]: W1013 06:25:51.684268 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.684330 kubelet[3426]: E1013 06:25:51.684286 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.684487 kubelet[3426]: E1013 06:25:51.684476 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.684487 kubelet[3426]: W1013 06:25:51.684486 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.684545 kubelet[3426]: E1013 06:25:51.684497 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.684650 kubelet[3426]: E1013 06:25:51.684641 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.684683 kubelet[3426]: W1013 06:25:51.684650 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.684683 kubelet[3426]: E1013 06:25:51.684659 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.684833 kubelet[3426]: E1013 06:25:51.684823 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.684833 kubelet[3426]: W1013 06:25:51.684832 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.684902 kubelet[3426]: E1013 06:25:51.684840 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.685081 kubelet[3426]: E1013 06:25:51.685070 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.685115 kubelet[3426]: W1013 06:25:51.685081 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.685115 kubelet[3426]: E1013 06:25:51.685090 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.685300 kubelet[3426]: E1013 06:25:51.685287 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.685342 kubelet[3426]: W1013 06:25:51.685302 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.685342 kubelet[3426]: E1013 06:25:51.685317 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.685492 kubelet[3426]: E1013 06:25:51.685481 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.685524 kubelet[3426]: W1013 06:25:51.685494 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.685524 kubelet[3426]: E1013 06:25:51.685509 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.685689 kubelet[3426]: E1013 06:25:51.685675 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.685728 kubelet[3426]: W1013 06:25:51.685688 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.685728 kubelet[3426]: E1013 06:25:51.685702 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.685916 kubelet[3426]: E1013 06:25:51.685904 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.685960 kubelet[3426]: W1013 06:25:51.685916 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.685960 kubelet[3426]: E1013 06:25:51.685931 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.686137 kubelet[3426]: E1013 06:25:51.686124 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.686174 kubelet[3426]: W1013 06:25:51.686138 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.686174 kubelet[3426]: E1013 06:25:51.686153 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.686334 kubelet[3426]: E1013 06:25:51.686324 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.686382 kubelet[3426]: W1013 06:25:51.686335 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.686382 kubelet[3426]: E1013 06:25:51.686345 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.686522 kubelet[3426]: E1013 06:25:51.686510 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.686554 kubelet[3426]: W1013 06:25:51.686523 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.686554 kubelet[3426]: E1013 06:25:51.686537 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.686773 kubelet[3426]: E1013 06:25:51.686760 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.686806 kubelet[3426]: W1013 06:25:51.686776 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.686806 kubelet[3426]: E1013 06:25:51.686791 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.687009 kubelet[3426]: E1013 06:25:51.686980 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.687059 kubelet[3426]: W1013 06:25:51.687011 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.687059 kubelet[3426]: E1013 06:25:51.687029 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.687232 kubelet[3426]: E1013 06:25:51.687219 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.687277 kubelet[3426]: W1013 06:25:51.687233 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.687277 kubelet[3426]: E1013 06:25:51.687256 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.687449 kubelet[3426]: E1013 06:25:51.687436 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.687489 kubelet[3426]: W1013 06:25:51.687450 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.687489 kubelet[3426]: E1013 06:25:51.687465 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.687674 kubelet[3426]: E1013 06:25:51.687662 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.687711 kubelet[3426]: W1013 06:25:51.687675 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.687711 kubelet[3426]: E1013 06:25:51.687687 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.687853 kubelet[3426]: E1013 06:25:51.687842 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.687889 kubelet[3426]: W1013 06:25:51.687853 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.687889 kubelet[3426]: E1013 06:25:51.687862 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:51.692631 kubelet[3426]: E1013 06:25:51.692582 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:51.692631 kubelet[3426]: W1013 06:25:51.692597 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:51.692631 kubelet[3426]: E1013 06:25:51.692612 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:52.801915 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount799748288.mount: Deactivated successfully. Oct 13 06:25:53.551140 containerd[2022]: time="2025-10-13T06:25:53.551115882Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:53.551399 containerd[2022]: time="2025-10-13T06:25:53.551309986Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Oct 13 06:25:53.551726 containerd[2022]: time="2025-10-13T06:25:53.551716470Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:53.552526 containerd[2022]: time="2025-10-13T06:25:53.552515430Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:53.552947 containerd[2022]: time="2025-10-13T06:25:53.552906101Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.334282434s" Oct 13 06:25:53.552947 containerd[2022]: time="2025-10-13T06:25:53.552927505Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Oct 13 06:25:53.553374 containerd[2022]: time="2025-10-13T06:25:53.553362954Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Oct 13 06:25:53.557224 containerd[2022]: time="2025-10-13T06:25:53.557208503Z" level=info msg="CreateContainer within sandbox \"c9d2c6ca21dd39c454f4f16fd7a70121d5c9438e27d0d4bcd8b86c1ddcee611d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Oct 13 06:25:53.559823 containerd[2022]: time="2025-10-13T06:25:53.559812224Z" level=info msg="Container c0c3b48ca0dd9c43227472392b48ba19b3b3ff58c2a3ce9686f37de810fb01ac: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:25:53.562364 containerd[2022]: time="2025-10-13T06:25:53.562351306Z" level=info msg="CreateContainer within sandbox \"c9d2c6ca21dd39c454f4f16fd7a70121d5c9438e27d0d4bcd8b86c1ddcee611d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c0c3b48ca0dd9c43227472392b48ba19b3b3ff58c2a3ce9686f37de810fb01ac\"" Oct 13 06:25:53.562521 containerd[2022]: time="2025-10-13T06:25:53.562511719Z" level=info msg="StartContainer for \"c0c3b48ca0dd9c43227472392b48ba19b3b3ff58c2a3ce9686f37de810fb01ac\"" Oct 13 06:25:53.563030 containerd[2022]: time="2025-10-13T06:25:53.563019604Z" level=info msg="connecting to shim c0c3b48ca0dd9c43227472392b48ba19b3b3ff58c2a3ce9686f37de810fb01ac" address="unix:///run/containerd/s/8ce3338e5d6c13a86cdbcac5bd17b7f82cf3195a27e33536155eec35d1e30382" protocol=ttrpc version=3 Oct 13 06:25:53.570988 kubelet[3426]: E1013 06:25:53.570969 3426 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x788d" podUID="da2b5110-cb9c-4d9e-a520-e14947dc335c" Oct 13 06:25:53.579516 systemd[1]: Started cri-containerd-c0c3b48ca0dd9c43227472392b48ba19b3b3ff58c2a3ce9686f37de810fb01ac.scope - libcontainer container c0c3b48ca0dd9c43227472392b48ba19b3b3ff58c2a3ce9686f37de810fb01ac. Oct 13 06:25:53.606856 containerd[2022]: time="2025-10-13T06:25:53.606831733Z" level=info msg="StartContainer for \"c0c3b48ca0dd9c43227472392b48ba19b3b3ff58c2a3ce9686f37de810fb01ac\" returns successfully" Oct 13 06:25:53.632619 kubelet[3426]: I1013 06:25:53.632551 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6fc8898dc4-zcvcv" podStartSLOduration=1.297710585 podStartE2EDuration="3.632535217s" podCreationTimestamp="2025-10-13 06:25:50 +0000 UTC" firstStartedPulling="2025-10-13 06:25:51.218466132 +0000 UTC m=+16.712819271" lastFinishedPulling="2025-10-13 06:25:53.553290762 +0000 UTC m=+19.047643903" observedRunningTime="2025-10-13 06:25:53.630904515 +0000 UTC m=+19.125257657" watchObservedRunningTime="2025-10-13 06:25:53.632535217 +0000 UTC m=+19.126888355" Oct 13 06:25:53.676039 kubelet[3426]: E1013 06:25:53.675947 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.676039 kubelet[3426]: W1013 06:25:53.675995 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.676039 kubelet[3426]: E1013 06:25:53.676039 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.676551 kubelet[3426]: E1013 06:25:53.676502 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.676551 kubelet[3426]: W1013 06:25:53.676534 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.676765 kubelet[3426]: E1013 06:25:53.676567 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.677090 kubelet[3426]: E1013 06:25:53.677043 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.677090 kubelet[3426]: W1013 06:25:53.677077 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.677430 kubelet[3426]: E1013 06:25:53.677108 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.677707 kubelet[3426]: E1013 06:25:53.677659 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.677707 kubelet[3426]: W1013 06:25:53.677692 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.677997 kubelet[3426]: E1013 06:25:53.677723 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.678175 kubelet[3426]: E1013 06:25:53.678142 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.678175 kubelet[3426]: W1013 06:25:53.678169 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.678468 kubelet[3426]: E1013 06:25:53.678196 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.678731 kubelet[3426]: E1013 06:25:53.678684 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.678731 kubelet[3426]: W1013 06:25:53.678717 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.679033 kubelet[3426]: E1013 06:25:53.678749 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.679232 kubelet[3426]: E1013 06:25:53.679201 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.679232 kubelet[3426]: W1013 06:25:53.679249 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.679498 kubelet[3426]: E1013 06:25:53.679293 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.679734 kubelet[3426]: E1013 06:25:53.679692 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.679734 kubelet[3426]: W1013 06:25:53.679715 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.679734 kubelet[3426]: E1013 06:25:53.679739 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.680222 kubelet[3426]: E1013 06:25:53.680193 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.680222 kubelet[3426]: W1013 06:25:53.680219 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.680222 kubelet[3426]: E1013 06:25:53.680279 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.680771 kubelet[3426]: E1013 06:25:53.680639 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.680771 kubelet[3426]: W1013 06:25:53.680659 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.680771 kubelet[3426]: E1013 06:25:53.680682 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.681063 kubelet[3426]: E1013 06:25:53.681029 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.681063 kubelet[3426]: W1013 06:25:53.681059 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.681306 kubelet[3426]: E1013 06:25:53.681080 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.681447 kubelet[3426]: E1013 06:25:53.681410 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.681447 kubelet[3426]: W1013 06:25:53.681430 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.681644 kubelet[3426]: E1013 06:25:53.681451 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.681852 kubelet[3426]: E1013 06:25:53.681825 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.681852 kubelet[3426]: W1013 06:25:53.681848 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.682065 kubelet[3426]: E1013 06:25:53.681874 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.682316 kubelet[3426]: E1013 06:25:53.682289 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.682316 kubelet[3426]: W1013 06:25:53.682312 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.682547 kubelet[3426]: E1013 06:25:53.682335 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.682759 kubelet[3426]: E1013 06:25:53.682732 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.682759 kubelet[3426]: W1013 06:25:53.682756 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.682948 kubelet[3426]: E1013 06:25:53.682779 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.700373 kubelet[3426]: E1013 06:25:53.700319 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.700373 kubelet[3426]: W1013 06:25:53.700357 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.700757 kubelet[3426]: E1013 06:25:53.700404 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.701019 kubelet[3426]: E1013 06:25:53.700971 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.701019 kubelet[3426]: W1013 06:25:53.701007 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.701353 kubelet[3426]: E1013 06:25:53.701048 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.701645 kubelet[3426]: E1013 06:25:53.701600 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.701645 kubelet[3426]: W1013 06:25:53.701630 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.701921 kubelet[3426]: E1013 06:25:53.701670 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.702413 kubelet[3426]: E1013 06:25:53.702315 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.702413 kubelet[3426]: W1013 06:25:53.702369 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.702712 kubelet[3426]: E1013 06:25:53.702446 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.702984 kubelet[3426]: E1013 06:25:53.702935 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.702984 kubelet[3426]: W1013 06:25:53.702971 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.703294 kubelet[3426]: E1013 06:25:53.703008 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.703575 kubelet[3426]: E1013 06:25:53.703540 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.703575 kubelet[3426]: W1013 06:25:53.703567 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.703788 kubelet[3426]: E1013 06:25:53.703596 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.704116 kubelet[3426]: E1013 06:25:53.704081 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.704231 kubelet[3426]: W1013 06:25:53.704116 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.704231 kubelet[3426]: E1013 06:25:53.704149 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.704715 kubelet[3426]: E1013 06:25:53.704682 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.704833 kubelet[3426]: W1013 06:25:53.704718 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.704833 kubelet[3426]: E1013 06:25:53.704750 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.705262 kubelet[3426]: E1013 06:25:53.705210 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.705398 kubelet[3426]: W1013 06:25:53.705280 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.705398 kubelet[3426]: E1013 06:25:53.705316 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.705825 kubelet[3426]: E1013 06:25:53.705779 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.705825 kubelet[3426]: W1013 06:25:53.705814 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.706053 kubelet[3426]: E1013 06:25:53.705845 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.706344 kubelet[3426]: E1013 06:25:53.706308 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.706344 kubelet[3426]: W1013 06:25:53.706333 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.706595 kubelet[3426]: E1013 06:25:53.706359 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.706832 kubelet[3426]: E1013 06:25:53.706786 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.706832 kubelet[3426]: W1013 06:25:53.706821 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.707058 kubelet[3426]: E1013 06:25:53.706852 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.707323 kubelet[3426]: E1013 06:25:53.707261 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.707323 kubelet[3426]: W1013 06:25:53.707303 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.707564 kubelet[3426]: E1013 06:25:53.707333 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.707896 kubelet[3426]: E1013 06:25:53.707830 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.707896 kubelet[3426]: W1013 06:25:53.707876 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.708171 kubelet[3426]: E1013 06:25:53.707923 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.708488 kubelet[3426]: E1013 06:25:53.708444 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.708488 kubelet[3426]: W1013 06:25:53.708470 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.708488 kubelet[3426]: E1013 06:25:53.708498 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.709012 kubelet[3426]: E1013 06:25:53.708955 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.709012 kubelet[3426]: W1013 06:25:53.708979 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.709012 kubelet[3426]: E1013 06:25:53.709004 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.709683 kubelet[3426]: E1013 06:25:53.709642 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.709862 kubelet[3426]: W1013 06:25:53.709681 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.709862 kubelet[3426]: E1013 06:25:53.709715 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:53.710193 kubelet[3426]: E1013 06:25:53.710156 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:53.710193 kubelet[3426]: W1013 06:25:53.710185 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:53.710544 kubelet[3426]: E1013 06:25:53.710213 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.626648 kubelet[3426]: I1013 06:25:54.626604 3426 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 06:25:54.689304 kubelet[3426]: E1013 06:25:54.689201 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.689304 kubelet[3426]: W1013 06:25:54.689278 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.689775 kubelet[3426]: E1013 06:25:54.689344 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.689977 kubelet[3426]: E1013 06:25:54.689894 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.689977 kubelet[3426]: W1013 06:25:54.689930 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.689977 kubelet[3426]: E1013 06:25:54.689966 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.690540 kubelet[3426]: E1013 06:25:54.690490 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.690540 kubelet[3426]: W1013 06:25:54.690517 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.690751 kubelet[3426]: E1013 06:25:54.690545 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.691083 kubelet[3426]: E1013 06:25:54.691002 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.691083 kubelet[3426]: W1013 06:25:54.691030 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.691083 kubelet[3426]: E1013 06:25:54.691057 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.691548 kubelet[3426]: E1013 06:25:54.691503 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.691548 kubelet[3426]: W1013 06:25:54.691528 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.691744 kubelet[3426]: E1013 06:25:54.691551 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.692050 kubelet[3426]: E1013 06:25:54.691984 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.692050 kubelet[3426]: W1013 06:25:54.692011 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.692050 kubelet[3426]: E1013 06:25:54.692034 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.692541 kubelet[3426]: E1013 06:25:54.692442 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.692541 kubelet[3426]: W1013 06:25:54.692466 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.692541 kubelet[3426]: E1013 06:25:54.692490 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.692995 kubelet[3426]: E1013 06:25:54.692935 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.692995 kubelet[3426]: W1013 06:25:54.692961 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.692995 kubelet[3426]: E1013 06:25:54.692986 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.693509 kubelet[3426]: E1013 06:25:54.693457 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.693509 kubelet[3426]: W1013 06:25:54.693485 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.693509 kubelet[3426]: E1013 06:25:54.693511 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.693968 kubelet[3426]: E1013 06:25:54.693907 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.693968 kubelet[3426]: W1013 06:25:54.693935 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.693968 kubelet[3426]: E1013 06:25:54.693963 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.694442 kubelet[3426]: E1013 06:25:54.694413 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.694442 kubelet[3426]: W1013 06:25:54.694438 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.694657 kubelet[3426]: E1013 06:25:54.694463 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.694872 kubelet[3426]: E1013 06:25:54.694843 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.694971 kubelet[3426]: W1013 06:25:54.694875 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.694971 kubelet[3426]: E1013 06:25:54.694899 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.695387 kubelet[3426]: E1013 06:25:54.695359 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.695387 kubelet[3426]: W1013 06:25:54.695386 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.695617 kubelet[3426]: E1013 06:25:54.695409 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.695880 kubelet[3426]: E1013 06:25:54.695851 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.695880 kubelet[3426]: W1013 06:25:54.695878 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.696098 kubelet[3426]: E1013 06:25:54.695903 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.696364 kubelet[3426]: E1013 06:25:54.696316 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.696364 kubelet[3426]: W1013 06:25:54.696342 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.696364 kubelet[3426]: E1013 06:25:54.696366 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.710070 kubelet[3426]: E1013 06:25:54.709976 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.710070 kubelet[3426]: W1013 06:25:54.710013 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.710070 kubelet[3426]: E1013 06:25:54.710047 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.710730 kubelet[3426]: E1013 06:25:54.710652 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.710730 kubelet[3426]: W1013 06:25:54.710687 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.710730 kubelet[3426]: E1013 06:25:54.710722 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.711362 kubelet[3426]: E1013 06:25:54.711312 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.711362 kubelet[3426]: W1013 06:25:54.711343 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.711362 kubelet[3426]: E1013 06:25:54.711373 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.711999 kubelet[3426]: E1013 06:25:54.711955 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.711999 kubelet[3426]: W1013 06:25:54.711993 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.712219 kubelet[3426]: E1013 06:25:54.712026 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.712686 kubelet[3426]: E1013 06:25:54.712646 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.712824 kubelet[3426]: W1013 06:25:54.712683 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.712824 kubelet[3426]: E1013 06:25:54.712718 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.713268 kubelet[3426]: E1013 06:25:54.713210 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.713402 kubelet[3426]: W1013 06:25:54.713272 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.713402 kubelet[3426]: E1013 06:25:54.713311 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.713793 kubelet[3426]: E1013 06:25:54.713759 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.713793 kubelet[3426]: W1013 06:25:54.713785 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.714011 kubelet[3426]: E1013 06:25:54.713810 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.714221 kubelet[3426]: E1013 06:25:54.714192 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.714350 kubelet[3426]: W1013 06:25:54.714223 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.714350 kubelet[3426]: E1013 06:25:54.714275 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.714781 kubelet[3426]: E1013 06:25:54.714711 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.714781 kubelet[3426]: W1013 06:25:54.714734 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.714781 kubelet[3426]: E1013 06:25:54.714758 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.715191 kubelet[3426]: E1013 06:25:54.715161 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.715191 kubelet[3426]: W1013 06:25:54.715186 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.715456 kubelet[3426]: E1013 06:25:54.715209 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.715719 kubelet[3426]: E1013 06:25:54.715670 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.715719 kubelet[3426]: W1013 06:25:54.715694 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.715940 kubelet[3426]: E1013 06:25:54.715717 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.716190 kubelet[3426]: E1013 06:25:54.716156 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.716344 kubelet[3426]: W1013 06:25:54.716195 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.716344 kubelet[3426]: E1013 06:25:54.716263 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.716685 kubelet[3426]: E1013 06:25:54.716657 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.716685 kubelet[3426]: W1013 06:25:54.716683 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.716959 kubelet[3426]: E1013 06:25:54.716707 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.717232 kubelet[3426]: E1013 06:25:54.717197 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.717398 kubelet[3426]: W1013 06:25:54.717229 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.717398 kubelet[3426]: E1013 06:25:54.717296 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.717793 kubelet[3426]: E1013 06:25:54.717765 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.717793 kubelet[3426]: W1013 06:25:54.717790 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.718104 kubelet[3426]: E1013 06:25:54.717815 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.718630 kubelet[3426]: E1013 06:25:54.718585 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.718630 kubelet[3426]: W1013 06:25:54.718629 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.718916 kubelet[3426]: E1013 06:25:54.718666 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.719280 kubelet[3426]: E1013 06:25:54.719218 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.719443 kubelet[3426]: W1013 06:25:54.719279 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.719443 kubelet[3426]: E1013 06:25:54.719314 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:54.719818 kubelet[3426]: E1013 06:25:54.719785 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:25:54.719818 kubelet[3426]: W1013 06:25:54.719815 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:25:54.720013 kubelet[3426]: E1013 06:25:54.719846 3426 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:25:55.183521 containerd[2022]: time="2025-10-13T06:25:55.183465798Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:55.183714 containerd[2022]: time="2025-10-13T06:25:55.183677660Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Oct 13 06:25:55.184078 containerd[2022]: time="2025-10-13T06:25:55.184042184Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:55.184827 containerd[2022]: time="2025-10-13T06:25:55.184813284Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:55.185194 containerd[2022]: time="2025-10-13T06:25:55.185159605Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.631782757s" Oct 13 06:25:55.185194 containerd[2022]: time="2025-10-13T06:25:55.185172697Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Oct 13 06:25:55.186609 containerd[2022]: time="2025-10-13T06:25:55.186597115Z" level=info msg="CreateContainer within sandbox \"a3363dbc7563092372632e7b18798c8a5eda0cf60e69862c1215d5d8adc72b5a\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Oct 13 06:25:55.189338 containerd[2022]: time="2025-10-13T06:25:55.189323299Z" level=info msg="Container 119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:25:55.192729 containerd[2022]: time="2025-10-13T06:25:55.192688349Z" level=info msg="CreateContainer within sandbox \"a3363dbc7563092372632e7b18798c8a5eda0cf60e69862c1215d5d8adc72b5a\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4\"" Oct 13 06:25:55.192966 containerd[2022]: time="2025-10-13T06:25:55.192930391Z" level=info msg="StartContainer for \"119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4\"" Oct 13 06:25:55.193631 containerd[2022]: time="2025-10-13T06:25:55.193619518Z" level=info msg="connecting to shim 119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4" address="unix:///run/containerd/s/e4584b9a126e89f3b0b7839934dd592e2b8acaeae7400dd8feccecef417dbd9f" protocol=ttrpc version=3 Oct 13 06:25:55.216471 systemd[1]: Started cri-containerd-119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4.scope - libcontainer container 119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4. Oct 13 06:25:55.236941 containerd[2022]: time="2025-10-13T06:25:55.236913250Z" level=info msg="StartContainer for \"119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4\" returns successfully" Oct 13 06:25:55.241153 systemd[1]: cri-containerd-119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4.scope: Deactivated successfully. Oct 13 06:25:55.242180 containerd[2022]: time="2025-10-13T06:25:55.242136029Z" level=info msg="received exit event container_id:\"119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4\" id:\"119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4\" pid:4299 exited_at:{seconds:1760336755 nanos:241908945}" Oct 13 06:25:55.242230 containerd[2022]: time="2025-10-13T06:25:55.242203061Z" level=info msg="TaskExit event in podsandbox handler container_id:\"119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4\" id:\"119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4\" pid:4299 exited_at:{seconds:1760336755 nanos:241908945}" Oct 13 06:25:55.254392 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4-rootfs.mount: Deactivated successfully. Oct 13 06:25:55.571379 kubelet[3426]: E1013 06:25:55.571094 3426 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x788d" podUID="da2b5110-cb9c-4d9e-a520-e14947dc335c" Oct 13 06:25:56.642894 containerd[2022]: time="2025-10-13T06:25:56.642835675Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Oct 13 06:25:57.571552 kubelet[3426]: E1013 06:25:57.571416 3426 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x788d" podUID="da2b5110-cb9c-4d9e-a520-e14947dc335c" Oct 13 06:25:59.571157 kubelet[3426]: E1013 06:25:59.571128 3426 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x788d" podUID="da2b5110-cb9c-4d9e-a520-e14947dc335c" Oct 13 06:25:59.916042 containerd[2022]: time="2025-10-13T06:25:59.915990196Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:59.916254 containerd[2022]: time="2025-10-13T06:25:59.916180196Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Oct 13 06:25:59.916591 containerd[2022]: time="2025-10-13T06:25:59.916550412Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:59.919172 containerd[2022]: time="2025-10-13T06:25:59.919147861Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:25:59.919579 containerd[2022]: time="2025-10-13T06:25:59.919565980Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.276677815s" Oct 13 06:25:59.919608 containerd[2022]: time="2025-10-13T06:25:59.919584120Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Oct 13 06:25:59.921128 containerd[2022]: time="2025-10-13T06:25:59.921087420Z" level=info msg="CreateContainer within sandbox \"a3363dbc7563092372632e7b18798c8a5eda0cf60e69862c1215d5d8adc72b5a\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Oct 13 06:25:59.924215 containerd[2022]: time="2025-10-13T06:25:59.924199518Z" level=info msg="Container 2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:25:59.928250 containerd[2022]: time="2025-10-13T06:25:59.928204076Z" level=info msg="CreateContainer within sandbox \"a3363dbc7563092372632e7b18798c8a5eda0cf60e69862c1215d5d8adc72b5a\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5\"" Oct 13 06:25:59.928490 containerd[2022]: time="2025-10-13T06:25:59.928449834Z" level=info msg="StartContainer for \"2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5\"" Oct 13 06:25:59.929246 containerd[2022]: time="2025-10-13T06:25:59.929199974Z" level=info msg="connecting to shim 2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5" address="unix:///run/containerd/s/e4584b9a126e89f3b0b7839934dd592e2b8acaeae7400dd8feccecef417dbd9f" protocol=ttrpc version=3 Oct 13 06:25:59.958698 systemd[1]: Started cri-containerd-2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5.scope - libcontainer container 2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5. Oct 13 06:26:00.018777 containerd[2022]: time="2025-10-13T06:26:00.018716773Z" level=info msg="StartContainer for \"2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5\" returns successfully" Oct 13 06:26:00.566535 containerd[2022]: time="2025-10-13T06:26:00.566502528Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Oct 13 06:26:00.567673 systemd[1]: cri-containerd-2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5.scope: Deactivated successfully. Oct 13 06:26:00.567859 systemd[1]: cri-containerd-2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5.scope: Consumed 421ms CPU time, 195.6M memory peak, 171.3M written to disk. Oct 13 06:26:00.568228 containerd[2022]: time="2025-10-13T06:26:00.568211949Z" level=info msg="received exit event container_id:\"2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5\" id:\"2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5\" pid:4357 exited_at:{seconds:1760336760 nanos:568098431}" Oct 13 06:26:00.568331 containerd[2022]: time="2025-10-13T06:26:00.568290523Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5\" id:\"2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5\" pid:4357 exited_at:{seconds:1760336760 nanos:568098431}" Oct 13 06:26:00.580472 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5-rootfs.mount: Deactivated successfully. Oct 13 06:26:00.631376 kubelet[3426]: I1013 06:26:00.631305 3426 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Oct 13 06:26:00.737836 systemd[1]: Created slice kubepods-burstable-poddb0fa69d_27bf_470f_96a6_ab3a53daf187.slice - libcontainer container kubepods-burstable-poddb0fa69d_27bf_470f_96a6_ab3a53daf187.slice. Oct 13 06:26:00.758223 kubelet[3426]: I1013 06:26:00.757477 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsjpt\" (UniqueName: \"kubernetes.io/projected/db0fa69d-27bf-470f-96a6-ab3a53daf187-kube-api-access-tsjpt\") pod \"coredns-66bc5c9577-6x5pz\" (UID: \"db0fa69d-27bf-470f-96a6-ab3a53daf187\") " pod="kube-system/coredns-66bc5c9577-6x5pz" Oct 13 06:26:00.758223 kubelet[3426]: I1013 06:26:00.757547 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db0fa69d-27bf-470f-96a6-ab3a53daf187-config-volume\") pod \"coredns-66bc5c9577-6x5pz\" (UID: \"db0fa69d-27bf-470f-96a6-ab3a53daf187\") " pod="kube-system/coredns-66bc5c9577-6x5pz" Oct 13 06:26:00.802875 systemd[1]: Created slice kubepods-burstable-pod4c26904b_e9da_403c_b682_33bb28f5522d.slice - libcontainer container kubepods-burstable-pod4c26904b_e9da_403c_b682_33bb28f5522d.slice. Oct 13 06:26:00.821172 systemd[1]: Created slice kubepods-besteffort-pod78446806_9e35_4cb3_9095_96c30617aa27.slice - libcontainer container kubepods-besteffort-pod78446806_9e35_4cb3_9095_96c30617aa27.slice. Oct 13 06:26:00.858869 kubelet[3426]: I1013 06:26:00.858793 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltpnh\" (UniqueName: \"kubernetes.io/projected/4c26904b-e9da-403c-b682-33bb28f5522d-kube-api-access-ltpnh\") pod \"coredns-66bc5c9577-f547n\" (UID: \"4c26904b-e9da-403c-b682-33bb28f5522d\") " pod="kube-system/coredns-66bc5c9577-f547n" Oct 13 06:26:00.858869 kubelet[3426]: I1013 06:26:00.858848 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqq96\" (UniqueName: \"kubernetes.io/projected/78446806-9e35-4cb3-9095-96c30617aa27-kube-api-access-rqq96\") pod \"calico-kube-controllers-5dd984db4b-4ks8k\" (UID: \"78446806-9e35-4cb3-9095-96c30617aa27\") " pod="calico-system/calico-kube-controllers-5dd984db4b-4ks8k" Oct 13 06:26:00.859062 kubelet[3426]: I1013 06:26:00.858932 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c26904b-e9da-403c-b682-33bb28f5522d-config-volume\") pod \"coredns-66bc5c9577-f547n\" (UID: \"4c26904b-e9da-403c-b682-33bb28f5522d\") " pod="kube-system/coredns-66bc5c9577-f547n" Oct 13 06:26:00.859062 kubelet[3426]: I1013 06:26:00.858967 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78446806-9e35-4cb3-9095-96c30617aa27-tigera-ca-bundle\") pod \"calico-kube-controllers-5dd984db4b-4ks8k\" (UID: \"78446806-9e35-4cb3-9095-96c30617aa27\") " pod="calico-system/calico-kube-controllers-5dd984db4b-4ks8k" Oct 13 06:26:00.923811 systemd[1]: Created slice kubepods-besteffort-podca755307_5760_491a_b1b1_672d441db091.slice - libcontainer container kubepods-besteffort-podca755307_5760_491a_b1b1_672d441db091.slice. Oct 13 06:26:00.959424 kubelet[3426]: I1013 06:26:00.959353 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2298\" (UniqueName: \"kubernetes.io/projected/ca755307-5760-491a-b1b1-672d441db091-kube-api-access-s2298\") pod \"calico-apiserver-7db8fc6667-dkdcr\" (UID: \"ca755307-5760-491a-b1b1-672d441db091\") " pod="calico-apiserver/calico-apiserver-7db8fc6667-dkdcr" Oct 13 06:26:00.959424 kubelet[3426]: I1013 06:26:00.959418 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ca755307-5760-491a-b1b1-672d441db091-calico-apiserver-certs\") pod \"calico-apiserver-7db8fc6667-dkdcr\" (UID: \"ca755307-5760-491a-b1b1-672d441db091\") " pod="calico-apiserver/calico-apiserver-7db8fc6667-dkdcr" Oct 13 06:26:00.970177 systemd[1]: Created slice kubepods-besteffort-podaa7806ae_9b11_43b8_9854_d8e21bacfc42.slice - libcontainer container kubepods-besteffort-podaa7806ae_9b11_43b8_9854_d8e21bacfc42.slice. Oct 13 06:26:00.973398 systemd[1]: Created slice kubepods-besteffort-pod8e0f36a1_1ce5_4d0e_992c_16a055d75e4c.slice - libcontainer container kubepods-besteffort-pod8e0f36a1_1ce5_4d0e_992c_16a055d75e4c.slice. Oct 13 06:26:00.976587 systemd[1]: Created slice kubepods-besteffort-pod405a127c_4dd1_41a6_bf45_575476aa1beb.slice - libcontainer container kubepods-besteffort-pod405a127c_4dd1_41a6_bf45_575476aa1beb.slice. Oct 13 06:26:01.042880 containerd[2022]: time="2025-10-13T06:26:01.042838739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6x5pz,Uid:db0fa69d-27bf-470f-96a6-ab3a53daf187,Namespace:kube-system,Attempt:0,}" Oct 13 06:26:01.060473 kubelet[3426]: I1013 06:26:01.060401 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e0f36a1-1ce5-4d0e-992c-16a055d75e4c-goldmane-ca-bundle\") pod \"goldmane-854f97d977-p9fs4\" (UID: \"8e0f36a1-1ce5-4d0e-992c-16a055d75e4c\") " pod="calico-system/goldmane-854f97d977-p9fs4" Oct 13 06:26:01.060473 kubelet[3426]: I1013 06:26:01.060427 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8e0f36a1-1ce5-4d0e-992c-16a055d75e4c-goldmane-key-pair\") pod \"goldmane-854f97d977-p9fs4\" (UID: \"8e0f36a1-1ce5-4d0e-992c-16a055d75e4c\") " pod="calico-system/goldmane-854f97d977-p9fs4" Oct 13 06:26:01.060473 kubelet[3426]: I1013 06:26:01.060438 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmvsf\" (UniqueName: \"kubernetes.io/projected/405a127c-4dd1-41a6-bf45-575476aa1beb-kube-api-access-nmvsf\") pod \"whisker-d74698464-85f2b\" (UID: \"405a127c-4dd1-41a6-bf45-575476aa1beb\") " pod="calico-system/whisker-d74698464-85f2b" Oct 13 06:26:01.060637 kubelet[3426]: I1013 06:26:01.060497 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4jt9\" (UniqueName: \"kubernetes.io/projected/8e0f36a1-1ce5-4d0e-992c-16a055d75e4c-kube-api-access-q4jt9\") pod \"goldmane-854f97d977-p9fs4\" (UID: \"8e0f36a1-1ce5-4d0e-992c-16a055d75e4c\") " pod="calico-system/goldmane-854f97d977-p9fs4" Oct 13 06:26:01.060637 kubelet[3426]: I1013 06:26:01.060526 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/aa7806ae-9b11-43b8-9854-d8e21bacfc42-calico-apiserver-certs\") pod \"calico-apiserver-7db8fc6667-8x69c\" (UID: \"aa7806ae-9b11-43b8-9854-d8e21bacfc42\") " pod="calico-apiserver/calico-apiserver-7db8fc6667-8x69c" Oct 13 06:26:01.060637 kubelet[3426]: I1013 06:26:01.060535 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8dhb\" (UniqueName: \"kubernetes.io/projected/aa7806ae-9b11-43b8-9854-d8e21bacfc42-kube-api-access-z8dhb\") pod \"calico-apiserver-7db8fc6667-8x69c\" (UID: \"aa7806ae-9b11-43b8-9854-d8e21bacfc42\") " pod="calico-apiserver/calico-apiserver-7db8fc6667-8x69c" Oct 13 06:26:01.060637 kubelet[3426]: I1013 06:26:01.060552 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e0f36a1-1ce5-4d0e-992c-16a055d75e4c-config\") pod \"goldmane-854f97d977-p9fs4\" (UID: \"8e0f36a1-1ce5-4d0e-992c-16a055d75e4c\") " pod="calico-system/goldmane-854f97d977-p9fs4" Oct 13 06:26:01.060637 kubelet[3426]: I1013 06:26:01.060581 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/405a127c-4dd1-41a6-bf45-575476aa1beb-whisker-ca-bundle\") pod \"whisker-d74698464-85f2b\" (UID: \"405a127c-4dd1-41a6-bf45-575476aa1beb\") " pod="calico-system/whisker-d74698464-85f2b" Oct 13 06:26:01.060728 kubelet[3426]: I1013 06:26:01.060629 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/405a127c-4dd1-41a6-bf45-575476aa1beb-whisker-backend-key-pair\") pod \"whisker-d74698464-85f2b\" (UID: \"405a127c-4dd1-41a6-bf45-575476aa1beb\") " pod="calico-system/whisker-d74698464-85f2b" Oct 13 06:26:01.070050 containerd[2022]: time="2025-10-13T06:26:01.070024486Z" level=error msg="Failed to destroy network for sandbox \"886953519c66b7743ad1f4df5ee63c339eb47e76ea6764fe63c27e2cbd2c7d60\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.070699 containerd[2022]: time="2025-10-13T06:26:01.070607128Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6x5pz,Uid:db0fa69d-27bf-470f-96a6-ab3a53daf187,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"886953519c66b7743ad1f4df5ee63c339eb47e76ea6764fe63c27e2cbd2c7d60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.070891 kubelet[3426]: E1013 06:26:01.070838 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"886953519c66b7743ad1f4df5ee63c339eb47e76ea6764fe63c27e2cbd2c7d60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.070891 kubelet[3426]: E1013 06:26:01.070877 3426 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"886953519c66b7743ad1f4df5ee63c339eb47e76ea6764fe63c27e2cbd2c7d60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-6x5pz" Oct 13 06:26:01.070891 kubelet[3426]: E1013 06:26:01.070889 3426 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"886953519c66b7743ad1f4df5ee63c339eb47e76ea6764fe63c27e2cbd2c7d60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-6x5pz" Oct 13 06:26:01.070988 kubelet[3426]: E1013 06:26:01.070939 3426 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-6x5pz_kube-system(db0fa69d-27bf-470f-96a6-ab3a53daf187)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-6x5pz_kube-system(db0fa69d-27bf-470f-96a6-ab3a53daf187)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"886953519c66b7743ad1f4df5ee63c339eb47e76ea6764fe63c27e2cbd2c7d60\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-6x5pz" podUID="db0fa69d-27bf-470f-96a6-ab3a53daf187" Oct 13 06:26:01.071408 systemd[1]: run-netns-cni\x2dbc36294e\x2dcdc3\x2d510f\x2de3e1\x2dd8cd50d1e2d4.mount: Deactivated successfully. Oct 13 06:26:01.117953 containerd[2022]: time="2025-10-13T06:26:01.117899850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-f547n,Uid:4c26904b-e9da-403c-b682-33bb28f5522d,Namespace:kube-system,Attempt:0,}" Oct 13 06:26:01.124170 containerd[2022]: time="2025-10-13T06:26:01.124140877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5dd984db4b-4ks8k,Uid:78446806-9e35-4cb3-9095-96c30617aa27,Namespace:calico-system,Attempt:0,}" Oct 13 06:26:01.143508 containerd[2022]: time="2025-10-13T06:26:01.143453079Z" level=error msg="Failed to destroy network for sandbox \"98abef362cbca4176ba4f3407389000eddecafe9292f535dbeea245cab7ac4dc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.143955 containerd[2022]: time="2025-10-13T06:26:01.143936359Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-f547n,Uid:4c26904b-e9da-403c-b682-33bb28f5522d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"98abef362cbca4176ba4f3407389000eddecafe9292f535dbeea245cab7ac4dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.144121 kubelet[3426]: E1013 06:26:01.144097 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98abef362cbca4176ba4f3407389000eddecafe9292f535dbeea245cab7ac4dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.144155 kubelet[3426]: E1013 06:26:01.144136 3426 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98abef362cbca4176ba4f3407389000eddecafe9292f535dbeea245cab7ac4dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-f547n" Oct 13 06:26:01.144155 kubelet[3426]: E1013 06:26:01.144148 3426 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98abef362cbca4176ba4f3407389000eddecafe9292f535dbeea245cab7ac4dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-f547n" Oct 13 06:26:01.144200 kubelet[3426]: E1013 06:26:01.144183 3426 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-f547n_kube-system(4c26904b-e9da-403c-b682-33bb28f5522d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-f547n_kube-system(4c26904b-e9da-403c-b682-33bb28f5522d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"98abef362cbca4176ba4f3407389000eddecafe9292f535dbeea245cab7ac4dc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-f547n" podUID="4c26904b-e9da-403c-b682-33bb28f5522d" Oct 13 06:26:01.148967 containerd[2022]: time="2025-10-13T06:26:01.148920348Z" level=error msg="Failed to destroy network for sandbox \"5d790d501f08e7f1d5313ec8820a24d4cfabd334268c28b4fd16a46743673a1a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.149287 containerd[2022]: time="2025-10-13T06:26:01.149269322Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5dd984db4b-4ks8k,Uid:78446806-9e35-4cb3-9095-96c30617aa27,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d790d501f08e7f1d5313ec8820a24d4cfabd334268c28b4fd16a46743673a1a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.149439 kubelet[3426]: E1013 06:26:01.149392 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d790d501f08e7f1d5313ec8820a24d4cfabd334268c28b4fd16a46743673a1a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.149439 kubelet[3426]: E1013 06:26:01.149421 3426 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d790d501f08e7f1d5313ec8820a24d4cfabd334268c28b4fd16a46743673a1a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5dd984db4b-4ks8k" Oct 13 06:26:01.149439 kubelet[3426]: E1013 06:26:01.149432 3426 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d790d501f08e7f1d5313ec8820a24d4cfabd334268c28b4fd16a46743673a1a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5dd984db4b-4ks8k" Oct 13 06:26:01.149513 kubelet[3426]: E1013 06:26:01.149460 3426 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5dd984db4b-4ks8k_calico-system(78446806-9e35-4cb3-9095-96c30617aa27)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5dd984db4b-4ks8k_calico-system(78446806-9e35-4cb3-9095-96c30617aa27)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5d790d501f08e7f1d5313ec8820a24d4cfabd334268c28b4fd16a46743673a1a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5dd984db4b-4ks8k" podUID="78446806-9e35-4cb3-9095-96c30617aa27" Oct 13 06:26:01.228345 containerd[2022]: time="2025-10-13T06:26:01.228278567Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db8fc6667-dkdcr,Uid:ca755307-5760-491a-b1b1-672d441db091,Namespace:calico-apiserver,Attempt:0,}" Oct 13 06:26:01.253395 containerd[2022]: time="2025-10-13T06:26:01.253357294Z" level=error msg="Failed to destroy network for sandbox \"56146be00746d617e69b8b86772311d7e774b7c4fe5866e183292376e3cd1451\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.253978 containerd[2022]: time="2025-10-13T06:26:01.253919976Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db8fc6667-dkdcr,Uid:ca755307-5760-491a-b1b1-672d441db091,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"56146be00746d617e69b8b86772311d7e774b7c4fe5866e183292376e3cd1451\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.254121 kubelet[3426]: E1013 06:26:01.254099 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56146be00746d617e69b8b86772311d7e774b7c4fe5866e183292376e3cd1451\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.254157 kubelet[3426]: E1013 06:26:01.254136 3426 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56146be00746d617e69b8b86772311d7e774b7c4fe5866e183292376e3cd1451\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db8fc6667-dkdcr" Oct 13 06:26:01.254157 kubelet[3426]: E1013 06:26:01.254151 3426 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56146be00746d617e69b8b86772311d7e774b7c4fe5866e183292376e3cd1451\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db8fc6667-dkdcr" Oct 13 06:26:01.254201 kubelet[3426]: E1013 06:26:01.254186 3426 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7db8fc6667-dkdcr_calico-apiserver(ca755307-5760-491a-b1b1-672d441db091)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7db8fc6667-dkdcr_calico-apiserver(ca755307-5760-491a-b1b1-672d441db091)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"56146be00746d617e69b8b86772311d7e774b7c4fe5866e183292376e3cd1451\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7db8fc6667-dkdcr" podUID="ca755307-5760-491a-b1b1-672d441db091" Oct 13 06:26:01.274247 containerd[2022]: time="2025-10-13T06:26:01.274213431Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db8fc6667-8x69c,Uid:aa7806ae-9b11-43b8-9854-d8e21bacfc42,Namespace:calico-apiserver,Attempt:0,}" Oct 13 06:26:01.275887 containerd[2022]: time="2025-10-13T06:26:01.275840650Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-854f97d977-p9fs4,Uid:8e0f36a1-1ce5-4d0e-992c-16a055d75e4c,Namespace:calico-system,Attempt:0,}" Oct 13 06:26:01.278863 containerd[2022]: time="2025-10-13T06:26:01.278842940Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d74698464-85f2b,Uid:405a127c-4dd1-41a6-bf45-575476aa1beb,Namespace:calico-system,Attempt:0,}" Oct 13 06:26:01.301667 containerd[2022]: time="2025-10-13T06:26:01.301629210Z" level=error msg="Failed to destroy network for sandbox \"84f54961cea75fb43381dd65cfac7e427a0b9279ceb9f03c529108420407ed02\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.302001 containerd[2022]: time="2025-10-13T06:26:01.301989195Z" level=error msg="Failed to destroy network for sandbox \"5e088757ab328dd50229b97138d4bc14eacc0aba179138abf9d1c30254d51208\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.305198 containerd[2022]: time="2025-10-13T06:26:01.305141891Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-854f97d977-p9fs4,Uid:8e0f36a1-1ce5-4d0e-992c-16a055d75e4c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"84f54961cea75fb43381dd65cfac7e427a0b9279ceb9f03c529108420407ed02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.305377 kubelet[3426]: E1013 06:26:01.305331 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84f54961cea75fb43381dd65cfac7e427a0b9279ceb9f03c529108420407ed02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.305377 kubelet[3426]: E1013 06:26:01.305369 3426 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84f54961cea75fb43381dd65cfac7e427a0b9279ceb9f03c529108420407ed02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-854f97d977-p9fs4" Oct 13 06:26:01.305436 kubelet[3426]: E1013 06:26:01.305381 3426 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84f54961cea75fb43381dd65cfac7e427a0b9279ceb9f03c529108420407ed02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-854f97d977-p9fs4" Oct 13 06:26:01.305436 kubelet[3426]: E1013 06:26:01.305419 3426 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-854f97d977-p9fs4_calico-system(8e0f36a1-1ce5-4d0e-992c-16a055d75e4c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-854f97d977-p9fs4_calico-system(8e0f36a1-1ce5-4d0e-992c-16a055d75e4c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"84f54961cea75fb43381dd65cfac7e427a0b9279ceb9f03c529108420407ed02\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-854f97d977-p9fs4" podUID="8e0f36a1-1ce5-4d0e-992c-16a055d75e4c" Oct 13 06:26:01.305507 containerd[2022]: time="2025-10-13T06:26:01.305394873Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db8fc6667-8x69c,Uid:aa7806ae-9b11-43b8-9854-d8e21bacfc42,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e088757ab328dd50229b97138d4bc14eacc0aba179138abf9d1c30254d51208\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.305554 kubelet[3426]: E1013 06:26:01.305514 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e088757ab328dd50229b97138d4bc14eacc0aba179138abf9d1c30254d51208\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.305588 kubelet[3426]: E1013 06:26:01.305548 3426 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e088757ab328dd50229b97138d4bc14eacc0aba179138abf9d1c30254d51208\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db8fc6667-8x69c" Oct 13 06:26:01.305588 kubelet[3426]: E1013 06:26:01.305563 3426 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e088757ab328dd50229b97138d4bc14eacc0aba179138abf9d1c30254d51208\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db8fc6667-8x69c" Oct 13 06:26:01.305645 kubelet[3426]: E1013 06:26:01.305598 3426 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7db8fc6667-8x69c_calico-apiserver(aa7806ae-9b11-43b8-9854-d8e21bacfc42)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7db8fc6667-8x69c_calico-apiserver(aa7806ae-9b11-43b8-9854-d8e21bacfc42)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5e088757ab328dd50229b97138d4bc14eacc0aba179138abf9d1c30254d51208\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7db8fc6667-8x69c" podUID="aa7806ae-9b11-43b8-9854-d8e21bacfc42" Oct 13 06:26:01.306015 containerd[2022]: time="2025-10-13T06:26:01.305998719Z" level=error msg="Failed to destroy network for sandbox \"a3c958fd6b37d303b95549f24318d7df89556e221085795f0ddd1e75198a8045\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.306363 containerd[2022]: time="2025-10-13T06:26:01.306322992Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d74698464-85f2b,Uid:405a127c-4dd1-41a6-bf45-575476aa1beb,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c958fd6b37d303b95549f24318d7df89556e221085795f0ddd1e75198a8045\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.306404 kubelet[3426]: E1013 06:26:01.306385 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c958fd6b37d303b95549f24318d7df89556e221085795f0ddd1e75198a8045\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.306426 kubelet[3426]: E1013 06:26:01.306403 3426 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c958fd6b37d303b95549f24318d7df89556e221085795f0ddd1e75198a8045\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-d74698464-85f2b" Oct 13 06:26:01.306426 kubelet[3426]: E1013 06:26:01.306413 3426 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c958fd6b37d303b95549f24318d7df89556e221085795f0ddd1e75198a8045\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-d74698464-85f2b" Oct 13 06:26:01.306461 kubelet[3426]: E1013 06:26:01.306432 3426 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-d74698464-85f2b_calico-system(405a127c-4dd1-41a6-bf45-575476aa1beb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-d74698464-85f2b_calico-system(405a127c-4dd1-41a6-bf45-575476aa1beb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a3c958fd6b37d303b95549f24318d7df89556e221085795f0ddd1e75198a8045\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-d74698464-85f2b" podUID="405a127c-4dd1-41a6-bf45-575476aa1beb" Oct 13 06:26:01.586863 systemd[1]: Created slice kubepods-besteffort-podda2b5110_cb9c_4d9e_a520_e14947dc335c.slice - libcontainer container kubepods-besteffort-podda2b5110_cb9c_4d9e_a520_e14947dc335c.slice. Oct 13 06:26:01.593905 containerd[2022]: time="2025-10-13T06:26:01.593844023Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x788d,Uid:da2b5110-cb9c-4d9e-a520-e14947dc335c,Namespace:calico-system,Attempt:0,}" Oct 13 06:26:01.619617 containerd[2022]: time="2025-10-13T06:26:01.619562384Z" level=error msg="Failed to destroy network for sandbox \"4547f9c5e0d6cb5968d9cbaaa2c78ea29cc5cf552c4f9c991c751d654c125a66\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.620142 containerd[2022]: time="2025-10-13T06:26:01.620102383Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x788d,Uid:da2b5110-cb9c-4d9e-a520-e14947dc335c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4547f9c5e0d6cb5968d9cbaaa2c78ea29cc5cf552c4f9c991c751d654c125a66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.620380 kubelet[3426]: E1013 06:26:01.620315 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4547f9c5e0d6cb5968d9cbaaa2c78ea29cc5cf552c4f9c991c751d654c125a66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:26:01.620380 kubelet[3426]: E1013 06:26:01.620352 3426 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4547f9c5e0d6cb5968d9cbaaa2c78ea29cc5cf552c4f9c991c751d654c125a66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x788d" Oct 13 06:26:01.620380 kubelet[3426]: E1013 06:26:01.620364 3426 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4547f9c5e0d6cb5968d9cbaaa2c78ea29cc5cf552c4f9c991c751d654c125a66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x788d" Oct 13 06:26:01.620508 kubelet[3426]: E1013 06:26:01.620396 3426 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-x788d_calico-system(da2b5110-cb9c-4d9e-a520-e14947dc335c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-x788d_calico-system(da2b5110-cb9c-4d9e-a520-e14947dc335c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4547f9c5e0d6cb5968d9cbaaa2c78ea29cc5cf552c4f9c991c751d654c125a66\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-x788d" podUID="da2b5110-cb9c-4d9e-a520-e14947dc335c" Oct 13 06:26:01.665022 containerd[2022]: time="2025-10-13T06:26:01.664988178Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Oct 13 06:26:06.837296 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1789191941.mount: Deactivated successfully. Oct 13 06:26:06.854256 containerd[2022]: time="2025-10-13T06:26:06.854199159Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:06.854411 containerd[2022]: time="2025-10-13T06:26:06.854383612Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Oct 13 06:26:06.854737 containerd[2022]: time="2025-10-13T06:26:06.854698238Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:06.855419 containerd[2022]: time="2025-10-13T06:26:06.855379112Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:06.855753 containerd[2022]: time="2025-10-13T06:26:06.855712963Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 5.190689436s" Oct 13 06:26:06.855753 containerd[2022]: time="2025-10-13T06:26:06.855727386Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Oct 13 06:26:06.859871 containerd[2022]: time="2025-10-13T06:26:06.859852647Z" level=info msg="CreateContainer within sandbox \"a3363dbc7563092372632e7b18798c8a5eda0cf60e69862c1215d5d8adc72b5a\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Oct 13 06:26:06.863382 containerd[2022]: time="2025-10-13T06:26:06.863342623Z" level=info msg="Container f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:26:06.867929 containerd[2022]: time="2025-10-13T06:26:06.867910782Z" level=info msg="CreateContainer within sandbox \"a3363dbc7563092372632e7b18798c8a5eda0cf60e69862c1215d5d8adc72b5a\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\"" Oct 13 06:26:06.868229 containerd[2022]: time="2025-10-13T06:26:06.868218425Z" level=info msg="StartContainer for \"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\"" Oct 13 06:26:06.868988 containerd[2022]: time="2025-10-13T06:26:06.868975618Z" level=info msg="connecting to shim f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139" address="unix:///run/containerd/s/e4584b9a126e89f3b0b7839934dd592e2b8acaeae7400dd8feccecef417dbd9f" protocol=ttrpc version=3 Oct 13 06:26:06.883383 systemd[1]: Started cri-containerd-f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139.scope - libcontainer container f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139. Oct 13 06:26:06.903566 containerd[2022]: time="2025-10-13T06:26:06.903543763Z" level=info msg="StartContainer for \"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" returns successfully" Oct 13 06:26:06.976918 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Oct 13 06:26:06.976990 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Oct 13 06:26:07.103517 kubelet[3426]: I1013 06:26:07.103424 3426 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmvsf\" (UniqueName: \"kubernetes.io/projected/405a127c-4dd1-41a6-bf45-575476aa1beb-kube-api-access-nmvsf\") pod \"405a127c-4dd1-41a6-bf45-575476aa1beb\" (UID: \"405a127c-4dd1-41a6-bf45-575476aa1beb\") " Oct 13 06:26:07.103517 kubelet[3426]: I1013 06:26:07.103468 3426 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/405a127c-4dd1-41a6-bf45-575476aa1beb-whisker-ca-bundle\") pod \"405a127c-4dd1-41a6-bf45-575476aa1beb\" (UID: \"405a127c-4dd1-41a6-bf45-575476aa1beb\") " Oct 13 06:26:07.103517 kubelet[3426]: I1013 06:26:07.103498 3426 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/405a127c-4dd1-41a6-bf45-575476aa1beb-whisker-backend-key-pair\") pod \"405a127c-4dd1-41a6-bf45-575476aa1beb\" (UID: \"405a127c-4dd1-41a6-bf45-575476aa1beb\") " Oct 13 06:26:07.103780 kubelet[3426]: I1013 06:26:07.103704 3426 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/405a127c-4dd1-41a6-bf45-575476aa1beb-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "405a127c-4dd1-41a6-bf45-575476aa1beb" (UID: "405a127c-4dd1-41a6-bf45-575476aa1beb"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Oct 13 06:26:07.104909 kubelet[3426]: I1013 06:26:07.104864 3426 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405a127c-4dd1-41a6-bf45-575476aa1beb-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "405a127c-4dd1-41a6-bf45-575476aa1beb" (UID: "405a127c-4dd1-41a6-bf45-575476aa1beb"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Oct 13 06:26:07.104998 kubelet[3426]: I1013 06:26:07.104962 3426 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/405a127c-4dd1-41a6-bf45-575476aa1beb-kube-api-access-nmvsf" (OuterVolumeSpecName: "kube-api-access-nmvsf") pod "405a127c-4dd1-41a6-bf45-575476aa1beb" (UID: "405a127c-4dd1-41a6-bf45-575476aa1beb"). InnerVolumeSpecName "kube-api-access-nmvsf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Oct 13 06:26:07.204654 kubelet[3426]: I1013 06:26:07.204549 3426 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/405a127c-4dd1-41a6-bf45-575476aa1beb-whisker-backend-key-pair\") on node \"ci-4487.0.0-a-becc29ce89\" DevicePath \"\"" Oct 13 06:26:07.204654 kubelet[3426]: I1013 06:26:07.204613 3426 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmvsf\" (UniqueName: \"kubernetes.io/projected/405a127c-4dd1-41a6-bf45-575476aa1beb-kube-api-access-nmvsf\") on node \"ci-4487.0.0-a-becc29ce89\" DevicePath \"\"" Oct 13 06:26:07.204654 kubelet[3426]: I1013 06:26:07.204641 3426 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/405a127c-4dd1-41a6-bf45-575476aa1beb-whisker-ca-bundle\") on node \"ci-4487.0.0-a-becc29ce89\" DevicePath \"\"" Oct 13 06:26:07.691355 systemd[1]: Removed slice kubepods-besteffort-pod405a127c_4dd1_41a6_bf45_575476aa1beb.slice - libcontainer container kubepods-besteffort-pod405a127c_4dd1_41a6_bf45_575476aa1beb.slice. Oct 13 06:26:07.698559 kubelet[3426]: I1013 06:26:07.698503 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-j9px9" podStartSLOduration=1.386325565 podStartE2EDuration="16.698485866s" podCreationTimestamp="2025-10-13 06:25:51 +0000 UTC" firstStartedPulling="2025-10-13 06:25:51.543918769 +0000 UTC m=+17.038271908" lastFinishedPulling="2025-10-13 06:26:06.856079071 +0000 UTC m=+32.350432209" observedRunningTime="2025-10-13 06:26:07.69808857 +0000 UTC m=+33.192441712" watchObservedRunningTime="2025-10-13 06:26:07.698485866 +0000 UTC m=+33.192839006" Oct 13 06:26:07.721726 systemd[1]: Created slice kubepods-besteffort-pod65e0038f_c461_4dfd_b27b_99560fb61e3d.slice - libcontainer container kubepods-besteffort-pod65e0038f_c461_4dfd_b27b_99560fb61e3d.slice. Oct 13 06:26:07.808881 kubelet[3426]: I1013 06:26:07.808771 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65e0038f-c461-4dfd-b27b-99560fb61e3d-whisker-ca-bundle\") pod \"whisker-7749c84547-xjvm7\" (UID: \"65e0038f-c461-4dfd-b27b-99560fb61e3d\") " pod="calico-system/whisker-7749c84547-xjvm7" Oct 13 06:26:07.808881 kubelet[3426]: I1013 06:26:07.808896 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gc9r\" (UniqueName: \"kubernetes.io/projected/65e0038f-c461-4dfd-b27b-99560fb61e3d-kube-api-access-6gc9r\") pod \"whisker-7749c84547-xjvm7\" (UID: \"65e0038f-c461-4dfd-b27b-99560fb61e3d\") " pod="calico-system/whisker-7749c84547-xjvm7" Oct 13 06:26:07.809300 kubelet[3426]: I1013 06:26:07.808955 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/65e0038f-c461-4dfd-b27b-99560fb61e3d-whisker-backend-key-pair\") pod \"whisker-7749c84547-xjvm7\" (UID: \"65e0038f-c461-4dfd-b27b-99560fb61e3d\") " pod="calico-system/whisker-7749c84547-xjvm7" Oct 13 06:26:07.844425 systemd[1]: var-lib-kubelet-pods-405a127c\x2d4dd1\x2d41a6\x2dbf45\x2d575476aa1beb-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dnmvsf.mount: Deactivated successfully. Oct 13 06:26:07.844683 systemd[1]: var-lib-kubelet-pods-405a127c\x2d4dd1\x2d41a6\x2dbf45\x2d575476aa1beb-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Oct 13 06:26:08.026442 containerd[2022]: time="2025-10-13T06:26:08.026320530Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7749c84547-xjvm7,Uid:65e0038f-c461-4dfd-b27b-99560fb61e3d,Namespace:calico-system,Attempt:0,}" Oct 13 06:26:08.087007 systemd-networkd[1778]: cali69fd6556ff3: Link UP Oct 13 06:26:08.087228 systemd-networkd[1778]: cali69fd6556ff3: Gained carrier Oct 13 06:26:08.095186 containerd[2022]: 2025-10-13 06:26:08.037 [INFO][4859] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 13 06:26:08.095186 containerd[2022]: 2025-10-13 06:26:08.044 [INFO][4859] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4487.0.0--a--becc29ce89-k8s-whisker--7749c84547--xjvm7-eth0 whisker-7749c84547- calico-system 65e0038f-c461-4dfd-b27b-99560fb61e3d 868 0 2025-10-13 06:26:07 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7749c84547 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4487.0.0-a-becc29ce89 whisker-7749c84547-xjvm7 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali69fd6556ff3 [] [] }} ContainerID="5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" Namespace="calico-system" Pod="whisker-7749c84547-xjvm7" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-whisker--7749c84547--xjvm7-" Oct 13 06:26:08.095186 containerd[2022]: 2025-10-13 06:26:08.044 [INFO][4859] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" Namespace="calico-system" Pod="whisker-7749c84547-xjvm7" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-whisker--7749c84547--xjvm7-eth0" Oct 13 06:26:08.095186 containerd[2022]: 2025-10-13 06:26:08.057 [INFO][4880] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" HandleID="k8s-pod-network.5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" Workload="ci--4487.0.0--a--becc29ce89-k8s-whisker--7749c84547--xjvm7-eth0" Oct 13 06:26:08.095445 containerd[2022]: 2025-10-13 06:26:08.057 [INFO][4880] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" HandleID="k8s-pod-network.5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" Workload="ci--4487.0.0--a--becc29ce89-k8s-whisker--7749c84547--xjvm7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e2310), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4487.0.0-a-becc29ce89", "pod":"whisker-7749c84547-xjvm7", "timestamp":"2025-10-13 06:26:08.057633698 +0000 UTC"}, Hostname:"ci-4487.0.0-a-becc29ce89", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:26:08.095445 containerd[2022]: 2025-10-13 06:26:08.057 [INFO][4880] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:26:08.095445 containerd[2022]: 2025-10-13 06:26:08.057 [INFO][4880] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:26:08.095445 containerd[2022]: 2025-10-13 06:26:08.057 [INFO][4880] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4487.0.0-a-becc29ce89' Oct 13 06:26:08.095445 containerd[2022]: 2025-10-13 06:26:08.063 [INFO][4880] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:08.095445 containerd[2022]: 2025-10-13 06:26:08.066 [INFO][4880] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:08.095445 containerd[2022]: 2025-10-13 06:26:08.069 [INFO][4880] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:08.095445 containerd[2022]: 2025-10-13 06:26:08.070 [INFO][4880] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:08.095445 containerd[2022]: 2025-10-13 06:26:08.072 [INFO][4880] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:08.095728 containerd[2022]: 2025-10-13 06:26:08.072 [INFO][4880] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:08.095728 containerd[2022]: 2025-10-13 06:26:08.073 [INFO][4880] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291 Oct 13 06:26:08.095728 containerd[2022]: 2025-10-13 06:26:08.075 [INFO][4880] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:08.095728 containerd[2022]: 2025-10-13 06:26:08.078 [INFO][4880] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.193/26] block=192.168.61.192/26 handle="k8s-pod-network.5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:08.095728 containerd[2022]: 2025-10-13 06:26:08.079 [INFO][4880] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.193/26] handle="k8s-pod-network.5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:08.095728 containerd[2022]: 2025-10-13 06:26:08.079 [INFO][4880] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:26:08.095728 containerd[2022]: 2025-10-13 06:26:08.079 [INFO][4880] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.193/26] IPv6=[] ContainerID="5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" HandleID="k8s-pod-network.5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" Workload="ci--4487.0.0--a--becc29ce89-k8s-whisker--7749c84547--xjvm7-eth0" Oct 13 06:26:08.095937 containerd[2022]: 2025-10-13 06:26:08.081 [INFO][4859] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" Namespace="calico-system" Pod="whisker-7749c84547-xjvm7" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-whisker--7749c84547--xjvm7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-whisker--7749c84547--xjvm7-eth0", GenerateName:"whisker-7749c84547-", Namespace:"calico-system", SelfLink:"", UID:"65e0038f-c461-4dfd-b27b-99560fb61e3d", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 26, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7749c84547", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"", Pod:"whisker-7749c84547-xjvm7", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.61.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali69fd6556ff3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:08.095937 containerd[2022]: 2025-10-13 06:26:08.081 [INFO][4859] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.193/32] ContainerID="5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" Namespace="calico-system" Pod="whisker-7749c84547-xjvm7" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-whisker--7749c84547--xjvm7-eth0" Oct 13 06:26:08.096062 containerd[2022]: 2025-10-13 06:26:08.081 [INFO][4859] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali69fd6556ff3 ContainerID="5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" Namespace="calico-system" Pod="whisker-7749c84547-xjvm7" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-whisker--7749c84547--xjvm7-eth0" Oct 13 06:26:08.096062 containerd[2022]: 2025-10-13 06:26:08.087 [INFO][4859] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" Namespace="calico-system" Pod="whisker-7749c84547-xjvm7" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-whisker--7749c84547--xjvm7-eth0" Oct 13 06:26:08.096131 containerd[2022]: 2025-10-13 06:26:08.087 [INFO][4859] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" Namespace="calico-system" Pod="whisker-7749c84547-xjvm7" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-whisker--7749c84547--xjvm7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-whisker--7749c84547--xjvm7-eth0", GenerateName:"whisker-7749c84547-", Namespace:"calico-system", SelfLink:"", UID:"65e0038f-c461-4dfd-b27b-99560fb61e3d", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 26, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7749c84547", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291", Pod:"whisker-7749c84547-xjvm7", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.61.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali69fd6556ff3", MAC:"be:8f:73:37:ef:a8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:08.096207 containerd[2022]: 2025-10-13 06:26:08.093 [INFO][4859] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" Namespace="calico-system" Pod="whisker-7749c84547-xjvm7" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-whisker--7749c84547--xjvm7-eth0" Oct 13 06:26:08.104537 containerd[2022]: time="2025-10-13T06:26:08.104494472Z" level=info msg="connecting to shim 5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291" address="unix:///run/containerd/s/e4253377ce63e7730219241ebe42994211f2d3f922483581c5764f0155227b8a" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:26:08.134488 systemd[1]: Started cri-containerd-5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291.scope - libcontainer container 5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291. Oct 13 06:26:08.170114 containerd[2022]: time="2025-10-13T06:26:08.170036286Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7749c84547-xjvm7,Uid:65e0038f-c461-4dfd-b27b-99560fb61e3d,Namespace:calico-system,Attempt:0,} returns sandbox id \"5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291\"" Oct 13 06:26:08.171123 containerd[2022]: time="2025-10-13T06:26:08.171105171Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Oct 13 06:26:08.172348 containerd[2022]: time="2025-10-13T06:26:08.172326950Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"a368c0628d6b5734faa43566c12649bdd797e81d86e0512b7ab64a1e9e70d225\" pid:5045 exit_status:1 exited_at:{seconds:1760336768 nanos:172111675}" Oct 13 06:26:08.227760 containerd[2022]: time="2025-10-13T06:26:08.227708282Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"9e88a8d42546b2222c5aebd4223d3c393af625d4994ca308ceaf8cd90a51ba9f\" pid:5142 exit_status:1 exited_at:{seconds:1760336768 nanos:227527196}" Oct 13 06:26:08.376470 systemd-networkd[1778]: vxlan.calico: Link UP Oct 13 06:26:08.376475 systemd-networkd[1778]: vxlan.calico: Gained carrier Oct 13 06:26:08.577293 kubelet[3426]: I1013 06:26:08.577182 3426 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="405a127c-4dd1-41a6-bf45-575476aa1beb" path="/var/lib/kubelet/pods/405a127c-4dd1-41a6-bf45-575476aa1beb/volumes" Oct 13 06:26:08.787003 containerd[2022]: time="2025-10-13T06:26:08.786917934Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"9de24b63b8d85a119caae7569f8178d93dcd93e33146ec106c456850d251c173\" pid:5298 exit_status:1 exited_at:{seconds:1760336768 nanos:786682487}" Oct 13 06:26:09.752896 containerd[2022]: time="2025-10-13T06:26:09.752866220Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"655100f765024e8bd8668410770134d6401b41b5fccc7120ae2846b832e9f266\" pid:5337 exit_status:1 exited_at:{seconds:1760336769 nanos:752642093}" Oct 13 06:26:09.878101 containerd[2022]: time="2025-10-13T06:26:09.878048870Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:09.878319 containerd[2022]: time="2025-10-13T06:26:09.878260003Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Oct 13 06:26:09.878679 containerd[2022]: time="2025-10-13T06:26:09.878639760Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:09.879576 containerd[2022]: time="2025-10-13T06:26:09.879536422Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:09.879921 containerd[2022]: time="2025-10-13T06:26:09.879874337Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.708747638s" Oct 13 06:26:09.879921 containerd[2022]: time="2025-10-13T06:26:09.879890558Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Oct 13 06:26:09.881705 containerd[2022]: time="2025-10-13T06:26:09.881665609Z" level=info msg="CreateContainer within sandbox \"5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Oct 13 06:26:09.884298 containerd[2022]: time="2025-10-13T06:26:09.884244241Z" level=info msg="Container 35c58363f6b565ed6d05b9a5cb530071b4371b4d4b78ebaebc262184d699d34f: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:26:09.886912 containerd[2022]: time="2025-10-13T06:26:09.886869645Z" level=info msg="CreateContainer within sandbox \"5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"35c58363f6b565ed6d05b9a5cb530071b4371b4d4b78ebaebc262184d699d34f\"" Oct 13 06:26:09.887121 containerd[2022]: time="2025-10-13T06:26:09.887079249Z" level=info msg="StartContainer for \"35c58363f6b565ed6d05b9a5cb530071b4371b4d4b78ebaebc262184d699d34f\"" Oct 13 06:26:09.887659 containerd[2022]: time="2025-10-13T06:26:09.887632391Z" level=info msg="connecting to shim 35c58363f6b565ed6d05b9a5cb530071b4371b4d4b78ebaebc262184d699d34f" address="unix:///run/containerd/s/e4253377ce63e7730219241ebe42994211f2d3f922483581c5764f0155227b8a" protocol=ttrpc version=3 Oct 13 06:26:09.907547 systemd[1]: Started cri-containerd-35c58363f6b565ed6d05b9a5cb530071b4371b4d4b78ebaebc262184d699d34f.scope - libcontainer container 35c58363f6b565ed6d05b9a5cb530071b4371b4d4b78ebaebc262184d699d34f. Oct 13 06:26:09.918367 systemd-networkd[1778]: cali69fd6556ff3: Gained IPv6LL Oct 13 06:26:09.936244 containerd[2022]: time="2025-10-13T06:26:09.936210500Z" level=info msg="StartContainer for \"35c58363f6b565ed6d05b9a5cb530071b4371b4d4b78ebaebc262184d699d34f\" returns successfully" Oct 13 06:26:09.936848 containerd[2022]: time="2025-10-13T06:26:09.936833277Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Oct 13 06:26:10.367560 systemd-networkd[1778]: vxlan.calico: Gained IPv6LL Oct 13 06:26:11.574210 containerd[2022]: time="2025-10-13T06:26:11.574185998Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-f547n,Uid:4c26904b-e9da-403c-b682-33bb28f5522d,Namespace:kube-system,Attempt:0,}" Oct 13 06:26:11.634552 systemd-networkd[1778]: cali1834ff16180: Link UP Oct 13 06:26:11.634748 systemd-networkd[1778]: cali1834ff16180: Gained carrier Oct 13 06:26:11.644595 containerd[2022]: 2025-10-13 06:26:11.595 [INFO][5414] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--f547n-eth0 coredns-66bc5c9577- kube-system 4c26904b-e9da-403c-b682-33bb28f5522d 800 0 2025-10-13 06:25:42 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4487.0.0-a-becc29ce89 coredns-66bc5c9577-f547n eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali1834ff16180 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" Namespace="kube-system" Pod="coredns-66bc5c9577-f547n" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--f547n-" Oct 13 06:26:11.644595 containerd[2022]: 2025-10-13 06:26:11.595 [INFO][5414] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" Namespace="kube-system" Pod="coredns-66bc5c9577-f547n" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--f547n-eth0" Oct 13 06:26:11.644595 containerd[2022]: 2025-10-13 06:26:11.609 [INFO][5436] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" HandleID="k8s-pod-network.b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" Workload="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--f547n-eth0" Oct 13 06:26:11.644799 containerd[2022]: 2025-10-13 06:26:11.609 [INFO][5436] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" HandleID="k8s-pod-network.b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" Workload="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--f547n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f6a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4487.0.0-a-becc29ce89", "pod":"coredns-66bc5c9577-f547n", "timestamp":"2025-10-13 06:26:11.60987878 +0000 UTC"}, Hostname:"ci-4487.0.0-a-becc29ce89", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:26:11.644799 containerd[2022]: 2025-10-13 06:26:11.609 [INFO][5436] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:26:11.644799 containerd[2022]: 2025-10-13 06:26:11.610 [INFO][5436] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:26:11.644799 containerd[2022]: 2025-10-13 06:26:11.610 [INFO][5436] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4487.0.0-a-becc29ce89' Oct 13 06:26:11.644799 containerd[2022]: 2025-10-13 06:26:11.615 [INFO][5436] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:11.644799 containerd[2022]: 2025-10-13 06:26:11.618 [INFO][5436] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:11.644799 containerd[2022]: 2025-10-13 06:26:11.620 [INFO][5436] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:11.644799 containerd[2022]: 2025-10-13 06:26:11.622 [INFO][5436] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:11.644799 containerd[2022]: 2025-10-13 06:26:11.624 [INFO][5436] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:11.645043 containerd[2022]: 2025-10-13 06:26:11.624 [INFO][5436] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:11.645043 containerd[2022]: 2025-10-13 06:26:11.625 [INFO][5436] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc Oct 13 06:26:11.645043 containerd[2022]: 2025-10-13 06:26:11.628 [INFO][5436] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:11.645043 containerd[2022]: 2025-10-13 06:26:11.631 [INFO][5436] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.194/26] block=192.168.61.192/26 handle="k8s-pod-network.b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:11.645043 containerd[2022]: 2025-10-13 06:26:11.631 [INFO][5436] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.194/26] handle="k8s-pod-network.b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:11.645043 containerd[2022]: 2025-10-13 06:26:11.631 [INFO][5436] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:26:11.645043 containerd[2022]: 2025-10-13 06:26:11.631 [INFO][5436] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.194/26] IPv6=[] ContainerID="b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" HandleID="k8s-pod-network.b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" Workload="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--f547n-eth0" Oct 13 06:26:11.645256 containerd[2022]: 2025-10-13 06:26:11.633 [INFO][5414] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" Namespace="kube-system" Pod="coredns-66bc5c9577-f547n" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--f547n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--f547n-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"4c26904b-e9da-403c-b682-33bb28f5522d", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 25, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"", Pod:"coredns-66bc5c9577-f547n", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1834ff16180", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:11.645256 containerd[2022]: 2025-10-13 06:26:11.633 [INFO][5414] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.194/32] ContainerID="b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" Namespace="kube-system" Pod="coredns-66bc5c9577-f547n" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--f547n-eth0" Oct 13 06:26:11.645256 containerd[2022]: 2025-10-13 06:26:11.633 [INFO][5414] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1834ff16180 ContainerID="b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" Namespace="kube-system" Pod="coredns-66bc5c9577-f547n" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--f547n-eth0" Oct 13 06:26:11.645256 containerd[2022]: 2025-10-13 06:26:11.634 [INFO][5414] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" Namespace="kube-system" Pod="coredns-66bc5c9577-f547n" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--f547n-eth0" Oct 13 06:26:11.645256 containerd[2022]: 2025-10-13 06:26:11.635 [INFO][5414] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" Namespace="kube-system" Pod="coredns-66bc5c9577-f547n" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--f547n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--f547n-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"4c26904b-e9da-403c-b682-33bb28f5522d", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 25, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc", Pod:"coredns-66bc5c9577-f547n", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1834ff16180", MAC:"1e:21:5e:81:a0:5c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:11.645502 containerd[2022]: 2025-10-13 06:26:11.642 [INFO][5414] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" Namespace="kube-system" Pod="coredns-66bc5c9577-f547n" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--f547n-eth0" Oct 13 06:26:11.653249 containerd[2022]: time="2025-10-13T06:26:11.653212476Z" level=info msg="connecting to shim b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc" address="unix:///run/containerd/s/1a2c6c7fe226454188c290e8a2cac087d75cc7e7cd01336d28be8ec2f3953f6a" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:26:11.671383 systemd[1]: Started cri-containerd-b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc.scope - libcontainer container b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc. Oct 13 06:26:11.697549 containerd[2022]: time="2025-10-13T06:26:11.697498293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-f547n,Uid:4c26904b-e9da-403c-b682-33bb28f5522d,Namespace:kube-system,Attempt:0,} returns sandbox id \"b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc\"" Oct 13 06:26:11.699552 containerd[2022]: time="2025-10-13T06:26:11.699507773Z" level=info msg="CreateContainer within sandbox \"b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Oct 13 06:26:11.702708 containerd[2022]: time="2025-10-13T06:26:11.702665362Z" level=info msg="Container 78c01d202482c0b162d04ef45376f9b96947b2775ab013a15eb5865671950e99: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:26:11.704972 containerd[2022]: time="2025-10-13T06:26:11.704958066Z" level=info msg="CreateContainer within sandbox \"b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"78c01d202482c0b162d04ef45376f9b96947b2775ab013a15eb5865671950e99\"" Oct 13 06:26:11.705182 containerd[2022]: time="2025-10-13T06:26:11.705169067Z" level=info msg="StartContainer for \"78c01d202482c0b162d04ef45376f9b96947b2775ab013a15eb5865671950e99\"" Oct 13 06:26:11.705187 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3017574986.mount: Deactivated successfully. Oct 13 06:26:11.705608 containerd[2022]: time="2025-10-13T06:26:11.705594651Z" level=info msg="connecting to shim 78c01d202482c0b162d04ef45376f9b96947b2775ab013a15eb5865671950e99" address="unix:///run/containerd/s/1a2c6c7fe226454188c290e8a2cac087d75cc7e7cd01336d28be8ec2f3953f6a" protocol=ttrpc version=3 Oct 13 06:26:11.724441 systemd[1]: Started cri-containerd-78c01d202482c0b162d04ef45376f9b96947b2775ab013a15eb5865671950e99.scope - libcontainer container 78c01d202482c0b162d04ef45376f9b96947b2775ab013a15eb5865671950e99. Oct 13 06:26:11.742128 containerd[2022]: time="2025-10-13T06:26:11.742101022Z" level=info msg="StartContainer for \"78c01d202482c0b162d04ef45376f9b96947b2775ab013a15eb5865671950e99\" returns successfully" Oct 13 06:26:12.195129 containerd[2022]: time="2025-10-13T06:26:12.195076544Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:12.195303 containerd[2022]: time="2025-10-13T06:26:12.195242696Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Oct 13 06:26:12.195671 containerd[2022]: time="2025-10-13T06:26:12.195631824Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:12.196553 containerd[2022]: time="2025-10-13T06:26:12.196512838Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:12.196893 containerd[2022]: time="2025-10-13T06:26:12.196853584Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.260001596s" Oct 13 06:26:12.196893 containerd[2022]: time="2025-10-13T06:26:12.196868966Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Oct 13 06:26:12.198393 containerd[2022]: time="2025-10-13T06:26:12.198382422Z" level=info msg="CreateContainer within sandbox \"5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Oct 13 06:26:12.200947 containerd[2022]: time="2025-10-13T06:26:12.200933798Z" level=info msg="Container 1de3a6974b2fc39d625abb268ff3e90d4a12ca97b4e171a8e5e931467af4d664: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:26:12.203648 containerd[2022]: time="2025-10-13T06:26:12.203636158Z" level=info msg="CreateContainer within sandbox \"5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"1de3a6974b2fc39d625abb268ff3e90d4a12ca97b4e171a8e5e931467af4d664\"" Oct 13 06:26:12.203861 containerd[2022]: time="2025-10-13T06:26:12.203847903Z" level=info msg="StartContainer for \"1de3a6974b2fc39d625abb268ff3e90d4a12ca97b4e171a8e5e931467af4d664\"" Oct 13 06:26:12.204436 containerd[2022]: time="2025-10-13T06:26:12.204396538Z" level=info msg="connecting to shim 1de3a6974b2fc39d625abb268ff3e90d4a12ca97b4e171a8e5e931467af4d664" address="unix:///run/containerd/s/e4253377ce63e7730219241ebe42994211f2d3f922483581c5764f0155227b8a" protocol=ttrpc version=3 Oct 13 06:26:12.225456 systemd[1]: Started cri-containerd-1de3a6974b2fc39d625abb268ff3e90d4a12ca97b4e171a8e5e931467af4d664.scope - libcontainer container 1de3a6974b2fc39d625abb268ff3e90d4a12ca97b4e171a8e5e931467af4d664. Oct 13 06:26:12.256260 containerd[2022]: time="2025-10-13T06:26:12.256232033Z" level=info msg="StartContainer for \"1de3a6974b2fc39d625abb268ff3e90d4a12ca97b4e171a8e5e931467af4d664\" returns successfully" Oct 13 06:26:12.577406 containerd[2022]: time="2025-10-13T06:26:12.577307563Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db8fc6667-8x69c,Uid:aa7806ae-9b11-43b8-9854-d8e21bacfc42,Namespace:calico-apiserver,Attempt:0,}" Oct 13 06:26:12.663443 systemd-networkd[1778]: cali43df57931a6: Link UP Oct 13 06:26:12.664386 systemd-networkd[1778]: cali43df57931a6: Gained carrier Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.597 [INFO][5603] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--8x69c-eth0 calico-apiserver-7db8fc6667- calico-apiserver aa7806ae-9b11-43b8-9854-d8e21bacfc42 803 0 2025-10-13 06:25:48 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7db8fc6667 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4487.0.0-a-becc29ce89 calico-apiserver-7db8fc6667-8x69c eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali43df57931a6 [] [] }} ContainerID="9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-8x69c" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--8x69c-" Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.597 [INFO][5603] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-8x69c" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--8x69c-eth0" Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.613 [INFO][5625] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" HandleID="k8s-pod-network.9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" Workload="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--8x69c-eth0" Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.614 [INFO][5625] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" HandleID="k8s-pod-network.9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" Workload="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--8x69c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00026fd80), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4487.0.0-a-becc29ce89", "pod":"calico-apiserver-7db8fc6667-8x69c", "timestamp":"2025-10-13 06:26:12.613978371 +0000 UTC"}, Hostname:"ci-4487.0.0-a-becc29ce89", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.614 [INFO][5625] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.614 [INFO][5625] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.614 [INFO][5625] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4487.0.0-a-becc29ce89' Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.619 [INFO][5625] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.624 [INFO][5625] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.627 [INFO][5625] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.629 [INFO][5625] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.631 [INFO][5625] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.631 [INFO][5625] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.632 [INFO][5625] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471 Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.643 [INFO][5625] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.655 [INFO][5625] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.195/26] block=192.168.61.192/26 handle="k8s-pod-network.9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.655 [INFO][5625] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.195/26] handle="k8s-pod-network.9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.655 [INFO][5625] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:26:12.687445 containerd[2022]: 2025-10-13 06:26:12.655 [INFO][5625] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.195/26] IPv6=[] ContainerID="9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" HandleID="k8s-pod-network.9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" Workload="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--8x69c-eth0" Oct 13 06:26:12.691188 containerd[2022]: 2025-10-13 06:26:12.659 [INFO][5603] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-8x69c" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--8x69c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--8x69c-eth0", GenerateName:"calico-apiserver-7db8fc6667-", Namespace:"calico-apiserver", SelfLink:"", UID:"aa7806ae-9b11-43b8-9854-d8e21bacfc42", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 25, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7db8fc6667", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"", Pod:"calico-apiserver-7db8fc6667-8x69c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali43df57931a6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:12.691188 containerd[2022]: 2025-10-13 06:26:12.659 [INFO][5603] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.195/32] ContainerID="9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-8x69c" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--8x69c-eth0" Oct 13 06:26:12.691188 containerd[2022]: 2025-10-13 06:26:12.659 [INFO][5603] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali43df57931a6 ContainerID="9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-8x69c" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--8x69c-eth0" Oct 13 06:26:12.691188 containerd[2022]: 2025-10-13 06:26:12.664 [INFO][5603] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-8x69c" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--8x69c-eth0" Oct 13 06:26:12.691188 containerd[2022]: 2025-10-13 06:26:12.665 [INFO][5603] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-8x69c" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--8x69c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--8x69c-eth0", GenerateName:"calico-apiserver-7db8fc6667-", Namespace:"calico-apiserver", SelfLink:"", UID:"aa7806ae-9b11-43b8-9854-d8e21bacfc42", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 25, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7db8fc6667", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471", Pod:"calico-apiserver-7db8fc6667-8x69c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali43df57931a6", MAC:"5a:e7:da:b4:4c:93", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:12.691188 containerd[2022]: 2025-10-13 06:26:12.681 [INFO][5603] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-8x69c" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--8x69c-eth0" Oct 13 06:26:12.701126 containerd[2022]: time="2025-10-13T06:26:12.701089961Z" level=info msg="connecting to shim 9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471" address="unix:///run/containerd/s/af6954303b4f416d5a3be3741dec388e9050ff099287535d44aa9ecee2d52f14" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:26:12.707679 kubelet[3426]: I1013 06:26:12.707611 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-f547n" podStartSLOduration=30.707583593 podStartE2EDuration="30.707583593s" podCreationTimestamp="2025-10-13 06:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 06:26:12.707285618 +0000 UTC m=+38.201638759" watchObservedRunningTime="2025-10-13 06:26:12.707583593 +0000 UTC m=+38.201936732" Oct 13 06:26:12.718326 kubelet[3426]: I1013 06:26:12.718283 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7749c84547-xjvm7" podStartSLOduration=1.692004353 podStartE2EDuration="5.718269663s" podCreationTimestamp="2025-10-13 06:26:07 +0000 UTC" firstStartedPulling="2025-10-13 06:26:08.170977797 +0000 UTC m=+33.665330935" lastFinishedPulling="2025-10-13 06:26:12.197243097 +0000 UTC m=+37.691596245" observedRunningTime="2025-10-13 06:26:12.718121189 +0000 UTC m=+38.212474330" watchObservedRunningTime="2025-10-13 06:26:12.718269663 +0000 UTC m=+38.212622801" Oct 13 06:26:12.718449 systemd[1]: Started cri-containerd-9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471.scope - libcontainer container 9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471. Oct 13 06:26:12.743995 containerd[2022]: time="2025-10-13T06:26:12.743973174Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db8fc6667-8x69c,Uid:aa7806ae-9b11-43b8-9854-d8e21bacfc42,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471\"" Oct 13 06:26:12.744690 containerd[2022]: time="2025-10-13T06:26:12.744679573Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Oct 13 06:26:13.506790 systemd-networkd[1778]: cali1834ff16180: Gained IPv6LL Oct 13 06:26:13.573684 containerd[2022]: time="2025-10-13T06:26:13.573612405Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db8fc6667-dkdcr,Uid:ca755307-5760-491a-b1b1-672d441db091,Namespace:calico-apiserver,Attempt:0,}" Oct 13 06:26:13.574103 containerd[2022]: time="2025-10-13T06:26:13.574065115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x788d,Uid:da2b5110-cb9c-4d9e-a520-e14947dc335c,Namespace:calico-system,Attempt:0,}" Oct 13 06:26:13.634040 systemd-networkd[1778]: cali5d3b637d60c: Link UP Oct 13 06:26:13.634186 systemd-networkd[1778]: cali5d3b637d60c: Gained carrier Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.594 [INFO][5707] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4487.0.0--a--becc29ce89-k8s-csi--node--driver--x788d-eth0 csi-node-driver- calico-system da2b5110-cb9c-4d9e-a520-e14947dc335c 683 0 2025-10-13 06:25:51 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:f8549cf5c k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4487.0.0-a-becc29ce89 csi-node-driver-x788d eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali5d3b637d60c [] [] }} ContainerID="bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" Namespace="calico-system" Pod="csi-node-driver-x788d" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-csi--node--driver--x788d-" Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.594 [INFO][5707] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" Namespace="calico-system" Pod="csi-node-driver-x788d" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-csi--node--driver--x788d-eth0" Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.608 [INFO][5751] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" HandleID="k8s-pod-network.bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" Workload="ci--4487.0.0--a--becc29ce89-k8s-csi--node--driver--x788d-eth0" Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.608 [INFO][5751] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" HandleID="k8s-pod-network.bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" Workload="ci--4487.0.0--a--becc29ce89-k8s-csi--node--driver--x788d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000345e70), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4487.0.0-a-becc29ce89", "pod":"csi-node-driver-x788d", "timestamp":"2025-10-13 06:26:13.608350831 +0000 UTC"}, Hostname:"ci-4487.0.0-a-becc29ce89", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.608 [INFO][5751] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.608 [INFO][5751] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.608 [INFO][5751] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4487.0.0-a-becc29ce89' Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.614 [INFO][5751] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.617 [INFO][5751] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.621 [INFO][5751] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.623 [INFO][5751] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.624 [INFO][5751] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.624 [INFO][5751] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.625 [INFO][5751] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342 Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.628 [INFO][5751] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.631 [INFO][5751] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.196/26] block=192.168.61.192/26 handle="k8s-pod-network.bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.631 [INFO][5751] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.196/26] handle="k8s-pod-network.bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.632 [INFO][5751] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:26:13.640206 containerd[2022]: 2025-10-13 06:26:13.632 [INFO][5751] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.196/26] IPv6=[] ContainerID="bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" HandleID="k8s-pod-network.bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" Workload="ci--4487.0.0--a--becc29ce89-k8s-csi--node--driver--x788d-eth0" Oct 13 06:26:13.640763 containerd[2022]: 2025-10-13 06:26:13.632 [INFO][5707] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" Namespace="calico-system" Pod="csi-node-driver-x788d" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-csi--node--driver--x788d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-csi--node--driver--x788d-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"da2b5110-cb9c-4d9e-a520-e14947dc335c", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 25, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"f8549cf5c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"", Pod:"csi-node-driver-x788d", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.61.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5d3b637d60c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:13.640763 containerd[2022]: 2025-10-13 06:26:13.633 [INFO][5707] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.196/32] ContainerID="bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" Namespace="calico-system" Pod="csi-node-driver-x788d" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-csi--node--driver--x788d-eth0" Oct 13 06:26:13.640763 containerd[2022]: 2025-10-13 06:26:13.633 [INFO][5707] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5d3b637d60c ContainerID="bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" Namespace="calico-system" Pod="csi-node-driver-x788d" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-csi--node--driver--x788d-eth0" Oct 13 06:26:13.640763 containerd[2022]: 2025-10-13 06:26:13.634 [INFO][5707] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" Namespace="calico-system" Pod="csi-node-driver-x788d" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-csi--node--driver--x788d-eth0" Oct 13 06:26:13.640763 containerd[2022]: 2025-10-13 06:26:13.634 [INFO][5707] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" Namespace="calico-system" Pod="csi-node-driver-x788d" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-csi--node--driver--x788d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-csi--node--driver--x788d-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"da2b5110-cb9c-4d9e-a520-e14947dc335c", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 25, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"f8549cf5c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342", Pod:"csi-node-driver-x788d", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.61.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5d3b637d60c", MAC:"4a:18:59:41:00:5f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:13.640763 containerd[2022]: 2025-10-13 06:26:13.639 [INFO][5707] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" Namespace="calico-system" Pod="csi-node-driver-x788d" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-csi--node--driver--x788d-eth0" Oct 13 06:26:13.647649 containerd[2022]: time="2025-10-13T06:26:13.647586416Z" level=info msg="connecting to shim bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342" address="unix:///run/containerd/s/40b45a21c756bae453a58a765d9bd3bfc2aea9f341853df76c6d2166b7728cd5" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:26:13.676699 systemd[1]: Started cri-containerd-bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342.scope - libcontainer container bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342. Oct 13 06:26:13.722356 containerd[2022]: time="2025-10-13T06:26:13.722334391Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x788d,Uid:da2b5110-cb9c-4d9e-a520-e14947dc335c,Namespace:calico-system,Attempt:0,} returns sandbox id \"bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342\"" Oct 13 06:26:13.732638 systemd-networkd[1778]: cali001bab1fa77: Link UP Oct 13 06:26:13.732787 systemd-networkd[1778]: cali001bab1fa77: Gained carrier Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.595 [INFO][5702] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--dkdcr-eth0 calico-apiserver-7db8fc6667- calico-apiserver ca755307-5760-491a-b1b1-672d441db091 802 0 2025-10-13 06:25:48 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7db8fc6667 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4487.0.0-a-becc29ce89 calico-apiserver-7db8fc6667-dkdcr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali001bab1fa77 [] [] }} ContainerID="d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-dkdcr" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--dkdcr-" Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.595 [INFO][5702] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-dkdcr" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--dkdcr-eth0" Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.608 [INFO][5749] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" HandleID="k8s-pod-network.d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" Workload="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--dkdcr-eth0" Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.609 [INFO][5749] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" HandleID="k8s-pod-network.d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" Workload="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--dkdcr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e210), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4487.0.0-a-becc29ce89", "pod":"calico-apiserver-7db8fc6667-dkdcr", "timestamp":"2025-10-13 06:26:13.608969965 +0000 UTC"}, Hostname:"ci-4487.0.0-a-becc29ce89", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.609 [INFO][5749] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.632 [INFO][5749] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.632 [INFO][5749] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4487.0.0-a-becc29ce89' Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.716 [INFO][5749] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.721 [INFO][5749] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.723 [INFO][5749] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.724 [INFO][5749] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.725 [INFO][5749] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.725 [INFO][5749] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.726 [INFO][5749] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301 Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.728 [INFO][5749] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.730 [INFO][5749] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.197/26] block=192.168.61.192/26 handle="k8s-pod-network.d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.730 [INFO][5749] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.197/26] handle="k8s-pod-network.d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.730 [INFO][5749] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:26:13.755541 containerd[2022]: 2025-10-13 06:26:13.730 [INFO][5749] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.197/26] IPv6=[] ContainerID="d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" HandleID="k8s-pod-network.d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" Workload="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--dkdcr-eth0" Oct 13 06:26:13.756061 containerd[2022]: 2025-10-13 06:26:13.731 [INFO][5702] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-dkdcr" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--dkdcr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--dkdcr-eth0", GenerateName:"calico-apiserver-7db8fc6667-", Namespace:"calico-apiserver", SelfLink:"", UID:"ca755307-5760-491a-b1b1-672d441db091", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 25, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7db8fc6667", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"", Pod:"calico-apiserver-7db8fc6667-dkdcr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali001bab1fa77", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:13.756061 containerd[2022]: 2025-10-13 06:26:13.731 [INFO][5702] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.197/32] ContainerID="d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-dkdcr" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--dkdcr-eth0" Oct 13 06:26:13.756061 containerd[2022]: 2025-10-13 06:26:13.731 [INFO][5702] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali001bab1fa77 ContainerID="d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-dkdcr" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--dkdcr-eth0" Oct 13 06:26:13.756061 containerd[2022]: 2025-10-13 06:26:13.732 [INFO][5702] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-dkdcr" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--dkdcr-eth0" Oct 13 06:26:13.756061 containerd[2022]: 2025-10-13 06:26:13.732 [INFO][5702] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-dkdcr" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--dkdcr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--dkdcr-eth0", GenerateName:"calico-apiserver-7db8fc6667-", Namespace:"calico-apiserver", SelfLink:"", UID:"ca755307-5760-491a-b1b1-672d441db091", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 25, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7db8fc6667", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301", Pod:"calico-apiserver-7db8fc6667-dkdcr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali001bab1fa77", MAC:"0a:20:53:83:7d:99", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:13.756061 containerd[2022]: 2025-10-13 06:26:13.754 [INFO][5702] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" Namespace="calico-apiserver" Pod="calico-apiserver-7db8fc6667-dkdcr" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--apiserver--7db8fc6667--dkdcr-eth0" Oct 13 06:26:13.763851 containerd[2022]: time="2025-10-13T06:26:13.763756394Z" level=info msg="connecting to shim d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301" address="unix:///run/containerd/s/6996a4933fa5ca821f29c8590cb7a36ab88cc4687260c2c79c3c615b57aa492e" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:26:13.782708 systemd[1]: Started cri-containerd-d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301.scope - libcontainer container d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301. Oct 13 06:26:13.848486 containerd[2022]: time="2025-10-13T06:26:13.848462852Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db8fc6667-dkdcr,Uid:ca755307-5760-491a-b1b1-672d441db091,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301\"" Oct 13 06:26:14.270439 systemd-networkd[1778]: cali43df57931a6: Gained IPv6LL Oct 13 06:26:14.573140 containerd[2022]: time="2025-10-13T06:26:14.573049637Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6x5pz,Uid:db0fa69d-27bf-470f-96a6-ab3a53daf187,Namespace:kube-system,Attempt:0,}" Oct 13 06:26:14.632901 systemd-networkd[1778]: cali58a215977d7: Link UP Oct 13 06:26:14.633461 systemd-networkd[1778]: cali58a215977d7: Gained carrier Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.593 [INFO][5899] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--6x5pz-eth0 coredns-66bc5c9577- kube-system db0fa69d-27bf-470f-96a6-ab3a53daf187 799 0 2025-10-13 06:25:42 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4487.0.0-a-becc29ce89 coredns-66bc5c9577-6x5pz eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali58a215977d7 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" Namespace="kube-system" Pod="coredns-66bc5c9577-6x5pz" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--6x5pz-" Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.593 [INFO][5899] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" Namespace="kube-system" Pod="coredns-66bc5c9577-6x5pz" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--6x5pz-eth0" Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.607 [INFO][5921] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" HandleID="k8s-pod-network.2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" Workload="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--6x5pz-eth0" Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.608 [INFO][5921] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" HandleID="k8s-pod-network.2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" Workload="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--6x5pz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f6f0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4487.0.0-a-becc29ce89", "pod":"coredns-66bc5c9577-6x5pz", "timestamp":"2025-10-13 06:26:14.607964623 +0000 UTC"}, Hostname:"ci-4487.0.0-a-becc29ce89", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.608 [INFO][5921] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.608 [INFO][5921] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.608 [INFO][5921] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4487.0.0-a-becc29ce89' Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.613 [INFO][5921] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.617 [INFO][5921] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.620 [INFO][5921] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.622 [INFO][5921] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.623 [INFO][5921] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.623 [INFO][5921] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.624 [INFO][5921] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014 Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.627 [INFO][5921] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.630 [INFO][5921] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.198/26] block=192.168.61.192/26 handle="k8s-pod-network.2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.630 [INFO][5921] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.198/26] handle="k8s-pod-network.2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.630 [INFO][5921] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:26:14.640902 containerd[2022]: 2025-10-13 06:26:14.630 [INFO][5921] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.198/26] IPv6=[] ContainerID="2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" HandleID="k8s-pod-network.2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" Workload="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--6x5pz-eth0" Oct 13 06:26:14.641670 containerd[2022]: 2025-10-13 06:26:14.631 [INFO][5899] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" Namespace="kube-system" Pod="coredns-66bc5c9577-6x5pz" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--6x5pz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--6x5pz-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"db0fa69d-27bf-470f-96a6-ab3a53daf187", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 25, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"", Pod:"coredns-66bc5c9577-6x5pz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali58a215977d7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:14.641670 containerd[2022]: 2025-10-13 06:26:14.631 [INFO][5899] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.198/32] ContainerID="2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" Namespace="kube-system" Pod="coredns-66bc5c9577-6x5pz" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--6x5pz-eth0" Oct 13 06:26:14.641670 containerd[2022]: 2025-10-13 06:26:14.631 [INFO][5899] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali58a215977d7 ContainerID="2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" Namespace="kube-system" Pod="coredns-66bc5c9577-6x5pz" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--6x5pz-eth0" Oct 13 06:26:14.641670 containerd[2022]: 2025-10-13 06:26:14.633 [INFO][5899] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" Namespace="kube-system" Pod="coredns-66bc5c9577-6x5pz" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--6x5pz-eth0" Oct 13 06:26:14.641670 containerd[2022]: 2025-10-13 06:26:14.634 [INFO][5899] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" Namespace="kube-system" Pod="coredns-66bc5c9577-6x5pz" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--6x5pz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--6x5pz-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"db0fa69d-27bf-470f-96a6-ab3a53daf187", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 25, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014", Pod:"coredns-66bc5c9577-6x5pz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali58a215977d7", MAC:"6a:71:6e:a8:7a:6e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:14.641804 containerd[2022]: 2025-10-13 06:26:14.639 [INFO][5899] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" Namespace="kube-system" Pod="coredns-66bc5c9577-6x5pz" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-coredns--66bc5c9577--6x5pz-eth0" Oct 13 06:26:14.649103 containerd[2022]: time="2025-10-13T06:26:14.649075815Z" level=info msg="connecting to shim 2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014" address="unix:///run/containerd/s/00b3ed520d3cb0d10c25c68d36019bc9a1015c8e6fe9bc8754689f5c322c71bf" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:26:14.664321 systemd[1]: Started cri-containerd-2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014.scope - libcontainer container 2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014. Oct 13 06:26:14.689416 containerd[2022]: time="2025-10-13T06:26:14.689393772Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6x5pz,Uid:db0fa69d-27bf-470f-96a6-ab3a53daf187,Namespace:kube-system,Attempt:0,} returns sandbox id \"2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014\"" Oct 13 06:26:14.691250 containerd[2022]: time="2025-10-13T06:26:14.691211303Z" level=info msg="CreateContainer within sandbox \"2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Oct 13 06:26:14.694171 containerd[2022]: time="2025-10-13T06:26:14.694157211Z" level=info msg="Container aa374fe792f03961ae6bcaffa372ca3918225dac1cef8514158a777769b46fb1: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:26:14.696634 containerd[2022]: time="2025-10-13T06:26:14.696618403Z" level=info msg="CreateContainer within sandbox \"2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"aa374fe792f03961ae6bcaffa372ca3918225dac1cef8514158a777769b46fb1\"" Oct 13 06:26:14.696843 containerd[2022]: time="2025-10-13T06:26:14.696829918Z" level=info msg="StartContainer for \"aa374fe792f03961ae6bcaffa372ca3918225dac1cef8514158a777769b46fb1\"" Oct 13 06:26:14.697232 containerd[2022]: time="2025-10-13T06:26:14.697221228Z" level=info msg="connecting to shim aa374fe792f03961ae6bcaffa372ca3918225dac1cef8514158a777769b46fb1" address="unix:///run/containerd/s/00b3ed520d3cb0d10c25c68d36019bc9a1015c8e6fe9bc8754689f5c322c71bf" protocol=ttrpc version=3 Oct 13 06:26:14.712334 systemd[1]: Started cri-containerd-aa374fe792f03961ae6bcaffa372ca3918225dac1cef8514158a777769b46fb1.scope - libcontainer container aa374fe792f03961ae6bcaffa372ca3918225dac1cef8514158a777769b46fb1. Oct 13 06:26:14.726454 containerd[2022]: time="2025-10-13T06:26:14.726430824Z" level=info msg="StartContainer for \"aa374fe792f03961ae6bcaffa372ca3918225dac1cef8514158a777769b46fb1\" returns successfully" Oct 13 06:26:15.358458 systemd-networkd[1778]: cali5d3b637d60c: Gained IPv6LL Oct 13 06:26:15.562014 containerd[2022]: time="2025-10-13T06:26:15.561960932Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:15.562184 containerd[2022]: time="2025-10-13T06:26:15.562171430Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Oct 13 06:26:15.562659 containerd[2022]: time="2025-10-13T06:26:15.562617389Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:15.563497 containerd[2022]: time="2025-10-13T06:26:15.563458380Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:15.563898 containerd[2022]: time="2025-10-13T06:26:15.563854590Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.81915738s" Oct 13 06:26:15.563898 containerd[2022]: time="2025-10-13T06:26:15.563874803Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Oct 13 06:26:15.564330 containerd[2022]: time="2025-10-13T06:26:15.564317123Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Oct 13 06:26:15.565384 containerd[2022]: time="2025-10-13T06:26:15.565342672Z" level=info msg="CreateContainer within sandbox \"9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Oct 13 06:26:15.567804 containerd[2022]: time="2025-10-13T06:26:15.567790625Z" level=info msg="Container fcb45d8df585a7fac7736c8b9321cd53f4db52cd20452dc4fc877917554b5018: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:26:15.570540 containerd[2022]: time="2025-10-13T06:26:15.570498922Z" level=info msg="CreateContainer within sandbox \"9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"fcb45d8df585a7fac7736c8b9321cd53f4db52cd20452dc4fc877917554b5018\"" Oct 13 06:26:15.570728 containerd[2022]: time="2025-10-13T06:26:15.570687672Z" level=info msg="StartContainer for \"fcb45d8df585a7fac7736c8b9321cd53f4db52cd20452dc4fc877917554b5018\"" Oct 13 06:26:15.571200 containerd[2022]: time="2025-10-13T06:26:15.571188810Z" level=info msg="connecting to shim fcb45d8df585a7fac7736c8b9321cd53f4db52cd20452dc4fc877917554b5018" address="unix:///run/containerd/s/af6954303b4f416d5a3be3741dec388e9050ff099287535d44aa9ecee2d52f14" protocol=ttrpc version=3 Oct 13 06:26:15.571595 containerd[2022]: time="2025-10-13T06:26:15.571582909Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-854f97d977-p9fs4,Uid:8e0f36a1-1ce5-4d0e-992c-16a055d75e4c,Namespace:calico-system,Attempt:0,}" Oct 13 06:26:15.590572 systemd[1]: Started cri-containerd-fcb45d8df585a7fac7736c8b9321cd53f4db52cd20452dc4fc877917554b5018.scope - libcontainer container fcb45d8df585a7fac7736c8b9321cd53f4db52cd20452dc4fc877917554b5018. Oct 13 06:26:15.615342 systemd-networkd[1778]: cali001bab1fa77: Gained IPv6LL Oct 13 06:26:15.619767 containerd[2022]: time="2025-10-13T06:26:15.619741415Z" level=info msg="StartContainer for \"fcb45d8df585a7fac7736c8b9321cd53f4db52cd20452dc4fc877917554b5018\" returns successfully" Oct 13 06:26:15.627965 systemd-networkd[1778]: calid75f628020c: Link UP Oct 13 06:26:15.628525 systemd-networkd[1778]: calid75f628020c: Gained carrier Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.594 [INFO][6058] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4487.0.0--a--becc29ce89-k8s-goldmane--854f97d977--p9fs4-eth0 goldmane-854f97d977- calico-system 8e0f36a1-1ce5-4d0e-992c-16a055d75e4c 805 0 2025-10-13 06:25:50 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:854f97d977 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4487.0.0-a-becc29ce89 goldmane-854f97d977-p9fs4 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calid75f628020c [] [] }} ContainerID="a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" Namespace="calico-system" Pod="goldmane-854f97d977-p9fs4" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-goldmane--854f97d977--p9fs4-" Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.594 [INFO][6058] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" Namespace="calico-system" Pod="goldmane-854f97d977-p9fs4" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-goldmane--854f97d977--p9fs4-eth0" Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.606 [INFO][6093] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" HandleID="k8s-pod-network.a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" Workload="ci--4487.0.0--a--becc29ce89-k8s-goldmane--854f97d977--p9fs4-eth0" Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.606 [INFO][6093] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" HandleID="k8s-pod-network.a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" Workload="ci--4487.0.0--a--becc29ce89-k8s-goldmane--854f97d977--p9fs4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000139e60), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4487.0.0-a-becc29ce89", "pod":"goldmane-854f97d977-p9fs4", "timestamp":"2025-10-13 06:26:15.606686241 +0000 UTC"}, Hostname:"ci-4487.0.0-a-becc29ce89", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.606 [INFO][6093] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.606 [INFO][6093] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.606 [INFO][6093] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4487.0.0-a-becc29ce89' Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.610 [INFO][6093] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.613 [INFO][6093] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.615 [INFO][6093] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.617 [INFO][6093] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.618 [INFO][6093] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.618 [INFO][6093] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.619 [INFO][6093] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272 Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.622 [INFO][6093] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.625 [INFO][6093] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.199/26] block=192.168.61.192/26 handle="k8s-pod-network.a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.625 [INFO][6093] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.199/26] handle="k8s-pod-network.a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.625 [INFO][6093] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:26:15.634765 containerd[2022]: 2025-10-13 06:26:15.625 [INFO][6093] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.199/26] IPv6=[] ContainerID="a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" HandleID="k8s-pod-network.a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" Workload="ci--4487.0.0--a--becc29ce89-k8s-goldmane--854f97d977--p9fs4-eth0" Oct 13 06:26:15.635411 containerd[2022]: 2025-10-13 06:26:15.626 [INFO][6058] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" Namespace="calico-system" Pod="goldmane-854f97d977-p9fs4" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-goldmane--854f97d977--p9fs4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-goldmane--854f97d977--p9fs4-eth0", GenerateName:"goldmane-854f97d977-", Namespace:"calico-system", SelfLink:"", UID:"8e0f36a1-1ce5-4d0e-992c-16a055d75e4c", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 25, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"854f97d977", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"", Pod:"goldmane-854f97d977-p9fs4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.61.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid75f628020c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:15.635411 containerd[2022]: 2025-10-13 06:26:15.627 [INFO][6058] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.199/32] ContainerID="a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" Namespace="calico-system" Pod="goldmane-854f97d977-p9fs4" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-goldmane--854f97d977--p9fs4-eth0" Oct 13 06:26:15.635411 containerd[2022]: 2025-10-13 06:26:15.627 [INFO][6058] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid75f628020c ContainerID="a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" Namespace="calico-system" Pod="goldmane-854f97d977-p9fs4" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-goldmane--854f97d977--p9fs4-eth0" Oct 13 06:26:15.635411 containerd[2022]: 2025-10-13 06:26:15.628 [INFO][6058] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" Namespace="calico-system" Pod="goldmane-854f97d977-p9fs4" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-goldmane--854f97d977--p9fs4-eth0" Oct 13 06:26:15.635411 containerd[2022]: 2025-10-13 06:26:15.628 [INFO][6058] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" Namespace="calico-system" Pod="goldmane-854f97d977-p9fs4" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-goldmane--854f97d977--p9fs4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-goldmane--854f97d977--p9fs4-eth0", GenerateName:"goldmane-854f97d977-", Namespace:"calico-system", SelfLink:"", UID:"8e0f36a1-1ce5-4d0e-992c-16a055d75e4c", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 25, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"854f97d977", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272", Pod:"goldmane-854f97d977-p9fs4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.61.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid75f628020c", MAC:"1e:c3:c3:6e:8d:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:15.635411 containerd[2022]: 2025-10-13 06:26:15.633 [INFO][6058] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" Namespace="calico-system" Pod="goldmane-854f97d977-p9fs4" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-goldmane--854f97d977--p9fs4-eth0" Oct 13 06:26:15.654755 containerd[2022]: time="2025-10-13T06:26:15.654729682Z" level=info msg="connecting to shim a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272" address="unix:///run/containerd/s/44d53fdfe29aef4ddcfa1f5325e2f970b2b661b4b5352b4a81f39caf40c0a28c" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:26:15.680399 systemd[1]: Started cri-containerd-a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272.scope - libcontainer container a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272. Oct 13 06:26:15.705323 containerd[2022]: time="2025-10-13T06:26:15.705302130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-854f97d977-p9fs4,Uid:8e0f36a1-1ce5-4d0e-992c-16a055d75e4c,Namespace:calico-system,Attempt:0,} returns sandbox id \"a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272\"" Oct 13 06:26:15.721961 kubelet[3426]: I1013 06:26:15.721891 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7db8fc6667-8x69c" podStartSLOduration=24.902192102 podStartE2EDuration="27.721873015s" podCreationTimestamp="2025-10-13 06:25:48 +0000 UTC" firstStartedPulling="2025-10-13 06:26:12.744560759 +0000 UTC m=+38.238913897" lastFinishedPulling="2025-10-13 06:26:15.564241669 +0000 UTC m=+41.058594810" observedRunningTime="2025-10-13 06:26:15.721358087 +0000 UTC m=+41.215711233" watchObservedRunningTime="2025-10-13 06:26:15.721873015 +0000 UTC m=+41.216226153" Oct 13 06:26:15.726517 kubelet[3426]: I1013 06:26:15.726475 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-6x5pz" podStartSLOduration=33.726461499 podStartE2EDuration="33.726461499s" podCreationTimestamp="2025-10-13 06:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 06:26:15.726262063 +0000 UTC m=+41.220615209" watchObservedRunningTime="2025-10-13 06:26:15.726461499 +0000 UTC m=+41.220814638" Oct 13 06:26:16.573022 containerd[2022]: time="2025-10-13T06:26:16.572970285Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5dd984db4b-4ks8k,Uid:78446806-9e35-4cb3-9095-96c30617aa27,Namespace:calico-system,Attempt:0,}" Oct 13 06:26:16.639417 systemd-networkd[1778]: cali58a215977d7: Gained IPv6LL Oct 13 06:26:16.642116 systemd-networkd[1778]: cali919f9eeb585: Link UP Oct 13 06:26:16.642254 systemd-networkd[1778]: cali919f9eeb585: Gained carrier Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.593 [INFO][6195] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4487.0.0--a--becc29ce89-k8s-calico--kube--controllers--5dd984db4b--4ks8k-eth0 calico-kube-controllers-5dd984db4b- calico-system 78446806-9e35-4cb3-9095-96c30617aa27 801 0 2025-10-13 06:25:51 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5dd984db4b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4487.0.0-a-becc29ce89 calico-kube-controllers-5dd984db4b-4ks8k eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali919f9eeb585 [] [] }} ContainerID="e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" Namespace="calico-system" Pod="calico-kube-controllers-5dd984db4b-4ks8k" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--kube--controllers--5dd984db4b--4ks8k-" Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.594 [INFO][6195] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" Namespace="calico-system" Pod="calico-kube-controllers-5dd984db4b-4ks8k" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--kube--controllers--5dd984db4b--4ks8k-eth0" Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.609 [INFO][6219] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" HandleID="k8s-pod-network.e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" Workload="ci--4487.0.0--a--becc29ce89-k8s-calico--kube--controllers--5dd984db4b--4ks8k-eth0" Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.609 [INFO][6219] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" HandleID="k8s-pod-network.e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" Workload="ci--4487.0.0--a--becc29ce89-k8s-calico--kube--controllers--5dd984db4b--4ks8k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000345b30), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4487.0.0-a-becc29ce89", "pod":"calico-kube-controllers-5dd984db4b-4ks8k", "timestamp":"2025-10-13 06:26:16.609681614 +0000 UTC"}, Hostname:"ci-4487.0.0-a-becc29ce89", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.609 [INFO][6219] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.609 [INFO][6219] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.609 [INFO][6219] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4487.0.0-a-becc29ce89' Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.619 [INFO][6219] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.629 [INFO][6219] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.632 [INFO][6219] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.633 [INFO][6219] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.634 [INFO][6219] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.634 [INFO][6219] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.635 [INFO][6219] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07 Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.637 [INFO][6219] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.640 [INFO][6219] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.200/26] block=192.168.61.192/26 handle="k8s-pod-network.e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.640 [INFO][6219] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.200/26] handle="k8s-pod-network.e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" host="ci-4487.0.0-a-becc29ce89" Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.640 [INFO][6219] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:26:16.649214 containerd[2022]: 2025-10-13 06:26:16.640 [INFO][6219] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.200/26] IPv6=[] ContainerID="e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" HandleID="k8s-pod-network.e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" Workload="ci--4487.0.0--a--becc29ce89-k8s-calico--kube--controllers--5dd984db4b--4ks8k-eth0" Oct 13 06:26:16.649660 containerd[2022]: 2025-10-13 06:26:16.641 [INFO][6195] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" Namespace="calico-system" Pod="calico-kube-controllers-5dd984db4b-4ks8k" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--kube--controllers--5dd984db4b--4ks8k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-calico--kube--controllers--5dd984db4b--4ks8k-eth0", GenerateName:"calico-kube-controllers-5dd984db4b-", Namespace:"calico-system", SelfLink:"", UID:"78446806-9e35-4cb3-9095-96c30617aa27", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 25, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5dd984db4b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"", Pod:"calico-kube-controllers-5dd984db4b-4ks8k", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.61.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali919f9eeb585", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:16.649660 containerd[2022]: 2025-10-13 06:26:16.641 [INFO][6195] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.200/32] ContainerID="e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" Namespace="calico-system" Pod="calico-kube-controllers-5dd984db4b-4ks8k" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--kube--controllers--5dd984db4b--4ks8k-eth0" Oct 13 06:26:16.649660 containerd[2022]: 2025-10-13 06:26:16.641 [INFO][6195] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali919f9eeb585 ContainerID="e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" Namespace="calico-system" Pod="calico-kube-controllers-5dd984db4b-4ks8k" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--kube--controllers--5dd984db4b--4ks8k-eth0" Oct 13 06:26:16.649660 containerd[2022]: 2025-10-13 06:26:16.642 [INFO][6195] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" Namespace="calico-system" Pod="calico-kube-controllers-5dd984db4b-4ks8k" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--kube--controllers--5dd984db4b--4ks8k-eth0" Oct 13 06:26:16.649660 containerd[2022]: 2025-10-13 06:26:16.642 [INFO][6195] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" Namespace="calico-system" Pod="calico-kube-controllers-5dd984db4b-4ks8k" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--kube--controllers--5dd984db4b--4ks8k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4487.0.0--a--becc29ce89-k8s-calico--kube--controllers--5dd984db4b--4ks8k-eth0", GenerateName:"calico-kube-controllers-5dd984db4b-", Namespace:"calico-system", SelfLink:"", UID:"78446806-9e35-4cb3-9095-96c30617aa27", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 25, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5dd984db4b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4487.0.0-a-becc29ce89", ContainerID:"e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07", Pod:"calico-kube-controllers-5dd984db4b-4ks8k", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.61.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali919f9eeb585", MAC:"1a:e4:3f:32:c4:6d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:26:16.649660 containerd[2022]: 2025-10-13 06:26:16.647 [INFO][6195] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" Namespace="calico-system" Pod="calico-kube-controllers-5dd984db4b-4ks8k" WorkloadEndpoint="ci--4487.0.0--a--becc29ce89-k8s-calico--kube--controllers--5dd984db4b--4ks8k-eth0" Oct 13 06:26:16.657248 containerd[2022]: time="2025-10-13T06:26:16.657209770Z" level=info msg="connecting to shim e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07" address="unix:///run/containerd/s/389e4338676b4f862e8830d9c3bdfb0efa7782e4937d9628c8d953ac49327565" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:26:16.684395 systemd[1]: Started cri-containerd-e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07.scope - libcontainer container e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07. Oct 13 06:26:16.709229 containerd[2022]: time="2025-10-13T06:26:16.709205113Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5dd984db4b-4ks8k,Uid:78446806-9e35-4cb3-9095-96c30617aa27,Namespace:calico-system,Attempt:0,} returns sandbox id \"e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07\"" Oct 13 06:26:16.723277 kubelet[3426]: I1013 06:26:16.723191 3426 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 06:26:17.342348 systemd-networkd[1778]: calid75f628020c: Gained IPv6LL Oct 13 06:26:17.393988 containerd[2022]: time="2025-10-13T06:26:17.393964044Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:17.394149 containerd[2022]: time="2025-10-13T06:26:17.394134894Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Oct 13 06:26:17.394595 containerd[2022]: time="2025-10-13T06:26:17.394581353Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:17.395460 containerd[2022]: time="2025-10-13T06:26:17.395448116Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:17.396086 containerd[2022]: time="2025-10-13T06:26:17.396073160Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.831739357s" Oct 13 06:26:17.396135 containerd[2022]: time="2025-10-13T06:26:17.396089591Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Oct 13 06:26:17.396641 containerd[2022]: time="2025-10-13T06:26:17.396628403Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Oct 13 06:26:17.397919 containerd[2022]: time="2025-10-13T06:26:17.397906193Z" level=info msg="CreateContainer within sandbox \"bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Oct 13 06:26:17.401845 containerd[2022]: time="2025-10-13T06:26:17.401805816Z" level=info msg="Container dfde3d1b808c9d04957e73eaccc2a9cc5f47e9719d8a543608da346ebfb61aae: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:26:17.405211 containerd[2022]: time="2025-10-13T06:26:17.405199217Z" level=info msg="CreateContainer within sandbox \"bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"dfde3d1b808c9d04957e73eaccc2a9cc5f47e9719d8a543608da346ebfb61aae\"" Oct 13 06:26:17.405581 containerd[2022]: time="2025-10-13T06:26:17.405560149Z" level=info msg="StartContainer for \"dfde3d1b808c9d04957e73eaccc2a9cc5f47e9719d8a543608da346ebfb61aae\"" Oct 13 06:26:17.406441 containerd[2022]: time="2025-10-13T06:26:17.406396923Z" level=info msg="connecting to shim dfde3d1b808c9d04957e73eaccc2a9cc5f47e9719d8a543608da346ebfb61aae" address="unix:///run/containerd/s/40b45a21c756bae453a58a765d9bd3bfc2aea9f341853df76c6d2166b7728cd5" protocol=ttrpc version=3 Oct 13 06:26:17.426509 systemd[1]: Started cri-containerd-dfde3d1b808c9d04957e73eaccc2a9cc5f47e9719d8a543608da346ebfb61aae.scope - libcontainer container dfde3d1b808c9d04957e73eaccc2a9cc5f47e9719d8a543608da346ebfb61aae. Oct 13 06:26:17.445318 containerd[2022]: time="2025-10-13T06:26:17.445294423Z" level=info msg="StartContainer for \"dfde3d1b808c9d04957e73eaccc2a9cc5f47e9719d8a543608da346ebfb61aae\" returns successfully" Oct 13 06:26:17.837048 containerd[2022]: time="2025-10-13T06:26:17.836994825Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:17.837306 containerd[2022]: time="2025-10-13T06:26:17.837216213Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Oct 13 06:26:17.838339 containerd[2022]: time="2025-10-13T06:26:17.838274277Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 441.630078ms" Oct 13 06:26:17.838339 containerd[2022]: time="2025-10-13T06:26:17.838321088Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Oct 13 06:26:17.838971 containerd[2022]: time="2025-10-13T06:26:17.838938352Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Oct 13 06:26:17.840171 containerd[2022]: time="2025-10-13T06:26:17.840158675Z" level=info msg="CreateContainer within sandbox \"d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Oct 13 06:26:17.843207 containerd[2022]: time="2025-10-13T06:26:17.843193717Z" level=info msg="Container 98ce0ccb4608e5fe5bf86a9a41b59da6917816633dd6aeac854dd7923601bf9f: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:26:17.846293 containerd[2022]: time="2025-10-13T06:26:17.846242927Z" level=info msg="CreateContainer within sandbox \"d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"98ce0ccb4608e5fe5bf86a9a41b59da6917816633dd6aeac854dd7923601bf9f\"" Oct 13 06:26:17.846644 containerd[2022]: time="2025-10-13T06:26:17.846575841Z" level=info msg="StartContainer for \"98ce0ccb4608e5fe5bf86a9a41b59da6917816633dd6aeac854dd7923601bf9f\"" Oct 13 06:26:17.847187 containerd[2022]: time="2025-10-13T06:26:17.847175204Z" level=info msg="connecting to shim 98ce0ccb4608e5fe5bf86a9a41b59da6917816633dd6aeac854dd7923601bf9f" address="unix:///run/containerd/s/6996a4933fa5ca821f29c8590cb7a36ab88cc4687260c2c79c3c615b57aa492e" protocol=ttrpc version=3 Oct 13 06:26:17.864510 systemd[1]: Started cri-containerd-98ce0ccb4608e5fe5bf86a9a41b59da6917816633dd6aeac854dd7923601bf9f.scope - libcontainer container 98ce0ccb4608e5fe5bf86a9a41b59da6917816633dd6aeac854dd7923601bf9f. Oct 13 06:26:17.892927 containerd[2022]: time="2025-10-13T06:26:17.892905038Z" level=info msg="StartContainer for \"98ce0ccb4608e5fe5bf86a9a41b59da6917816633dd6aeac854dd7923601bf9f\" returns successfully" Oct 13 06:26:18.687465 systemd-networkd[1778]: cali919f9eeb585: Gained IPv6LL Oct 13 06:26:18.759545 kubelet[3426]: I1013 06:26:18.759457 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7db8fc6667-dkdcr" podStartSLOduration=26.769598873 podStartE2EDuration="30.759426008s" podCreationTimestamp="2025-10-13 06:25:48 +0000 UTC" firstStartedPulling="2025-10-13 06:26:13.849042916 +0000 UTC m=+39.343396053" lastFinishedPulling="2025-10-13 06:26:17.83887005 +0000 UTC m=+43.333223188" observedRunningTime="2025-10-13 06:26:18.759424035 +0000 UTC m=+44.253777207" watchObservedRunningTime="2025-10-13 06:26:18.759426008 +0000 UTC m=+44.253779161" Oct 13 06:26:19.747521 kubelet[3426]: I1013 06:26:19.747473 3426 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 06:26:20.323067 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2954773161.mount: Deactivated successfully. Oct 13 06:26:20.524310 containerd[2022]: time="2025-10-13T06:26:20.524283141Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:20.524595 containerd[2022]: time="2025-10-13T06:26:20.524463776Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Oct 13 06:26:20.524823 containerd[2022]: time="2025-10-13T06:26:20.524811735Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:20.525890 containerd[2022]: time="2025-10-13T06:26:20.525878304Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:20.526301 containerd[2022]: time="2025-10-13T06:26:20.526287319Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 2.687314461s" Oct 13 06:26:20.526342 containerd[2022]: time="2025-10-13T06:26:20.526305021Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Oct 13 06:26:20.527133 containerd[2022]: time="2025-10-13T06:26:20.527114242Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Oct 13 06:26:20.529592 containerd[2022]: time="2025-10-13T06:26:20.529574050Z" level=info msg="CreateContainer within sandbox \"a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Oct 13 06:26:20.532258 containerd[2022]: time="2025-10-13T06:26:20.532244537Z" level=info msg="Container 1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:26:20.535623 containerd[2022]: time="2025-10-13T06:26:20.535578050Z" level=info msg="CreateContainer within sandbox \"a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\"" Oct 13 06:26:20.535833 containerd[2022]: time="2025-10-13T06:26:20.535819594Z" level=info msg="StartContainer for \"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\"" Oct 13 06:26:20.536425 containerd[2022]: time="2025-10-13T06:26:20.536379193Z" level=info msg="connecting to shim 1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847" address="unix:///run/containerd/s/44d53fdfe29aef4ddcfa1f5325e2f970b2b661b4b5352b4a81f39caf40c0a28c" protocol=ttrpc version=3 Oct 13 06:26:20.560396 systemd[1]: Started cri-containerd-1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847.scope - libcontainer container 1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847. Oct 13 06:26:20.594215 containerd[2022]: time="2025-10-13T06:26:20.594150151Z" level=info msg="StartContainer for \"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" returns successfully" Oct 13 06:26:20.869111 containerd[2022]: time="2025-10-13T06:26:20.869046363Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"6eba9868cf4e454b13ccefa89c86825613755c8f1a570b8a665d7830671d65b1\" pid:6453 exit_status:1 exited_at:{seconds:1760336780 nanos:868801502}" Oct 13 06:26:21.859191 containerd[2022]: time="2025-10-13T06:26:21.859163601Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"bf0e327ecf7a6fb51cd919b201f1396323cf504fcf3c90179782a3fd6518b4be\" pid:6489 exit_status:1 exited_at:{seconds:1760336781 nanos:858972121}" Oct 13 06:26:23.330444 containerd[2022]: time="2025-10-13T06:26:23.330416938Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:23.330716 containerd[2022]: time="2025-10-13T06:26:23.330617062Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Oct 13 06:26:23.331026 containerd[2022]: time="2025-10-13T06:26:23.331013313Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:23.332223 containerd[2022]: time="2025-10-13T06:26:23.332209271Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:23.332484 containerd[2022]: time="2025-10-13T06:26:23.332468781Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 2.805334043s" Oct 13 06:26:23.332528 containerd[2022]: time="2025-10-13T06:26:23.332486884Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Oct 13 06:26:23.332984 containerd[2022]: time="2025-10-13T06:26:23.332967592Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Oct 13 06:26:23.336653 containerd[2022]: time="2025-10-13T06:26:23.336630357Z" level=info msg="CreateContainer within sandbox \"e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Oct 13 06:26:23.339164 containerd[2022]: time="2025-10-13T06:26:23.339149465Z" level=info msg="Container ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:26:23.342101 containerd[2022]: time="2025-10-13T06:26:23.342064345Z" level=info msg="CreateContainer within sandbox \"e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\"" Oct 13 06:26:23.342341 containerd[2022]: time="2025-10-13T06:26:23.342299437Z" level=info msg="StartContainer for \"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\"" Oct 13 06:26:23.342849 containerd[2022]: time="2025-10-13T06:26:23.342837343Z" level=info msg="connecting to shim ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b" address="unix:///run/containerd/s/389e4338676b4f862e8830d9c3bdfb0efa7782e4937d9628c8d953ac49327565" protocol=ttrpc version=3 Oct 13 06:26:23.364403 systemd[1]: Started cri-containerd-ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b.scope - libcontainer container ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b. Oct 13 06:26:23.391531 containerd[2022]: time="2025-10-13T06:26:23.391478754Z" level=info msg="StartContainer for \"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" returns successfully" Oct 13 06:26:23.792782 kubelet[3426]: I1013 06:26:23.792618 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5dd984db4b-4ks8k" podStartSLOduration=26.169475082 podStartE2EDuration="32.7925812s" podCreationTimestamp="2025-10-13 06:25:51 +0000 UTC" firstStartedPulling="2025-10-13 06:26:16.709796808 +0000 UTC m=+42.204149946" lastFinishedPulling="2025-10-13 06:26:23.332902926 +0000 UTC m=+48.827256064" observedRunningTime="2025-10-13 06:26:23.791318526 +0000 UTC m=+49.285671747" watchObservedRunningTime="2025-10-13 06:26:23.7925812 +0000 UTC m=+49.286934392" Oct 13 06:26:23.793887 kubelet[3426]: I1013 06:26:23.793146 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-854f97d977-p9fs4" podStartSLOduration=28.972264826 podStartE2EDuration="33.793127429s" podCreationTimestamp="2025-10-13 06:25:50 +0000 UTC" firstStartedPulling="2025-10-13 06:26:15.705910383 +0000 UTC m=+41.200263521" lastFinishedPulling="2025-10-13 06:26:20.526772983 +0000 UTC m=+46.021126124" observedRunningTime="2025-10-13 06:26:20.780274043 +0000 UTC m=+46.274627281" watchObservedRunningTime="2025-10-13 06:26:23.793127429 +0000 UTC m=+49.287480619" Oct 13 06:26:23.858946 containerd[2022]: time="2025-10-13T06:26:23.858920180Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"982d107989ac5025b9460bef8b3b75e62b29c7b520218c409b42a8830d62ef08\" pid:6581 exited_at:{seconds:1760336783 nanos:858759202}" Oct 13 06:26:25.125327 containerd[2022]: time="2025-10-13T06:26:25.125273518Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:25.125566 containerd[2022]: time="2025-10-13T06:26:25.125472991Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Oct 13 06:26:25.125870 containerd[2022]: time="2025-10-13T06:26:25.125827724Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:25.126840 containerd[2022]: time="2025-10-13T06:26:25.126800998Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:26:25.127232 containerd[2022]: time="2025-10-13T06:26:25.127188918Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.794202577s" Oct 13 06:26:25.127232 containerd[2022]: time="2025-10-13T06:26:25.127204936Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Oct 13 06:26:25.130232 containerd[2022]: time="2025-10-13T06:26:25.130212447Z" level=info msg="CreateContainer within sandbox \"bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Oct 13 06:26:25.133543 containerd[2022]: time="2025-10-13T06:26:25.133528600Z" level=info msg="Container 0a452deb1e57221957585ab3dffe5cd1c7f1f0badfd87e3b0a1ea35c3e12e8b1: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:26:25.137199 containerd[2022]: time="2025-10-13T06:26:25.137186113Z" level=info msg="CreateContainer within sandbox \"bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"0a452deb1e57221957585ab3dffe5cd1c7f1f0badfd87e3b0a1ea35c3e12e8b1\"" Oct 13 06:26:25.137484 containerd[2022]: time="2025-10-13T06:26:25.137469280Z" level=info msg="StartContainer for \"0a452deb1e57221957585ab3dffe5cd1c7f1f0badfd87e3b0a1ea35c3e12e8b1\"" Oct 13 06:26:25.138203 containerd[2022]: time="2025-10-13T06:26:25.138190489Z" level=info msg="connecting to shim 0a452deb1e57221957585ab3dffe5cd1c7f1f0badfd87e3b0a1ea35c3e12e8b1" address="unix:///run/containerd/s/40b45a21c756bae453a58a765d9bd3bfc2aea9f341853df76c6d2166b7728cd5" protocol=ttrpc version=3 Oct 13 06:26:25.163428 systemd[1]: Started cri-containerd-0a452deb1e57221957585ab3dffe5cd1c7f1f0badfd87e3b0a1ea35c3e12e8b1.scope - libcontainer container 0a452deb1e57221957585ab3dffe5cd1c7f1f0badfd87e3b0a1ea35c3e12e8b1. Oct 13 06:26:25.183529 containerd[2022]: time="2025-10-13T06:26:25.183478659Z" level=info msg="StartContainer for \"0a452deb1e57221957585ab3dffe5cd1c7f1f0badfd87e3b0a1ea35c3e12e8b1\" returns successfully" Oct 13 06:26:25.607784 kubelet[3426]: I1013 06:26:25.607744 3426 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Oct 13 06:26:25.607784 kubelet[3426]: I1013 06:26:25.607767 3426 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Oct 13 06:26:25.810080 kubelet[3426]: I1013 06:26:25.809967 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-x788d" podStartSLOduration=23.404486579 podStartE2EDuration="34.809933706s" podCreationTimestamp="2025-10-13 06:25:51 +0000 UTC" firstStartedPulling="2025-10-13 06:26:13.722919794 +0000 UTC m=+39.217272932" lastFinishedPulling="2025-10-13 06:26:25.128366921 +0000 UTC m=+50.622720059" observedRunningTime="2025-10-13 06:26:25.809092542 +0000 UTC m=+51.303445760" watchObservedRunningTime="2025-10-13 06:26:25.809933706 +0000 UTC m=+51.304286892" Oct 13 06:26:29.278181 containerd[2022]: time="2025-10-13T06:26:29.278124411Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"555a73f86afc5369dbc9f864cb187f43020cba63d77aa9e76b2f8f83eb225416\" pid:6654 exited_at:{seconds:1760336789 nanos:277922111}" Oct 13 06:26:35.240593 kubelet[3426]: I1013 06:26:35.240535 3426 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 06:26:37.263012 containerd[2022]: time="2025-10-13T06:26:37.262981053Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"8aa9aa4ab02fa7755d51892ad5e55b0c82ebbb4e39b2ba05ad8d36f95305c794\" pid:6692 exited_at:{seconds:1760336797 nanos:262831235}" Oct 13 06:26:39.773999 containerd[2022]: time="2025-10-13T06:26:39.773967859Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"53ad711f351d70b5e3aa56b988f30ce3f144d23bda99dc261bc4e0423b8264ef\" pid:6714 exited_at:{seconds:1760336799 nanos:773741992}" Oct 13 06:26:51.878441 containerd[2022]: time="2025-10-13T06:26:51.878412943Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"5a9b3ef3e80320c94dc6cbe80c0da0ccd48977029c47ea0ee08711669e8139f9\" pid:6760 exited_at:{seconds:1760336811 nanos:878162171}" Oct 13 06:26:53.319522 kubelet[3426]: I1013 06:26:53.319417 3426 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 06:26:53.863049 containerd[2022]: time="2025-10-13T06:26:53.863019329Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"1f2a60a04b8385acfc9d0a5d59dc181eb771a33dfa6d52bbe0cfac5d61beaaca\" pid:6798 exited_at:{seconds:1760336813 nanos:862875707}" Oct 13 06:27:09.788153 containerd[2022]: time="2025-10-13T06:27:09.788124966Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"f963398680ff1c2cd8f0565b9c5287d7d1489e0f6126d372bddbd8e62a2a9cd2\" pid:6823 exited_at:{seconds:1760336829 nanos:787928867}" Oct 13 06:27:21.811943 containerd[2022]: time="2025-10-13T06:27:21.811884417Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"0eac988b81a5b4ad3ede50b088d37a889ac5d9f9096a71f32dd64318a26fd054\" pid:6861 exited_at:{seconds:1760336841 nanos:811643188}" Oct 13 06:27:23.850118 containerd[2022]: time="2025-10-13T06:27:23.850088324Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"50ce74a51b97087c418a914b3a55dd2ffdbbf14276e6d91e719dbb3e7c5c364f\" pid:6891 exited_at:{seconds:1760336843 nanos:849949937}" Oct 13 06:27:29.327244 containerd[2022]: time="2025-10-13T06:27:29.327209493Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"819fda13ba6eae6463748bcd16fe0945ddff74d47e2aba1bd45d9fc0a2457d8e\" pid:6919 exited_at:{seconds:1760336849 nanos:326966334}" Oct 13 06:27:37.212580 containerd[2022]: time="2025-10-13T06:27:37.212547551Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"12b53dcbe3354784d5995d9fd3b762375887f3d4658a3315279513027ec171f0\" pid:6958 exited_at:{seconds:1760336857 nanos:212423866}" Oct 13 06:27:39.752217 containerd[2022]: time="2025-10-13T06:27:39.752181398Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"66103a9865ad18e8e6d5d6ab26d545008dc8c26ef6c6d55a98c4badd3f418893\" pid:6980 exited_at:{seconds:1760336859 nanos:751940008}" Oct 13 06:27:51.874698 containerd[2022]: time="2025-10-13T06:27:51.874642594Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"b8924a8536a92ced030d291b80f4d17a695e9869760c03bb04c23fae2a98c1b6\" pid:7042 exited_at:{seconds:1760336871 nanos:874422379}" Oct 13 06:27:53.872532 containerd[2022]: time="2025-10-13T06:27:53.872504491Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"167e46bcd9f7ed93417ff0442218a33876cef21af70b7b282a83f85def26d9bd\" pid:7071 exited_at:{seconds:1760336873 nanos:872364173}" Oct 13 06:28:09.756284 containerd[2022]: time="2025-10-13T06:28:09.756213124Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"119ef004b8216057323a2ca8fa2177ee3b9e05c652e6fbe37a83c28c1d47db24\" pid:7095 exited_at:{seconds:1760336889 nanos:755953996}" Oct 13 06:28:21.834431 containerd[2022]: time="2025-10-13T06:28:21.834401485Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"f4658203dc13352957df253729bf0521b3f6c718d4291ba6cb62f3b316512695\" pid:7132 exited_at:{seconds:1760336901 nanos:834186680}" Oct 13 06:28:23.823952 containerd[2022]: time="2025-10-13T06:28:23.823924439Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"10b164480fcd2b2dac98c74fff9ba0257025cf74852a8ac2b4ac1c232886dc95\" pid:7164 exited_at:{seconds:1760336903 nanos:823776581}" Oct 13 06:28:29.287745 containerd[2022]: time="2025-10-13T06:28:29.287719313Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"cb8098013a57d92b47efcb2a10d5cddb19380ea3e967aa6fba4438728d33bac0\" pid:7186 exited_at:{seconds:1760336909 nanos:287415850}" Oct 13 06:28:37.218747 containerd[2022]: time="2025-10-13T06:28:37.218682060Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"2c6d035dfb5caf98a7ff37bd1369109d364c2c3aa224f0b2304d791aead81a5a\" pid:7221 exited_at:{seconds:1760336917 nanos:218523939}" Oct 13 06:28:39.756096 containerd[2022]: time="2025-10-13T06:28:39.756037704Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"43a6461bc62cfeac922e267ac2512b5596b383c94725f714536a45a51929f685\" pid:7242 exited_at:{seconds:1760336919 nanos:755812604}" Oct 13 06:28:51.825480 containerd[2022]: time="2025-10-13T06:28:51.825452904Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"d29d35354e5b1726107bae46c67c15b4267b079f7d08770901d683241ac66b36\" pid:7286 exited_at:{seconds:1760336931 nanos:825198575}" Oct 13 06:28:53.863301 containerd[2022]: time="2025-10-13T06:28:53.863270617Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"b4ab6bd925eadcae496145cf3369d4b08c23de33a028993a6468214f7860b915\" pid:7316 exited_at:{seconds:1760336933 nanos:863113569}" Oct 13 06:29:09.755148 containerd[2022]: time="2025-10-13T06:29:09.755116652Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"14d1f603f1d1aca82904254a21a676d479022c58fed2522071a138c18fe961d5\" pid:7339 exited_at:{seconds:1760336949 nanos:754915154}" Oct 13 06:29:21.867384 containerd[2022]: time="2025-10-13T06:29:21.867356799Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"1c9b1733a29ef386446003c07cf4a1ac098ed7c5ab06ade037731d16529da89b\" pid:7377 exited_at:{seconds:1760336961 nanos:867133084}" Oct 13 06:29:23.832717 containerd[2022]: time="2025-10-13T06:29:23.832686785Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"cc36b0f3d5f1264817224706635ce9eed7427965ff9849f2cb340f5c02ff568a\" pid:7425 exited_at:{seconds:1760336963 nanos:832532994}" Oct 13 06:29:29.276090 containerd[2022]: time="2025-10-13T06:29:29.275999238Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"009c886151b95999d7700f7b2cc175e3151896a5de9f96131738b9b9f698faf7\" pid:7455 exited_at:{seconds:1760336969 nanos:275779799}" Oct 13 06:29:37.222154 containerd[2022]: time="2025-10-13T06:29:37.222127875Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"0dd1173295082f066513d58353bfe3242ee3f9b75e6bc27ba4c9498bc3f93ea2\" pid:7489 exited_at:{seconds:1760336977 nanos:221966916}" Oct 13 06:29:39.762795 containerd[2022]: time="2025-10-13T06:29:39.762731898Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"235dcbfb8567595bee21fa8811501343d61fcdefd2fba4687dd330075b2b4348\" pid:7511 exited_at:{seconds:1760336979 nanos:762526118}" Oct 13 06:29:51.868984 containerd[2022]: time="2025-10-13T06:29:51.868921747Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"00bf7b68b344d6103ad067c6789870d62bc6a8506d31e7bf5782acd3651dbf5c\" pid:7547 exited_at:{seconds:1760336991 nanos:868657800}" Oct 13 06:29:53.826344 containerd[2022]: time="2025-10-13T06:29:53.826308217Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"4d744cabc5c33773f4d51e8d39ff837eb7ed5adb3228ad0173e1a0c5cf387f97\" pid:7579 exited_at:{seconds:1760336993 nanos:826161663}" Oct 13 06:30:09.753493 containerd[2022]: time="2025-10-13T06:30:09.753363337Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"ffbce02cf12f412acb3e7cd11182a10806a7eb717702da0a3cf4ab017076c0e3\" pid:7605 exited_at:{seconds:1760337009 nanos:753155390}" Oct 13 06:30:21.891309 containerd[2022]: time="2025-10-13T06:30:21.891256595Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"46fb178b01f0c544f1871f854733bc2281e2b4ae5c85eaabd0b7e695e3f05393\" pid:7646 exited_at:{seconds:1760337021 nanos:891041050}" Oct 13 06:30:23.817420 containerd[2022]: time="2025-10-13T06:30:23.817395321Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"b369eb70ea70c2d6635fff03aae5b34a9ec2e467652e71c7fb1fc3c55974987b\" pid:7679 exited_at:{seconds:1760337023 nanos:817276646}" Oct 13 06:30:29.278604 containerd[2022]: time="2025-10-13T06:30:29.278548156Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"2f5ea992e197215916d4463a77305e164f5c6fb3671b9eb482bd03231514c081\" pid:7702 exited_at:{seconds:1760337029 nanos:278319214}" Oct 13 06:30:30.780678 containerd[2022]: time="2025-10-13T06:30:30.780487003Z" level=warning msg="container event discarded" container=f697c50be01fcd626705c1b957ee2ae490c7375f14b20bba4816a467dc07be8a type=CONTAINER_CREATED_EVENT Oct 13 06:30:30.792048 containerd[2022]: time="2025-10-13T06:30:30.791901119Z" level=warning msg="container event discarded" container=f697c50be01fcd626705c1b957ee2ae490c7375f14b20bba4816a467dc07be8a type=CONTAINER_STARTED_EVENT Oct 13 06:30:30.792048 containerd[2022]: time="2025-10-13T06:30:30.792010983Z" level=warning msg="container event discarded" container=eaa710c21edd0688dd875a2068a97e92e532ded3ba744780572f599a7cd83c85 type=CONTAINER_CREATED_EVENT Oct 13 06:30:30.792048 containerd[2022]: time="2025-10-13T06:30:30.792039204Z" level=warning msg="container event discarded" container=eaa710c21edd0688dd875a2068a97e92e532ded3ba744780572f599a7cd83c85 type=CONTAINER_STARTED_EVENT Oct 13 06:30:30.792048 containerd[2022]: time="2025-10-13T06:30:30.792060589Z" level=warning msg="container event discarded" container=ca8d8218fcd459a51df3f602959450678068b9e81c85bac0601a2ea36ac16298 type=CONTAINER_CREATED_EVENT Oct 13 06:30:30.792622 containerd[2022]: time="2025-10-13T06:30:30.792079786Z" level=warning msg="container event discarded" container=ca8d8218fcd459a51df3f602959450678068b9e81c85bac0601a2ea36ac16298 type=CONTAINER_STARTED_EVENT Oct 13 06:30:30.792622 containerd[2022]: time="2025-10-13T06:30:30.792103684Z" level=warning msg="container event discarded" container=9d887a4b9252720fbf3840ce5576043a6ab214717871fef0dcfa941562dfc9fa type=CONTAINER_CREATED_EVENT Oct 13 06:30:30.792622 containerd[2022]: time="2025-10-13T06:30:30.792127274Z" level=warning msg="container event discarded" container=80c6cecb59189b5afab3a1873531c94b6d8a4a43884748ff9e03ef8c5b49e096 type=CONTAINER_CREATED_EVENT Oct 13 06:30:30.792622 containerd[2022]: time="2025-10-13T06:30:30.792147319Z" level=warning msg="container event discarded" container=9a1bc1522a94ac483794a3c5e8a129d7b1454d50bf17bb5d62a46fdf4fbf8ddf type=CONTAINER_CREATED_EVENT Oct 13 06:30:30.848702 containerd[2022]: time="2025-10-13T06:30:30.848561511Z" level=warning msg="container event discarded" container=80c6cecb59189b5afab3a1873531c94b6d8a4a43884748ff9e03ef8c5b49e096 type=CONTAINER_STARTED_EVENT Oct 13 06:30:30.848702 containerd[2022]: time="2025-10-13T06:30:30.848642908Z" level=warning msg="container event discarded" container=9a1bc1522a94ac483794a3c5e8a129d7b1454d50bf17bb5d62a46fdf4fbf8ddf type=CONTAINER_STARTED_EVENT Oct 13 06:30:30.848702 containerd[2022]: time="2025-10-13T06:30:30.848670231Z" level=warning msg="container event discarded" container=9d887a4b9252720fbf3840ce5576043a6ab214717871fef0dcfa941562dfc9fa type=CONTAINER_STARTED_EVENT Oct 13 06:30:37.207563 containerd[2022]: time="2025-10-13T06:30:37.207506927Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"de6cfb52010c6069612e28adbf112ccb77381f34f9e91723918e228bcf5ad0b7\" pid:7738 exited_at:{seconds:1760337037 nanos:207391316}" Oct 13 06:30:39.744866 containerd[2022]: time="2025-10-13T06:30:39.744819825Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"048466ca33373f7724a5d8d7d62fab3c9cc61eb86e85591e415ad3ceedee3b16\" pid:7759 exited_at:{seconds:1760337039 nanos:744620960}" Oct 13 06:30:42.435571 containerd[2022]: time="2025-10-13T06:30:42.435390722Z" level=warning msg="container event discarded" container=77892dca9f489efaee76162acafe398f9c8b2b0728533e01df17dba1da0f8f9e type=CONTAINER_CREATED_EVENT Oct 13 06:30:42.435571 containerd[2022]: time="2025-10-13T06:30:42.435513990Z" level=warning msg="container event discarded" container=77892dca9f489efaee76162acafe398f9c8b2b0728533e01df17dba1da0f8f9e type=CONTAINER_STARTED_EVENT Oct 13 06:30:42.747677 containerd[2022]: time="2025-10-13T06:30:42.747418339Z" level=warning msg="container event discarded" container=8dfd41b64ebec994397295617479b23eff1a864f0f04ba82cf39d4400590edfc type=CONTAINER_CREATED_EVENT Oct 13 06:30:42.747677 containerd[2022]: time="2025-10-13T06:30:42.747508275Z" level=warning msg="container event discarded" container=8dfd41b64ebec994397295617479b23eff1a864f0f04ba82cf39d4400590edfc type=CONTAINER_STARTED_EVENT Oct 13 06:30:42.747677 containerd[2022]: time="2025-10-13T06:30:42.747535769Z" level=warning msg="container event discarded" container=f96c0839cf80003f4188d53c02992fd12de589b91ca68c9db300fd808bf7b2b3 type=CONTAINER_CREATED_EVENT Oct 13 06:30:42.791081 containerd[2022]: time="2025-10-13T06:30:42.790925134Z" level=warning msg="container event discarded" container=f96c0839cf80003f4188d53c02992fd12de589b91ca68c9db300fd808bf7b2b3 type=CONTAINER_STARTED_EVENT Oct 13 06:30:43.970866 containerd[2022]: time="2025-10-13T06:30:43.970718694Z" level=warning msg="container event discarded" container=a11760bba6c5d108bcfa23da8f0d39db008abdc6ed68764857cdd63050f5942d type=CONTAINER_CREATED_EVENT Oct 13 06:30:44.010352 containerd[2022]: time="2025-10-13T06:30:44.010169896Z" level=warning msg="container event discarded" container=a11760bba6c5d108bcfa23da8f0d39db008abdc6ed68764857cdd63050f5942d type=CONTAINER_STARTED_EVENT Oct 13 06:30:51.228782 containerd[2022]: time="2025-10-13T06:30:51.228657270Z" level=warning msg="container event discarded" container=c9d2c6ca21dd39c454f4f16fd7a70121d5c9438e27d0d4bcd8b86c1ddcee611d type=CONTAINER_CREATED_EVENT Oct 13 06:30:51.228782 containerd[2022]: time="2025-10-13T06:30:51.228754904Z" level=warning msg="container event discarded" container=c9d2c6ca21dd39c454f4f16fd7a70121d5c9438e27d0d4bcd8b86c1ddcee611d type=CONTAINER_STARTED_EVENT Oct 13 06:30:51.554228 containerd[2022]: time="2025-10-13T06:30:51.553966776Z" level=warning msg="container event discarded" container=a3363dbc7563092372632e7b18798c8a5eda0cf60e69862c1215d5d8adc72b5a type=CONTAINER_CREATED_EVENT Oct 13 06:30:51.554228 containerd[2022]: time="2025-10-13T06:30:51.554048260Z" level=warning msg="container event discarded" container=a3363dbc7563092372632e7b18798c8a5eda0cf60e69862c1215d5d8adc72b5a type=CONTAINER_STARTED_EVENT Oct 13 06:30:51.830934 containerd[2022]: time="2025-10-13T06:30:51.830844866Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"6787d88d7c3993dbff2fb65f3b674ba582feff581047b1eaae5cff443e46623c\" pid:7796 exited_at:{seconds:1760337051 nanos:830607051}" Oct 13 06:30:53.573057 containerd[2022]: time="2025-10-13T06:30:53.572939144Z" level=warning msg="container event discarded" container=c0c3b48ca0dd9c43227472392b48ba19b3b3ff58c2a3ce9686f37de810fb01ac type=CONTAINER_CREATED_EVENT Oct 13 06:30:53.617452 containerd[2022]: time="2025-10-13T06:30:53.617391966Z" level=warning msg="container event discarded" container=c0c3b48ca0dd9c43227472392b48ba19b3b3ff58c2a3ce9686f37de810fb01ac type=CONTAINER_STARTED_EVENT Oct 13 06:30:53.815320 containerd[2022]: time="2025-10-13T06:30:53.815291560Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"22d19f1022f8619340dcfff3652c083f22a099364960587a5f3446ed48457eff\" pid:7826 exited_at:{seconds:1760337053 nanos:815179214}" Oct 13 06:30:55.203440 containerd[2022]: time="2025-10-13T06:30:55.203285662Z" level=warning msg="container event discarded" container=119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4 type=CONTAINER_CREATED_EVENT Oct 13 06:30:55.246998 containerd[2022]: time="2025-10-13T06:30:55.246941118Z" level=warning msg="container event discarded" container=119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4 type=CONTAINER_STARTED_EVENT Oct 13 06:30:56.109824 containerd[2022]: time="2025-10-13T06:30:56.109652318Z" level=warning msg="container event discarded" container=119449f5280c8ba694fb1958637f664b38ced541035198cb0f32b21731f4f9e4 type=CONTAINER_STOPPED_EVENT Oct 13 06:30:59.938628 containerd[2022]: time="2025-10-13T06:30:59.938317119Z" level=warning msg="container event discarded" container=2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5 type=CONTAINER_CREATED_EVENT Oct 13 06:31:00.027968 containerd[2022]: time="2025-10-13T06:31:00.027805601Z" level=warning msg="container event discarded" container=2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5 type=CONTAINER_STARTED_EVENT Oct 13 06:31:00.978040 containerd[2022]: time="2025-10-13T06:31:00.977881500Z" level=warning msg="container event discarded" container=2fa9abe48443f4b9693aa287caa564f1e7be5d6a020ec403af4bd734d0dc34b5 type=CONTAINER_STOPPED_EVENT Oct 13 06:31:06.878027 containerd[2022]: time="2025-10-13T06:31:06.877863302Z" level=warning msg="container event discarded" container=f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139 type=CONTAINER_CREATED_EVENT Oct 13 06:31:06.913528 containerd[2022]: time="2025-10-13T06:31:06.913374479Z" level=warning msg="container event discarded" container=f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139 type=CONTAINER_STARTED_EVENT Oct 13 06:31:08.180436 containerd[2022]: time="2025-10-13T06:31:08.180275069Z" level=warning msg="container event discarded" container=5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291 type=CONTAINER_CREATED_EVENT Oct 13 06:31:08.180436 containerd[2022]: time="2025-10-13T06:31:08.180377919Z" level=warning msg="container event discarded" container=5362a6fd2a15536114a19b3b1c48c80e42735e18fbdfe583140ed95027ffc291 type=CONTAINER_STARTED_EVENT Oct 13 06:31:09.756819 containerd[2022]: time="2025-10-13T06:31:09.756757122Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"99bf5bb01dc71698e1c532a8eaa77abdeaf395051c279ba0284c130261a9b71d\" pid:7871 exited_at:{seconds:1760337069 nanos:756455596}" Oct 13 06:31:09.897540 containerd[2022]: time="2025-10-13T06:31:09.897436536Z" level=warning msg="container event discarded" container=35c58363f6b565ed6d05b9a5cb530071b4371b4d4b78ebaebc262184d699d34f type=CONTAINER_CREATED_EVENT Oct 13 06:31:09.945980 containerd[2022]: time="2025-10-13T06:31:09.945825433Z" level=warning msg="container event discarded" container=35c58363f6b565ed6d05b9a5cb530071b4371b4d4b78ebaebc262184d699d34f type=CONTAINER_STARTED_EVENT Oct 13 06:31:11.708154 containerd[2022]: time="2025-10-13T06:31:11.708008777Z" level=warning msg="container event discarded" container=b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc type=CONTAINER_CREATED_EVENT Oct 13 06:31:11.708154 containerd[2022]: time="2025-10-13T06:31:11.708098446Z" level=warning msg="container event discarded" container=b37f54ed34c6c60114f640e6d4ae96d2da297774b9d6595d84dd51f4538f0bdc type=CONTAINER_STARTED_EVENT Oct 13 06:31:11.708154 containerd[2022]: time="2025-10-13T06:31:11.708125165Z" level=warning msg="container event discarded" container=78c01d202482c0b162d04ef45376f9b96947b2775ab013a15eb5865671950e99 type=CONTAINER_CREATED_EVENT Oct 13 06:31:11.752665 containerd[2022]: time="2025-10-13T06:31:11.752500007Z" level=warning msg="container event discarded" container=78c01d202482c0b162d04ef45376f9b96947b2775ab013a15eb5865671950e99 type=CONTAINER_STARTED_EVENT Oct 13 06:31:12.214080 containerd[2022]: time="2025-10-13T06:31:12.213892789Z" level=warning msg="container event discarded" container=1de3a6974b2fc39d625abb268ff3e90d4a12ca97b4e171a8e5e931467af4d664 type=CONTAINER_CREATED_EVENT Oct 13 06:31:12.266553 containerd[2022]: time="2025-10-13T06:31:12.266394064Z" level=warning msg="container event discarded" container=1de3a6974b2fc39d625abb268ff3e90d4a12ca97b4e171a8e5e931467af4d664 type=CONTAINER_STARTED_EVENT Oct 13 06:31:12.755376 containerd[2022]: time="2025-10-13T06:31:12.755142055Z" level=warning msg="container event discarded" container=9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471 type=CONTAINER_CREATED_EVENT Oct 13 06:31:12.755376 containerd[2022]: time="2025-10-13T06:31:12.755307209Z" level=warning msg="container event discarded" container=9c40b2baaac0f4eca4ae05afd9a8c284fbdf75c34a11c7ae86a94304409c5471 type=CONTAINER_STARTED_EVENT Oct 13 06:31:13.733606 containerd[2022]: time="2025-10-13T06:31:13.733451816Z" level=warning msg="container event discarded" container=bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342 type=CONTAINER_CREATED_EVENT Oct 13 06:31:13.733606 containerd[2022]: time="2025-10-13T06:31:13.733548754Z" level=warning msg="container event discarded" container=bf5f3c3c958b094ceebc81c1b9cde7a6a6a2fd38352309c9c045a90478e0f342 type=CONTAINER_STARTED_EVENT Oct 13 06:31:13.859145 containerd[2022]: time="2025-10-13T06:31:13.858995600Z" level=warning msg="container event discarded" container=d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301 type=CONTAINER_CREATED_EVENT Oct 13 06:31:13.859145 containerd[2022]: time="2025-10-13T06:31:13.859086504Z" level=warning msg="container event discarded" container=d7680db081a095f074edbb08ca41b2456635bd3738308e423d36db80a725c301 type=CONTAINER_STARTED_EVENT Oct 13 06:31:14.700600 containerd[2022]: time="2025-10-13T06:31:14.700499861Z" level=warning msg="container event discarded" container=2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014 type=CONTAINER_CREATED_EVENT Oct 13 06:31:14.700600 containerd[2022]: time="2025-10-13T06:31:14.700584086Z" level=warning msg="container event discarded" container=2e962bf13a8d2f276c1ee9ebbc872a2c4065b9a1ca66d4d14518f43a9394b014 type=CONTAINER_STARTED_EVENT Oct 13 06:31:14.700600 containerd[2022]: time="2025-10-13T06:31:14.700609820Z" level=warning msg="container event discarded" container=aa374fe792f03961ae6bcaffa372ca3918225dac1cef8514158a777769b46fb1 type=CONTAINER_CREATED_EVENT Oct 13 06:31:14.737201 containerd[2022]: time="2025-10-13T06:31:14.737020965Z" level=warning msg="container event discarded" container=aa374fe792f03961ae6bcaffa372ca3918225dac1cef8514158a777769b46fb1 type=CONTAINER_STARTED_EVENT Oct 13 06:31:15.580260 containerd[2022]: time="2025-10-13T06:31:15.580182506Z" level=warning msg="container event discarded" container=fcb45d8df585a7fac7736c8b9321cd53f4db52cd20452dc4fc877917554b5018 type=CONTAINER_CREATED_EVENT Oct 13 06:31:15.629974 containerd[2022]: time="2025-10-13T06:31:15.629839606Z" level=warning msg="container event discarded" container=fcb45d8df585a7fac7736c8b9321cd53f4db52cd20452dc4fc877917554b5018 type=CONTAINER_STARTED_EVENT Oct 13 06:31:15.715543 containerd[2022]: time="2025-10-13T06:31:15.715370359Z" level=warning msg="container event discarded" container=a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272 type=CONTAINER_CREATED_EVENT Oct 13 06:31:15.715543 containerd[2022]: time="2025-10-13T06:31:15.715483784Z" level=warning msg="container event discarded" container=a7309ffbeae74d7f0ae84070410e4fb88c050da2f56a0e04196da9d9cd0be272 type=CONTAINER_STARTED_EVENT Oct 13 06:31:16.719971 containerd[2022]: time="2025-10-13T06:31:16.719834488Z" level=warning msg="container event discarded" container=e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07 type=CONTAINER_CREATED_EVENT Oct 13 06:31:16.719971 containerd[2022]: time="2025-10-13T06:31:16.719957400Z" level=warning msg="container event discarded" container=e79eadad2014dcd84bc08a9401496a82b97abcb4b79072f57a09c789c9a34e07 type=CONTAINER_STARTED_EVENT Oct 13 06:31:17.415795 containerd[2022]: time="2025-10-13T06:31:17.415625381Z" level=warning msg="container event discarded" container=dfde3d1b808c9d04957e73eaccc2a9cc5f47e9719d8a543608da346ebfb61aae type=CONTAINER_CREATED_EVENT Oct 13 06:31:17.455195 containerd[2022]: time="2025-10-13T06:31:17.455080620Z" level=warning msg="container event discarded" container=dfde3d1b808c9d04957e73eaccc2a9cc5f47e9719d8a543608da346ebfb61aae type=CONTAINER_STARTED_EVENT Oct 13 06:31:17.855992 containerd[2022]: time="2025-10-13T06:31:17.855889924Z" level=warning msg="container event discarded" container=98ce0ccb4608e5fe5bf86a9a41b59da6917816633dd6aeac854dd7923601bf9f type=CONTAINER_CREATED_EVENT Oct 13 06:31:17.903407 containerd[2022]: time="2025-10-13T06:31:17.903297244Z" level=warning msg="container event discarded" container=98ce0ccb4608e5fe5bf86a9a41b59da6917816633dd6aeac854dd7923601bf9f type=CONTAINER_STARTED_EVENT Oct 13 06:31:20.545333 containerd[2022]: time="2025-10-13T06:31:20.545155977Z" level=warning msg="container event discarded" container=1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847 type=CONTAINER_CREATED_EVENT Oct 13 06:31:20.604395 containerd[2022]: time="2025-10-13T06:31:20.604226073Z" level=warning msg="container event discarded" container=1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847 type=CONTAINER_STARTED_EVENT Oct 13 06:31:21.821348 containerd[2022]: time="2025-10-13T06:31:21.821306942Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"99389348c4bc8354780a164ee47598c1d6482741d35dcb0092ea229f3dfdc8dd\" pid:7909 exited_at:{seconds:1760337081 nanos:821100009}" Oct 13 06:31:23.352868 containerd[2022]: time="2025-10-13T06:31:23.352732746Z" level=warning msg="container event discarded" container=ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b type=CONTAINER_CREATED_EVENT Oct 13 06:31:23.401516 containerd[2022]: time="2025-10-13T06:31:23.401399813Z" level=warning msg="container event discarded" container=ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b type=CONTAINER_STARTED_EVENT Oct 13 06:31:23.864328 containerd[2022]: time="2025-10-13T06:31:23.864297801Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"6edddf8e5bf799d99450d5eb1b1529315c637411d8ce00d8e7f6c09ebd20f610\" pid:7940 exited_at:{seconds:1760337083 nanos:864128425}" Oct 13 06:31:25.147728 containerd[2022]: time="2025-10-13T06:31:25.147514348Z" level=warning msg="container event discarded" container=0a452deb1e57221957585ab3dffe5cd1c7f1f0badfd87e3b0a1ea35c3e12e8b1 type=CONTAINER_CREATED_EVENT Oct 13 06:31:25.193347 containerd[2022]: time="2025-10-13T06:31:25.193156503Z" level=warning msg="container event discarded" container=0a452deb1e57221957585ab3dffe5cd1c7f1f0badfd87e3b0a1ea35c3e12e8b1 type=CONTAINER_STARTED_EVENT Oct 13 06:31:29.274248 containerd[2022]: time="2025-10-13T06:31:29.274217200Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"c7e45a7226655314f392717c0f127cf9037d1a394e7c0ddd8c9541628a2f20c0\" pid:7968 exited_at:{seconds:1760337089 nanos:273975647}" Oct 13 06:31:37.210370 containerd[2022]: time="2025-10-13T06:31:37.210339857Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"64d510581e3ca301982503c786ca4c3ce9085193f3d9a26aeb64e6cc07598602\" pid:8004 exited_at:{seconds:1760337097 nanos:210216652}" Oct 13 06:31:39.744951 containerd[2022]: time="2025-10-13T06:31:39.744900835Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"c494fbecb80f3b7e843d14b879db3dee79c95eeba9a968b1e1eb5a9167508b2f\" pid:8025 exited_at:{seconds:1760337099 nanos:744710764}" Oct 13 06:31:51.879452 containerd[2022]: time="2025-10-13T06:31:51.879425082Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"8026f504b6da55a321cce706233794990c967da9b1119845748c0b6396e4297d\" pid:8062 exited_at:{seconds:1760337111 nanos:879221733}" Oct 13 06:31:53.853722 containerd[2022]: time="2025-10-13T06:31:53.853690446Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"983b24eb1ce724b33c3b23a32b10a79991d995c6ab1e59ca87bffa1b8a28aa33\" pid:8093 exited_at:{seconds:1760337113 nanos:853529824}" Oct 13 06:32:09.800537 containerd[2022]: time="2025-10-13T06:32:09.800501278Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"af6fa010c77e593128e8c88deb21fdc9faeb8ad3a5028898e118500841abfab9\" pid:8116 exited_at:{seconds:1760337129 nanos:800300526}" Oct 13 06:32:21.822020 containerd[2022]: time="2025-10-13T06:32:21.821991613Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"9583cc75980985f118ebabe82668f07fb27170d2e7f2850bbe3f4a607ae7239a\" pid:8153 exited_at:{seconds:1760337141 nanos:821746233}" Oct 13 06:32:23.870344 containerd[2022]: time="2025-10-13T06:32:23.870317733Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"fc4d2c3485255b35134e40dbf938d8789bf12f320ce912fc886eedf1bf1e707c\" pid:8186 exited_at:{seconds:1760337143 nanos:870175638}" Oct 13 06:32:29.280565 containerd[2022]: time="2025-10-13T06:32:29.280513061Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"98f5ab69e087c4eaa65373fc87d9bc801a4ce8eacff97d5bd7a6ccb141d3fa45\" pid:8215 exited_at:{seconds:1760337149 nanos:280267947}" Oct 13 06:32:30.618191 systemd[1]: Started sshd@12-139.178.94.13:22-193.46.255.7:25334.service - OpenSSH per-connection server daemon (193.46.255.7:25334). Oct 13 06:32:31.916082 sshd-session[8238]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7 user=root Oct 13 06:32:34.535791 sshd[8235]: PAM: Permission denied for root from 193.46.255.7 Oct 13 06:32:34.882088 sshd-session[8241]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7 user=root Oct 13 06:32:36.247574 sshd[8235]: PAM: Permission denied for root from 193.46.255.7 Oct 13 06:32:36.594048 sshd-session[8259]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7 user=root Oct 13 06:32:37.121868 systemd[1]: Started sshd@13-139.178.94.13:22-139.178.68.195:55554.service - OpenSSH per-connection server daemon (139.178.68.195:55554). Oct 13 06:32:37.191397 sshd[8261]: Accepted publickey for core from 139.178.68.195 port 55554 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:32:37.192304 sshd-session[8261]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:32:37.195422 systemd-logind[2006]: New session 12 of user core. Oct 13 06:32:37.196450 systemd[1]: Started session-12.scope - Session 12 of User core. Oct 13 06:32:37.204277 containerd[2022]: time="2025-10-13T06:32:37.204253889Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"965134883032406b995e1c32be6177b709c928c8eb37267f1841a53eabcb5f9b\" pid:8279 exited_at:{seconds:1760337157 nanos:204128076}" Oct 13 06:32:37.289477 sshd[8285]: Connection closed by 139.178.68.195 port 55554 Oct 13 06:32:37.289673 sshd-session[8261]: pam_unix(sshd:session): session closed for user core Oct 13 06:32:37.291697 systemd[1]: sshd@13-139.178.94.13:22-139.178.68.195:55554.service: Deactivated successfully. Oct 13 06:32:37.292833 systemd[1]: session-12.scope: Deactivated successfully. Oct 13 06:32:37.293945 systemd-logind[2006]: Session 12 logged out. Waiting for processes to exit. Oct 13 06:32:37.294611 systemd-logind[2006]: Removed session 12. Oct 13 06:32:38.566765 sshd[8235]: PAM: Permission denied for root from 193.46.255.7 Oct 13 06:32:38.739286 sshd[8235]: Received disconnect from 193.46.255.7 port 25334:11: [preauth] Oct 13 06:32:38.739286 sshd[8235]: Disconnected from authenticating user root 193.46.255.7 port 25334 [preauth] Oct 13 06:32:38.743752 systemd[1]: sshd@12-139.178.94.13:22-193.46.255.7:25334.service: Deactivated successfully. Oct 13 06:32:38.932565 systemd[1]: Started sshd@14-139.178.94.13:22-193.46.255.7:10214.service - OpenSSH per-connection server daemon (193.46.255.7:10214). Oct 13 06:32:39.761443 containerd[2022]: time="2025-10-13T06:32:39.761407241Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"2916427bf2f6c3df66d2acc57d285cb0829e13dc83a990ae8692568213dc7a56\" pid:8337 exited_at:{seconds:1760337159 nanos:761050939}" Oct 13 06:32:40.208593 sshd-session[8360]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7 user=root Oct 13 06:32:42.313314 systemd[1]: Started sshd@15-139.178.94.13:22-139.178.68.195:55566.service - OpenSSH per-connection server daemon (139.178.68.195:55566). Oct 13 06:32:42.348211 sshd[8362]: Accepted publickey for core from 139.178.68.195 port 55566 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:32:42.349057 sshd-session[8362]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:32:42.351838 systemd-logind[2006]: New session 13 of user core. Oct 13 06:32:42.366464 systemd[1]: Started session-13.scope - Session 13 of User core. Oct 13 06:32:42.396397 sshd[8322]: PAM: Permission denied for root from 193.46.255.7 Oct 13 06:32:42.464413 sshd[8365]: Connection closed by 139.178.68.195 port 55566 Oct 13 06:32:42.464636 sshd-session[8362]: pam_unix(sshd:session): session closed for user core Oct 13 06:32:42.466426 systemd[1]: sshd@15-139.178.94.13:22-139.178.68.195:55566.service: Deactivated successfully. Oct 13 06:32:42.467494 systemd[1]: session-13.scope: Deactivated successfully. Oct 13 06:32:42.468160 systemd-logind[2006]: Session 13 logged out. Waiting for processes to exit. Oct 13 06:32:42.468705 systemd-logind[2006]: Removed session 13. Oct 13 06:32:42.742922 sshd-session[8390]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7 user=root Oct 13 06:32:44.871181 sshd[8322]: PAM: Permission denied for root from 193.46.255.7 Oct 13 06:32:45.215760 sshd-session[8393]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7 user=root Oct 13 06:32:47.423588 sshd[8322]: PAM: Permission denied for root from 193.46.255.7 Oct 13 06:32:47.486145 systemd[1]: Started sshd@16-139.178.94.13:22-139.178.68.195:35186.service - OpenSSH per-connection server daemon (139.178.68.195:35186). Oct 13 06:32:47.571412 sshd[8395]: Accepted publickey for core from 139.178.68.195 port 35186 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:32:47.572012 sshd-session[8395]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:32:47.574922 systemd-logind[2006]: New session 14 of user core. Oct 13 06:32:47.585527 systemd[1]: Started session-14.scope - Session 14 of User core. Oct 13 06:32:47.596341 sshd[8322]: Received disconnect from 193.46.255.7 port 10214:11: [preauth] Oct 13 06:32:47.596341 sshd[8322]: Disconnected from authenticating user root 193.46.255.7 port 10214 [preauth] Oct 13 06:32:47.597334 systemd[1]: sshd@14-139.178.94.13:22-193.46.255.7:10214.service: Deactivated successfully. Oct 13 06:32:47.670404 sshd[8398]: Connection closed by 139.178.68.195 port 35186 Oct 13 06:32:47.670664 sshd-session[8395]: pam_unix(sshd:session): session closed for user core Oct 13 06:32:47.702347 systemd[1]: sshd@16-139.178.94.13:22-139.178.68.195:35186.service: Deactivated successfully. Oct 13 06:32:47.706658 systemd[1]: session-14.scope: Deactivated successfully. Oct 13 06:32:47.708930 systemd-logind[2006]: Session 14 logged out. Waiting for processes to exit. Oct 13 06:32:47.712837 systemd-logind[2006]: Removed session 14. Oct 13 06:32:47.715940 systemd[1]: Started sshd@17-139.178.94.13:22-139.178.68.195:35192.service - OpenSSH per-connection server daemon (139.178.68.195:35192). Oct 13 06:32:47.789314 systemd[1]: Started sshd@18-139.178.94.13:22-193.46.255.7:10226.service - OpenSSH per-connection server daemon (193.46.255.7:10226). Oct 13 06:32:47.806845 sshd[8426]: Accepted publickey for core from 139.178.68.195 port 35192 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:32:47.808174 sshd-session[8426]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:32:47.813520 systemd-logind[2006]: New session 15 of user core. Oct 13 06:32:47.830530 systemd[1]: Started session-15.scope - Session 15 of User core. Oct 13 06:32:48.011614 sshd[8435]: Connection closed by 139.178.68.195 port 35192 Oct 13 06:32:48.011848 sshd-session[8426]: pam_unix(sshd:session): session closed for user core Oct 13 06:32:48.022887 systemd[1]: sshd@17-139.178.94.13:22-139.178.68.195:35192.service: Deactivated successfully. Oct 13 06:32:48.024713 systemd[1]: session-15.scope: Deactivated successfully. Oct 13 06:32:48.025510 systemd-logind[2006]: Session 15 logged out. Waiting for processes to exit. Oct 13 06:32:48.027734 systemd[1]: Started sshd@19-139.178.94.13:22-139.178.68.195:35198.service - OpenSSH per-connection server daemon (139.178.68.195:35198). Oct 13 06:32:48.028362 systemd-logind[2006]: Removed session 15. Oct 13 06:32:48.112211 sshd[8458]: Accepted publickey for core from 139.178.68.195 port 35198 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:32:48.113601 sshd-session[8458]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:32:48.119281 systemd-logind[2006]: New session 16 of user core. Oct 13 06:32:48.132513 systemd[1]: Started session-16.scope - Session 16 of User core. Oct 13 06:32:48.268760 sshd[8463]: Connection closed by 139.178.68.195 port 35198 Oct 13 06:32:48.268899 sshd-session[8458]: pam_unix(sshd:session): session closed for user core Oct 13 06:32:48.270727 systemd[1]: sshd@19-139.178.94.13:22-139.178.68.195:35198.service: Deactivated successfully. Oct 13 06:32:48.271690 systemd[1]: session-16.scope: Deactivated successfully. Oct 13 06:32:48.272332 systemd-logind[2006]: Session 16 logged out. Waiting for processes to exit. Oct 13 06:32:48.273013 systemd-logind[2006]: Removed session 16. Oct 13 06:32:49.057110 sshd-session[8487]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7 user=root Oct 13 06:32:51.009824 sshd[8432]: PAM: Permission denied for root from 193.46.255.7 Oct 13 06:32:51.355754 sshd-session[8488]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7 user=root Oct 13 06:32:51.840354 containerd[2022]: time="2025-10-13T06:32:51.840290264Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1ed2d24274e79278562442f25d6eb7a92e0eb48019c2224f2a8f1c2b781f1847\" id:\"036284aac5d1af8e6959814fa1c52fa9bfe18874a1de0a1b46f23f977548ea81\" pid:8501 exited_at:{seconds:1760337171 nanos:840088964}" Oct 13 06:32:53.052698 sshd[8432]: PAM: Permission denied for root from 193.46.255.7 Oct 13 06:32:53.285252 systemd[1]: Started sshd@20-139.178.94.13:22-139.178.68.195:35212.service - OpenSSH per-connection server daemon (139.178.68.195:35212). Oct 13 06:32:53.322369 sshd[8523]: Accepted publickey for core from 139.178.68.195 port 35212 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:32:53.325641 sshd-session[8523]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:32:53.338733 systemd-logind[2006]: New session 17 of user core. Oct 13 06:32:53.355348 systemd[1]: Started session-17.scope - Session 17 of User core. Oct 13 06:32:53.398526 sshd-session[8521]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7 user=root Oct 13 06:32:53.499846 sshd[8529]: Connection closed by 139.178.68.195 port 35212 Oct 13 06:32:53.500060 sshd-session[8523]: pam_unix(sshd:session): session closed for user core Oct 13 06:32:53.502388 systemd[1]: sshd@20-139.178.94.13:22-139.178.68.195:35212.service: Deactivated successfully. Oct 13 06:32:53.503301 systemd[1]: session-17.scope: Deactivated successfully. Oct 13 06:32:53.503773 systemd-logind[2006]: Session 17 logged out. Waiting for processes to exit. Oct 13 06:32:53.504250 systemd-logind[2006]: Removed session 17. Oct 13 06:32:53.827081 containerd[2022]: time="2025-10-13T06:32:53.827019718Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffe4ae523c092acefde2738f355bc4642f36f96f0c59338aba5ca3887021365b\" id:\"642da60f87c04656423851a65df37f147806c2b22d8a2b9f0c616d2e80f1f426\" pid:8566 exited_at:{seconds:1760337173 nanos:826870945}" Oct 13 06:32:55.370601 sshd[8432]: PAM: Permission denied for root from 193.46.255.7 Oct 13 06:32:55.543117 sshd[8432]: Received disconnect from 193.46.255.7 port 10226:11: [preauth] Oct 13 06:32:55.543117 sshd[8432]: Disconnected from authenticating user root 193.46.255.7 port 10226 [preauth] Oct 13 06:32:55.545617 systemd[1]: sshd@18-139.178.94.13:22-193.46.255.7:10226.service: Deactivated successfully. Oct 13 06:32:58.520329 systemd[1]: Started sshd@21-139.178.94.13:22-139.178.68.195:41294.service - OpenSSH per-connection server daemon (139.178.68.195:41294). Oct 13 06:32:58.562411 sshd[8579]: Accepted publickey for core from 139.178.68.195 port 41294 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:32:58.563294 sshd-session[8579]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:32:58.567221 systemd-logind[2006]: New session 18 of user core. Oct 13 06:32:58.582715 systemd[1]: Started session-18.scope - Session 18 of User core. Oct 13 06:32:58.733495 sshd[8582]: Connection closed by 139.178.68.195 port 41294 Oct 13 06:32:58.733707 sshd-session[8579]: pam_unix(sshd:session): session closed for user core Oct 13 06:32:58.735813 systemd[1]: sshd@21-139.178.94.13:22-139.178.68.195:41294.service: Deactivated successfully. Oct 13 06:32:58.736712 systemd[1]: session-18.scope: Deactivated successfully. Oct 13 06:32:58.737149 systemd-logind[2006]: Session 18 logged out. Waiting for processes to exit. Oct 13 06:32:58.737843 systemd-logind[2006]: Removed session 18. Oct 13 06:33:03.763227 systemd[1]: Started sshd@22-139.178.94.13:22-139.178.68.195:41306.service - OpenSSH per-connection server daemon (139.178.68.195:41306). Oct 13 06:33:03.834338 sshd[8607]: Accepted publickey for core from 139.178.68.195 port 41306 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:33:03.835459 sshd-session[8607]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:33:03.840220 systemd-logind[2006]: New session 19 of user core. Oct 13 06:33:03.861602 systemd[1]: Started session-19.scope - Session 19 of User core. Oct 13 06:33:03.979620 sshd[8610]: Connection closed by 139.178.68.195 port 41306 Oct 13 06:33:03.979833 sshd-session[8607]: pam_unix(sshd:session): session closed for user core Oct 13 06:33:04.003470 systemd[1]: sshd@22-139.178.94.13:22-139.178.68.195:41306.service: Deactivated successfully. Oct 13 06:33:04.004505 systemd[1]: session-19.scope: Deactivated successfully. Oct 13 06:33:04.004966 systemd-logind[2006]: Session 19 logged out. Waiting for processes to exit. Oct 13 06:33:04.006420 systemd[1]: Started sshd@23-139.178.94.13:22-139.178.68.195:41312.service - OpenSSH per-connection server daemon (139.178.68.195:41312). Oct 13 06:33:04.006853 systemd-logind[2006]: Removed session 19. Oct 13 06:33:04.046040 sshd[8635]: Accepted publickey for core from 139.178.68.195 port 41312 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:33:04.046930 sshd-session[8635]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:33:04.051251 systemd-logind[2006]: New session 20 of user core. Oct 13 06:33:04.070589 systemd[1]: Started session-20.scope - Session 20 of User core. Oct 13 06:33:04.218930 sshd[8638]: Connection closed by 139.178.68.195 port 41312 Oct 13 06:33:04.219117 sshd-session[8635]: pam_unix(sshd:session): session closed for user core Oct 13 06:33:04.243976 systemd[1]: sshd@23-139.178.94.13:22-139.178.68.195:41312.service: Deactivated successfully. Oct 13 06:33:04.248299 systemd[1]: session-20.scope: Deactivated successfully. Oct 13 06:33:04.250505 systemd-logind[2006]: Session 20 logged out. Waiting for processes to exit. Oct 13 06:33:04.256468 systemd[1]: Started sshd@24-139.178.94.13:22-139.178.68.195:41324.service - OpenSSH per-connection server daemon (139.178.68.195:41324). Oct 13 06:33:04.258331 systemd-logind[2006]: Removed session 20. Oct 13 06:33:04.338480 sshd[8661]: Accepted publickey for core from 139.178.68.195 port 41324 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:33:04.341577 sshd-session[8661]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:33:04.354998 systemd-logind[2006]: New session 21 of user core. Oct 13 06:33:04.371715 systemd[1]: Started session-21.scope - Session 21 of User core. Oct 13 06:33:04.972703 sshd[8666]: Connection closed by 139.178.68.195 port 41324 Oct 13 06:33:04.972960 sshd-session[8661]: pam_unix(sshd:session): session closed for user core Oct 13 06:33:04.982497 systemd[1]: sshd@24-139.178.94.13:22-139.178.68.195:41324.service: Deactivated successfully. Oct 13 06:33:04.983886 systemd[1]: session-21.scope: Deactivated successfully. Oct 13 06:33:04.984408 systemd-logind[2006]: Session 21 logged out. Waiting for processes to exit. Oct 13 06:33:04.985825 systemd[1]: Started sshd@25-139.178.94.13:22-139.178.68.195:41336.service - OpenSSH per-connection server daemon (139.178.68.195:41336). Oct 13 06:33:04.986163 systemd-logind[2006]: Removed session 21. Oct 13 06:33:05.020775 sshd[8694]: Accepted publickey for core from 139.178.68.195 port 41336 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:33:05.021492 sshd-session[8694]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:33:05.024206 systemd-logind[2006]: New session 22 of user core. Oct 13 06:33:05.035416 systemd[1]: Started session-22.scope - Session 22 of User core. Oct 13 06:33:05.225490 sshd[8698]: Connection closed by 139.178.68.195 port 41336 Oct 13 06:33:05.225741 sshd-session[8694]: pam_unix(sshd:session): session closed for user core Oct 13 06:33:05.253684 systemd[1]: sshd@25-139.178.94.13:22-139.178.68.195:41336.service: Deactivated successfully. Oct 13 06:33:05.258134 systemd[1]: session-22.scope: Deactivated successfully. Oct 13 06:33:05.260558 systemd-logind[2006]: Session 22 logged out. Waiting for processes to exit. Oct 13 06:33:05.267583 systemd[1]: Started sshd@26-139.178.94.13:22-139.178.68.195:41340.service - OpenSSH per-connection server daemon (139.178.68.195:41340). Oct 13 06:33:05.269862 systemd-logind[2006]: Removed session 22. Oct 13 06:33:05.341263 sshd[8721]: Accepted publickey for core from 139.178.68.195 port 41340 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:33:05.344946 sshd-session[8721]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:33:05.357806 systemd-logind[2006]: New session 23 of user core. Oct 13 06:33:05.377672 systemd[1]: Started session-23.scope - Session 23 of User core. Oct 13 06:33:05.506293 sshd[8725]: Connection closed by 139.178.68.195 port 41340 Oct 13 06:33:05.506429 sshd-session[8721]: pam_unix(sshd:session): session closed for user core Oct 13 06:33:05.508327 systemd[1]: sshd@26-139.178.94.13:22-139.178.68.195:41340.service: Deactivated successfully. Oct 13 06:33:05.509329 systemd[1]: session-23.scope: Deactivated successfully. Oct 13 06:33:05.509796 systemd-logind[2006]: Session 23 logged out. Waiting for processes to exit. Oct 13 06:33:05.510389 systemd-logind[2006]: Removed session 23. Oct 13 06:33:09.745798 containerd[2022]: time="2025-10-13T06:33:09.745753281Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f374d706f8d884fc47f24da590456e58972711e66ee7d0cbf9992b258e19d139\" id:\"072287e7cdbf58cf2df4a121f3db2044498d63a933878c75ce4aa113be4bad41\" pid:8764 exited_at:{seconds:1760337189 nanos:745492853}" Oct 13 06:33:10.534103 systemd[1]: Started sshd@27-139.178.94.13:22-139.178.68.195:59096.service - OpenSSH per-connection server daemon (139.178.68.195:59096). Oct 13 06:33:10.587400 sshd[8789]: Accepted publickey for core from 139.178.68.195 port 59096 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:33:10.590667 sshd-session[8789]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:33:10.596190 systemd-logind[2006]: New session 24 of user core. Oct 13 06:33:10.609463 systemd[1]: Started session-24.scope - Session 24 of User core. Oct 13 06:33:10.769416 sshd[8792]: Connection closed by 139.178.68.195 port 59096 Oct 13 06:33:10.769598 sshd-session[8789]: pam_unix(sshd:session): session closed for user core Oct 13 06:33:10.771665 systemd[1]: sshd@27-139.178.94.13:22-139.178.68.195:59096.service: Deactivated successfully. Oct 13 06:33:10.772762 systemd[1]: session-24.scope: Deactivated successfully. Oct 13 06:33:10.773829 systemd-logind[2006]: Session 24 logged out. Waiting for processes to exit. Oct 13 06:33:10.774481 systemd-logind[2006]: Removed session 24. Oct 13 06:33:15.781948 systemd[1]: Started sshd@28-139.178.94.13:22-139.178.68.195:59106.service - OpenSSH per-connection server daemon (139.178.68.195:59106). Oct 13 06:33:15.828446 sshd[8820]: Accepted publickey for core from 139.178.68.195 port 59106 ssh2: RSA SHA256:29xnVtTnqplzA8GhQ5YZlABloObaj56nMNifpGcXPTE Oct 13 06:33:15.829689 sshd-session[8820]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:33:15.834121 systemd-logind[2006]: New session 25 of user core. Oct 13 06:33:15.850415 systemd[1]: Started session-25.scope - Session 25 of User core. Oct 13 06:33:15.961314 sshd[8823]: Connection closed by 139.178.68.195 port 59106 Oct 13 06:33:15.961480 sshd-session[8820]: pam_unix(sshd:session): session closed for user core Oct 13 06:33:15.963397 systemd[1]: sshd@28-139.178.94.13:22-139.178.68.195:59106.service: Deactivated successfully. Oct 13 06:33:15.964399 systemd[1]: session-25.scope: Deactivated successfully. Oct 13 06:33:15.965119 systemd-logind[2006]: Session 25 logged out. Waiting for processes to exit. Oct 13 06:33:15.965895 systemd-logind[2006]: Removed session 25.