Sep 12 17:50:04.886395 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 15:34:39 -00 2025 Sep 12 17:50:04.886411 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:50:04.886418 kernel: BIOS-provided physical RAM map: Sep 12 17:50:04.886422 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000008f7ff] usable Sep 12 17:50:04.886426 kernel: BIOS-e820: [mem 0x000000000008f800-0x000000000009ffff] reserved Sep 12 17:50:04.886430 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Sep 12 17:50:04.886435 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Sep 12 17:50:04.886440 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Sep 12 17:50:04.886444 kernel: BIOS-e820: [mem 0x0000000040400000-0x000000006de3afff] usable Sep 12 17:50:04.886448 kernel: BIOS-e820: [mem 0x000000006de3b000-0x000000006de3bfff] ACPI NVS Sep 12 17:50:04.886452 kernel: BIOS-e820: [mem 0x000000006de3c000-0x000000006de3cfff] reserved Sep 12 17:50:04.886456 kernel: BIOS-e820: [mem 0x000000006de3d000-0x0000000077fc4fff] usable Sep 12 17:50:04.886460 kernel: BIOS-e820: [mem 0x0000000077fc5000-0x00000000790a7fff] reserved Sep 12 17:50:04.886464 kernel: BIOS-e820: [mem 0x00000000790a8000-0x0000000079230fff] usable Sep 12 17:50:04.886471 kernel: BIOS-e820: [mem 0x0000000079231000-0x0000000079662fff] ACPI NVS Sep 12 17:50:04.886475 kernel: BIOS-e820: [mem 0x0000000079663000-0x000000007befefff] reserved Sep 12 17:50:04.886480 kernel: BIOS-e820: [mem 0x000000007beff000-0x000000007befffff] usable Sep 12 17:50:04.886485 kernel: BIOS-e820: [mem 0x000000007bf00000-0x000000007f7fffff] reserved Sep 12 17:50:04.886489 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 12 17:50:04.886494 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Sep 12 17:50:04.886498 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Sep 12 17:50:04.886503 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Sep 12 17:50:04.886507 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Sep 12 17:50:04.886513 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000087f7fffff] usable Sep 12 17:50:04.886517 kernel: NX (Execute Disable) protection: active Sep 12 17:50:04.886522 kernel: APIC: Static calls initialized Sep 12 17:50:04.886526 kernel: SMBIOS 3.2.1 present. Sep 12 17:50:04.886531 kernel: DMI: Supermicro PIO-519C-MR-PH004/X11SCH-F, BIOS 1.5 11/17/2020 Sep 12 17:50:04.886536 kernel: DMI: Memory slots populated: 2/4 Sep 12 17:50:04.886540 kernel: tsc: Detected 3400.000 MHz processor Sep 12 17:50:04.886545 kernel: tsc: Detected 3399.906 MHz TSC Sep 12 17:50:04.886549 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 12 17:50:04.886554 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 12 17:50:04.886559 kernel: last_pfn = 0x87f800 max_arch_pfn = 0x400000000 Sep 12 17:50:04.886565 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 23), built from 10 variable MTRRs Sep 12 17:50:04.886570 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 12 17:50:04.886574 kernel: last_pfn = 0x7bf00 max_arch_pfn = 0x400000000 Sep 12 17:50:04.886579 kernel: Using GB pages for direct mapping Sep 12 17:50:04.886584 kernel: ACPI: Early table checksum verification disabled Sep 12 17:50:04.886589 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Sep 12 17:50:04.886595 kernel: ACPI: XSDT 0x00000000795440C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Sep 12 17:50:04.886601 kernel: ACPI: FACP 0x0000000079580620 000114 (v06 01072009 AMI 00010013) Sep 12 17:50:04.886606 kernel: ACPI: DSDT 0x0000000079544268 03C3B7 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Sep 12 17:50:04.886611 kernel: ACPI: FACS 0x0000000079662F80 000040 Sep 12 17:50:04.886616 kernel: ACPI: APIC 0x0000000079580738 00012C (v04 01072009 AMI 00010013) Sep 12 17:50:04.886621 kernel: ACPI: FPDT 0x0000000079580868 000044 (v01 01072009 AMI 00010013) Sep 12 17:50:04.886626 kernel: ACPI: FIDT 0x00000000795808B0 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Sep 12 17:50:04.886631 kernel: ACPI: MCFG 0x0000000079580950 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Sep 12 17:50:04.886637 kernel: ACPI: SPMI 0x0000000079580990 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Sep 12 17:50:04.886642 kernel: ACPI: SSDT 0x00000000795809D8 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Sep 12 17:50:04.886647 kernel: ACPI: SSDT 0x00000000795824F8 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Sep 12 17:50:04.886652 kernel: ACPI: SSDT 0x00000000795856C0 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Sep 12 17:50:04.886657 kernel: ACPI: HPET 0x00000000795879F0 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 12 17:50:04.886662 kernel: ACPI: SSDT 0x0000000079587A28 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Sep 12 17:50:04.886666 kernel: ACPI: SSDT 0x00000000795889D8 0008F7 (v02 INTEL xh_mossb 00000000 INTL 20160527) Sep 12 17:50:04.886671 kernel: ACPI: UEFI 0x00000000795892D0 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 12 17:50:04.886677 kernel: ACPI: LPIT 0x0000000079589318 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 12 17:50:04.886682 kernel: ACPI: SSDT 0x00000000795893B0 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Sep 12 17:50:04.886687 kernel: ACPI: SSDT 0x000000007958BB90 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Sep 12 17:50:04.886692 kernel: ACPI: DBGP 0x000000007958D078 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 12 17:50:04.886697 kernel: ACPI: DBG2 0x000000007958D0B0 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Sep 12 17:50:04.886702 kernel: ACPI: SSDT 0x000000007958D108 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Sep 12 17:50:04.886707 kernel: ACPI: DMAR 0x000000007958EC70 0000A8 (v01 INTEL EDK2 00000002 01000013) Sep 12 17:50:04.886712 kernel: ACPI: SSDT 0x000000007958ED18 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Sep 12 17:50:04.886717 kernel: ACPI: TPM2 0x000000007958EE60 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Sep 12 17:50:04.886723 kernel: ACPI: SSDT 0x000000007958EE98 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Sep 12 17:50:04.886728 kernel: ACPI: WSMT 0x000000007958FC28 000028 (v01 \xddm 01072009 AMI 00010013) Sep 12 17:50:04.886733 kernel: ACPI: EINJ 0x000000007958FC50 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Sep 12 17:50:04.886738 kernel: ACPI: ERST 0x000000007958FD80 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Sep 12 17:50:04.886743 kernel: ACPI: BERT 0x000000007958FFB0 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Sep 12 17:50:04.886748 kernel: ACPI: HEST 0x000000007958FFE0 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Sep 12 17:50:04.886752 kernel: ACPI: SSDT 0x0000000079590260 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Sep 12 17:50:04.886757 kernel: ACPI: Reserving FACP table memory at [mem 0x79580620-0x79580733] Sep 12 17:50:04.886763 kernel: ACPI: Reserving DSDT table memory at [mem 0x79544268-0x7958061e] Sep 12 17:50:04.886768 kernel: ACPI: Reserving FACS table memory at [mem 0x79662f80-0x79662fbf] Sep 12 17:50:04.886773 kernel: ACPI: Reserving APIC table memory at [mem 0x79580738-0x79580863] Sep 12 17:50:04.886778 kernel: ACPI: Reserving FPDT table memory at [mem 0x79580868-0x795808ab] Sep 12 17:50:04.886783 kernel: ACPI: Reserving FIDT table memory at [mem 0x795808b0-0x7958094b] Sep 12 17:50:04.886788 kernel: ACPI: Reserving MCFG table memory at [mem 0x79580950-0x7958098b] Sep 12 17:50:04.886796 kernel: ACPI: Reserving SPMI table memory at [mem 0x79580990-0x795809d0] Sep 12 17:50:04.886801 kernel: ACPI: Reserving SSDT table memory at [mem 0x795809d8-0x795824f3] Sep 12 17:50:04.886806 kernel: ACPI: Reserving SSDT table memory at [mem 0x795824f8-0x795856bd] Sep 12 17:50:04.886812 kernel: ACPI: Reserving SSDT table memory at [mem 0x795856c0-0x795879ea] Sep 12 17:50:04.886841 kernel: ACPI: Reserving HPET table memory at [mem 0x795879f0-0x79587a27] Sep 12 17:50:04.886846 kernel: ACPI: Reserving SSDT table memory at [mem 0x79587a28-0x795889d5] Sep 12 17:50:04.886866 kernel: ACPI: Reserving SSDT table memory at [mem 0x795889d8-0x795892ce] Sep 12 17:50:04.886870 kernel: ACPI: Reserving UEFI table memory at [mem 0x795892d0-0x79589311] Sep 12 17:50:04.886890 kernel: ACPI: Reserving LPIT table memory at [mem 0x79589318-0x795893ab] Sep 12 17:50:04.886895 kernel: ACPI: Reserving SSDT table memory at [mem 0x795893b0-0x7958bb8d] Sep 12 17:50:04.886900 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958bb90-0x7958d071] Sep 12 17:50:04.886905 kernel: ACPI: Reserving DBGP table memory at [mem 0x7958d078-0x7958d0ab] Sep 12 17:50:04.886909 kernel: ACPI: Reserving DBG2 table memory at [mem 0x7958d0b0-0x7958d103] Sep 12 17:50:04.886915 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958d108-0x7958ec6e] Sep 12 17:50:04.886920 kernel: ACPI: Reserving DMAR table memory at [mem 0x7958ec70-0x7958ed17] Sep 12 17:50:04.886925 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958ed18-0x7958ee5b] Sep 12 17:50:04.886930 kernel: ACPI: Reserving TPM2 table memory at [mem 0x7958ee60-0x7958ee93] Sep 12 17:50:04.886935 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958ee98-0x7958fc26] Sep 12 17:50:04.886940 kernel: ACPI: Reserving WSMT table memory at [mem 0x7958fc28-0x7958fc4f] Sep 12 17:50:04.886945 kernel: ACPI: Reserving EINJ table memory at [mem 0x7958fc50-0x7958fd7f] Sep 12 17:50:04.886950 kernel: ACPI: Reserving ERST table memory at [mem 0x7958fd80-0x7958ffaf] Sep 12 17:50:04.886954 kernel: ACPI: Reserving BERT table memory at [mem 0x7958ffb0-0x7958ffdf] Sep 12 17:50:04.886960 kernel: ACPI: Reserving HEST table memory at [mem 0x7958ffe0-0x7959025b] Sep 12 17:50:04.886965 kernel: ACPI: Reserving SSDT table memory at [mem 0x79590260-0x795903c1] Sep 12 17:50:04.886970 kernel: No NUMA configuration found Sep 12 17:50:04.886975 kernel: Faking a node at [mem 0x0000000000000000-0x000000087f7fffff] Sep 12 17:50:04.886980 kernel: NODE_DATA(0) allocated [mem 0x87f7f8dc0-0x87f7fffff] Sep 12 17:50:04.886985 kernel: Zone ranges: Sep 12 17:50:04.886990 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 12 17:50:04.886995 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Sep 12 17:50:04.887000 kernel: Normal [mem 0x0000000100000000-0x000000087f7fffff] Sep 12 17:50:04.887005 kernel: Device empty Sep 12 17:50:04.887010 kernel: Movable zone start for each node Sep 12 17:50:04.887016 kernel: Early memory node ranges Sep 12 17:50:04.887020 kernel: node 0: [mem 0x0000000000001000-0x000000000008efff] Sep 12 17:50:04.887025 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Sep 12 17:50:04.887030 kernel: node 0: [mem 0x0000000040400000-0x000000006de3afff] Sep 12 17:50:04.887035 kernel: node 0: [mem 0x000000006de3d000-0x0000000077fc4fff] Sep 12 17:50:04.887040 kernel: node 0: [mem 0x00000000790a8000-0x0000000079230fff] Sep 12 17:50:04.887049 kernel: node 0: [mem 0x000000007beff000-0x000000007befffff] Sep 12 17:50:04.887054 kernel: node 0: [mem 0x0000000100000000-0x000000087f7fffff] Sep 12 17:50:04.887060 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000087f7fffff] Sep 12 17:50:04.887065 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 17:50:04.887071 kernel: On node 0, zone DMA: 113 pages in unavailable ranges Sep 12 17:50:04.887077 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Sep 12 17:50:04.887082 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Sep 12 17:50:04.887087 kernel: On node 0, zone DMA32: 4323 pages in unavailable ranges Sep 12 17:50:04.887092 kernel: On node 0, zone DMA32: 11470 pages in unavailable ranges Sep 12 17:50:04.887097 kernel: On node 0, zone Normal: 16640 pages in unavailable ranges Sep 12 17:50:04.887104 kernel: On node 0, zone Normal: 2048 pages in unavailable ranges Sep 12 17:50:04.887109 kernel: ACPI: PM-Timer IO Port: 0x1808 Sep 12 17:50:04.887114 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Sep 12 17:50:04.887119 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Sep 12 17:50:04.887124 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Sep 12 17:50:04.887130 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Sep 12 17:50:04.887135 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Sep 12 17:50:04.887140 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Sep 12 17:50:04.887145 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Sep 12 17:50:04.887151 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Sep 12 17:50:04.887156 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Sep 12 17:50:04.887162 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Sep 12 17:50:04.887167 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Sep 12 17:50:04.887172 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Sep 12 17:50:04.887177 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Sep 12 17:50:04.887182 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Sep 12 17:50:04.887187 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Sep 12 17:50:04.887192 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Sep 12 17:50:04.887199 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Sep 12 17:50:04.887204 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 12 17:50:04.887209 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 12 17:50:04.887214 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 12 17:50:04.887220 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 12 17:50:04.887225 kernel: TSC deadline timer available Sep 12 17:50:04.887230 kernel: CPU topo: Max. logical packages: 1 Sep 12 17:50:04.887235 kernel: CPU topo: Max. logical dies: 1 Sep 12 17:50:04.887241 kernel: CPU topo: Max. dies per package: 1 Sep 12 17:50:04.887247 kernel: CPU topo: Max. threads per core: 2 Sep 12 17:50:04.887252 kernel: CPU topo: Num. cores per package: 8 Sep 12 17:50:04.887257 kernel: CPU topo: Num. threads per package: 16 Sep 12 17:50:04.887262 kernel: CPU topo: Allowing 16 present CPUs plus 0 hotplug CPUs Sep 12 17:50:04.887267 kernel: [mem 0x7f800000-0xdfffffff] available for PCI devices Sep 12 17:50:04.887273 kernel: Booting paravirtualized kernel on bare hardware Sep 12 17:50:04.887278 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 12 17:50:04.887284 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Sep 12 17:50:04.887289 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 12 17:50:04.887295 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 12 17:50:04.887300 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Sep 12 17:50:04.887306 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:50:04.887312 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 17:50:04.887317 kernel: random: crng init done Sep 12 17:50:04.887322 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Sep 12 17:50:04.887327 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Sep 12 17:50:04.887333 kernel: Fallback order for Node 0: 0 Sep 12 17:50:04.887339 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8352987 Sep 12 17:50:04.887345 kernel: Policy zone: Normal Sep 12 17:50:04.887350 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 17:50:04.887355 kernel: software IO TLB: area num 16. Sep 12 17:50:04.887360 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Sep 12 17:50:04.887365 kernel: ftrace: allocating 40125 entries in 157 pages Sep 12 17:50:04.887371 kernel: ftrace: allocated 157 pages with 5 groups Sep 12 17:50:04.887376 kernel: Dynamic Preempt: voluntary Sep 12 17:50:04.887381 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 17:50:04.887387 kernel: rcu: RCU event tracing is enabled. Sep 12 17:50:04.887393 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Sep 12 17:50:04.887398 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 17:50:04.887404 kernel: Rude variant of Tasks RCU enabled. Sep 12 17:50:04.887409 kernel: Tracing variant of Tasks RCU enabled. Sep 12 17:50:04.887414 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 17:50:04.887419 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Sep 12 17:50:04.887425 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 12 17:50:04.887430 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 12 17:50:04.887435 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 12 17:50:04.887442 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Sep 12 17:50:04.887447 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 17:50:04.887452 kernel: Console: colour VGA+ 80x25 Sep 12 17:50:04.887458 kernel: printk: legacy console [tty0] enabled Sep 12 17:50:04.887463 kernel: printk: legacy console [ttyS1] enabled Sep 12 17:50:04.887468 kernel: ACPI: Core revision 20240827 Sep 12 17:50:04.887473 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 79635855245 ns Sep 12 17:50:04.887479 kernel: APIC: Switch to symmetric I/O mode setup Sep 12 17:50:04.887484 kernel: DMAR: Host address width 39 Sep 12 17:50:04.887490 kernel: DMAR: DRHD base: 0x000000fed90000 flags: 0x0 Sep 12 17:50:04.887495 kernel: DMAR: dmar0: reg_base_addr fed90000 ver 1:0 cap 1c0000c40660462 ecap 19e2ff0505e Sep 12 17:50:04.887501 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Sep 12 17:50:04.887506 kernel: DMAR: dmar1: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Sep 12 17:50:04.887511 kernel: DMAR: RMRR base: 0x00000079f11000 end: 0x0000007a15afff Sep 12 17:50:04.887516 kernel: DMAR: RMRR base: 0x0000007d000000 end: 0x0000007f7fffff Sep 12 17:50:04.887521 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 1 Sep 12 17:50:04.887527 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Sep 12 17:50:04.887532 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Sep 12 17:50:04.887538 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Sep 12 17:50:04.887544 kernel: x2apic enabled Sep 12 17:50:04.887549 kernel: APIC: Switched APIC routing to: cluster x2apic Sep 12 17:50:04.887554 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 12 17:50:04.887559 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Sep 12 17:50:04.887565 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Sep 12 17:50:04.887570 kernel: CPU0: Thermal monitoring enabled (TM1) Sep 12 17:50:04.887575 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 12 17:50:04.887580 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 12 17:50:04.887587 kernel: process: using mwait in idle threads Sep 12 17:50:04.887592 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 12 17:50:04.887597 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 12 17:50:04.887602 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 12 17:50:04.887608 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 12 17:50:04.887613 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 12 17:50:04.887619 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 12 17:50:04.887624 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 12 17:50:04.887630 kernel: TAA: Mitigation: Clear CPU buffers Sep 12 17:50:04.887635 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 12 17:50:04.887641 kernel: SRBDS: Mitigation: Microcode Sep 12 17:50:04.887646 kernel: GDS: Vulnerable: No microcode Sep 12 17:50:04.887651 kernel: active return thunk: its_return_thunk Sep 12 17:50:04.887656 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 12 17:50:04.887662 kernel: VMSCAPE: Mitigation: IBPB before exit to userspace Sep 12 17:50:04.887667 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 12 17:50:04.887672 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 12 17:50:04.887677 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 12 17:50:04.887684 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Sep 12 17:50:04.887689 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Sep 12 17:50:04.887694 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 12 17:50:04.887699 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Sep 12 17:50:04.887705 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Sep 12 17:50:04.887710 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Sep 12 17:50:04.887715 kernel: Freeing SMP alternatives memory: 32K Sep 12 17:50:04.887720 kernel: pid_max: default: 32768 minimum: 301 Sep 12 17:50:04.887727 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 17:50:04.887732 kernel: landlock: Up and running. Sep 12 17:50:04.887737 kernel: SELinux: Initializing. Sep 12 17:50:04.887742 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 17:50:04.887748 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 17:50:04.887753 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Sep 12 17:50:04.887758 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Sep 12 17:50:04.887764 kernel: ... version: 4 Sep 12 17:50:04.887769 kernel: ... bit width: 48 Sep 12 17:50:04.887775 kernel: ... generic registers: 4 Sep 12 17:50:04.887780 kernel: ... value mask: 0000ffffffffffff Sep 12 17:50:04.887786 kernel: ... max period: 00007fffffffffff Sep 12 17:50:04.887793 kernel: ... fixed-purpose events: 3 Sep 12 17:50:04.887798 kernel: ... event mask: 000000070000000f Sep 12 17:50:04.887803 kernel: signal: max sigframe size: 2032 Sep 12 17:50:04.887808 kernel: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Sep 12 17:50:04.887835 kernel: rcu: Hierarchical SRCU implementation. Sep 12 17:50:04.887841 kernel: rcu: Max phase no-delay instances is 400. Sep 12 17:50:04.887861 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Sep 12 17:50:04.887867 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Sep 12 17:50:04.887887 kernel: smp: Bringing up secondary CPUs ... Sep 12 17:50:04.887892 kernel: smpboot: x86: Booting SMP configuration: Sep 12 17:50:04.887898 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12 #13 #14 #15 Sep 12 17:50:04.887903 kernel: TAA CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/tsx_async_abort.html for more details. Sep 12 17:50:04.887909 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Sep 12 17:50:04.887914 kernel: smp: Brought up 1 node, 16 CPUs Sep 12 17:50:04.887920 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Sep 12 17:50:04.887926 kernel: Memory: 32652124K/33411948K available (14336K kernel code, 2432K rwdata, 9960K rodata, 54040K init, 2924K bss, 732480K reserved, 0K cma-reserved) Sep 12 17:50:04.887932 kernel: devtmpfs: initialized Sep 12 17:50:04.887937 kernel: x86/mm: Memory block size: 128MB Sep 12 17:50:04.887942 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x6de3b000-0x6de3bfff] (4096 bytes) Sep 12 17:50:04.887948 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x79231000-0x79662fff] (4399104 bytes) Sep 12 17:50:04.887953 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 17:50:04.887959 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Sep 12 17:50:04.887964 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 17:50:04.887970 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 17:50:04.887975 kernel: audit: initializing netlink subsys (disabled) Sep 12 17:50:04.887981 kernel: audit: type=2000 audit(1757699397.176:1): state=initialized audit_enabled=0 res=1 Sep 12 17:50:04.887986 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 17:50:04.887991 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 12 17:50:04.887996 kernel: cpuidle: using governor menu Sep 12 17:50:04.888002 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 17:50:04.888007 kernel: dca service started, version 1.12.1 Sep 12 17:50:04.888012 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Sep 12 17:50:04.888018 kernel: PCI: Using configuration type 1 for base access Sep 12 17:50:04.888024 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 12 17:50:04.888029 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 17:50:04.888034 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 17:50:04.888040 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 17:50:04.888045 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 17:50:04.888050 kernel: ACPI: Added _OSI(Module Device) Sep 12 17:50:04.888055 kernel: ACPI: Added _OSI(Processor Device) Sep 12 17:50:04.888061 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 17:50:04.888067 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Sep 12 17:50:04.888072 kernel: ACPI: Dynamic OEM Table Load: Sep 12 17:50:04.888077 kernel: ACPI: SSDT 0xFFFF9B7442398000 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Sep 12 17:50:04.888083 kernel: ACPI: Dynamic OEM Table Load: Sep 12 17:50:04.888088 kernel: ACPI: SSDT 0xFFFF9B7440246200 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Sep 12 17:50:04.888093 kernel: ACPI: Dynamic OEM Table Load: Sep 12 17:50:04.888098 kernel: ACPI: SSDT 0xFFFF9B744239B000 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Sep 12 17:50:04.888103 kernel: ACPI: Dynamic OEM Table Load: Sep 12 17:50:04.888108 kernel: ACPI: SSDT 0xFFFF9B74401A5000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Sep 12 17:50:04.888114 kernel: ACPI: Interpreter enabled Sep 12 17:50:04.888120 kernel: ACPI: PM: (supports S0 S5) Sep 12 17:50:04.888125 kernel: ACPI: Using IOAPIC for interrupt routing Sep 12 17:50:04.888130 kernel: HEST: Enabling Firmware First mode for corrected errors. Sep 12 17:50:04.888136 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Sep 12 17:50:04.888141 kernel: HEST: Table parsing has been initialized. Sep 12 17:50:04.888146 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Sep 12 17:50:04.888151 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 12 17:50:04.888157 kernel: PCI: Using E820 reservations for host bridge windows Sep 12 17:50:04.888162 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Sep 12 17:50:04.888168 kernel: ACPI: \_SB_.PCI0.XDCI.USBC: New power resource Sep 12 17:50:04.888174 kernel: ACPI: \_SB_.PCI0.SAT0.VOL0.V0PR: New power resource Sep 12 17:50:04.888179 kernel: ACPI: \_SB_.PCI0.SAT0.VOL1.V1PR: New power resource Sep 12 17:50:04.888184 kernel: ACPI: \_SB_.PCI0.SAT0.VOL2.V2PR: New power resource Sep 12 17:50:04.888189 kernel: ACPI: \_SB_.PCI0.CNVW.WRST: New power resource Sep 12 17:50:04.888195 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Sep 12 17:50:04.888200 kernel: ACPI: \_TZ_.FN00: New power resource Sep 12 17:50:04.888205 kernel: ACPI: \_TZ_.FN01: New power resource Sep 12 17:50:04.888210 kernel: ACPI: \_TZ_.FN02: New power resource Sep 12 17:50:04.888217 kernel: ACPI: \_TZ_.FN03: New power resource Sep 12 17:50:04.888222 kernel: ACPI: \_TZ_.FN04: New power resource Sep 12 17:50:04.888227 kernel: ACPI: \PIN_: New power resource Sep 12 17:50:04.888232 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Sep 12 17:50:04.888312 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 17:50:04.888371 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Sep 12 17:50:04.888425 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Sep 12 17:50:04.888435 kernel: PCI host bridge to bus 0000:00 Sep 12 17:50:04.888492 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 12 17:50:04.888542 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 12 17:50:04.888591 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 12 17:50:04.888640 kernel: pci_bus 0000:00: root bus resource [mem 0x7f800000-0xdfffffff window] Sep 12 17:50:04.888688 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Sep 12 17:50:04.888736 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Sep 12 17:50:04.888810 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 conventional PCI endpoint Sep 12 17:50:04.888911 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 PCIe Root Port Sep 12 17:50:04.888969 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 17:50:04.889026 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Sep 12 17:50:04.889088 kernel: pci 0000:00:01.1: [8086:1905] type 01 class 0x060400 PCIe Root Port Sep 12 17:50:04.889145 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Sep 12 17:50:04.889203 kernel: pci 0000:00:01.1: bridge window [mem 0x96100000-0x962fffff] Sep 12 17:50:04.889259 kernel: pci 0000:00:01.1: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Sep 12 17:50:04.889315 kernel: pci 0000:00:01.1: PME# supported from D0 D3hot D3cold Sep 12 17:50:04.889375 kernel: pci 0000:00:02.0: [8086:3e9a] type 00 class 0x038000 PCIe Root Complex Integrated Endpoint Sep 12 17:50:04.889431 kernel: pci 0000:00:02.0: BAR 0 [mem 0x94000000-0x94ffffff 64bit] Sep 12 17:50:04.889487 kernel: pci 0000:00:02.0: BAR 2 [mem 0x80000000-0x8fffffff 64bit pref] Sep 12 17:50:04.889542 kernel: pci 0000:00:02.0: BAR 4 [io 0x6000-0x603f] Sep 12 17:50:04.889611 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 conventional PCI endpoint Sep 12 17:50:04.889672 kernel: pci 0000:00:08.0: BAR 0 [mem 0x9651f000-0x9651ffff 64bit] Sep 12 17:50:04.889732 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 conventional PCI endpoint Sep 12 17:50:04.889789 kernel: pci 0000:00:12.0: BAR 0 [mem 0x9651e000-0x9651efff 64bit] Sep 12 17:50:04.889892 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 conventional PCI endpoint Sep 12 17:50:04.889949 kernel: pci 0000:00:14.0: BAR 0 [mem 0x96500000-0x9650ffff 64bit] Sep 12 17:50:04.890008 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Sep 12 17:50:04.890067 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 conventional PCI endpoint Sep 12 17:50:04.890124 kernel: pci 0000:00:14.2: BAR 0 [mem 0x96512000-0x96513fff 64bit] Sep 12 17:50:04.890179 kernel: pci 0000:00:14.2: BAR 2 [mem 0x9651d000-0x9651dfff 64bit] Sep 12 17:50:04.890238 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 conventional PCI endpoint Sep 12 17:50:04.890295 kernel: pci 0000:00:15.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 12 17:50:04.890354 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 conventional PCI endpoint Sep 12 17:50:04.890413 kernel: pci 0000:00:15.1: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 12 17:50:04.890475 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 conventional PCI endpoint Sep 12 17:50:04.890531 kernel: pci 0000:00:16.0: BAR 0 [mem 0x9651a000-0x9651afff 64bit] Sep 12 17:50:04.890589 kernel: pci 0000:00:16.0: PME# supported from D3hot Sep 12 17:50:04.890649 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 conventional PCI endpoint Sep 12 17:50:04.890705 kernel: pci 0000:00:16.1: BAR 0 [mem 0x96519000-0x96519fff 64bit] Sep 12 17:50:04.890760 kernel: pci 0000:00:16.1: PME# supported from D3hot Sep 12 17:50:04.890862 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 conventional PCI endpoint Sep 12 17:50:04.890919 kernel: pci 0000:00:16.4: BAR 0 [mem 0x96518000-0x96518fff 64bit] Sep 12 17:50:04.890975 kernel: pci 0000:00:16.4: PME# supported from D3hot Sep 12 17:50:04.891038 kernel: pci 0000:00:17.0: [8086:2826] type 00 class 0x010400 conventional PCI endpoint Sep 12 17:50:04.891094 kernel: pci 0000:00:17.0: BAR 0 [mem 0x96510000-0x96511fff] Sep 12 17:50:04.891150 kernel: pci 0000:00:17.0: BAR 1 [mem 0x96517000-0x965170ff] Sep 12 17:50:04.891205 kernel: pci 0000:00:17.0: BAR 2 [io 0x6090-0x6097] Sep 12 17:50:04.891260 kernel: pci 0000:00:17.0: BAR 3 [io 0x6080-0x6083] Sep 12 17:50:04.891315 kernel: pci 0000:00:17.0: BAR 4 [io 0x6060-0x607f] Sep 12 17:50:04.891374 kernel: pci 0000:00:17.0: BAR 5 [mem 0x96516000-0x965167ff] Sep 12 17:50:04.891429 kernel: pci 0000:00:17.0: PME# supported from D3hot Sep 12 17:50:04.891492 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 PCIe Root Port Sep 12 17:50:04.891549 kernel: pci 0000:00:1b.0: PCI bridge to [bus 03] Sep 12 17:50:04.891605 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Sep 12 17:50:04.891667 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 PCIe Root Port Sep 12 17:50:04.891723 kernel: pci 0000:00:1b.4: PCI bridge to [bus 04] Sep 12 17:50:04.891782 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Sep 12 17:50:04.891893 kernel: pci 0000:00:1b.4: bridge window [mem 0x96400000-0x964fffff] Sep 12 17:50:04.891949 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Sep 12 17:50:04.892008 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 PCIe Root Port Sep 12 17:50:04.892064 kernel: pci 0000:00:1b.5: PCI bridge to [bus 05] Sep 12 17:50:04.892119 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Sep 12 17:50:04.892174 kernel: pci 0000:00:1b.5: bridge window [mem 0x96300000-0x963fffff] Sep 12 17:50:04.892230 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Sep 12 17:50:04.892292 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 PCIe Root Port Sep 12 17:50:04.892347 kernel: pci 0000:00:1c.0: PCI bridge to [bus 06] Sep 12 17:50:04.892405 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Sep 12 17:50:04.892465 kernel: pci 0000:00:1c.1: [8086:a339] type 01 class 0x060400 PCIe Root Port Sep 12 17:50:04.892523 kernel: pci 0000:00:1c.1: PCI bridge to [bus 07-08] Sep 12 17:50:04.892579 kernel: pci 0000:00:1c.1: bridge window [io 0x3000-0x3fff] Sep 12 17:50:04.892636 kernel: pci 0000:00:1c.1: bridge window [mem 0x95000000-0x960fffff] Sep 12 17:50:04.892691 kernel: pci 0000:00:1c.1: PME# supported from D0 D3hot D3cold Sep 12 17:50:04.892753 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 conventional PCI endpoint Sep 12 17:50:04.892811 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 12 17:50:04.892908 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 conventional PCI endpoint Sep 12 17:50:04.892967 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 conventional PCI endpoint Sep 12 17:50:04.893022 kernel: pci 0000:00:1f.4: BAR 0 [mem 0x96514000-0x965140ff 64bit] Sep 12 17:50:04.893079 kernel: pci 0000:00:1f.4: BAR 4 [io 0xefa0-0xefbf] Sep 12 17:50:04.893138 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 conventional PCI endpoint Sep 12 17:50:04.893193 kernel: pci 0000:00:1f.5: BAR 0 [mem 0xfe010000-0xfe010fff] Sep 12 17:50:04.893249 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 17:50:04.893311 kernel: pci 0000:02:00.0: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Sep 12 17:50:04.893368 kernel: pci 0000:02:00.0: BAR 0 [mem 0x92000000-0x93ffffff 64bit pref] Sep 12 17:50:04.893425 kernel: pci 0000:02:00.0: ROM [mem 0x96200000-0x962fffff pref] Sep 12 17:50:04.893484 kernel: pci 0000:02:00.0: PME# supported from D3cold Sep 12 17:50:04.893541 kernel: pci 0000:02:00.0: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Sep 12 17:50:04.893612 kernel: pci 0000:02:00.0: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Sep 12 17:50:04.893687 kernel: pci 0000:02:00.1: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Sep 12 17:50:04.893744 kernel: pci 0000:02:00.1: BAR 0 [mem 0x90000000-0x91ffffff 64bit pref] Sep 12 17:50:04.893804 kernel: pci 0000:02:00.1: ROM [mem 0x96100000-0x961fffff pref] Sep 12 17:50:04.893911 kernel: pci 0000:02:00.1: PME# supported from D3cold Sep 12 17:50:04.893971 kernel: pci 0000:02:00.1: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Sep 12 17:50:04.894027 kernel: pci 0000:02:00.1: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Sep 12 17:50:04.894083 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Sep 12 17:50:04.894139 kernel: pci 0000:00:1b.0: PCI bridge to [bus 03] Sep 12 17:50:04.894205 kernel: pci 0000:04:00.0: working around ROM BAR overlap defect Sep 12 17:50:04.894263 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Sep 12 17:50:04.894320 kernel: pci 0000:04:00.0: BAR 0 [mem 0x96400000-0x9647ffff] Sep 12 17:50:04.894380 kernel: pci 0000:04:00.0: BAR 2 [io 0x5000-0x501f] Sep 12 17:50:04.894436 kernel: pci 0000:04:00.0: BAR 3 [mem 0x96480000-0x96483fff] Sep 12 17:50:04.894492 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Sep 12 17:50:04.894548 kernel: pci 0000:00:1b.4: PCI bridge to [bus 04] Sep 12 17:50:04.894609 kernel: pci 0000:05:00.0: working around ROM BAR overlap defect Sep 12 17:50:04.894666 kernel: pci 0000:05:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Sep 12 17:50:04.894723 kernel: pci 0000:05:00.0: BAR 0 [mem 0x96300000-0x9637ffff] Sep 12 17:50:04.894783 kernel: pci 0000:05:00.0: BAR 2 [io 0x4000-0x401f] Sep 12 17:50:04.894924 kernel: pci 0000:05:00.0: BAR 3 [mem 0x96380000-0x96383fff] Sep 12 17:50:04.894981 kernel: pci 0000:05:00.0: PME# supported from D0 D3hot D3cold Sep 12 17:50:04.895038 kernel: pci 0000:00:1b.5: PCI bridge to [bus 05] Sep 12 17:50:04.895094 kernel: pci 0000:00:1c.0: PCI bridge to [bus 06] Sep 12 17:50:04.895155 kernel: pci 0000:07:00.0: [1a03:1150] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Sep 12 17:50:04.895213 kernel: pci 0000:07:00.0: PCI bridge to [bus 08] Sep 12 17:50:04.895272 kernel: pci 0000:07:00.0: bridge window [io 0x3000-0x3fff] Sep 12 17:50:04.895329 kernel: pci 0000:07:00.0: bridge window [mem 0x95000000-0x960fffff] Sep 12 17:50:04.895385 kernel: pci 0000:07:00.0: enabling Extended Tags Sep 12 17:50:04.895441 kernel: pci 0000:07:00.0: supports D1 D2 Sep 12 17:50:04.895497 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 12 17:50:04.895552 kernel: pci 0000:00:1c.1: PCI bridge to [bus 07-08] Sep 12 17:50:04.895616 kernel: pci_bus 0000:08: extended config space not accessible Sep 12 17:50:04.895684 kernel: pci 0000:08:00.0: [1a03:2000] type 00 class 0x030000 conventional PCI endpoint Sep 12 17:50:04.895744 kernel: pci 0000:08:00.0: BAR 0 [mem 0x95000000-0x95ffffff] Sep 12 17:50:04.895806 kernel: pci 0000:08:00.0: BAR 1 [mem 0x96000000-0x9601ffff] Sep 12 17:50:04.895903 kernel: pci 0000:08:00.0: BAR 2 [io 0x3000-0x307f] Sep 12 17:50:04.895962 kernel: pci 0000:08:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 12 17:50:04.896020 kernel: pci 0000:08:00.0: supports D1 D2 Sep 12 17:50:04.896078 kernel: pci 0000:08:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 12 17:50:04.896137 kernel: pci 0000:07:00.0: PCI bridge to [bus 08] Sep 12 17:50:04.896146 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Sep 12 17:50:04.896152 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Sep 12 17:50:04.896157 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Sep 12 17:50:04.896163 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Sep 12 17:50:04.896170 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Sep 12 17:50:04.896176 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Sep 12 17:50:04.896182 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Sep 12 17:50:04.896188 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Sep 12 17:50:04.896193 kernel: iommu: Default domain type: Translated Sep 12 17:50:04.896199 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 12 17:50:04.896204 kernel: PCI: Using ACPI for IRQ routing Sep 12 17:50:04.896210 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 12 17:50:04.896215 kernel: e820: reserve RAM buffer [mem 0x0008f800-0x0008ffff] Sep 12 17:50:04.896222 kernel: e820: reserve RAM buffer [mem 0x6de3b000-0x6fffffff] Sep 12 17:50:04.896227 kernel: e820: reserve RAM buffer [mem 0x77fc5000-0x77ffffff] Sep 12 17:50:04.896233 kernel: e820: reserve RAM buffer [mem 0x79231000-0x7bffffff] Sep 12 17:50:04.896238 kernel: e820: reserve RAM buffer [mem 0x7bf00000-0x7bffffff] Sep 12 17:50:04.896243 kernel: e820: reserve RAM buffer [mem 0x87f800000-0x87fffffff] Sep 12 17:50:04.896300 kernel: pci 0000:08:00.0: vgaarb: setting as boot VGA device Sep 12 17:50:04.896359 kernel: pci 0000:08:00.0: vgaarb: bridge control possible Sep 12 17:50:04.896418 kernel: pci 0000:08:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 12 17:50:04.896426 kernel: vgaarb: loaded Sep 12 17:50:04.896433 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0 Sep 12 17:50:04.896439 kernel: hpet0: 8 comparators, 64-bit 24.000000 MHz counter Sep 12 17:50:04.896445 kernel: clocksource: Switched to clocksource tsc-early Sep 12 17:50:04.896450 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 17:50:04.896456 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 17:50:04.896461 kernel: pnp: PnP ACPI init Sep 12 17:50:04.896518 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Sep 12 17:50:04.896576 kernel: pnp 00:02: [dma 0 disabled] Sep 12 17:50:04.896633 kernel: pnp 00:03: [dma 0 disabled] Sep 12 17:50:04.896688 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Sep 12 17:50:04.896739 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Sep 12 17:50:04.896797 kernel: system 00:05: [mem 0xfed10000-0xfed17fff] has been reserved Sep 12 17:50:04.896885 kernel: system 00:05: [mem 0xfed18000-0xfed18fff] has been reserved Sep 12 17:50:04.896936 kernel: system 00:05: [mem 0xfed19000-0xfed19fff] has been reserved Sep 12 17:50:04.896989 kernel: system 00:05: [mem 0xe0000000-0xefffffff] has been reserved Sep 12 17:50:04.897040 kernel: system 00:05: [mem 0xfed20000-0xfed3ffff] has been reserved Sep 12 17:50:04.897090 kernel: system 00:05: [mem 0xfed90000-0xfed93fff] could not be reserved Sep 12 17:50:04.897140 kernel: system 00:05: [mem 0xfed45000-0xfed8ffff] has been reserved Sep 12 17:50:04.897190 kernel: system 00:05: [mem 0xfee00000-0xfeefffff] could not be reserved Sep 12 17:50:04.897244 kernel: system 00:06: [io 0x1800-0x18fe] could not be reserved Sep 12 17:50:04.897295 kernel: system 00:06: [mem 0xfd000000-0xfd69ffff] has been reserved Sep 12 17:50:04.897347 kernel: system 00:06: [mem 0xfd6c0000-0xfd6cffff] has been reserved Sep 12 17:50:04.897398 kernel: system 00:06: [mem 0xfd6f0000-0xfdffffff] has been reserved Sep 12 17:50:04.897449 kernel: system 00:06: [mem 0xfe000000-0xfe01ffff] could not be reserved Sep 12 17:50:04.897499 kernel: system 00:06: [mem 0xfe200000-0xfe7fffff] has been reserved Sep 12 17:50:04.897549 kernel: system 00:06: [mem 0xff000000-0xffffffff] has been reserved Sep 12 17:50:04.897603 kernel: system 00:07: [io 0x2000-0x20fe] has been reserved Sep 12 17:50:04.897611 kernel: pnp: PnP ACPI: found 9 devices Sep 12 17:50:04.897619 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 12 17:50:04.897625 kernel: NET: Registered PF_INET protocol family Sep 12 17:50:04.897631 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 17:50:04.897636 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Sep 12 17:50:04.897642 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 17:50:04.897648 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 17:50:04.897653 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Sep 12 17:50:04.897659 kernel: TCP: Hash tables configured (established 262144 bind 65536) Sep 12 17:50:04.897664 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 12 17:50:04.897671 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 12 17:50:04.897676 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 17:50:04.897682 kernel: NET: Registered PF_XDP protocol family Sep 12 17:50:04.897737 kernel: pci 0000:00:15.0: BAR 0 [mem 0x7f800000-0x7f800fff 64bit]: assigned Sep 12 17:50:04.897795 kernel: pci 0000:00:15.1: BAR 0 [mem 0x7f801000-0x7f801fff 64bit]: assigned Sep 12 17:50:04.897891 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x7f802000-0x7f802fff 64bit]: assigned Sep 12 17:50:04.897950 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 17:50:04.898008 kernel: pci 0000:02:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Sep 12 17:50:04.898067 kernel: pci 0000:02:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Sep 12 17:50:04.898124 kernel: pci 0000:02:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Sep 12 17:50:04.898180 kernel: pci 0000:02:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Sep 12 17:50:04.898234 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Sep 12 17:50:04.898289 kernel: pci 0000:00:01.1: bridge window [mem 0x96100000-0x962fffff] Sep 12 17:50:04.898344 kernel: pci 0000:00:01.1: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Sep 12 17:50:04.898400 kernel: pci 0000:00:1b.0: PCI bridge to [bus 03] Sep 12 17:50:04.898454 kernel: pci 0000:00:1b.4: PCI bridge to [bus 04] Sep 12 17:50:04.898509 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Sep 12 17:50:04.898563 kernel: pci 0000:00:1b.4: bridge window [mem 0x96400000-0x964fffff] Sep 12 17:50:04.898621 kernel: pci 0000:00:1b.5: PCI bridge to [bus 05] Sep 12 17:50:04.898676 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Sep 12 17:50:04.898731 kernel: pci 0000:00:1b.5: bridge window [mem 0x96300000-0x963fffff] Sep 12 17:50:04.898786 kernel: pci 0000:00:1c.0: PCI bridge to [bus 06] Sep 12 17:50:04.898881 kernel: pci 0000:07:00.0: PCI bridge to [bus 08] Sep 12 17:50:04.898937 kernel: pci 0000:07:00.0: bridge window [io 0x3000-0x3fff] Sep 12 17:50:04.898994 kernel: pci 0000:07:00.0: bridge window [mem 0x95000000-0x960fffff] Sep 12 17:50:04.899049 kernel: pci 0000:00:1c.1: PCI bridge to [bus 07-08] Sep 12 17:50:04.899104 kernel: pci 0000:00:1c.1: bridge window [io 0x3000-0x3fff] Sep 12 17:50:04.899158 kernel: pci 0000:00:1c.1: bridge window [mem 0x95000000-0x960fffff] Sep 12 17:50:04.899211 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Sep 12 17:50:04.899260 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 12 17:50:04.899309 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 12 17:50:04.899357 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 12 17:50:04.899406 kernel: pci_bus 0000:00: resource 7 [mem 0x7f800000-0xdfffffff window] Sep 12 17:50:04.899454 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Sep 12 17:50:04.899513 kernel: pci_bus 0000:02: resource 1 [mem 0x96100000-0x962fffff] Sep 12 17:50:04.899568 kernel: pci_bus 0000:02: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Sep 12 17:50:04.899623 kernel: pci_bus 0000:04: resource 0 [io 0x5000-0x5fff] Sep 12 17:50:04.899675 kernel: pci_bus 0000:04: resource 1 [mem 0x96400000-0x964fffff] Sep 12 17:50:04.899730 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Sep 12 17:50:04.899781 kernel: pci_bus 0000:05: resource 1 [mem 0x96300000-0x963fffff] Sep 12 17:50:04.899875 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Sep 12 17:50:04.899931 kernel: pci_bus 0000:07: resource 1 [mem 0x95000000-0x960fffff] Sep 12 17:50:04.899985 kernel: pci_bus 0000:08: resource 0 [io 0x3000-0x3fff] Sep 12 17:50:04.900040 kernel: pci_bus 0000:08: resource 1 [mem 0x95000000-0x960fffff] Sep 12 17:50:04.900048 kernel: PCI: CLS 64 bytes, default 64 Sep 12 17:50:04.900054 kernel: DMAR: No ATSR found Sep 12 17:50:04.900060 kernel: DMAR: No SATC found Sep 12 17:50:04.900065 kernel: DMAR: IOMMU feature fl1gp_support inconsistent Sep 12 17:50:04.900071 kernel: DMAR: IOMMU feature pgsel_inv inconsistent Sep 12 17:50:04.900076 kernel: DMAR: IOMMU feature nwfs inconsistent Sep 12 17:50:04.900083 kernel: DMAR: IOMMU feature pasid inconsistent Sep 12 17:50:04.900089 kernel: DMAR: IOMMU feature eafs inconsistent Sep 12 17:50:04.900094 kernel: DMAR: IOMMU feature prs inconsistent Sep 12 17:50:04.900100 kernel: DMAR: IOMMU feature nest inconsistent Sep 12 17:50:04.900106 kernel: DMAR: IOMMU feature mts inconsistent Sep 12 17:50:04.900111 kernel: DMAR: IOMMU feature sc_support inconsistent Sep 12 17:50:04.900117 kernel: DMAR: IOMMU feature dev_iotlb_support inconsistent Sep 12 17:50:04.900122 kernel: DMAR: dmar0: Using Queued invalidation Sep 12 17:50:04.900128 kernel: DMAR: dmar1: Using Queued invalidation Sep 12 17:50:04.900183 kernel: pci 0000:00:02.0: Adding to iommu group 0 Sep 12 17:50:04.900240 kernel: pci 0000:00:00.0: Adding to iommu group 1 Sep 12 17:50:04.900295 kernel: pci 0000:00:01.0: Adding to iommu group 2 Sep 12 17:50:04.900350 kernel: pci 0000:00:01.1: Adding to iommu group 2 Sep 12 17:50:04.900405 kernel: pci 0000:00:08.0: Adding to iommu group 3 Sep 12 17:50:04.900460 kernel: pci 0000:00:12.0: Adding to iommu group 4 Sep 12 17:50:04.900514 kernel: pci 0000:00:14.0: Adding to iommu group 5 Sep 12 17:50:04.900568 kernel: pci 0000:00:14.2: Adding to iommu group 5 Sep 12 17:50:04.900624 kernel: pci 0000:00:15.0: Adding to iommu group 6 Sep 12 17:50:04.900678 kernel: pci 0000:00:15.1: Adding to iommu group 6 Sep 12 17:50:04.900733 kernel: pci 0000:00:16.0: Adding to iommu group 7 Sep 12 17:50:04.900787 kernel: pci 0000:00:16.1: Adding to iommu group 7 Sep 12 17:50:04.900881 kernel: pci 0000:00:16.4: Adding to iommu group 7 Sep 12 17:50:04.900935 kernel: pci 0000:00:17.0: Adding to iommu group 8 Sep 12 17:50:04.900990 kernel: pci 0000:00:1b.0: Adding to iommu group 9 Sep 12 17:50:04.901045 kernel: pci 0000:00:1b.4: Adding to iommu group 10 Sep 12 17:50:04.901103 kernel: pci 0000:00:1b.5: Adding to iommu group 11 Sep 12 17:50:04.901157 kernel: pci 0000:00:1c.0: Adding to iommu group 12 Sep 12 17:50:04.901212 kernel: pci 0000:00:1c.1: Adding to iommu group 13 Sep 12 17:50:04.901266 kernel: pci 0000:00:1e.0: Adding to iommu group 14 Sep 12 17:50:04.901321 kernel: pci 0000:00:1f.0: Adding to iommu group 15 Sep 12 17:50:04.901375 kernel: pci 0000:00:1f.4: Adding to iommu group 15 Sep 12 17:50:04.901430 kernel: pci 0000:00:1f.5: Adding to iommu group 15 Sep 12 17:50:04.901488 kernel: pci 0000:02:00.0: Adding to iommu group 2 Sep 12 17:50:04.901544 kernel: pci 0000:02:00.1: Adding to iommu group 2 Sep 12 17:50:04.901601 kernel: pci 0000:04:00.0: Adding to iommu group 16 Sep 12 17:50:04.901656 kernel: pci 0000:05:00.0: Adding to iommu group 17 Sep 12 17:50:04.901713 kernel: pci 0000:07:00.0: Adding to iommu group 18 Sep 12 17:50:04.901771 kernel: pci 0000:08:00.0: Adding to iommu group 18 Sep 12 17:50:04.901779 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Sep 12 17:50:04.901785 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 12 17:50:04.901793 kernel: software IO TLB: mapped [mem 0x0000000073fc5000-0x0000000077fc5000] (64MB) Sep 12 17:50:04.901801 kernel: RAPL PMU: API unit is 2^-32 Joules, 4 fixed counters, 655360 ms ovfl timer Sep 12 17:50:04.901807 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Sep 12 17:50:04.901813 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Sep 12 17:50:04.901818 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Sep 12 17:50:04.901824 kernel: RAPL PMU: hw unit of domain pp1-gpu 2^-14 Joules Sep 12 17:50:04.901915 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Sep 12 17:50:04.901924 kernel: Initialise system trusted keyrings Sep 12 17:50:04.901929 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Sep 12 17:50:04.901937 kernel: Key type asymmetric registered Sep 12 17:50:04.901942 kernel: Asymmetric key parser 'x509' registered Sep 12 17:50:04.901947 kernel: tsc: Refined TSC clocksource calibration: 3407.951 MHz Sep 12 17:50:04.901953 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fa5d91b7, max_idle_ns: 440795370708 ns Sep 12 17:50:04.901959 kernel: clocksource: Switched to clocksource tsc Sep 12 17:50:04.901964 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 17:50:04.901970 kernel: io scheduler mq-deadline registered Sep 12 17:50:04.901975 kernel: io scheduler kyber registered Sep 12 17:50:04.901981 kernel: io scheduler bfq registered Sep 12 17:50:04.902037 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 122 Sep 12 17:50:04.902092 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 123 Sep 12 17:50:04.902147 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 124 Sep 12 17:50:04.902202 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 125 Sep 12 17:50:04.902258 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 126 Sep 12 17:50:04.902314 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 127 Sep 12 17:50:04.902369 kernel: pcieport 0000:00:1c.1: PME: Signaling with IRQ 128 Sep 12 17:50:04.902432 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Sep 12 17:50:04.902442 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Sep 12 17:50:04.902448 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Sep 12 17:50:04.902454 kernel: pstore: Using crash dump compression: deflate Sep 12 17:50:04.902460 kernel: pstore: Registered erst as persistent store backend Sep 12 17:50:04.902465 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 12 17:50:04.902471 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 17:50:04.902476 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 17:50:04.902482 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Sep 12 17:50:04.902537 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Sep 12 17:50:04.902548 kernel: i8042: PNP: No PS/2 controller found. Sep 12 17:50:04.902600 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Sep 12 17:50:04.902652 kernel: rtc_cmos rtc_cmos: registered as rtc0 Sep 12 17:50:04.902702 kernel: rtc_cmos rtc_cmos: setting system clock to 2025-09-12T17:50:03 UTC (1757699403) Sep 12 17:50:04.902753 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Sep 12 17:50:04.902761 kernel: intel_pstate: Intel P-state driver initializing Sep 12 17:50:04.902767 kernel: intel_pstate: Disabling energy efficiency optimization Sep 12 17:50:04.902774 kernel: intel_pstate: HWP enabled Sep 12 17:50:04.902780 kernel: NET: Registered PF_INET6 protocol family Sep 12 17:50:04.902785 kernel: Segment Routing with IPv6 Sep 12 17:50:04.902795 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 17:50:04.902801 kernel: NET: Registered PF_PACKET protocol family Sep 12 17:50:04.902806 kernel: Key type dns_resolver registered Sep 12 17:50:04.902812 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Sep 12 17:50:04.902838 kernel: microcode: Current revision: 0x000000de Sep 12 17:50:04.902843 kernel: IPI shorthand broadcast: enabled Sep 12 17:50:04.902872 kernel: sched_clock: Marking stable (3824000610, 1502324012)->(6880523919, -1554199297) Sep 12 17:50:04.902878 kernel: registered taskstats version 1 Sep 12 17:50:04.902883 kernel: Loading compiled-in X.509 certificates Sep 12 17:50:04.902889 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: f1ae8d6e9bfae84d90f4136cf098b0465b2a5bd7' Sep 12 17:50:04.902894 kernel: Demotion targets for Node 0: null Sep 12 17:50:04.902900 kernel: Key type .fscrypt registered Sep 12 17:50:04.902905 kernel: Key type fscrypt-provisioning registered Sep 12 17:50:04.902911 kernel: ima: Allocated hash algorithm: sha1 Sep 12 17:50:04.902916 kernel: ima: No architecture policies found Sep 12 17:50:04.902922 kernel: clk: Disabling unused clocks Sep 12 17:50:04.902928 kernel: Warning: unable to open an initial console. Sep 12 17:50:04.902934 kernel: Freeing unused kernel image (initmem) memory: 54040K Sep 12 17:50:04.902939 kernel: Write protecting the kernel read-only data: 24576k Sep 12 17:50:04.902945 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Sep 12 17:50:04.902950 kernel: Run /init as init process Sep 12 17:50:04.902956 kernel: with arguments: Sep 12 17:50:04.902961 kernel: /init Sep 12 17:50:04.902967 kernel: with environment: Sep 12 17:50:04.902973 kernel: HOME=/ Sep 12 17:50:04.902979 kernel: TERM=linux Sep 12 17:50:04.902984 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 17:50:04.902990 systemd[1]: Successfully made /usr/ read-only. Sep 12 17:50:04.902998 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 17:50:04.903004 systemd[1]: Detected architecture x86-64. Sep 12 17:50:04.903009 systemd[1]: Running in initrd. Sep 12 17:50:04.903016 systemd[1]: No hostname configured, using default hostname. Sep 12 17:50:04.903022 systemd[1]: Hostname set to . Sep 12 17:50:04.903028 systemd[1]: Initializing machine ID from random generator. Sep 12 17:50:04.903034 systemd[1]: Queued start job for default target initrd.target. Sep 12 17:50:04.903039 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:50:04.903045 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:50:04.903051 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 17:50:04.903057 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:50:04.903064 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 17:50:04.903070 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 17:50:04.903077 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 17:50:04.903083 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 17:50:04.903088 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:50:04.903094 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:50:04.903100 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:50:04.903106 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:50:04.903112 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:50:04.903118 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:50:04.903124 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:50:04.903130 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:50:04.903135 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 17:50:04.903141 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 17:50:04.903147 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:50:04.903153 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:50:04.903159 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:50:04.903165 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:50:04.903171 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 17:50:04.903177 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:50:04.903182 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 17:50:04.903189 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 17:50:04.903194 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 17:50:04.903200 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:50:04.903219 systemd-journald[299]: Collecting audit messages is disabled. Sep 12 17:50:04.903233 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:50:04.903240 systemd-journald[299]: Journal started Sep 12 17:50:04.903254 systemd-journald[299]: Runtime Journal (/run/log/journal/a92377ac6ecf493ca67d13e1c2025de9) is 8M, max 639.3M, 631.3M free. Sep 12 17:50:04.916796 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:50:04.917930 systemd-modules-load[301]: Inserted module 'overlay' Sep 12 17:50:04.938846 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:50:04.939133 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 17:50:04.939394 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:50:04.939559 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 17:50:04.940429 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:50:04.953795 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 17:50:04.954120 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:50:04.977446 kernel: Bridge firewalling registered Sep 12 17:50:04.955298 systemd-modules-load[301]: Inserted module 'br_netfilter' Sep 12 17:50:04.977611 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:50:05.065175 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:50:05.068806 systemd-tmpfiles[314]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 17:50:05.084547 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:50:05.103462 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:50:05.130228 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:50:05.140895 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:50:05.168539 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:50:05.197677 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:50:05.205532 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:50:05.209727 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:50:05.237480 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:50:05.249096 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 17:50:05.268897 systemd-resolved[330]: Positive Trust Anchors: Sep 12 17:50:05.268904 systemd-resolved[330]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:50:05.268936 systemd-resolved[330]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:50:05.271013 systemd-resolved[330]: Defaulting to hostname 'linux'. Sep 12 17:50:05.271774 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:50:05.288043 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:50:05.396661 dracut-cmdline[341]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:50:05.600846 kernel: SCSI subsystem initialized Sep 12 17:50:05.629833 kernel: Loading iSCSI transport class v2.0-870. Sep 12 17:50:05.641805 kernel: iscsi: registered transport (tcp) Sep 12 17:50:05.673420 kernel: iscsi: registered transport (qla4xxx) Sep 12 17:50:05.673440 kernel: QLogic iSCSI HBA Driver Sep 12 17:50:05.684263 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:50:05.715200 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:50:05.727098 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:50:05.778518 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 17:50:05.788201 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 17:50:05.908821 kernel: raid6: avx2x4 gen() 20204 MB/s Sep 12 17:50:05.929844 kernel: raid6: avx2x2 gen() 42320 MB/s Sep 12 17:50:05.955921 kernel: raid6: avx2x1 gen() 45046 MB/s Sep 12 17:50:05.955940 kernel: raid6: using algorithm avx2x1 gen() 45046 MB/s Sep 12 17:50:05.982946 kernel: raid6: .... xor() 24437 MB/s, rmw enabled Sep 12 17:50:05.982966 kernel: raid6: using avx2x2 recovery algorithm Sep 12 17:50:06.003836 kernel: xor: automatically using best checksumming function avx Sep 12 17:50:06.175827 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 17:50:06.179542 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:50:06.188961 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:50:06.239355 systemd-udevd[555]: Using default interface naming scheme 'v255'. Sep 12 17:50:06.244334 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:50:06.261027 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 17:50:06.308235 dracut-pre-trigger[566]: rd.md=0: removing MD RAID activation Sep 12 17:50:06.362580 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:50:06.376196 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:50:06.517589 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:50:06.544754 kernel: cryptd: max_cpu_qlen set to 1000 Sep 12 17:50:06.544781 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 12 17:50:06.544800 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 12 17:50:06.519576 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 17:50:06.608212 kernel: PTP clock support registered Sep 12 17:50:06.608227 kernel: AES CTR mode by8 optimization enabled Sep 12 17:50:06.608235 kernel: libata version 3.00 loaded. Sep 12 17:50:06.608246 kernel: ACPI: bus type USB registered Sep 12 17:50:06.608254 kernel: usbcore: registered new interface driver usbfs Sep 12 17:50:06.608261 kernel: usbcore: registered new interface driver hub Sep 12 17:50:06.608268 kernel: usbcore: registered new device driver usb Sep 12 17:50:06.608275 kernel: ahci 0000:00:17.0: version 3.0 Sep 12 17:50:06.608374 kernel: ahci 0000:00:17.0: AHCI vers 0001.0301, 32 command slots, 6 Gbps, RAID mode Sep 12 17:50:06.557154 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:50:06.624776 kernel: ahci 0000:00:17.0: 8/8 ports implemented (port mask 0xff) Sep 12 17:50:06.624915 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Sep 12 17:50:06.557247 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:50:06.809871 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Sep 12 17:50:06.809964 kernel: scsi host0: ahci Sep 12 17:50:06.810040 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Sep 12 17:50:06.810117 kernel: scsi host1: ahci Sep 12 17:50:06.810188 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Sep 12 17:50:06.810259 kernel: scsi host2: ahci Sep 12 17:50:06.810328 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Sep 12 17:50:06.810398 kernel: scsi host3: ahci Sep 12 17:50:06.810467 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Sep 12 17:50:06.810536 kernel: scsi host4: ahci Sep 12 17:50:06.810610 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Sep 12 17:50:06.810680 kernel: scsi host5: ahci Sep 12 17:50:06.810747 kernel: hub 1-0:1.0: USB hub found Sep 12 17:50:06.810837 kernel: scsi host6: ahci Sep 12 17:50:06.810906 kernel: hub 1-0:1.0: 16 ports detected Sep 12 17:50:06.810982 kernel: scsi host7: ahci Sep 12 17:50:06.811049 kernel: hub 2-0:1.0: USB hub found Sep 12 17:50:06.811130 kernel: ata1: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516100 irq 129 lpm-pol 0 Sep 12 17:50:06.811139 kernel: hub 2-0:1.0: 10 ports detected Sep 12 17:50:06.811213 kernel: ata2: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516180 irq 129 lpm-pol 0 Sep 12 17:50:06.811222 kernel: ata3: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516200 irq 129 lpm-pol 0 Sep 12 17:50:06.811229 kernel: ata4: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516280 irq 129 lpm-pol 0 Sep 12 17:50:06.811236 kernel: ata5: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516300 irq 129 lpm-pol 0 Sep 12 17:50:06.811243 kernel: ata6: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516380 irq 129 lpm-pol 0 Sep 12 17:50:06.811250 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Sep 12 17:50:06.811259 kernel: ata7: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516400 irq 129 lpm-pol 0 Sep 12 17:50:06.811267 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Sep 12 17:50:06.811274 kernel: ata8: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516480 irq 129 lpm-pol 0 Sep 12 17:50:06.624822 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:50:06.846994 kernel: igb 0000:04:00.0: added PHC on eth0 Sep 12 17:50:06.847098 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Sep 12 17:50:06.625503 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:50:06.868110 kernel: igb 0000:04:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:72:01:b8 Sep 12 17:50:06.868203 kernel: igb 0000:04:00.0: eth0: PBA No: 010000-000 Sep 12 17:50:06.868282 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Sep 12 17:50:06.810050 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:50:06.891845 kernel: igb 0000:05:00.0: added PHC on eth1 Sep 12 17:50:06.891947 kernel: igb 0000:05:00.0: Intel(R) Gigabit Ethernet Network Connection Sep 12 17:50:06.892026 kernel: igb 0000:05:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:72:01:b9 Sep 12 17:50:06.892102 kernel: igb 0000:05:00.0: eth1: PBA No: 010000-000 Sep 12 17:50:06.892175 kernel: igb 0000:05:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Sep 12 17:50:06.927410 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:50:06.953979 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Sep 12 17:50:07.089266 kernel: hub 1-14:1.0: USB hub found Sep 12 17:50:07.089955 kernel: hub 1-14:1.0: 4 ports detected Sep 12 17:50:07.099832 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 12 17:50:07.099851 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 12 17:50:07.105851 kernel: ata7: SATA link down (SStatus 0 SControl 300) Sep 12 17:50:07.111854 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Sep 12 17:50:07.117826 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 12 17:50:07.123855 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 12 17:50:07.129852 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Sep 12 17:50:07.135850 kernel: ata8: SATA link down (SStatus 0 SControl 300) Sep 12 17:50:07.141863 kernel: ata1.00: Model 'Micron_5300_MTFDDAK480TDT', rev ' D3MU001', applying quirks: zeroaftertrim Sep 12 17:50:07.158153 kernel: ata1.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Sep 12 17:50:07.158850 kernel: ata2.00: Model 'Micron_5300_MTFDDAK480TDT', rev ' D3MU001', applying quirks: zeroaftertrim Sep 12 17:50:07.175325 kernel: ata2.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Sep 12 17:50:07.186855 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Sep 12 17:50:07.186872 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Sep 12 17:50:07.204858 kernel: ata1.00: Features: NCQ-prio Sep 12 17:50:07.204875 kernel: ata2.00: Features: NCQ-prio Sep 12 17:50:07.225827 kernel: ata1.00: configured for UDMA/133 Sep 12 17:50:07.225844 kernel: ata2.00: configured for UDMA/133 Sep 12 17:50:07.225852 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Sep 12 17:50:07.247850 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Sep 12 17:50:07.254801 kernel: igb 0000:05:00.0 eno2: renamed from eth1 Sep 12 17:50:07.254959 kernel: ata1.00: Enabling discard_zeroes_data Sep 12 17:50:07.259320 kernel: igb 0000:04:00.0 eno1: renamed from eth0 Sep 12 17:50:07.259451 kernel: sd 0:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Sep 12 17:50:07.259572 kernel: ata2.00: Enabling discard_zeroes_data Sep 12 17:50:07.259587 kernel: sd 1:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Sep 12 17:50:07.259697 kernel: sd 1:0:0:0: [sdb] 4096-byte physical blocks Sep 12 17:50:07.259823 kernel: sd 1:0:0:0: [sdb] Write Protect is off Sep 12 17:50:07.259943 kernel: sd 1:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Sep 12 17:50:07.260047 kernel: sd 1:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 12 17:50:07.260154 kernel: sd 1:0:0:0: [sdb] Preferred minimum I/O size 4096 bytes Sep 12 17:50:07.260257 kernel: ata2.00: Enabling discard_zeroes_data Sep 12 17:50:07.322552 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Sep 12 17:50:07.327763 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 12 17:50:07.332864 kernel: sd 1:0:0:0: [sdb] Attached SCSI disk Sep 12 17:50:07.332982 kernel: sd 0:0:0:0: [sda] Mode Sense: 00 3a 00 00 Sep 12 17:50:07.333860 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 12 17:50:07.348909 kernel: sd 0:0:0:0: [sda] Preferred minimum I/O size 4096 bytes Sep 12 17:50:07.354634 kernel: ata1.00: Enabling discard_zeroes_data Sep 12 17:50:07.379451 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 17:50:07.379470 kernel: GPT:9289727 != 937703087 Sep 12 17:50:07.385727 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 17:50:07.392683 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Sep 12 17:50:07.392715 kernel: GPT:9289727 != 937703087 Sep 12 17:50:07.401994 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 17:50:07.407243 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:50:07.412250 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 12 17:50:07.492989 kernel: mlx5_core 0000:02:00.0: PTM is not supported by PCIe Sep 12 17:50:07.493137 kernel: mlx5_core 0000:02:00.0: firmware version: 14.28.2006 Sep 12 17:50:07.502129 kernel: mlx5_core 0000:02:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Sep 12 17:50:07.508814 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Micron_5300_MTFDDAK480TDT ROOT. Sep 12 17:50:07.552900 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 17:50:07.552914 kernel: usbcore: registered new interface driver usbhid Sep 12 17:50:07.552925 kernel: usbhid: USB HID core driver Sep 12 17:50:07.552933 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Sep 12 17:50:07.528601 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Micron_5300_MTFDDAK480TDT EFI-SYSTEM. Sep 12 17:50:07.568604 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Micron_5300_MTFDDAK480TDT USR-A. Sep 12 17:50:07.619875 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Sep 12 17:50:07.619984 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Sep 12 17:50:07.632357 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Sep 12 17:50:07.632360 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Micron_5300_MTFDDAK480TDT USR-A. Sep 12 17:50:07.649790 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Sep 12 17:50:07.674342 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 17:50:07.702998 disk-uuid[760]: Primary Header is updated. Sep 12 17:50:07.702998 disk-uuid[760]: Secondary Entries is updated. Sep 12 17:50:07.702998 disk-uuid[760]: Secondary Header is updated. Sep 12 17:50:07.730859 kernel: ata1.00: Enabling discard_zeroes_data Sep 12 17:50:07.730871 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:50:07.767851 kernel: mlx5_core 0000:02:00.0: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Sep 12 17:50:07.778587 kernel: mlx5_core 0000:02:00.0: Port module event: module 0, Cable plugged Sep 12 17:50:07.992861 kernel: mlx5_core 0000:02:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Sep 12 17:50:08.010020 kernel: mlx5_core 0000:02:00.1: PTM is not supported by PCIe Sep 12 17:50:08.010550 kernel: mlx5_core 0000:02:00.1: firmware version: 14.28.2006 Sep 12 17:50:08.010997 kernel: mlx5_core 0000:02:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Sep 12 17:50:08.307843 kernel: mlx5_core 0000:02:00.1: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Sep 12 17:50:08.320086 kernel: mlx5_core 0000:02:00.1: Port module event: module 1, Cable plugged Sep 12 17:50:08.566882 kernel: mlx5_core 0000:02:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Sep 12 17:50:08.579797 kernel: mlx5_core 0000:02:00.0 enp2s0f0np0: renamed from eth0 Sep 12 17:50:08.579916 kernel: mlx5_core 0000:02:00.1 enp2s0f1np1: renamed from eth1 Sep 12 17:50:08.598915 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 17:50:08.608409 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:50:08.626937 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:50:08.644938 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:50:08.664324 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 17:50:08.702995 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:50:08.734379 kernel: ata1.00: Enabling discard_zeroes_data Sep 12 17:50:08.755698 disk-uuid[761]: The operation has completed successfully. Sep 12 17:50:08.762905 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:50:08.794091 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 17:50:08.794142 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 17:50:08.835580 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 17:50:08.869937 sh[811]: Success Sep 12 17:50:08.899134 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 17:50:08.899158 kernel: device-mapper: uevent: version 1.0.3 Sep 12 17:50:08.908374 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 17:50:08.921798 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 12 17:50:08.964264 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 17:50:08.975048 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 17:50:09.007846 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 17:50:09.054896 kernel: BTRFS: device fsid 74707491-1b86-4926-8bdb-c533ce2a0c32 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (824) Sep 12 17:50:09.054909 kernel: BTRFS info (device dm-0): first mount of filesystem 74707491-1b86-4926-8bdb-c533ce2a0c32 Sep 12 17:50:09.054916 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:50:09.069978 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 17:50:09.069995 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 17:50:09.076099 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 17:50:09.078242 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 17:50:09.086139 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 17:50:09.094073 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 17:50:09.094510 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 17:50:09.142136 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 17:50:09.198746 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (847) Sep 12 17:50:09.198781 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:50:09.198789 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:50:09.198799 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:50:09.200863 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:50:09.200899 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:50:09.223800 kernel: BTRFS info (device sda6): last unmount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:50:09.228957 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 17:50:09.239519 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 17:50:09.298061 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:50:09.308971 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:50:09.360675 systemd-networkd[994]: lo: Link UP Sep 12 17:50:09.360679 systemd-networkd[994]: lo: Gained carrier Sep 12 17:50:09.363493 systemd-networkd[994]: Enumeration completed Sep 12 17:50:09.374141 ignition[893]: Ignition 2.21.0 Sep 12 17:50:09.363564 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:50:09.374326 ignition[893]: Stage: fetch-offline Sep 12 17:50:09.364197 systemd-networkd[994]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:50:09.374365 ignition[893]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:50:09.375993 systemd[1]: Reached target network.target - Network. Sep 12 17:50:09.374372 ignition[893]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 12 17:50:09.376461 unknown[893]: fetched base config from "system" Sep 12 17:50:09.374438 ignition[893]: parsed url from cmdline: "" Sep 12 17:50:09.376465 unknown[893]: fetched user config from "system" Sep 12 17:50:09.374441 ignition[893]: no config URL provided Sep 12 17:50:09.393905 systemd-networkd[994]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:50:09.374445 ignition[893]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:50:09.395202 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:50:09.374482 ignition[893]: parsing config with SHA512: 15dca1f29b99a2008035c0a57a035bf3bca44103198489169812372dca5ffa5d9a0728a44f7c46b88ed1a794ffb811a916f17012712534b02ae6e7d2e7a7dcc0 Sep 12 17:50:09.417191 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 12 17:50:09.376675 ignition[893]: fetch-offline: fetch-offline passed Sep 12 17:50:09.417746 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 17:50:09.376678 ignition[893]: POST message to Packet Timeline Sep 12 17:50:09.421779 systemd-networkd[994]: enp2s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:50:09.376681 ignition[893]: POST Status error: resource requires networking Sep 12 17:50:09.585981 kernel: mlx5_core 0000:02:00.0 enp2s0f0np0: Link up Sep 12 17:50:09.376712 ignition[893]: Ignition finished successfully Sep 12 17:50:09.471959 ignition[1011]: Ignition 2.21.0 Sep 12 17:50:09.589756 systemd-networkd[994]: enp2s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:50:09.471964 ignition[1011]: Stage: kargs Sep 12 17:50:09.472050 ignition[1011]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:50:09.472056 ignition[1011]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 12 17:50:09.472542 ignition[1011]: kargs: kargs passed Sep 12 17:50:09.472546 ignition[1011]: POST message to Packet Timeline Sep 12 17:50:09.472557 ignition[1011]: GET https://metadata.packet.net/metadata: attempt #1 Sep 12 17:50:09.473110 ignition[1011]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:51479->[::1]:53: read: connection refused Sep 12 17:50:09.673257 ignition[1011]: GET https://metadata.packet.net/metadata: attempt #2 Sep 12 17:50:09.674472 ignition[1011]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:41704->[::1]:53: read: connection refused Sep 12 17:50:09.793891 kernel: mlx5_core 0000:02:00.1 enp2s0f1np1: Link up Sep 12 17:50:09.798327 systemd-networkd[994]: eno1: Link UP Sep 12 17:50:09.798746 systemd-networkd[994]: eno2: Link UP Sep 12 17:50:09.799178 systemd-networkd[994]: enp2s0f0np0: Link UP Sep 12 17:50:09.799651 systemd-networkd[994]: enp2s0f0np0: Gained carrier Sep 12 17:50:09.815321 systemd-networkd[994]: enp2s0f1np1: Link UP Sep 12 17:50:09.816671 systemd-networkd[994]: enp2s0f1np1: Gained carrier Sep 12 17:50:09.849967 systemd-networkd[994]: enp2s0f0np0: DHCPv4 address 139.178.94.149/31, gateway 139.178.94.148 acquired from 145.40.83.140 Sep 12 17:50:10.074628 ignition[1011]: GET https://metadata.packet.net/metadata: attempt #3 Sep 12 17:50:10.075669 ignition[1011]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:52806->[::1]:53: read: connection refused Sep 12 17:50:10.840022 systemd-networkd[994]: enp2s0f0np0: Gained IPv6LL Sep 12 17:50:10.876094 ignition[1011]: GET https://metadata.packet.net/metadata: attempt #4 Sep 12 17:50:10.877151 ignition[1011]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:60021->[::1]:53: read: connection refused Sep 12 17:50:10.968042 systemd-networkd[994]: enp2s0f1np1: Gained IPv6LL Sep 12 17:50:12.478844 ignition[1011]: GET https://metadata.packet.net/metadata: attempt #5 Sep 12 17:50:12.479743 ignition[1011]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:38644->[::1]:53: read: connection refused Sep 12 17:50:15.682448 ignition[1011]: GET https://metadata.packet.net/metadata: attempt #6 Sep 12 17:50:16.693829 ignition[1011]: GET result: OK Sep 12 17:50:17.273460 ignition[1011]: Ignition finished successfully Sep 12 17:50:17.279322 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 17:50:17.290705 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 17:50:17.333735 ignition[1030]: Ignition 2.21.0 Sep 12 17:50:17.333741 ignition[1030]: Stage: disks Sep 12 17:50:17.333843 ignition[1030]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:50:17.333850 ignition[1030]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 12 17:50:17.334962 ignition[1030]: disks: disks passed Sep 12 17:50:17.334967 ignition[1030]: POST message to Packet Timeline Sep 12 17:50:17.334981 ignition[1030]: GET https://metadata.packet.net/metadata: attempt #1 Sep 12 17:50:18.490882 ignition[1030]: GET result: OK Sep 12 17:50:19.132735 ignition[1030]: Ignition finished successfully Sep 12 17:50:19.136941 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 17:50:19.149950 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 17:50:19.168036 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 17:50:19.186992 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:50:19.203990 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:50:19.219989 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:50:19.238547 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 17:50:19.287994 systemd-fsck[1051]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 12 17:50:19.297296 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 17:50:19.311635 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 17:50:19.466672 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 17:50:19.480036 kernel: EXT4-fs (sda9): mounted filesystem 26739aba-b0be-4ce3-bfbd-ca4dbcbe2426 r/w with ordered data mode. Quota mode: none. Sep 12 17:50:19.474687 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 17:50:19.491773 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:50:19.512171 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 17:50:19.543105 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sda6 (8:6) scanned by mount (1060) Sep 12 17:50:19.543123 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:50:19.519573 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 12 17:50:19.570231 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:50:19.570243 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:50:19.570251 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:50:19.570258 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:50:19.579268 systemd[1]: Starting flatcar-static-network.service - Flatcar Static Network Agent... Sep 12 17:50:19.599975 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 17:50:19.599997 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:50:19.619030 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:50:19.664995 coreos-metadata[1078]: Sep 12 17:50:19.651 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 12 17:50:19.642045 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 17:50:19.692999 coreos-metadata[1062]: Sep 12 17:50:19.651 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 12 17:50:19.659010 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 17:50:19.715928 initrd-setup-root[1092]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 17:50:19.725923 initrd-setup-root[1099]: cut: /sysroot/etc/group: No such file or directory Sep 12 17:50:19.735876 initrd-setup-root[1106]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 17:50:19.744910 initrd-setup-root[1113]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 17:50:19.789062 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 17:50:19.790076 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 17:50:19.806796 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 17:50:19.840945 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 17:50:19.855840 kernel: BTRFS info (device sda6): last unmount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:50:19.862042 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 17:50:19.878098 ignition[1181]: INFO : Ignition 2.21.0 Sep 12 17:50:19.878098 ignition[1181]: INFO : Stage: mount Sep 12 17:50:19.878098 ignition[1181]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:50:19.878098 ignition[1181]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 12 17:50:19.878098 ignition[1181]: INFO : mount: mount passed Sep 12 17:50:19.878098 ignition[1181]: INFO : POST message to Packet Timeline Sep 12 17:50:19.878098 ignition[1181]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 12 17:50:20.634780 coreos-metadata[1078]: Sep 12 17:50:20.634 INFO Fetch successful Sep 12 17:50:20.674635 systemd[1]: flatcar-static-network.service: Deactivated successfully. Sep 12 17:50:20.674695 systemd[1]: Finished flatcar-static-network.service - Flatcar Static Network Agent. Sep 12 17:50:20.790822 coreos-metadata[1062]: Sep 12 17:50:20.790 INFO Fetch successful Sep 12 17:50:20.837685 ignition[1181]: INFO : GET result: OK Sep 12 17:50:20.864579 coreos-metadata[1062]: Sep 12 17:50:20.864 INFO wrote hostname ci-4426.1.0-a-b1d4eb1a76 to /sysroot/etc/hostname Sep 12 17:50:20.865842 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 17:50:21.657530 ignition[1181]: INFO : Ignition finished successfully Sep 12 17:50:21.661348 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 17:50:21.677921 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 17:50:21.702417 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:50:21.746797 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sda6 (8:6) scanned by mount (1206) Sep 12 17:50:21.764138 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:50:21.764154 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:50:21.779754 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:50:21.779774 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:50:21.785862 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:50:21.787631 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:50:21.820369 ignition[1223]: INFO : Ignition 2.21.0 Sep 12 17:50:21.820369 ignition[1223]: INFO : Stage: files Sep 12 17:50:21.833022 ignition[1223]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:50:21.833022 ignition[1223]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 12 17:50:21.833022 ignition[1223]: DEBUG : files: compiled without relabeling support, skipping Sep 12 17:50:21.833022 ignition[1223]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 17:50:21.833022 ignition[1223]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 17:50:21.833022 ignition[1223]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 17:50:21.833022 ignition[1223]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 17:50:21.833022 ignition[1223]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 17:50:21.833022 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 12 17:50:21.833022 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 12 17:50:21.824608 unknown[1223]: wrote ssh authorized keys file for user: core Sep 12 17:50:21.954931 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 17:50:23.120715 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 12 17:50:23.137023 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 17:50:23.137023 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 17:50:23.137023 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:50:23.137023 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:50:23.137023 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:50:23.137023 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:50:23.137023 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:50:23.137023 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:50:23.137023 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:50:23.137023 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:50:23.137023 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 17:50:23.137023 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 17:50:23.137023 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 17:50:23.137023 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 12 17:50:23.639127 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 17:50:23.924993 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 17:50:23.924993 ignition[1223]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 17:50:23.951943 ignition[1223]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:50:23.951943 ignition[1223]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:50:23.951943 ignition[1223]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 17:50:23.951943 ignition[1223]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 12 17:50:23.951943 ignition[1223]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 17:50:23.951943 ignition[1223]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:50:23.951943 ignition[1223]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:50:23.951943 ignition[1223]: INFO : files: files passed Sep 12 17:50:23.951943 ignition[1223]: INFO : POST message to Packet Timeline Sep 12 17:50:23.951943 ignition[1223]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 12 17:50:24.859346 ignition[1223]: INFO : GET result: OK Sep 12 17:50:26.155153 ignition[1223]: INFO : Ignition finished successfully Sep 12 17:50:26.158680 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 17:50:26.175101 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 17:50:26.197423 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 17:50:26.208273 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 17:50:26.208339 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 17:50:26.236298 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:50:26.256058 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 17:50:26.276081 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 17:50:26.294016 initrd-setup-root-after-ignition[1265]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:50:26.294016 initrd-setup-root-after-ignition[1265]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:50:26.316981 initrd-setup-root-after-ignition[1269]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:50:26.360050 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 17:50:26.360140 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 17:50:26.379237 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 17:50:26.398027 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 17:50:26.416199 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 17:50:26.418552 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 17:50:26.498115 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:50:26.501929 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 17:50:26.566674 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:50:26.577331 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:50:26.596396 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 17:50:26.614357 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 17:50:26.614698 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:50:26.640464 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 17:50:26.659319 systemd[1]: Stopped target basic.target - Basic System. Sep 12 17:50:26.677316 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 17:50:26.693321 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:50:26.712316 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 17:50:26.731319 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 17:50:26.750320 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 17:50:26.768315 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:50:26.787351 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 17:50:26.806338 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 17:50:26.824319 systemd[1]: Stopped target swap.target - Swaps. Sep 12 17:50:26.840266 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 17:50:26.840609 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:50:26.865398 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:50:26.883340 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:50:26.902203 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 17:50:26.902658 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:50:26.922216 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 17:50:26.922550 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 17:50:26.951343 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 17:50:26.951731 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:50:26.969478 systemd[1]: Stopped target paths.target - Path Units. Sep 12 17:50:26.985195 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 17:50:26.985636 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:50:27.004314 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 17:50:27.022319 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 17:50:27.038278 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 17:50:27.038540 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:50:27.056311 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 17:50:27.056567 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:50:27.077410 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 17:50:27.077755 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:50:27.094388 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 17:50:27.198992 ignition[1291]: INFO : Ignition 2.21.0 Sep 12 17:50:27.198992 ignition[1291]: INFO : Stage: umount Sep 12 17:50:27.198992 ignition[1291]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:50:27.198992 ignition[1291]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 12 17:50:27.198992 ignition[1291]: INFO : umount: umount passed Sep 12 17:50:27.198992 ignition[1291]: INFO : POST message to Packet Timeline Sep 12 17:50:27.198992 ignition[1291]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 12 17:50:27.094720 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 17:50:27.110386 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 12 17:50:27.110724 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 17:50:27.129503 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 17:50:27.142416 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 17:50:27.162906 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 17:50:27.163088 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:50:27.191139 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 17:50:27.191268 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:50:27.222596 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 17:50:27.223207 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 17:50:27.223266 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 17:50:27.240902 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 17:50:27.240978 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 17:50:28.225237 ignition[1291]: INFO : GET result: OK Sep 12 17:50:28.599706 ignition[1291]: INFO : Ignition finished successfully Sep 12 17:50:28.603520 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 17:50:28.603815 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 17:50:28.616904 systemd[1]: Stopped target network.target - Network. Sep 12 17:50:28.630065 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 17:50:28.630246 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 17:50:28.648151 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 17:50:28.648292 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 17:50:28.664080 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 17:50:28.664237 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 17:50:28.680076 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 17:50:28.680209 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 17:50:28.699068 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 17:50:28.699225 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 17:50:28.715362 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 17:50:28.733275 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 17:50:28.749897 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 17:50:28.750172 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 17:50:28.772037 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 17:50:28.772179 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 17:50:28.772232 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 17:50:28.793821 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 17:50:28.794306 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 17:50:28.820135 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 17:50:28.820186 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:50:28.840916 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 17:50:28.862917 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 17:50:28.862951 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:50:28.863034 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 17:50:28.863060 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:50:28.891103 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 17:50:28.891161 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 17:50:28.911044 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 17:50:28.911182 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:50:28.931372 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:50:28.953073 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 17:50:28.953251 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:50:28.954311 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 17:50:28.954658 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:50:28.973447 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 17:50:28.973593 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 17:50:28.989121 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 17:50:28.989227 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:50:28.997212 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 17:50:28.997354 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:50:29.042001 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 17:50:29.042145 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 17:50:29.067973 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 17:50:29.068126 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:50:29.108002 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 17:50:29.132828 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 17:50:29.388941 systemd-journald[299]: Received SIGTERM from PID 1 (systemd). Sep 12 17:50:29.132864 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:50:29.153219 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 17:50:29.153275 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:50:29.173240 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 12 17:50:29.173348 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:50:29.194488 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 17:50:29.194624 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:50:29.212211 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:50:29.212350 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:50:29.235552 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 12 17:50:29.235706 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 12 17:50:29.235834 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 12 17:50:29.235944 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:50:29.237287 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 17:50:29.237624 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 17:50:29.252610 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 17:50:29.252856 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 17:50:29.273856 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 17:50:29.293110 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 17:50:29.348212 systemd[1]: Switching root. Sep 12 17:50:29.515117 systemd-journald[299]: Journal stopped Sep 12 17:50:31.261077 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 17:50:31.261094 kernel: SELinux: policy capability open_perms=1 Sep 12 17:50:31.261101 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 17:50:31.261106 kernel: SELinux: policy capability always_check_network=0 Sep 12 17:50:31.261112 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 17:50:31.261117 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 17:50:31.261123 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 17:50:31.261130 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 17:50:31.261135 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 17:50:31.261142 kernel: audit: type=1403 audit(1757699429.634:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 17:50:31.261148 systemd[1]: Successfully loaded SELinux policy in 91.914ms. Sep 12 17:50:31.261155 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.854ms. Sep 12 17:50:31.261162 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 17:50:31.261168 systemd[1]: Detected architecture x86-64. Sep 12 17:50:31.261176 systemd[1]: Detected first boot. Sep 12 17:50:31.261182 systemd[1]: Hostname set to . Sep 12 17:50:31.261189 systemd[1]: Initializing machine ID from random generator. Sep 12 17:50:31.261195 zram_generator::config[1347]: No configuration found. Sep 12 17:50:31.261203 systemd[1]: Populated /etc with preset unit settings. Sep 12 17:50:31.261211 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 17:50:31.261217 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 17:50:31.261223 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 17:50:31.261229 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 17:50:31.261236 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 17:50:31.261242 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 17:50:31.261250 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 17:50:31.261256 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 17:50:31.261263 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 17:50:31.261270 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 17:50:31.261277 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 17:50:31.261283 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 17:50:31.261290 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:50:31.261298 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:50:31.261306 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 17:50:31.261312 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 17:50:31.261319 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 17:50:31.261325 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:50:31.261332 systemd[1]: Expecting device dev-ttyS1.device - /dev/ttyS1... Sep 12 17:50:31.261338 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:50:31.261345 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:50:31.261353 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 17:50:31.261360 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 17:50:31.261367 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 17:50:31.261374 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 17:50:31.261381 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:50:31.261387 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:50:31.261394 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:50:31.261401 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:50:31.261407 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 17:50:31.261415 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 17:50:31.261422 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 17:50:31.261429 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:50:31.261436 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:50:31.261443 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:50:31.261450 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 17:50:31.261457 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 17:50:31.261464 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 17:50:31.261470 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 17:50:31.261477 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:50:31.261484 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 17:50:31.261491 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 17:50:31.261497 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 17:50:31.261506 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 17:50:31.261512 systemd[1]: Reached target machines.target - Containers. Sep 12 17:50:31.261519 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 17:50:31.261526 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:50:31.261533 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:50:31.261539 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 17:50:31.261546 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:50:31.261553 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:50:31.261561 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:50:31.261568 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 17:50:31.261574 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:50:31.261581 kernel: ACPI: bus type drm_connector registered Sep 12 17:50:31.261588 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 17:50:31.261595 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 17:50:31.261602 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 17:50:31.261608 kernel: loop: module loaded Sep 12 17:50:31.261614 kernel: fuse: init (API version 7.41) Sep 12 17:50:31.261622 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 17:50:31.261629 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 17:50:31.261636 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:50:31.261642 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:50:31.261649 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:50:31.261656 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:50:31.261662 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 17:50:31.261680 systemd-journald[1450]: Collecting audit messages is disabled. Sep 12 17:50:31.261696 systemd-journald[1450]: Journal started Sep 12 17:50:31.261710 systemd-journald[1450]: Runtime Journal (/run/log/journal/5d1e89821b7244f2b89673f493c8a087) is 8M, max 639.3M, 631.3M free. Sep 12 17:50:30.113227 systemd[1]: Queued start job for default target multi-user.target. Sep 12 17:50:30.125713 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 17:50:30.125951 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 17:50:31.288863 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 17:50:31.300835 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:50:31.324875 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 17:50:31.324918 systemd[1]: Stopped verity-setup.service. Sep 12 17:50:31.353835 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:50:31.361842 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:50:31.370324 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 17:50:31.379933 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 17:50:31.389931 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 17:50:31.399934 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 17:50:31.409062 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 17:50:31.419040 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 17:50:31.428110 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 17:50:31.439101 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:50:31.450093 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 17:50:31.450210 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 17:50:31.460112 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:50:31.460238 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:50:31.470158 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:50:31.470318 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:50:31.479229 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:50:31.479436 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:50:31.491388 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 17:50:31.491677 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 17:50:31.501721 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:50:31.502235 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:50:31.512784 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:50:31.523769 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:50:31.535757 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 17:50:31.547751 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 17:50:31.559822 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:50:31.593014 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:50:31.605955 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 17:50:31.624467 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 17:50:31.634060 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 17:50:31.634152 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:50:31.645854 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 17:50:31.659774 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 17:50:31.669304 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:50:31.681807 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 17:50:31.691604 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 17:50:31.701898 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:50:31.712998 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 17:50:31.717617 systemd-journald[1450]: Time spent on flushing to /var/log/journal/5d1e89821b7244f2b89673f493c8a087 is 12.454ms for 1424 entries. Sep 12 17:50:31.717617 systemd-journald[1450]: System Journal (/var/log/journal/5d1e89821b7244f2b89673f493c8a087) is 8M, max 195.6M, 187.6M free. Sep 12 17:50:31.741536 systemd-journald[1450]: Received client request to flush runtime journal. Sep 12 17:50:31.729936 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:50:31.737089 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:50:31.754302 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 17:50:31.772051 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:50:31.783864 kernel: loop0: detected capacity change from 0 to 128016 Sep 12 17:50:31.790595 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 17:50:31.801446 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 17:50:31.811801 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 17:50:31.815767 systemd-tmpfiles[1489]: ACLs are not supported, ignoring. Sep 12 17:50:31.815777 systemd-tmpfiles[1489]: ACLs are not supported, ignoring. Sep 12 17:50:31.817112 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 17:50:31.827112 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 17:50:31.838072 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:50:31.847085 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:50:31.858828 kernel: loop1: detected capacity change from 0 to 111000 Sep 12 17:50:31.864469 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 17:50:31.874601 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 17:50:31.901073 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 17:50:31.914021 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 17:50:31.914405 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 17:50:31.927800 kernel: loop2: detected capacity change from 0 to 8 Sep 12 17:50:31.935274 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 17:50:31.946043 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:50:31.962812 kernel: loop3: detected capacity change from 0 to 221472 Sep 12 17:50:31.977268 systemd-tmpfiles[1506]: ACLs are not supported, ignoring. Sep 12 17:50:31.977278 systemd-tmpfiles[1506]: ACLs are not supported, ignoring. Sep 12 17:50:31.979201 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:50:32.002846 kernel: loop4: detected capacity change from 0 to 128016 Sep 12 17:50:32.022872 kernel: loop5: detected capacity change from 0 to 111000 Sep 12 17:50:32.043870 kernel: loop6: detected capacity change from 0 to 8 Sep 12 17:50:32.050839 kernel: loop7: detected capacity change from 0 to 221472 Sep 12 17:50:32.063099 ldconfig[1480]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 17:50:32.064429 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 17:50:32.065585 (sd-merge)[1510]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-packet'. Sep 12 17:50:32.065964 (sd-merge)[1510]: Merged extensions into '/usr'. Sep 12 17:50:32.074807 systemd[1]: Reload requested from client PID 1486 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 17:50:32.074816 systemd[1]: Reloading... Sep 12 17:50:32.100903 zram_generator::config[1535]: No configuration found. Sep 12 17:50:32.228789 systemd[1]: Reloading finished in 153 ms. Sep 12 17:50:32.246620 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 17:50:32.257144 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 17:50:32.277637 systemd[1]: Starting ensure-sysext.service... Sep 12 17:50:32.285680 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:50:32.306521 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:50:32.317447 systemd-tmpfiles[1593]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 17:50:32.317470 systemd-tmpfiles[1593]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 17:50:32.317634 systemd-tmpfiles[1593]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 17:50:32.317805 systemd-tmpfiles[1593]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 17:50:32.318388 systemd-tmpfiles[1593]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 17:50:32.318568 systemd-tmpfiles[1593]: ACLs are not supported, ignoring. Sep 12 17:50:32.318606 systemd-tmpfiles[1593]: ACLs are not supported, ignoring. Sep 12 17:50:32.321118 systemd-tmpfiles[1593]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:50:32.321123 systemd-tmpfiles[1593]: Skipping /boot Sep 12 17:50:32.325782 systemd-tmpfiles[1593]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:50:32.325787 systemd-tmpfiles[1593]: Skipping /boot Sep 12 17:50:32.331065 systemd[1]: Reload requested from client PID 1592 ('systemctl') (unit ensure-sysext.service)... Sep 12 17:50:32.331073 systemd[1]: Reloading... Sep 12 17:50:32.341280 systemd-udevd[1594]: Using default interface naming scheme 'v255'. Sep 12 17:50:32.358839 zram_generator::config[1621]: No configuration found. Sep 12 17:50:32.416628 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Sep 12 17:50:32.416678 kernel: ACPI: button: Sleep Button [SLPB] Sep 12 17:50:32.424767 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 12 17:50:32.433801 kernel: ACPI: button: Power Button [PWRF] Sep 12 17:50:32.433859 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 17:50:32.447803 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Sep 12 17:50:32.448018 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Sep 12 17:50:32.461805 kernel: IPMI message handler: version 39.2 Sep 12 17:50:32.474804 kernel: ACPI: video: Video Device [GFX0] (multi-head: yes rom: no post: no) Sep 12 17:50:32.480896 kernel: ipmi device interface Sep 12 17:50:32.480956 kernel: MACsec IEEE 802.1AE Sep 12 17:50:32.480971 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Sep 12 17:50:32.481799 kernel: input: Video Bus as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0A08:00/LNXVIDEO:00/input/input4 Sep 12 17:50:32.505547 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Sep 12 17:50:32.520806 kernel: iTCO_vendor_support: vendor-support=0 Sep 12 17:50:32.520884 kernel: ipmi_si: IPMI System Interface driver Sep 12 17:50:32.531972 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Sep 12 17:50:32.532541 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Sep 12 17:50:32.545656 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Sep 12 17:50:32.551899 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Sep 12 17:50:32.560263 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Sep 12 17:50:32.575545 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Sep 12 17:50:32.575954 kernel: ipmi_si: Adding ACPI-specified kcs state machine Sep 12 17:50:32.575996 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Sep 12 17:50:32.580919 systemd[1]: Condition check resulted in dev-ttyS1.device - /dev/ttyS1 being skipped. Sep 12 17:50:32.581145 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Sep 12 17:50:32.595930 systemd[1]: Reloading finished in 264 ms. Sep 12 17:50:32.610807 kernel: iTCO_wdt iTCO_wdt: unable to reset NO_REBOOT flag, device disabled by hardware/BIOS Sep 12 17:50:32.619847 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:50:32.652017 kernel: intel_rapl_common: Found RAPL domain package Sep 12 17:50:32.652070 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Sep 12 17:50:32.652187 kernel: intel_rapl_common: Found RAPL domain core Sep 12 17:50:32.652200 kernel: intel_rapl_common: Found RAPL domain uncore Sep 12 17:50:32.652211 kernel: intel_rapl_common: Found RAPL domain dram Sep 12 17:50:32.674983 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:50:32.695851 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b11, dev_id: 0x20) Sep 12 17:50:32.710461 systemd[1]: Finished ensure-sysext.service. Sep 12 17:50:32.742313 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Sep 12 17:50:32.750850 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:50:32.751539 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 17:50:32.777832 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Sep 12 17:50:33.009799 kernel: ipmi_ssif: IPMI SSIF Interface driver Sep 12 17:50:33.009866 kernel: i915 0000:00:02.0: can't derive routing for PCI INT A Sep 12 17:50:33.021166 kernel: i915 0000:00:02.0: PCI INT A: not connected Sep 12 17:50:33.030952 kernel: i915 0000:00:02.0: [drm] Found COFFEELAKE (device ID 3e9a) display version 9.00 stepping N/A Sep 12 17:50:33.034407 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 17:50:33.044608 kernel: i915 0000:00:02.0: [drm] VT-d active for gfx access Sep 12 17:50:33.044757 kernel: i915 0000:00:02.0: [drm] Using Transparent Hugepages Sep 12 17:50:33.056738 augenrules[1826]: No rules Sep 12 17:50:33.064367 kernel: i915 0000:00:02.0: ROM [??? 0x00000000 flags 0x20000000]: can't assign; bogus alignment Sep 12 17:50:33.064484 kernel: i915 0000:00:02.0: [drm] Failed to find VBIOS tables (VBT) Sep 12 17:50:33.078796 kernel: i915 0000:00:02.0: [drm] Finished loading DMC firmware i915/kbl_dmc_ver1_04.bin (v1.4) Sep 12 17:50:33.079987 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:50:33.090235 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:50:33.100419 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:50:33.110369 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:50:33.121399 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:50:33.130920 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:50:33.131463 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 17:50:33.141833 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:50:33.142476 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 17:50:33.153864 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:50:33.154932 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:50:33.163836 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 17:50:33.180426 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 17:50:33.198925 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:50:33.208912 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:50:33.209689 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:50:33.209825 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 17:50:33.220338 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 17:50:33.220525 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:50:33.220634 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:50:33.220828 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:50:33.220935 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:50:33.221118 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:50:33.221224 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:50:33.221405 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:50:33.221510 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:50:33.221699 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 17:50:33.221897 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 17:50:33.226283 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:50:33.226359 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:50:33.227111 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 17:50:33.228037 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 17:50:33.228065 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 17:50:33.228307 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 17:50:33.248481 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 17:50:33.267034 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 17:50:33.307820 systemd-resolved[1839]: Positive Trust Anchors: Sep 12 17:50:33.307826 systemd-resolved[1839]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:50:33.307853 systemd-resolved[1839]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:50:33.310706 systemd-resolved[1839]: Using system hostname 'ci-4426.1.0-a-b1d4eb1a76'. Sep 12 17:50:33.312888 systemd-networkd[1838]: lo: Link UP Sep 12 17:50:33.312892 systemd-networkd[1838]: lo: Gained carrier Sep 12 17:50:33.316434 systemd-networkd[1838]: bond0: netdev ready Sep 12 17:50:33.317587 systemd-networkd[1838]: Enumeration completed Sep 12 17:50:33.318657 systemd-networkd[1838]: enp2s0f0np0: Configuring with /etc/systemd/network/10-0c:42:a1:8f:9a:0e.network. Sep 12 17:50:33.325359 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 17:50:33.335134 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:50:33.343873 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:50:33.353026 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:50:33.365138 systemd[1]: Reached target network.target - Network. Sep 12 17:50:33.371831 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:50:33.382837 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:50:33.391899 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 17:50:33.401848 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 17:50:33.412838 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 12 17:50:33.423845 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 17:50:33.434834 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 17:50:33.434855 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:50:33.446796 kernel: mlx5_core 0000:02:00.0 enp2s0f0np0: Link up Sep 12 17:50:33.447838 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 17:50:33.459797 kernel: bond0: (slave enp2s0f0np0): Enslaving as a backup interface with an up link Sep 12 17:50:33.460126 systemd-networkd[1838]: enp2s0f1np1: Configuring with /etc/systemd/network/10-0c:42:a1:8f:9a:0f.network. Sep 12 17:50:33.463931 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 17:50:33.473879 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 17:50:33.484827 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:50:33.493343 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 17:50:33.503682 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 17:50:33.513070 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 17:50:33.524911 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 17:50:33.534015 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 17:50:33.545578 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 17:50:33.556421 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 17:50:33.567155 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 17:50:33.576390 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:50:33.587795 kernel: mlx5_core 0000:02:00.1 enp2s0f1np1: Link up Sep 12 17:50:33.598399 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:50:33.598794 kernel: bond0: (slave enp2s0f1np1): Enslaving as a backup interface with an up link Sep 12 17:50:33.599049 systemd-networkd[1838]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Sep 12 17:50:33.599930 systemd-networkd[1838]: enp2s0f0np0: Link UP Sep 12 17:50:33.600077 systemd-networkd[1838]: enp2s0f0np0: Gained carrier Sep 12 17:50:33.609794 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Sep 12 17:50:33.614873 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:50:33.614894 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:50:33.615463 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 17:50:33.619107 systemd-networkd[1838]: enp2s0f1np1: Reconfiguring with /etc/systemd/network/10-0c:42:a1:8f:9a:0e.network. Sep 12 17:50:33.619260 systemd-networkd[1838]: enp2s0f1np1: Link UP Sep 12 17:50:33.619405 systemd-networkd[1838]: enp2s0f1np1: Gained carrier Sep 12 17:50:33.628894 systemd-networkd[1838]: bond0: Link UP Sep 12 17:50:33.629049 systemd-networkd[1838]: bond0: Gained carrier Sep 12 17:50:33.629154 systemd-timesyncd[1840]: Network configuration changed, trying to establish connection. Sep 12 17:50:33.629462 systemd-timesyncd[1840]: Network configuration changed, trying to establish connection. Sep 12 17:50:33.629642 systemd-timesyncd[1840]: Network configuration changed, trying to establish connection. Sep 12 17:50:33.629728 systemd-timesyncd[1840]: Network configuration changed, trying to establish connection. Sep 12 17:50:33.634255 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 17:50:33.644397 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 17:50:33.652404 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 17:50:33.658832 coreos-metadata[1878]: Sep 12 17:50:33.658 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 12 17:50:33.678900 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 17:50:33.695920 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 17:50:33.697923 jq[1884]: false Sep 12 17:50:33.704835 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 17:50:33.705449 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 12 17:50:33.710196 extend-filesystems[1885]: Found /dev/sda6 Sep 12 17:50:33.733926 kernel: bond0: (slave enp2s0f0np0): link status definitely up, 10000 Mbps full duplex Sep 12 17:50:33.733952 kernel: bond0: active interface up! Sep 12 17:50:33.733970 kernel: EXT4-fs (sda9): resizing filesystem from 553472 to 116605649 blocks Sep 12 17:50:33.733983 extend-filesystems[1885]: Found /dev/sda9 Sep 12 17:50:33.733983 extend-filesystems[1885]: Checking size of /dev/sda9 Sep 12 17:50:33.733983 extend-filesystems[1885]: Resized partition /dev/sda9 Sep 12 17:50:33.767916 extend-filesystems[1896]: resize2fs 1.47.2 (1-Jan-2025) Sep 12 17:50:33.799975 kernel: i915 0000:00:02.0: [drm] [ENCODER:98:DDI A/PHY A] failed to retrieve link info, disabling eDP Sep 12 17:50:33.800143 kernel: [drm] Initialized i915 1.6.0 for 0000:00:02.0 on minor 0 Sep 12 17:50:33.753585 oslogin_cache_refresh[1886]: Refreshing passwd entry cache Sep 12 17:50:33.734634 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 17:50:33.800430 google_oslogin_nss_cache[1886]: oslogin_cache_refresh[1886]: Refreshing passwd entry cache Sep 12 17:50:33.749904 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 17:50:33.754540 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 17:50:33.768545 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 17:50:33.816728 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 17:50:33.827223 systemd[1]: Starting tcsd.service - TCG Core Services Daemon... Sep 12 17:50:33.834800 kernel: bond0: (slave enp2s0f1np1): link status definitely up, 10000 Mbps full duplex Sep 12 17:50:33.834855 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 17:50:33.835227 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 17:50:33.842569 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 17:50:33.850901 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 17:50:33.859824 update_engine[1916]: I20250912 17:50:33.859781 1916 main.cc:92] Flatcar Update Engine starting Sep 12 17:50:33.861108 systemd-logind[1911]: Watching system buttons on /dev/input/event3 (Power Button) Sep 12 17:50:33.861525 systemd-logind[1911]: Watching system buttons on /dev/input/event2 (Sleep Button) Sep 12 17:50:33.861545 systemd-logind[1911]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Sep 12 17:50:33.861716 systemd-logind[1911]: New seat seat0. Sep 12 17:50:33.862167 jq[1917]: true Sep 12 17:50:33.870308 sshd_keygen[1914]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 17:50:33.871038 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 17:50:33.881428 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 17:50:33.891988 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 17:50:33.892101 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 17:50:33.892262 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 17:50:33.900923 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 17:50:33.911409 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 17:50:33.911521 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 17:50:33.922988 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 17:50:33.935997 (ntainerd)[1930]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 17:50:33.937196 jq[1929]: true Sep 12 17:50:33.946417 tar[1928]: linux-amd64/helm Sep 12 17:50:33.950204 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Sep 12 17:50:33.950329 systemd[1]: Condition check resulted in tcsd.service - TCG Core Services Daemon being skipped. Sep 12 17:50:33.957568 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 17:50:33.970958 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 17:50:33.971084 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 17:50:33.981028 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 17:50:33.986036 dbus-daemon[1879]: [system] SELinux support is enabled Sep 12 17:50:33.987943 update_engine[1916]: I20250912 17:50:33.987917 1916 update_check_scheduler.cc:74] Next update check in 6m18s Sep 12 17:50:33.990889 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 17:50:33.993156 dbus-daemon[1879]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 12 17:50:33.992546 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 17:50:33.992561 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 17:50:34.006349 bash[1960]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:50:34.008913 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 17:50:34.008926 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 17:50:34.019159 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 17:50:34.030087 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 17:50:34.040756 systemd[1]: Started update-engine.service - Update Engine. Sep 12 17:50:34.051025 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 17:50:34.061646 systemd[1]: Started serial-getty@ttyS1.service - Serial Getty on ttyS1. Sep 12 17:50:34.070955 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 17:50:34.079876 systemd[1]: Starting sshkeys.service... Sep 12 17:50:34.097686 containerd[1930]: time="2025-09-12T17:50:34Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 17:50:34.098011 containerd[1930]: time="2025-09-12T17:50:34.097993624Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 17:50:34.102087 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 17:50:34.103407 containerd[1930]: time="2025-09-12T17:50:34.103364859Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="5.667µs" Sep 12 17:50:34.103677 containerd[1930]: time="2025-09-12T17:50:34.103669420Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 17:50:34.103696 containerd[1930]: time="2025-09-12T17:50:34.103682355Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 17:50:34.103771 containerd[1930]: time="2025-09-12T17:50:34.103765336Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 17:50:34.103788 containerd[1930]: time="2025-09-12T17:50:34.103774969Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 17:50:34.103852 containerd[1930]: time="2025-09-12T17:50:34.103797990Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 17:50:34.103852 containerd[1930]: time="2025-09-12T17:50:34.103832222Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 17:50:34.103852 containerd[1930]: time="2025-09-12T17:50:34.103839340Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 17:50:34.103974 containerd[1930]: time="2025-09-12T17:50:34.103963374Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 17:50:34.103974 containerd[1930]: time="2025-09-12T17:50:34.103971741Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 17:50:34.104016 containerd[1930]: time="2025-09-12T17:50:34.103977812Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 17:50:34.104016 containerd[1930]: time="2025-09-12T17:50:34.103982503Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 17:50:34.104052 containerd[1930]: time="2025-09-12T17:50:34.104021708Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 17:50:34.104142 containerd[1930]: time="2025-09-12T17:50:34.104128566Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 17:50:34.104159 containerd[1930]: time="2025-09-12T17:50:34.104144397Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 17:50:34.104159 containerd[1930]: time="2025-09-12T17:50:34.104150363Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 17:50:34.104185 containerd[1930]: time="2025-09-12T17:50:34.104165639Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 17:50:34.104315 containerd[1930]: time="2025-09-12T17:50:34.104280429Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 17:50:34.104315 containerd[1930]: time="2025-09-12T17:50:34.104309633Z" level=info msg="metadata content store policy set" policy=shared Sep 12 17:50:34.118814 containerd[1930]: time="2025-09-12T17:50:34.118794762Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 17:50:34.118853 containerd[1930]: time="2025-09-12T17:50:34.118823024Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 17:50:34.118853 containerd[1930]: time="2025-09-12T17:50:34.118836311Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 17:50:34.118853 containerd[1930]: time="2025-09-12T17:50:34.118844749Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 17:50:34.118907 containerd[1930]: time="2025-09-12T17:50:34.118852992Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 17:50:34.118907 containerd[1930]: time="2025-09-12T17:50:34.118859857Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 17:50:34.118907 containerd[1930]: time="2025-09-12T17:50:34.118867328Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 17:50:34.118907 containerd[1930]: time="2025-09-12T17:50:34.118873634Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 17:50:34.118907 containerd[1930]: time="2025-09-12T17:50:34.118879530Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 17:50:34.118907 containerd[1930]: time="2025-09-12T17:50:34.118886960Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 17:50:34.118907 containerd[1930]: time="2025-09-12T17:50:34.118892230Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 17:50:34.118907 containerd[1930]: time="2025-09-12T17:50:34.118899438Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 17:50:34.119012 containerd[1930]: time="2025-09-12T17:50:34.118960586Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 17:50:34.119012 containerd[1930]: time="2025-09-12T17:50:34.118972255Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 17:50:34.119012 containerd[1930]: time="2025-09-12T17:50:34.118980647Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 17:50:34.119012 containerd[1930]: time="2025-09-12T17:50:34.118986753Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 17:50:34.119012 containerd[1930]: time="2025-09-12T17:50:34.118992669Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 17:50:34.119012 containerd[1930]: time="2025-09-12T17:50:34.118998231Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 17:50:34.119012 containerd[1930]: time="2025-09-12T17:50:34.119003870Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 17:50:34.119012 containerd[1930]: time="2025-09-12T17:50:34.119009028Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 17:50:34.119185 containerd[1930]: time="2025-09-12T17:50:34.119015051Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 17:50:34.119185 containerd[1930]: time="2025-09-12T17:50:34.119020829Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 17:50:34.119185 containerd[1930]: time="2025-09-12T17:50:34.119026329Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 17:50:34.119185 containerd[1930]: time="2025-09-12T17:50:34.119063849Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 17:50:34.119185 containerd[1930]: time="2025-09-12T17:50:34.119072141Z" level=info msg="Start snapshots syncer" Sep 12 17:50:34.119185 containerd[1930]: time="2025-09-12T17:50:34.119085836Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 17:50:34.119329 containerd[1930]: time="2025-09-12T17:50:34.119229162Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 17:50:34.119329 containerd[1930]: time="2025-09-12T17:50:34.119261372Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 17:50:34.119454 containerd[1930]: time="2025-09-12T17:50:34.119295418Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 17:50:34.119454 containerd[1930]: time="2025-09-12T17:50:34.119343555Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 17:50:34.119454 containerd[1930]: time="2025-09-12T17:50:34.119355732Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 17:50:34.119454 containerd[1930]: time="2025-09-12T17:50:34.119361512Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 17:50:34.119454 containerd[1930]: time="2025-09-12T17:50:34.119367794Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 17:50:34.119454 containerd[1930]: time="2025-09-12T17:50:34.119374697Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 17:50:34.119454 containerd[1930]: time="2025-09-12T17:50:34.119381758Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 17:50:34.119454 containerd[1930]: time="2025-09-12T17:50:34.119391318Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 17:50:34.119454 containerd[1930]: time="2025-09-12T17:50:34.119403832Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 17:50:34.119454 containerd[1930]: time="2025-09-12T17:50:34.119410327Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 17:50:34.119454 containerd[1930]: time="2025-09-12T17:50:34.119416702Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 17:50:34.119454 containerd[1930]: time="2025-09-12T17:50:34.119435516Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 17:50:34.119454 containerd[1930]: time="2025-09-12T17:50:34.119444001Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 17:50:34.119454 containerd[1930]: time="2025-09-12T17:50:34.119448854Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 17:50:34.119698 containerd[1930]: time="2025-09-12T17:50:34.119454079Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 17:50:34.119698 containerd[1930]: time="2025-09-12T17:50:34.119458463Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 17:50:34.119698 containerd[1930]: time="2025-09-12T17:50:34.119463666Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 17:50:34.119698 containerd[1930]: time="2025-09-12T17:50:34.119474564Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 17:50:34.119698 containerd[1930]: time="2025-09-12T17:50:34.119484737Z" level=info msg="runtime interface created" Sep 12 17:50:34.119698 containerd[1930]: time="2025-09-12T17:50:34.119487884Z" level=info msg="created NRI interface" Sep 12 17:50:34.119698 containerd[1930]: time="2025-09-12T17:50:34.119492821Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 17:50:34.119698 containerd[1930]: time="2025-09-12T17:50:34.119498329Z" level=info msg="Connect containerd service" Sep 12 17:50:34.119698 containerd[1930]: time="2025-09-12T17:50:34.119512477Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 17:50:34.119882 containerd[1930]: time="2025-09-12T17:50:34.119870118Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:50:34.122151 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 12 17:50:34.125584 tar[1928]: linux-amd64/LICENSE Sep 12 17:50:34.125584 tar[1928]: linux-amd64/README.md Sep 12 17:50:34.133587 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 12 17:50:34.157668 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 17:50:34.159168 locksmithd[1980]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 17:50:34.164486 coreos-metadata[1990]: Sep 12 17:50:34.164 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 12 17:50:34.178798 kernel: i915 0000:00:02.0: [drm] Cannot find any crtc or sizes Sep 12 17:50:34.219156 containerd[1930]: time="2025-09-12T17:50:34.219128455Z" level=info msg="Start subscribing containerd event" Sep 12 17:50:34.219227 containerd[1930]: time="2025-09-12T17:50:34.219176235Z" level=info msg="Start recovering state" Sep 12 17:50:34.219255 containerd[1930]: time="2025-09-12T17:50:34.219239956Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 17:50:34.219272 containerd[1930]: time="2025-09-12T17:50:34.219256574Z" level=info msg="Start event monitor" Sep 12 17:50:34.219272 containerd[1930]: time="2025-09-12T17:50:34.219269008Z" level=info msg="Start cni network conf syncer for default" Sep 12 17:50:34.219303 containerd[1930]: time="2025-09-12T17:50:34.219276433Z" level=info msg="Start streaming server" Sep 12 17:50:34.219303 containerd[1930]: time="2025-09-12T17:50:34.219283633Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 17:50:34.219329 containerd[1930]: time="2025-09-12T17:50:34.219285246Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 17:50:34.219329 containerd[1930]: time="2025-09-12T17:50:34.219316558Z" level=info msg="runtime interface starting up..." Sep 12 17:50:34.219329 containerd[1930]: time="2025-09-12T17:50:34.219321582Z" level=info msg="starting plugins..." Sep 12 17:50:34.219366 containerd[1930]: time="2025-09-12T17:50:34.219332041Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 17:50:34.219417 containerd[1930]: time="2025-09-12T17:50:34.219409318Z" level=info msg="containerd successfully booted in 0.122006s" Sep 12 17:50:34.219467 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 17:50:34.284822 kernel: EXT4-fs (sda9): resized filesystem to 116605649 Sep 12 17:50:34.310892 extend-filesystems[1896]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 12 17:50:34.310892 extend-filesystems[1896]: old_desc_blocks = 1, new_desc_blocks = 56 Sep 12 17:50:34.310892 extend-filesystems[1896]: The filesystem on /dev/sda9 is now 116605649 (4k) blocks long. Sep 12 17:50:34.348938 extend-filesystems[1885]: Resized filesystem in /dev/sda9 Sep 12 17:50:34.311665 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 17:50:34.311817 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 17:50:34.377859 kernel: i915 0000:00:02.0: [drm] Cannot find any crtc or sizes Sep 12 17:50:35.032052 systemd-networkd[1838]: bond0: Gained IPv6LL Sep 12 17:50:35.033029 systemd-timesyncd[1840]: Network configuration changed, trying to establish connection. Sep 12 17:50:35.096838 systemd-timesyncd[1840]: Network configuration changed, trying to establish connection. Sep 12 17:50:35.097403 systemd-timesyncd[1840]: Network configuration changed, trying to establish connection. Sep 12 17:50:35.100722 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 17:50:35.114519 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 17:50:35.128135 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:50:35.150712 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 17:50:35.170894 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 17:50:35.892781 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:50:35.903476 (kubelet)[2035]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:50:36.343206 kubelet[2035]: E0912 17:50:36.343068 2035 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:50:36.344185 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:50:36.344269 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:50:36.344445 systemd[1]: kubelet.service: Consumed 595ms CPU time, 269.6M memory peak. Sep 12 17:50:36.723076 kernel: mlx5_core 0000:02:00.0: lag map: port 1:1 port 2:2 Sep 12 17:50:36.723422 kernel: mlx5_core 0000:02:00.0: shared_fdb:0 mode:queue_affinity Sep 12 17:50:37.584304 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 17:50:37.594757 systemd[1]: Started sshd@0-139.178.94.149:22-139.178.89.65:47636.service - OpenSSH per-connection server daemon (139.178.89.65:47636). Sep 12 17:50:37.673169 sshd[2056]: Accepted publickey for core from 139.178.89.65 port 47636 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:50:37.674208 sshd-session[2056]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:50:37.681785 systemd-logind[1911]: New session 1 of user core. Sep 12 17:50:37.682685 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 17:50:37.691724 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 17:50:37.696329 coreos-metadata[1990]: Sep 12 17:50:37.696 INFO Fetch successful Sep 12 17:50:37.722809 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 17:50:37.734714 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 17:50:37.746335 unknown[1990]: wrote ssh authorized keys file for user: core Sep 12 17:50:37.755521 (systemd)[2061]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 17:50:37.757229 systemd-logind[1911]: New session c1 of user core. Sep 12 17:50:37.757856 google_oslogin_nss_cache[1886]: oslogin_cache_refresh[1886]: Failure getting users, quitting Sep 12 17:50:37.757856 google_oslogin_nss_cache[1886]: oslogin_cache_refresh[1886]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 17:50:37.757798 oslogin_cache_refresh[1886]: Failure getting users, quitting Sep 12 17:50:37.758084 google_oslogin_nss_cache[1886]: oslogin_cache_refresh[1886]: Refreshing group entry cache Sep 12 17:50:37.757828 oslogin_cache_refresh[1886]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 17:50:37.757869 oslogin_cache_refresh[1886]: Refreshing group entry cache Sep 12 17:50:37.758460 google_oslogin_nss_cache[1886]: oslogin_cache_refresh[1886]: Failure getting groups, quitting Sep 12 17:50:37.758460 google_oslogin_nss_cache[1886]: oslogin_cache_refresh[1886]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 17:50:37.758433 oslogin_cache_refresh[1886]: Failure getting groups, quitting Sep 12 17:50:37.758440 oslogin_cache_refresh[1886]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 17:50:37.759074 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 12 17:50:37.759217 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 12 17:50:37.766439 update-ssh-keys[2062]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:50:37.768398 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 12 17:50:37.779780 systemd[1]: Finished sshkeys.service. Sep 12 17:50:37.860667 systemd[2061]: Queued start job for default target default.target. Sep 12 17:50:37.878433 systemd[2061]: Created slice app.slice - User Application Slice. Sep 12 17:50:37.878479 systemd[2061]: Reached target paths.target - Paths. Sep 12 17:50:37.878514 systemd[2061]: Reached target timers.target - Timers. Sep 12 17:50:37.879214 systemd[2061]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 17:50:37.885100 systemd[2061]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 17:50:37.885138 systemd[2061]: Reached target sockets.target - Sockets. Sep 12 17:50:37.885170 systemd[2061]: Reached target basic.target - Basic System. Sep 12 17:50:37.885208 systemd[2061]: Reached target default.target - Main User Target. Sep 12 17:50:37.885232 systemd[2061]: Startup finished in 124ms. Sep 12 17:50:37.885237 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 17:50:37.894705 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 17:50:37.948421 coreos-metadata[1878]: Sep 12 17:50:37.948 INFO Fetch successful Sep 12 17:50:37.966771 systemd[1]: Started sshd@1-139.178.94.149:22-139.178.89.65:47646.service - OpenSSH per-connection server daemon (139.178.89.65:47646). Sep 12 17:50:37.999212 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 17:50:38.009491 systemd[1]: Starting packet-phone-home.service - Report Success to Packet... Sep 12 17:50:38.021162 sshd[2077]: Accepted publickey for core from 139.178.89.65 port 47646 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:50:38.021907 sshd-session[2077]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:50:38.024634 systemd-logind[1911]: New session 2 of user core. Sep 12 17:50:38.025563 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 17:50:38.084938 sshd[2086]: Connection closed by 139.178.89.65 port 47646 Sep 12 17:50:38.085089 sshd-session[2077]: pam_unix(sshd:session): session closed for user core Sep 12 17:50:38.102084 systemd[1]: sshd@1-139.178.94.149:22-139.178.89.65:47646.service: Deactivated successfully. Sep 12 17:50:38.103022 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 17:50:38.103604 systemd-logind[1911]: Session 2 logged out. Waiting for processes to exit. Sep 12 17:50:38.104742 systemd[1]: Started sshd@2-139.178.94.149:22-139.178.89.65:47660.service - OpenSSH per-connection server daemon (139.178.89.65:47660). Sep 12 17:50:38.116003 systemd-logind[1911]: Removed session 2. Sep 12 17:50:38.189074 sshd[2092]: Accepted publickey for core from 139.178.89.65 port 47660 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:50:38.190477 sshd-session[2092]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:50:38.195981 systemd-logind[1911]: New session 3 of user core. Sep 12 17:50:38.209024 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 17:50:38.283365 sshd[2096]: Connection closed by 139.178.89.65 port 47660 Sep 12 17:50:38.284206 sshd-session[2092]: pam_unix(sshd:session): session closed for user core Sep 12 17:50:38.292984 systemd[1]: sshd@2-139.178.94.149:22-139.178.89.65:47660.service: Deactivated successfully. Sep 12 17:50:38.297103 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 17:50:38.299548 systemd-logind[1911]: Session 3 logged out. Waiting for processes to exit. Sep 12 17:50:38.302475 systemd-logind[1911]: Removed session 3. Sep 12 17:50:38.508809 systemd[1]: Finished packet-phone-home.service - Report Success to Packet. Sep 12 17:50:38.521741 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 17:50:38.531586 systemd[1]: Startup finished in 4.456s (kernel) + 25.334s (initrd) + 8.987s (userspace) = 38.778s. Sep 12 17:50:38.552062 login[1978]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 12 17:50:38.557003 systemd-logind[1911]: New session 4 of user core. Sep 12 17:50:38.557911 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 17:50:38.562618 login[1977]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 12 17:50:38.565199 systemd-logind[1911]: New session 5 of user core. Sep 12 17:50:38.566048 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 17:50:40.566491 systemd-timesyncd[1840]: Network configuration changed, trying to establish connection. Sep 12 17:50:46.462761 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 17:50:46.466195 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:50:46.726073 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:50:46.728282 (kubelet)[2138]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:50:46.754749 kubelet[2138]: E0912 17:50:46.754695 2138 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:50:46.756783 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:50:46.756882 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:50:46.757061 systemd[1]: kubelet.service: Consumed 151ms CPU time, 116.5M memory peak. Sep 12 17:50:48.310334 systemd[1]: Started sshd@3-139.178.94.149:22-139.178.89.65:56462.service - OpenSSH per-connection server daemon (139.178.89.65:56462). Sep 12 17:50:48.352881 sshd[2158]: Accepted publickey for core from 139.178.89.65 port 56462 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:50:48.353526 sshd-session[2158]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:50:48.356608 systemd-logind[1911]: New session 6 of user core. Sep 12 17:50:48.378990 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 17:50:48.430936 sshd[2161]: Connection closed by 139.178.89.65 port 56462 Sep 12 17:50:48.431094 sshd-session[2158]: pam_unix(sshd:session): session closed for user core Sep 12 17:50:48.450092 systemd[1]: sshd@3-139.178.94.149:22-139.178.89.65:56462.service: Deactivated successfully. Sep 12 17:50:48.451011 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 17:50:48.451577 systemd-logind[1911]: Session 6 logged out. Waiting for processes to exit. Sep 12 17:50:48.452708 systemd[1]: Started sshd@4-139.178.94.149:22-139.178.89.65:56476.service - OpenSSH per-connection server daemon (139.178.89.65:56476). Sep 12 17:50:48.453447 systemd-logind[1911]: Removed session 6. Sep 12 17:50:48.500971 sshd[2167]: Accepted publickey for core from 139.178.89.65 port 56476 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:50:48.501868 sshd-session[2167]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:50:48.505777 systemd-logind[1911]: New session 7 of user core. Sep 12 17:50:48.518030 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 17:50:48.569565 sshd[2170]: Connection closed by 139.178.89.65 port 56476 Sep 12 17:50:48.570227 sshd-session[2167]: pam_unix(sshd:session): session closed for user core Sep 12 17:50:48.594774 systemd[1]: sshd@4-139.178.94.149:22-139.178.89.65:56476.service: Deactivated successfully. Sep 12 17:50:48.595547 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 17:50:48.596060 systemd-logind[1911]: Session 7 logged out. Waiting for processes to exit. Sep 12 17:50:48.596974 systemd[1]: Started sshd@5-139.178.94.149:22-139.178.89.65:56480.service - OpenSSH per-connection server daemon (139.178.89.65:56480). Sep 12 17:50:48.597572 systemd-logind[1911]: Removed session 7. Sep 12 17:50:48.652518 sshd[2176]: Accepted publickey for core from 139.178.89.65 port 56480 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:50:48.653262 sshd-session[2176]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:50:48.656773 systemd-logind[1911]: New session 8 of user core. Sep 12 17:50:48.672196 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 17:50:48.739665 sshd[2179]: Connection closed by 139.178.89.65 port 56480 Sep 12 17:50:48.740516 sshd-session[2176]: pam_unix(sshd:session): session closed for user core Sep 12 17:50:48.762548 systemd[1]: sshd@5-139.178.94.149:22-139.178.89.65:56480.service: Deactivated successfully. Sep 12 17:50:48.766220 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 17:50:48.768302 systemd-logind[1911]: Session 8 logged out. Waiting for processes to exit. Sep 12 17:50:48.773763 systemd[1]: Started sshd@6-139.178.94.149:22-139.178.89.65:56494.service - OpenSSH per-connection server daemon (139.178.89.65:56494). Sep 12 17:50:48.775591 systemd-logind[1911]: Removed session 8. Sep 12 17:50:48.824404 sshd[2185]: Accepted publickey for core from 139.178.89.65 port 56494 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:50:48.825033 sshd-session[2185]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:50:48.828224 systemd-logind[1911]: New session 9 of user core. Sep 12 17:50:48.837045 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 17:50:48.896492 sudo[2189]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 17:50:48.896648 sudo[2189]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:50:48.914308 sudo[2189]: pam_unix(sudo:session): session closed for user root Sep 12 17:50:48.915230 sshd[2188]: Connection closed by 139.178.89.65 port 56494 Sep 12 17:50:48.915460 sshd-session[2185]: pam_unix(sshd:session): session closed for user core Sep 12 17:50:48.931980 systemd[1]: sshd@6-139.178.94.149:22-139.178.89.65:56494.service: Deactivated successfully. Sep 12 17:50:48.933234 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 17:50:48.933949 systemd-logind[1911]: Session 9 logged out. Waiting for processes to exit. Sep 12 17:50:48.935994 systemd[1]: Started sshd@7-139.178.94.149:22-139.178.89.65:56506.service - OpenSSH per-connection server daemon (139.178.89.65:56506). Sep 12 17:50:48.936656 systemd-logind[1911]: Removed session 9. Sep 12 17:50:48.983605 sshd[2195]: Accepted publickey for core from 139.178.89.65 port 56506 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:50:48.984225 sshd-session[2195]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:50:48.987176 systemd-logind[1911]: New session 10 of user core. Sep 12 17:50:48.997064 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 17:50:49.046553 sudo[2201]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 17:50:49.046904 sudo[2201]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:50:49.058252 sudo[2201]: pam_unix(sudo:session): session closed for user root Sep 12 17:50:49.064336 sudo[2200]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 17:50:49.064675 sudo[2200]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:50:49.078082 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 17:50:49.106582 augenrules[2223]: No rules Sep 12 17:50:49.107109 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:50:49.107289 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 17:50:49.107973 sudo[2200]: pam_unix(sudo:session): session closed for user root Sep 12 17:50:49.108870 sshd[2199]: Connection closed by 139.178.89.65 port 56506 Sep 12 17:50:49.109130 sshd-session[2195]: pam_unix(sshd:session): session closed for user core Sep 12 17:50:49.124727 systemd[1]: sshd@7-139.178.94.149:22-139.178.89.65:56506.service: Deactivated successfully. Sep 12 17:50:49.126521 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 17:50:49.127622 systemd-logind[1911]: Session 10 logged out. Waiting for processes to exit. Sep 12 17:50:49.130508 systemd[1]: Started sshd@8-139.178.94.149:22-139.178.89.65:56508.service - OpenSSH per-connection server daemon (139.178.89.65:56508). Sep 12 17:50:49.131567 systemd-logind[1911]: Removed session 10. Sep 12 17:50:49.196944 sshd[2232]: Accepted publickey for core from 139.178.89.65 port 56508 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:50:49.197570 sshd-session[2232]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:50:49.200298 systemd-logind[1911]: New session 11 of user core. Sep 12 17:50:49.215031 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 17:50:49.264533 sudo[2236]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 17:50:49.264918 sudo[2236]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:50:49.568703 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 17:50:49.581119 (dockerd)[2262]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 17:50:49.796133 dockerd[2262]: time="2025-09-12T17:50:49.796100366Z" level=info msg="Starting up" Sep 12 17:50:49.796545 dockerd[2262]: time="2025-09-12T17:50:49.796532754Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 17:50:49.802735 dockerd[2262]: time="2025-09-12T17:50:49.802685790Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 17:50:49.839418 dockerd[2262]: time="2025-09-12T17:50:49.839345653Z" level=info msg="Loading containers: start." Sep 12 17:50:49.849800 kernel: Initializing XFRM netlink socket Sep 12 17:50:49.974256 systemd-timesyncd[1840]: Network configuration changed, trying to establish connection. Sep 12 17:50:49.995503 systemd-networkd[1838]: docker0: Link UP Sep 12 17:50:49.997166 dockerd[2262]: time="2025-09-12T17:50:49.997122065Z" level=info msg="Loading containers: done." Sep 12 17:50:50.004993 dockerd[2262]: time="2025-09-12T17:50:50.004947976Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 17:50:50.004993 dockerd[2262]: time="2025-09-12T17:50:50.004988708Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 17:50:50.005082 dockerd[2262]: time="2025-09-12T17:50:50.005026699Z" level=info msg="Initializing buildkit" Sep 12 17:50:50.026654 dockerd[2262]: time="2025-09-12T17:50:50.026604834Z" level=info msg="Completed buildkit initialization" Sep 12 17:50:50.029782 dockerd[2262]: time="2025-09-12T17:50:50.029765768Z" level=info msg="Daemon has completed initialization" Sep 12 17:50:50.029828 dockerd[2262]: time="2025-09-12T17:50:50.029800915Z" level=info msg="API listen on /run/docker.sock" Sep 12 17:50:50.029883 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 17:50:20.347244 systemd-resolved[1839]: Clock change detected. Flushing caches. Sep 12 17:50:20.356487 systemd-journald[1450]: Time jumped backwards, rotating. Sep 12 17:50:20.347335 systemd-timesyncd[1840]: Contacted time server [2604:9a00:1:106:1c00:84ff:fe00:349]:123 (2.flatcar.pool.ntp.org). Sep 12 17:50:20.347363 systemd-timesyncd[1840]: Initial clock synchronization to Fri 2025-09-12 17:50:20.347177 UTC. Sep 12 17:50:21.064717 containerd[1930]: time="2025-09-12T17:50:21.064687884Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 12 17:50:21.618311 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3547681172.mount: Deactivated successfully. Sep 12 17:50:22.377696 containerd[1930]: time="2025-09-12T17:50:22.377668671Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:22.377907 containerd[1930]: time="2025-09-12T17:50:22.377808621Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=28117124" Sep 12 17:50:22.378227 containerd[1930]: time="2025-09-12T17:50:22.378214354Z" level=info msg="ImageCreate event name:\"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:22.379552 containerd[1930]: time="2025-09-12T17:50:22.379538000Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:22.380169 containerd[1930]: time="2025-09-12T17:50:22.380095848Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"28113723\" in 1.315387448s" Sep 12 17:50:22.380169 containerd[1930]: time="2025-09-12T17:50:22.380116276Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\"" Sep 12 17:50:22.380515 containerd[1930]: time="2025-09-12T17:50:22.380478430Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 12 17:50:23.382505 containerd[1930]: time="2025-09-12T17:50:23.382454762Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:23.382719 containerd[1930]: time="2025-09-12T17:50:23.382677792Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=24716632" Sep 12 17:50:23.382957 containerd[1930]: time="2025-09-12T17:50:23.382946879Z" level=info msg="ImageCreate event name:\"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:23.384543 containerd[1930]: time="2025-09-12T17:50:23.384505889Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:23.384975 containerd[1930]: time="2025-09-12T17:50:23.384933799Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"26351311\" in 1.00441496s" Sep 12 17:50:23.384975 containerd[1930]: time="2025-09-12T17:50:23.384954562Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\"" Sep 12 17:50:23.385194 containerd[1930]: time="2025-09-12T17:50:23.385181984Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 12 17:50:24.183873 containerd[1930]: time="2025-09-12T17:50:24.183821392Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:24.184048 containerd[1930]: time="2025-09-12T17:50:24.184032042Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=18787698" Sep 12 17:50:24.184418 containerd[1930]: time="2025-09-12T17:50:24.184383305Z" level=info msg="ImageCreate event name:\"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:24.185732 containerd[1930]: time="2025-09-12T17:50:24.185690544Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:24.186299 containerd[1930]: time="2025-09-12T17:50:24.186257409Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"20422395\" in 801.060445ms" Sep 12 17:50:24.186299 containerd[1930]: time="2025-09-12T17:50:24.186275274Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\"" Sep 12 17:50:24.186526 containerd[1930]: time="2025-09-12T17:50:24.186505304Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 12 17:50:25.052986 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount501670236.mount: Deactivated successfully. Sep 12 17:50:25.247269 containerd[1930]: time="2025-09-12T17:50:25.247216436Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:25.247471 containerd[1930]: time="2025-09-12T17:50:25.247350971Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=30410252" Sep 12 17:50:25.247816 containerd[1930]: time="2025-09-12T17:50:25.247776243Z" level=info msg="ImageCreate event name:\"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:25.248711 containerd[1930]: time="2025-09-12T17:50:25.248671432Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:25.248910 containerd[1930]: time="2025-09-12T17:50:25.248867830Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"30409271\" in 1.062343912s" Sep 12 17:50:25.248910 containerd[1930]: time="2025-09-12T17:50:25.248885092Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\"" Sep 12 17:50:25.249176 containerd[1930]: time="2025-09-12T17:50:25.249144464Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 17:50:25.665403 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2437189908.mount: Deactivated successfully. Sep 12 17:50:26.205518 containerd[1930]: time="2025-09-12T17:50:26.205462331Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:26.205669 containerd[1930]: time="2025-09-12T17:50:26.205634854Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 12 17:50:26.206116 containerd[1930]: time="2025-09-12T17:50:26.206069822Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:26.207515 containerd[1930]: time="2025-09-12T17:50:26.207475164Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:26.208507 containerd[1930]: time="2025-09-12T17:50:26.208473812Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 959.311122ms" Sep 12 17:50:26.208507 containerd[1930]: time="2025-09-12T17:50:26.208488071Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 12 17:50:26.208738 containerd[1930]: time="2025-09-12T17:50:26.208725794Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 17:50:26.721569 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2271854754.mount: Deactivated successfully. Sep 12 17:50:26.722881 containerd[1930]: time="2025-09-12T17:50:26.722865098Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:50:26.723078 containerd[1930]: time="2025-09-12T17:50:26.723067899Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 12 17:50:26.723463 containerd[1930]: time="2025-09-12T17:50:26.723428329Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:50:26.724406 containerd[1930]: time="2025-09-12T17:50:26.724366016Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:50:26.724779 containerd[1930]: time="2025-09-12T17:50:26.724743794Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 516.003369ms" Sep 12 17:50:26.724779 containerd[1930]: time="2025-09-12T17:50:26.724756744Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 12 17:50:26.725039 containerd[1930]: time="2025-09-12T17:50:26.725028586Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 12 17:50:27.048613 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 17:50:27.049490 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:50:27.323092 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:50:27.325405 (kubelet)[2638]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:50:27.335999 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1030780086.mount: Deactivated successfully. Sep 12 17:50:27.347426 kubelet[2638]: E0912 17:50:27.347372 2638 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:50:27.348871 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:50:27.348960 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:50:27.349182 systemd[1]: kubelet.service: Consumed 111ms CPU time, 115M memory peak. Sep 12 17:50:28.416810 containerd[1930]: time="2025-09-12T17:50:28.416752318Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:28.417018 containerd[1930]: time="2025-09-12T17:50:28.416957574Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910709" Sep 12 17:50:28.417332 containerd[1930]: time="2025-09-12T17:50:28.417291479Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:28.418774 containerd[1930]: time="2025-09-12T17:50:28.418734238Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:28.419349 containerd[1930]: time="2025-09-12T17:50:28.419316167Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 1.694272934s" Sep 12 17:50:28.419349 containerd[1930]: time="2025-09-12T17:50:28.419332070Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 12 17:50:30.190031 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:50:30.190177 systemd[1]: kubelet.service: Consumed 111ms CPU time, 115M memory peak. Sep 12 17:50:30.191450 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:50:30.204470 systemd[1]: Reload requested from client PID 2760 ('systemctl') (unit session-11.scope)... Sep 12 17:50:30.204478 systemd[1]: Reloading... Sep 12 17:50:30.250113 zram_generator::config[2806]: No configuration found. Sep 12 17:50:30.406616 systemd[1]: Reloading finished in 201 ms. Sep 12 17:50:30.433685 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 17:50:30.433733 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 17:50:30.433865 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:50:30.433891 systemd[1]: kubelet.service: Consumed 53ms CPU time, 97.3M memory peak. Sep 12 17:50:30.435134 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:50:30.722546 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:50:30.724948 (kubelet)[2871]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:50:30.746611 kubelet[2871]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:50:30.746611 kubelet[2871]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 17:50:30.746611 kubelet[2871]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:50:30.746816 kubelet[2871]: I0912 17:50:30.746624 2871 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:50:30.925745 kubelet[2871]: I0912 17:50:30.925685 2871 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 17:50:30.925745 kubelet[2871]: I0912 17:50:30.925712 2871 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:50:30.926022 kubelet[2871]: I0912 17:50:30.925976 2871 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 17:50:30.942992 kubelet[2871]: E0912 17:50:30.942946 2871 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.94.149:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.94.149:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:50:30.947346 kubelet[2871]: I0912 17:50:30.947302 2871 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:50:30.953342 kubelet[2871]: I0912 17:50:30.953306 2871 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 17:50:30.963368 kubelet[2871]: I0912 17:50:30.963332 2871 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:50:30.963893 kubelet[2871]: I0912 17:50:30.963857 2871 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 17:50:30.963974 kubelet[2871]: I0912 17:50:30.963933 2871 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:50:30.964034 kubelet[2871]: I0912 17:50:30.963944 2871 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426.1.0-a-b1d4eb1a76","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:50:30.964093 kubelet[2871]: I0912 17:50:30.964040 2871 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:50:30.964093 kubelet[2871]: I0912 17:50:30.964045 2871 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 17:50:30.964126 kubelet[2871]: I0912 17:50:30.964103 2871 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:50:30.966396 kubelet[2871]: I0912 17:50:30.966359 2871 kubelet.go:408] "Attempting to sync node with API server" Sep 12 17:50:30.966396 kubelet[2871]: I0912 17:50:30.966372 2871 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:50:30.966396 kubelet[2871]: I0912 17:50:30.966391 2871 kubelet.go:314] "Adding apiserver pod source" Sep 12 17:50:30.966460 kubelet[2871]: I0912 17:50:30.966416 2871 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:50:30.968979 kubelet[2871]: I0912 17:50:30.968966 2871 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 17:50:30.969390 kubelet[2871]: I0912 17:50:30.969381 2871 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:50:30.969448 kubelet[2871]: W0912 17:50:30.969398 2871 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.94.149:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.1.0-a-b1d4eb1a76&limit=500&resourceVersion=0": dial tcp 139.178.94.149:6443: connect: connection refused Sep 12 17:50:30.969493 kubelet[2871]: E0912 17:50:30.969481 2871 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.94.149:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.1.0-a-b1d4eb1a76&limit=500&resourceVersion=0\": dial tcp 139.178.94.149:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:50:30.969842 kubelet[2871]: W0912 17:50:30.969828 2871 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 17:50:30.970148 kubelet[2871]: W0912 17:50:30.970094 2871 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.94.149:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.94.149:6443: connect: connection refused Sep 12 17:50:30.970193 kubelet[2871]: E0912 17:50:30.970152 2871 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.94.149:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.94.149:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:50:30.971652 kubelet[2871]: I0912 17:50:30.971644 2871 server.go:1274] "Started kubelet" Sep 12 17:50:30.971770 kubelet[2871]: I0912 17:50:30.971753 2871 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:50:30.971808 kubelet[2871]: I0912 17:50:30.971745 2871 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:50:30.971962 kubelet[2871]: I0912 17:50:30.971947 2871 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:50:30.972420 kubelet[2871]: I0912 17:50:30.972408 2871 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:50:30.972456 kubelet[2871]: I0912 17:50:30.972420 2871 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:50:30.972483 kubelet[2871]: I0912 17:50:30.972459 2871 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 17:50:30.972483 kubelet[2871]: E0912 17:50:30.972472 2871 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-b1d4eb1a76\" not found" Sep 12 17:50:30.972536 kubelet[2871]: I0912 17:50:30.972491 2871 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 17:50:30.972536 kubelet[2871]: I0912 17:50:30.972520 2871 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:50:30.972673 kubelet[2871]: E0912 17:50:30.972617 2871 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.149:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.1.0-a-b1d4eb1a76?timeout=10s\": dial tcp 139.178.94.149:6443: connect: connection refused" interval="200ms" Sep 12 17:50:30.972673 kubelet[2871]: I0912 17:50:30.972648 2871 server.go:449] "Adding debug handlers to kubelet server" Sep 12 17:50:30.972741 kubelet[2871]: W0912 17:50:30.972713 2871 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.94.149:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.94.149:6443: connect: connection refused Sep 12 17:50:30.972772 kubelet[2871]: E0912 17:50:30.972741 2871 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.94.149:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.94.149:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:50:30.972772 kubelet[2871]: I0912 17:50:30.972767 2871 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:50:30.972830 kubelet[2871]: I0912 17:50:30.972816 2871 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:50:30.973304 kubelet[2871]: I0912 17:50:30.973294 2871 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:50:30.973345 kubelet[2871]: E0912 17:50:30.973315 2871 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:50:30.979357 kubelet[2871]: E0912 17:50:30.978333 2871 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.94.149:6443/api/v1/namespaces/default/events\": dial tcp 139.178.94.149:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4426.1.0-a-b1d4eb1a76.18649a51dcf8653f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4426.1.0-a-b1d4eb1a76,UID:ci-4426.1.0-a-b1d4eb1a76,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4426.1.0-a-b1d4eb1a76,},FirstTimestamp:2025-09-12 17:50:30.971630911 +0000 UTC m=+0.244729091,LastTimestamp:2025-09-12 17:50:30.971630911 +0000 UTC m=+0.244729091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426.1.0-a-b1d4eb1a76,}" Sep 12 17:50:30.983049 kubelet[2871]: I0912 17:50:30.983038 2871 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 17:50:30.983049 kubelet[2871]: I0912 17:50:30.983047 2871 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 17:50:30.983124 kubelet[2871]: I0912 17:50:30.983058 2871 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:50:30.983265 kubelet[2871]: I0912 17:50:30.983249 2871 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:50:30.983840 kubelet[2871]: I0912 17:50:30.983833 2871 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:50:30.983862 kubelet[2871]: I0912 17:50:30.983844 2871 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 17:50:30.983862 kubelet[2871]: I0912 17:50:30.983854 2871 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 17:50:30.983892 kubelet[2871]: E0912 17:50:30.983884 2871 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:50:30.983976 kubelet[2871]: I0912 17:50:30.983969 2871 policy_none.go:49] "None policy: Start" Sep 12 17:50:30.984091 kubelet[2871]: W0912 17:50:30.984078 2871 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.94.149:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.94.149:6443: connect: connection refused Sep 12 17:50:30.984122 kubelet[2871]: E0912 17:50:30.984103 2871 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.94.149:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.94.149:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:50:30.984164 kubelet[2871]: I0912 17:50:30.984155 2871 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 17:50:30.984183 kubelet[2871]: I0912 17:50:30.984167 2871 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:50:30.987824 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 17:50:31.008022 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 17:50:31.010273 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 17:50:31.024981 kubelet[2871]: I0912 17:50:31.024898 2871 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:50:31.025332 kubelet[2871]: I0912 17:50:31.025262 2871 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:50:31.025467 kubelet[2871]: I0912 17:50:31.025291 2871 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:50:31.025844 kubelet[2871]: I0912 17:50:31.025763 2871 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:50:31.027332 kubelet[2871]: E0912 17:50:31.027279 2871 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4426.1.0-a-b1d4eb1a76\" not found" Sep 12 17:50:31.107838 systemd[1]: Created slice kubepods-burstable-pod939647c5b053d3e85d1b85ab2b9446b0.slice - libcontainer container kubepods-burstable-pod939647c5b053d3e85d1b85ab2b9446b0.slice. Sep 12 17:50:31.128588 kubelet[2871]: I0912 17:50:31.128531 2871 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:31.129454 kubelet[2871]: E0912 17:50:31.129393 2871 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.94.149:6443/api/v1/nodes\": dial tcp 139.178.94.149:6443: connect: connection refused" node="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:31.131472 systemd[1]: Created slice kubepods-burstable-podae7a75ecf34eca2d7915effbddda25b7.slice - libcontainer container kubepods-burstable-podae7a75ecf34eca2d7915effbddda25b7.slice. Sep 12 17:50:31.158093 systemd[1]: Created slice kubepods-burstable-pod0828915ff7ea6e1d985afe08673e6721.slice - libcontainer container kubepods-burstable-pod0828915ff7ea6e1d985afe08673e6721.slice. Sep 12 17:50:31.174222 kubelet[2871]: E0912 17:50:31.174084 2871 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.149:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.1.0-a-b1d4eb1a76?timeout=10s\": dial tcp 139.178.94.149:6443: connect: connection refused" interval="400ms" Sep 12 17:50:31.274472 kubelet[2871]: I0912 17:50:31.274331 2871 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ae7a75ecf34eca2d7915effbddda25b7-k8s-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"ae7a75ecf34eca2d7915effbddda25b7\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:31.274472 kubelet[2871]: I0912 17:50:31.274421 2871 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ae7a75ecf34eca2d7915effbddda25b7-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"ae7a75ecf34eca2d7915effbddda25b7\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:31.274718 kubelet[2871]: I0912 17:50:31.274489 2871 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0828915ff7ea6e1d985afe08673e6721-kubeconfig\") pod \"kube-scheduler-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"0828915ff7ea6e1d985afe08673e6721\") " pod="kube-system/kube-scheduler-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:31.274718 kubelet[2871]: I0912 17:50:31.274570 2871 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/939647c5b053d3e85d1b85ab2b9446b0-k8s-certs\") pod \"kube-apiserver-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"939647c5b053d3e85d1b85ab2b9446b0\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:31.274718 kubelet[2871]: I0912 17:50:31.274643 2871 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ae7a75ecf34eca2d7915effbddda25b7-ca-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"ae7a75ecf34eca2d7915effbddda25b7\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:31.274937 kubelet[2871]: I0912 17:50:31.274718 2871 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ae7a75ecf34eca2d7915effbddda25b7-flexvolume-dir\") pod \"kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"ae7a75ecf34eca2d7915effbddda25b7\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:31.274937 kubelet[2871]: I0912 17:50:31.274792 2871 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/939647c5b053d3e85d1b85ab2b9446b0-ca-certs\") pod \"kube-apiserver-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"939647c5b053d3e85d1b85ab2b9446b0\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:31.274937 kubelet[2871]: I0912 17:50:31.274897 2871 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/939647c5b053d3e85d1b85ab2b9446b0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"939647c5b053d3e85d1b85ab2b9446b0\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:31.275283 kubelet[2871]: I0912 17:50:31.274977 2871 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ae7a75ecf34eca2d7915effbddda25b7-kubeconfig\") pod \"kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"ae7a75ecf34eca2d7915effbddda25b7\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:31.333881 kubelet[2871]: I0912 17:50:31.333834 2871 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:31.334508 kubelet[2871]: E0912 17:50:31.334445 2871 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.94.149:6443/api/v1/nodes\": dial tcp 139.178.94.149:6443: connect: connection refused" node="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:31.427136 containerd[1930]: time="2025-09-12T17:50:31.427031205Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426.1.0-a-b1d4eb1a76,Uid:939647c5b053d3e85d1b85ab2b9446b0,Namespace:kube-system,Attempt:0,}" Sep 12 17:50:31.435946 containerd[1930]: time="2025-09-12T17:50:31.435926465Z" level=info msg="connecting to shim 0e4e2ccec013f657d1b7bc8aa23f2149b0a7c7e5e7ab7d6884d561423ec38f07" address="unix:///run/containerd/s/b5f4d78db4f9aca6d79f30e6c6b09519a6b9cbe54ec088b2069b461d4a04c983" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:50:31.451503 containerd[1930]: time="2025-09-12T17:50:31.451481099Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76,Uid:ae7a75ecf34eca2d7915effbddda25b7,Namespace:kube-system,Attempt:0,}" Sep 12 17:50:31.456253 systemd[1]: Started cri-containerd-0e4e2ccec013f657d1b7bc8aa23f2149b0a7c7e5e7ab7d6884d561423ec38f07.scope - libcontainer container 0e4e2ccec013f657d1b7bc8aa23f2149b0a7c7e5e7ab7d6884d561423ec38f07. Sep 12 17:50:31.459652 containerd[1930]: time="2025-09-12T17:50:31.459627197Z" level=info msg="connecting to shim 107c807b9c7e3682213e53d6afb892b7e07f7d9413813864f2a41ba57757718b" address="unix:///run/containerd/s/0cdaaa695965a3557972924d031c51d33d836b48482c9ab0b8f274750eace052" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:50:31.465157 containerd[1930]: time="2025-09-12T17:50:31.465134990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426.1.0-a-b1d4eb1a76,Uid:0828915ff7ea6e1d985afe08673e6721,Namespace:kube-system,Attempt:0,}" Sep 12 17:50:31.467664 systemd[1]: Started cri-containerd-107c807b9c7e3682213e53d6afb892b7e07f7d9413813864f2a41ba57757718b.scope - libcontainer container 107c807b9c7e3682213e53d6afb892b7e07f7d9413813864f2a41ba57757718b. Sep 12 17:50:31.471795 containerd[1930]: time="2025-09-12T17:50:31.471770670Z" level=info msg="connecting to shim 62609e6cf098b31c5ef0c68cb3e7df041b552464f0a891a41398c5289a1a5f58" address="unix:///run/containerd/s/870c0ac2157fcc56553297e81950b77d87cfd7cf3e43ef06048d46c362f37d45" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:50:31.480758 systemd[1]: Started cri-containerd-62609e6cf098b31c5ef0c68cb3e7df041b552464f0a891a41398c5289a1a5f58.scope - libcontainer container 62609e6cf098b31c5ef0c68cb3e7df041b552464f0a891a41398c5289a1a5f58. Sep 12 17:50:31.496816 containerd[1930]: time="2025-09-12T17:50:31.496773678Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426.1.0-a-b1d4eb1a76,Uid:939647c5b053d3e85d1b85ab2b9446b0,Namespace:kube-system,Attempt:0,} returns sandbox id \"0e4e2ccec013f657d1b7bc8aa23f2149b0a7c7e5e7ab7d6884d561423ec38f07\"" Sep 12 17:50:31.497854 containerd[1930]: time="2025-09-12T17:50:31.497841643Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76,Uid:ae7a75ecf34eca2d7915effbddda25b7,Namespace:kube-system,Attempt:0,} returns sandbox id \"107c807b9c7e3682213e53d6afb892b7e07f7d9413813864f2a41ba57757718b\"" Sep 12 17:50:31.498978 containerd[1930]: time="2025-09-12T17:50:31.498958026Z" level=info msg="CreateContainer within sandbox \"0e4e2ccec013f657d1b7bc8aa23f2149b0a7c7e5e7ab7d6884d561423ec38f07\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 17:50:31.499290 containerd[1930]: time="2025-09-12T17:50:31.499275589Z" level=info msg="CreateContainer within sandbox \"107c807b9c7e3682213e53d6afb892b7e07f7d9413813864f2a41ba57757718b\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 17:50:31.502159 containerd[1930]: time="2025-09-12T17:50:31.502128192Z" level=info msg="Container 1d4e7310980478ad208d1dfbc7007361bce00d39a1e333a01d938153d7f56f2f: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:50:31.502865 containerd[1930]: time="2025-09-12T17:50:31.502851505Z" level=info msg="Container 87f1e5005697c0d1c06c975df7843f2f881b5d164ce2c133a40e8b3a2a87f449: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:50:31.506364 containerd[1930]: time="2025-09-12T17:50:31.506343206Z" level=info msg="CreateContainer within sandbox \"0e4e2ccec013f657d1b7bc8aa23f2149b0a7c7e5e7ab7d6884d561423ec38f07\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"1d4e7310980478ad208d1dfbc7007361bce00d39a1e333a01d938153d7f56f2f\"" Sep 12 17:50:31.506800 containerd[1930]: time="2025-09-12T17:50:31.506783548Z" level=info msg="StartContainer for \"1d4e7310980478ad208d1dfbc7007361bce00d39a1e333a01d938153d7f56f2f\"" Sep 12 17:50:31.506838 containerd[1930]: time="2025-09-12T17:50:31.506799226Z" level=info msg="CreateContainer within sandbox \"107c807b9c7e3682213e53d6afb892b7e07f7d9413813864f2a41ba57757718b\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"87f1e5005697c0d1c06c975df7843f2f881b5d164ce2c133a40e8b3a2a87f449\"" Sep 12 17:50:31.507013 containerd[1930]: time="2025-09-12T17:50:31.507003039Z" level=info msg="StartContainer for \"87f1e5005697c0d1c06c975df7843f2f881b5d164ce2c133a40e8b3a2a87f449\"" Sep 12 17:50:31.507435 containerd[1930]: time="2025-09-12T17:50:31.507424315Z" level=info msg="connecting to shim 1d4e7310980478ad208d1dfbc7007361bce00d39a1e333a01d938153d7f56f2f" address="unix:///run/containerd/s/b5f4d78db4f9aca6d79f30e6c6b09519a6b9cbe54ec088b2069b461d4a04c983" protocol=ttrpc version=3 Sep 12 17:50:31.507638 containerd[1930]: time="2025-09-12T17:50:31.507625637Z" level=info msg="connecting to shim 87f1e5005697c0d1c06c975df7843f2f881b5d164ce2c133a40e8b3a2a87f449" address="unix:///run/containerd/s/0cdaaa695965a3557972924d031c51d33d836b48482c9ab0b8f274750eace052" protocol=ttrpc version=3 Sep 12 17:50:31.507856 containerd[1930]: time="2025-09-12T17:50:31.507840916Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426.1.0-a-b1d4eb1a76,Uid:0828915ff7ea6e1d985afe08673e6721,Namespace:kube-system,Attempt:0,} returns sandbox id \"62609e6cf098b31c5ef0c68cb3e7df041b552464f0a891a41398c5289a1a5f58\"" Sep 12 17:50:31.508734 containerd[1930]: time="2025-09-12T17:50:31.508721962Z" level=info msg="CreateContainer within sandbox \"62609e6cf098b31c5ef0c68cb3e7df041b552464f0a891a41398c5289a1a5f58\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 17:50:31.511625 containerd[1930]: time="2025-09-12T17:50:31.511607802Z" level=info msg="Container c819e41f80dd0a203868627784ba2ea85180a87bc6254ccf95297aabc4ac67d1: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:50:31.514224 containerd[1930]: time="2025-09-12T17:50:31.514185516Z" level=info msg="CreateContainer within sandbox \"62609e6cf098b31c5ef0c68cb3e7df041b552464f0a891a41398c5289a1a5f58\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c819e41f80dd0a203868627784ba2ea85180a87bc6254ccf95297aabc4ac67d1\"" Sep 12 17:50:31.514617 containerd[1930]: time="2025-09-12T17:50:31.514582078Z" level=info msg="StartContainer for \"c819e41f80dd0a203868627784ba2ea85180a87bc6254ccf95297aabc4ac67d1\"" Sep 12 17:50:31.515147 containerd[1930]: time="2025-09-12T17:50:31.515109708Z" level=info msg="connecting to shim c819e41f80dd0a203868627784ba2ea85180a87bc6254ccf95297aabc4ac67d1" address="unix:///run/containerd/s/870c0ac2157fcc56553297e81950b77d87cfd7cf3e43ef06048d46c362f37d45" protocol=ttrpc version=3 Sep 12 17:50:31.529426 systemd[1]: Started cri-containerd-1d4e7310980478ad208d1dfbc7007361bce00d39a1e333a01d938153d7f56f2f.scope - libcontainer container 1d4e7310980478ad208d1dfbc7007361bce00d39a1e333a01d938153d7f56f2f. Sep 12 17:50:31.530129 systemd[1]: Started cri-containerd-87f1e5005697c0d1c06c975df7843f2f881b5d164ce2c133a40e8b3a2a87f449.scope - libcontainer container 87f1e5005697c0d1c06c975df7843f2f881b5d164ce2c133a40e8b3a2a87f449. Sep 12 17:50:31.532117 systemd[1]: Started cri-containerd-c819e41f80dd0a203868627784ba2ea85180a87bc6254ccf95297aabc4ac67d1.scope - libcontainer container c819e41f80dd0a203868627784ba2ea85180a87bc6254ccf95297aabc4ac67d1. Sep 12 17:50:31.558400 containerd[1930]: time="2025-09-12T17:50:31.558378731Z" level=info msg="StartContainer for \"87f1e5005697c0d1c06c975df7843f2f881b5d164ce2c133a40e8b3a2a87f449\" returns successfully" Sep 12 17:50:31.558476 containerd[1930]: time="2025-09-12T17:50:31.558458397Z" level=info msg="StartContainer for \"1d4e7310980478ad208d1dfbc7007361bce00d39a1e333a01d938153d7f56f2f\" returns successfully" Sep 12 17:50:31.560687 containerd[1930]: time="2025-09-12T17:50:31.560666441Z" level=info msg="StartContainer for \"c819e41f80dd0a203868627784ba2ea85180a87bc6254ccf95297aabc4ac67d1\" returns successfully" Sep 12 17:50:31.575388 kubelet[2871]: E0912 17:50:31.575357 2871 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.149:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.1.0-a-b1d4eb1a76?timeout=10s\": dial tcp 139.178.94.149:6443: connect: connection refused" interval="800ms" Sep 12 17:50:31.736218 kubelet[2871]: I0912 17:50:31.736173 2871 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:32.381641 kubelet[2871]: E0912 17:50:32.381578 2871 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4426.1.0-a-b1d4eb1a76\" not found" node="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:32.391314 kubelet[2871]: I0912 17:50:32.391249 2871 kubelet_node_status.go:75] "Successfully registered node" node="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:32.968811 kubelet[2871]: I0912 17:50:32.968726 2871 apiserver.go:52] "Watching apiserver" Sep 12 17:50:32.972700 kubelet[2871]: I0912 17:50:32.972643 2871 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 17:50:32.999658 kubelet[2871]: E0912 17:50:32.999569 2871 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4426.1.0-a-b1d4eb1a76\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:34.678025 systemd[1]: Reload requested from client PID 3185 ('systemctl') (unit session-11.scope)... Sep 12 17:50:34.678033 systemd[1]: Reloading... Sep 12 17:50:34.732199 zram_generator::config[3230]: No configuration found. Sep 12 17:50:34.904639 systemd[1]: Reloading finished in 226 ms. Sep 12 17:50:34.921412 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:50:34.932781 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 17:50:34.932908 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:50:34.932933 systemd[1]: kubelet.service: Consumed 697ms CPU time, 141.3M memory peak. Sep 12 17:50:34.934220 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:50:35.240916 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:50:35.249503 (kubelet)[3295]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:50:35.290618 kubelet[3295]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:50:35.290618 kubelet[3295]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 17:50:35.290618 kubelet[3295]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:50:35.291015 kubelet[3295]: I0912 17:50:35.290691 3295 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:50:35.297351 kubelet[3295]: I0912 17:50:35.297300 3295 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 17:50:35.297351 kubelet[3295]: I0912 17:50:35.297321 3295 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:50:35.297625 kubelet[3295]: I0912 17:50:35.297583 3295 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 17:50:35.299124 kubelet[3295]: I0912 17:50:35.299072 3295 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 17:50:35.302343 kubelet[3295]: I0912 17:50:35.302322 3295 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:50:35.305659 kubelet[3295]: I0912 17:50:35.305637 3295 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 17:50:35.317517 kubelet[3295]: I0912 17:50:35.317492 3295 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:50:35.317646 kubelet[3295]: I0912 17:50:35.317625 3295 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 17:50:35.317827 kubelet[3295]: I0912 17:50:35.317793 3295 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:50:35.318059 kubelet[3295]: I0912 17:50:35.317827 3295 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426.1.0-a-b1d4eb1a76","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:50:35.318246 kubelet[3295]: I0912 17:50:35.318072 3295 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:50:35.318246 kubelet[3295]: I0912 17:50:35.318093 3295 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 17:50:35.318246 kubelet[3295]: I0912 17:50:35.318141 3295 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:50:35.318407 kubelet[3295]: I0912 17:50:35.318289 3295 kubelet.go:408] "Attempting to sync node with API server" Sep 12 17:50:35.318407 kubelet[3295]: I0912 17:50:35.318310 3295 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:50:35.318407 kubelet[3295]: I0912 17:50:35.318393 3295 kubelet.go:314] "Adding apiserver pod source" Sep 12 17:50:35.318575 kubelet[3295]: I0912 17:50:35.318414 3295 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:50:35.320486 kubelet[3295]: I0912 17:50:35.320423 3295 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 17:50:35.321606 kubelet[3295]: I0912 17:50:35.321555 3295 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:50:35.322124 kubelet[3295]: I0912 17:50:35.322075 3295 server.go:1274] "Started kubelet" Sep 12 17:50:35.322188 kubelet[3295]: I0912 17:50:35.322127 3295 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:50:35.322304 kubelet[3295]: I0912 17:50:35.322198 3295 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:50:35.322540 kubelet[3295]: I0912 17:50:35.322518 3295 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:50:35.323792 kubelet[3295]: I0912 17:50:35.323769 3295 server.go:449] "Adding debug handlers to kubelet server" Sep 12 17:50:35.323792 kubelet[3295]: E0912 17:50:35.323773 3295 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:50:35.324018 kubelet[3295]: I0912 17:50:35.323999 3295 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:50:35.324082 kubelet[3295]: I0912 17:50:35.324012 3295 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:50:35.324155 kubelet[3295]: I0912 17:50:35.324096 3295 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 17:50:35.324233 kubelet[3295]: E0912 17:50:35.324125 3295 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-b1d4eb1a76\" not found" Sep 12 17:50:35.324565 kubelet[3295]: I0912 17:50:35.324539 3295 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:50:35.324901 kubelet[3295]: I0912 17:50:35.324870 3295 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 17:50:35.325988 kubelet[3295]: I0912 17:50:35.325961 3295 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:50:35.326183 kubelet[3295]: I0912 17:50:35.326152 3295 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:50:35.328373 kubelet[3295]: I0912 17:50:35.328345 3295 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:50:35.336123 kubelet[3295]: I0912 17:50:35.336062 3295 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:50:35.337384 kubelet[3295]: I0912 17:50:35.337357 3295 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:50:35.337503 kubelet[3295]: I0912 17:50:35.337389 3295 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 17:50:35.337503 kubelet[3295]: I0912 17:50:35.337417 3295 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 17:50:35.337503 kubelet[3295]: E0912 17:50:35.337488 3295 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:50:35.357776 kubelet[3295]: I0912 17:50:35.357721 3295 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 17:50:35.357776 kubelet[3295]: I0912 17:50:35.357743 3295 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 17:50:35.357776 kubelet[3295]: I0912 17:50:35.357760 3295 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:50:35.357936 kubelet[3295]: I0912 17:50:35.357896 3295 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 17:50:35.357936 kubelet[3295]: I0912 17:50:35.357907 3295 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 17:50:35.357936 kubelet[3295]: I0912 17:50:35.357927 3295 policy_none.go:49] "None policy: Start" Sep 12 17:50:35.358447 kubelet[3295]: I0912 17:50:35.358435 3295 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 17:50:35.358488 kubelet[3295]: I0912 17:50:35.358466 3295 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:50:35.358600 kubelet[3295]: I0912 17:50:35.358591 3295 state_mem.go:75] "Updated machine memory state" Sep 12 17:50:35.362241 kubelet[3295]: I0912 17:50:35.362194 3295 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:50:35.362408 kubelet[3295]: I0912 17:50:35.362390 3295 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:50:35.362479 kubelet[3295]: I0912 17:50:35.362406 3295 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:50:35.362597 kubelet[3295]: I0912 17:50:35.362578 3295 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:50:35.446343 kubelet[3295]: W0912 17:50:35.446259 3295 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 17:50:35.446815 kubelet[3295]: W0912 17:50:35.446768 3295 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 17:50:35.447056 kubelet[3295]: W0912 17:50:35.446918 3295 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 17:50:35.469204 kubelet[3295]: I0912 17:50:35.469137 3295 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:35.477679 kubelet[3295]: I0912 17:50:35.477626 3295 kubelet_node_status.go:111] "Node was previously registered" node="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:35.477868 kubelet[3295]: I0912 17:50:35.477793 3295 kubelet_node_status.go:75] "Successfully registered node" node="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:35.626158 kubelet[3295]: I0912 17:50:35.626016 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ae7a75ecf34eca2d7915effbddda25b7-ca-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"ae7a75ecf34eca2d7915effbddda25b7\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:35.626367 kubelet[3295]: I0912 17:50:35.626194 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ae7a75ecf34eca2d7915effbddda25b7-kubeconfig\") pod \"kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"ae7a75ecf34eca2d7915effbddda25b7\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:35.626367 kubelet[3295]: I0912 17:50:35.626272 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0828915ff7ea6e1d985afe08673e6721-kubeconfig\") pod \"kube-scheduler-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"0828915ff7ea6e1d985afe08673e6721\") " pod="kube-system/kube-scheduler-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:35.626367 kubelet[3295]: I0912 17:50:35.626322 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/939647c5b053d3e85d1b85ab2b9446b0-k8s-certs\") pod \"kube-apiserver-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"939647c5b053d3e85d1b85ab2b9446b0\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:35.626643 kubelet[3295]: I0912 17:50:35.626378 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/939647c5b053d3e85d1b85ab2b9446b0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"939647c5b053d3e85d1b85ab2b9446b0\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:35.626643 kubelet[3295]: I0912 17:50:35.626427 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ae7a75ecf34eca2d7915effbddda25b7-flexvolume-dir\") pod \"kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"ae7a75ecf34eca2d7915effbddda25b7\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:35.626643 kubelet[3295]: I0912 17:50:35.626469 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ae7a75ecf34eca2d7915effbddda25b7-k8s-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"ae7a75ecf34eca2d7915effbddda25b7\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:35.626643 kubelet[3295]: I0912 17:50:35.626530 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ae7a75ecf34eca2d7915effbddda25b7-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"ae7a75ecf34eca2d7915effbddda25b7\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:35.626643 kubelet[3295]: I0912 17:50:35.626573 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/939647c5b053d3e85d1b85ab2b9446b0-ca-certs\") pod \"kube-apiserver-ci-4426.1.0-a-b1d4eb1a76\" (UID: \"939647c5b053d3e85d1b85ab2b9446b0\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:36.319122 kubelet[3295]: I0912 17:50:36.319095 3295 apiserver.go:52] "Watching apiserver" Sep 12 17:50:36.325763 kubelet[3295]: I0912 17:50:36.325711 3295 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 17:50:36.348071 kubelet[3295]: W0912 17:50:36.348053 3295 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 17:50:36.348170 kubelet[3295]: E0912 17:50:36.348094 3295 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4426.1.0-a-b1d4eb1a76\" already exists" pod="kube-system/kube-apiserver-ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:50:36.355655 kubelet[3295]: I0912 17:50:36.355614 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4426.1.0-a-b1d4eb1a76" podStartSLOduration=1.3555892840000001 podStartE2EDuration="1.355589284s" podCreationTimestamp="2025-09-12 17:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:50:36.355575951 +0000 UTC m=+1.101428337" watchObservedRunningTime="2025-09-12 17:50:36.355589284 +0000 UTC m=+1.101441670" Sep 12 17:50:36.359513 kubelet[3295]: I0912 17:50:36.359481 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4426.1.0-a-b1d4eb1a76" podStartSLOduration=1.359468528 podStartE2EDuration="1.359468528s" podCreationTimestamp="2025-09-12 17:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:50:36.35934164 +0000 UTC m=+1.105194027" watchObservedRunningTime="2025-09-12 17:50:36.359468528 +0000 UTC m=+1.105320912" Sep 12 17:50:36.362795 kubelet[3295]: I0912 17:50:36.362772 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4426.1.0-a-b1d4eb1a76" podStartSLOduration=1.36276302 podStartE2EDuration="1.36276302s" podCreationTimestamp="2025-09-12 17:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:50:36.362703857 +0000 UTC m=+1.108556243" watchObservedRunningTime="2025-09-12 17:50:36.36276302 +0000 UTC m=+1.108615407" Sep 12 17:50:41.040525 kubelet[3295]: I0912 17:50:41.040414 3295 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 17:50:41.041448 containerd[1930]: time="2025-09-12T17:50:41.041199840Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 17:50:41.042187 kubelet[3295]: I0912 17:50:41.041615 3295 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 17:50:41.941550 systemd[1]: Created slice kubepods-besteffort-podb2135000_0f39_4ed7_9551_3067220961b8.slice - libcontainer container kubepods-besteffort-podb2135000_0f39_4ed7_9551_3067220961b8.slice. Sep 12 17:50:41.973391 kubelet[3295]: I0912 17:50:41.973302 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b2135000-0f39-4ed7-9551-3067220961b8-xtables-lock\") pod \"kube-proxy-jb7tl\" (UID: \"b2135000-0f39-4ed7-9551-3067220961b8\") " pod="kube-system/kube-proxy-jb7tl" Sep 12 17:50:41.973693 kubelet[3295]: I0912 17:50:41.973422 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzrgx\" (UniqueName: \"kubernetes.io/projected/b2135000-0f39-4ed7-9551-3067220961b8-kube-api-access-lzrgx\") pod \"kube-proxy-jb7tl\" (UID: \"b2135000-0f39-4ed7-9551-3067220961b8\") " pod="kube-system/kube-proxy-jb7tl" Sep 12 17:50:41.973693 kubelet[3295]: I0912 17:50:41.973515 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/b2135000-0f39-4ed7-9551-3067220961b8-kube-proxy\") pod \"kube-proxy-jb7tl\" (UID: \"b2135000-0f39-4ed7-9551-3067220961b8\") " pod="kube-system/kube-proxy-jb7tl" Sep 12 17:50:41.973693 kubelet[3295]: I0912 17:50:41.973592 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2135000-0f39-4ed7-9551-3067220961b8-lib-modules\") pod \"kube-proxy-jb7tl\" (UID: \"b2135000-0f39-4ed7-9551-3067220961b8\") " pod="kube-system/kube-proxy-jb7tl" Sep 12 17:50:42.005367 systemd[1]: Created slice kubepods-besteffort-pod82514f38_88a1_43e0_b818_33da62eb17ab.slice - libcontainer container kubepods-besteffort-pod82514f38_88a1_43e0_b818_33da62eb17ab.slice. Sep 12 17:50:42.074293 kubelet[3295]: I0912 17:50:42.074195 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b8c4\" (UniqueName: \"kubernetes.io/projected/82514f38-88a1-43e0-b818-33da62eb17ab-kube-api-access-9b8c4\") pod \"tigera-operator-58fc44c59b-qqqd8\" (UID: \"82514f38-88a1-43e0-b818-33da62eb17ab\") " pod="tigera-operator/tigera-operator-58fc44c59b-qqqd8" Sep 12 17:50:42.075010 kubelet[3295]: I0912 17:50:42.074309 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/82514f38-88a1-43e0-b818-33da62eb17ab-var-lib-calico\") pod \"tigera-operator-58fc44c59b-qqqd8\" (UID: \"82514f38-88a1-43e0-b818-33da62eb17ab\") " pod="tigera-operator/tigera-operator-58fc44c59b-qqqd8" Sep 12 17:50:42.259981 containerd[1930]: time="2025-09-12T17:50:42.259768826Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jb7tl,Uid:b2135000-0f39-4ed7-9551-3067220961b8,Namespace:kube-system,Attempt:0,}" Sep 12 17:50:42.309559 containerd[1930]: time="2025-09-12T17:50:42.309508782Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-qqqd8,Uid:82514f38-88a1-43e0-b818-33da62eb17ab,Namespace:tigera-operator,Attempt:0,}" Sep 12 17:50:42.331712 containerd[1930]: time="2025-09-12T17:50:42.331678488Z" level=info msg="connecting to shim 41ef0386d9aa1f1f6531ed4ba469108793d2709afd5a19709fd040b66d71a615" address="unix:///run/containerd/s/be505910d627c19cfa3fcfd71c0110ede9d8f0e9eaaba8872c4434ecb94a3fee" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:50:42.367605 systemd[1]: Started cri-containerd-41ef0386d9aa1f1f6531ed4ba469108793d2709afd5a19709fd040b66d71a615.scope - libcontainer container 41ef0386d9aa1f1f6531ed4ba469108793d2709afd5a19709fd040b66d71a615. Sep 12 17:50:42.398334 containerd[1930]: time="2025-09-12T17:50:42.398298996Z" level=info msg="connecting to shim 98fd4102631751c4195df424c7ca03f7e514bbd7922d4e3a33bedd2fd6388890" address="unix:///run/containerd/s/cc5d45811999d00a619f12cda482e37630a6f75cce1c1ea73445c957172c66bd" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:50:42.404528 containerd[1930]: time="2025-09-12T17:50:42.404502702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jb7tl,Uid:b2135000-0f39-4ed7-9551-3067220961b8,Namespace:kube-system,Attempt:0,} returns sandbox id \"41ef0386d9aa1f1f6531ed4ba469108793d2709afd5a19709fd040b66d71a615\"" Sep 12 17:50:42.405727 containerd[1930]: time="2025-09-12T17:50:42.405712794Z" level=info msg="CreateContainer within sandbox \"41ef0386d9aa1f1f6531ed4ba469108793d2709afd5a19709fd040b66d71a615\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 17:50:42.421278 systemd[1]: Started cri-containerd-98fd4102631751c4195df424c7ca03f7e514bbd7922d4e3a33bedd2fd6388890.scope - libcontainer container 98fd4102631751c4195df424c7ca03f7e514bbd7922d4e3a33bedd2fd6388890. Sep 12 17:50:42.544177 containerd[1930]: time="2025-09-12T17:50:42.544059359Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-qqqd8,Uid:82514f38-88a1-43e0-b818-33da62eb17ab,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"98fd4102631751c4195df424c7ca03f7e514bbd7922d4e3a33bedd2fd6388890\"" Sep 12 17:50:42.545307 containerd[1930]: time="2025-09-12T17:50:42.545250217Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 17:50:42.752196 containerd[1930]: time="2025-09-12T17:50:42.752136768Z" level=info msg="Container be58c909089a5b36789cd02cf7fc69773ed8a81d60c5d8c5f0232efe7a40885b: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:50:42.839633 containerd[1930]: time="2025-09-12T17:50:42.839609449Z" level=info msg="CreateContainer within sandbox \"41ef0386d9aa1f1f6531ed4ba469108793d2709afd5a19709fd040b66d71a615\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"be58c909089a5b36789cd02cf7fc69773ed8a81d60c5d8c5f0232efe7a40885b\"" Sep 12 17:50:42.839905 containerd[1930]: time="2025-09-12T17:50:42.839876785Z" level=info msg="StartContainer for \"be58c909089a5b36789cd02cf7fc69773ed8a81d60c5d8c5f0232efe7a40885b\"" Sep 12 17:50:42.840937 containerd[1930]: time="2025-09-12T17:50:42.840917974Z" level=info msg="connecting to shim be58c909089a5b36789cd02cf7fc69773ed8a81d60c5d8c5f0232efe7a40885b" address="unix:///run/containerd/s/be505910d627c19cfa3fcfd71c0110ede9d8f0e9eaaba8872c4434ecb94a3fee" protocol=ttrpc version=3 Sep 12 17:50:42.867419 systemd[1]: Started cri-containerd-be58c909089a5b36789cd02cf7fc69773ed8a81d60c5d8c5f0232efe7a40885b.scope - libcontainer container be58c909089a5b36789cd02cf7fc69773ed8a81d60c5d8c5f0232efe7a40885b. Sep 12 17:50:42.894321 containerd[1930]: time="2025-09-12T17:50:42.894270703Z" level=info msg="StartContainer for \"be58c909089a5b36789cd02cf7fc69773ed8a81d60c5d8c5f0232efe7a40885b\" returns successfully" Sep 12 17:50:43.894443 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3697798504.mount: Deactivated successfully. Sep 12 17:50:44.225948 containerd[1930]: time="2025-09-12T17:50:44.225889579Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:44.226161 containerd[1930]: time="2025-09-12T17:50:44.226109398Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 12 17:50:44.226480 containerd[1930]: time="2025-09-12T17:50:44.226466740Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:44.227391 containerd[1930]: time="2025-09-12T17:50:44.227380699Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:44.227772 containerd[1930]: time="2025-09-12T17:50:44.227760059Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.682482194s" Sep 12 17:50:44.227793 containerd[1930]: time="2025-09-12T17:50:44.227777457Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 12 17:50:44.228684 containerd[1930]: time="2025-09-12T17:50:44.228670525Z" level=info msg="CreateContainer within sandbox \"98fd4102631751c4195df424c7ca03f7e514bbd7922d4e3a33bedd2fd6388890\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 17:50:44.231449 containerd[1930]: time="2025-09-12T17:50:44.231437306Z" level=info msg="Container 3b8821ce17e51ef35826b7561a511efc6847cbcbbc3c18092f265befd8d07df1: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:50:44.233729 containerd[1930]: time="2025-09-12T17:50:44.233716288Z" level=info msg="CreateContainer within sandbox \"98fd4102631751c4195df424c7ca03f7e514bbd7922d4e3a33bedd2fd6388890\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"3b8821ce17e51ef35826b7561a511efc6847cbcbbc3c18092f265befd8d07df1\"" Sep 12 17:50:44.233918 containerd[1930]: time="2025-09-12T17:50:44.233906570Z" level=info msg="StartContainer for \"3b8821ce17e51ef35826b7561a511efc6847cbcbbc3c18092f265befd8d07df1\"" Sep 12 17:50:44.234301 containerd[1930]: time="2025-09-12T17:50:44.234287412Z" level=info msg="connecting to shim 3b8821ce17e51ef35826b7561a511efc6847cbcbbc3c18092f265befd8d07df1" address="unix:///run/containerd/s/cc5d45811999d00a619f12cda482e37630a6f75cce1c1ea73445c957172c66bd" protocol=ttrpc version=3 Sep 12 17:50:44.256247 systemd[1]: Started cri-containerd-3b8821ce17e51ef35826b7561a511efc6847cbcbbc3c18092f265befd8d07df1.scope - libcontainer container 3b8821ce17e51ef35826b7561a511efc6847cbcbbc3c18092f265befd8d07df1. Sep 12 17:50:44.269389 containerd[1930]: time="2025-09-12T17:50:44.269366825Z" level=info msg="StartContainer for \"3b8821ce17e51ef35826b7561a511efc6847cbcbbc3c18092f265befd8d07df1\" returns successfully" Sep 12 17:50:44.377500 kubelet[3295]: I0912 17:50:44.377449 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-qqqd8" podStartSLOduration=1.69416661 podStartE2EDuration="3.377433484s" podCreationTimestamp="2025-09-12 17:50:41 +0000 UTC" firstStartedPulling="2025-09-12 17:50:42.544892498 +0000 UTC m=+7.290744900" lastFinishedPulling="2025-09-12 17:50:44.228159388 +0000 UTC m=+8.974011774" observedRunningTime="2025-09-12 17:50:44.377423298 +0000 UTC m=+9.123275694" watchObservedRunningTime="2025-09-12 17:50:44.377433484 +0000 UTC m=+9.123285884" Sep 12 17:50:44.377871 kubelet[3295]: I0912 17:50:44.377633 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-jb7tl" podStartSLOduration=3.377623514 podStartE2EDuration="3.377623514s" podCreationTimestamp="2025-09-12 17:50:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:50:43.38541097 +0000 UTC m=+8.131263417" watchObservedRunningTime="2025-09-12 17:50:44.377623514 +0000 UTC m=+9.123475908" Sep 12 17:50:48.681820 sudo[2236]: pam_unix(sudo:session): session closed for user root Sep 12 17:50:48.682699 sshd[2235]: Connection closed by 139.178.89.65 port 56508 Sep 12 17:50:48.682907 sshd-session[2232]: pam_unix(sshd:session): session closed for user core Sep 12 17:50:48.684905 systemd[1]: sshd@8-139.178.94.149:22-139.178.89.65:56508.service: Deactivated successfully. Sep 12 17:50:48.686142 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 17:50:48.686251 systemd[1]: session-11.scope: Consumed 3.148s CPU time, 228.2M memory peak. Sep 12 17:50:48.687818 systemd-logind[1911]: Session 11 logged out. Waiting for processes to exit. Sep 12 17:50:48.688542 systemd-logind[1911]: Removed session 11. Sep 12 17:50:48.840235 update_engine[1916]: I20250912 17:50:48.840188 1916 update_attempter.cc:509] Updating boot flags... Sep 12 17:50:50.946945 systemd[1]: Created slice kubepods-besteffort-pod5ec15cc5_4b88_4235_b999_4e6ae7a27e01.slice - libcontainer container kubepods-besteffort-pod5ec15cc5_4b88_4235_b999_4e6ae7a27e01.slice. Sep 12 17:50:51.033507 kubelet[3295]: I0912 17:50:51.033447 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wqn7\" (UniqueName: \"kubernetes.io/projected/5ec15cc5-4b88-4235-b999-4e6ae7a27e01-kube-api-access-8wqn7\") pod \"calico-typha-55b64d7ccd-d6qpg\" (UID: \"5ec15cc5-4b88-4235-b999-4e6ae7a27e01\") " pod="calico-system/calico-typha-55b64d7ccd-d6qpg" Sep 12 17:50:51.034181 kubelet[3295]: I0912 17:50:51.033538 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/5ec15cc5-4b88-4235-b999-4e6ae7a27e01-typha-certs\") pod \"calico-typha-55b64d7ccd-d6qpg\" (UID: \"5ec15cc5-4b88-4235-b999-4e6ae7a27e01\") " pod="calico-system/calico-typha-55b64d7ccd-d6qpg" Sep 12 17:50:51.034181 kubelet[3295]: I0912 17:50:51.033653 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ec15cc5-4b88-4235-b999-4e6ae7a27e01-tigera-ca-bundle\") pod \"calico-typha-55b64d7ccd-d6qpg\" (UID: \"5ec15cc5-4b88-4235-b999-4e6ae7a27e01\") " pod="calico-system/calico-typha-55b64d7ccd-d6qpg" Sep 12 17:50:51.251852 containerd[1930]: time="2025-09-12T17:50:51.251656877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-55b64d7ccd-d6qpg,Uid:5ec15cc5-4b88-4235-b999-4e6ae7a27e01,Namespace:calico-system,Attempt:0,}" Sep 12 17:50:51.259720 containerd[1930]: time="2025-09-12T17:50:51.259690853Z" level=info msg="connecting to shim 74e3efc4fd17c59edbf02c4a18dcefc12e018e96c1ad1f8e781c600d931feda7" address="unix:///run/containerd/s/0da8e918c768c3ca93a692ba931c772dbbbb6f6e7e21e587b6623390fb5782a9" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:50:51.278286 systemd[1]: Started cri-containerd-74e3efc4fd17c59edbf02c4a18dcefc12e018e96c1ad1f8e781c600d931feda7.scope - libcontainer container 74e3efc4fd17c59edbf02c4a18dcefc12e018e96c1ad1f8e781c600d931feda7. Sep 12 17:50:51.299184 systemd[1]: Created slice kubepods-besteffort-pod3c8b40e6_947f_4856_95d8_8287ff7bc293.slice - libcontainer container kubepods-besteffort-pod3c8b40e6_947f_4856_95d8_8287ff7bc293.slice. Sep 12 17:50:51.305092 containerd[1930]: time="2025-09-12T17:50:51.305069397Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-55b64d7ccd-d6qpg,Uid:5ec15cc5-4b88-4235-b999-4e6ae7a27e01,Namespace:calico-system,Attempt:0,} returns sandbox id \"74e3efc4fd17c59edbf02c4a18dcefc12e018e96c1ad1f8e781c600d931feda7\"" Sep 12 17:50:51.305681 containerd[1930]: time="2025-09-12T17:50:51.305670137Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 17:50:51.335778 kubelet[3295]: I0912 17:50:51.335741 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfdqn\" (UniqueName: \"kubernetes.io/projected/3c8b40e6-947f-4856-95d8-8287ff7bc293-kube-api-access-sfdqn\") pod \"calico-node-sd9ss\" (UID: \"3c8b40e6-947f-4856-95d8-8287ff7bc293\") " pod="calico-system/calico-node-sd9ss" Sep 12 17:50:51.335892 kubelet[3295]: I0912 17:50:51.335791 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3c8b40e6-947f-4856-95d8-8287ff7bc293-lib-modules\") pod \"calico-node-sd9ss\" (UID: \"3c8b40e6-947f-4856-95d8-8287ff7bc293\") " pod="calico-system/calico-node-sd9ss" Sep 12 17:50:51.335892 kubelet[3295]: I0912 17:50:51.335816 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/3c8b40e6-947f-4856-95d8-8287ff7bc293-cni-bin-dir\") pod \"calico-node-sd9ss\" (UID: \"3c8b40e6-947f-4856-95d8-8287ff7bc293\") " pod="calico-system/calico-node-sd9ss" Sep 12 17:50:51.335892 kubelet[3295]: I0912 17:50:51.335835 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/3c8b40e6-947f-4856-95d8-8287ff7bc293-flexvol-driver-host\") pod \"calico-node-sd9ss\" (UID: \"3c8b40e6-947f-4856-95d8-8287ff7bc293\") " pod="calico-system/calico-node-sd9ss" Sep 12 17:50:51.335892 kubelet[3295]: I0912 17:50:51.335857 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/3c8b40e6-947f-4856-95d8-8287ff7bc293-node-certs\") pod \"calico-node-sd9ss\" (UID: \"3c8b40e6-947f-4856-95d8-8287ff7bc293\") " pod="calico-system/calico-node-sd9ss" Sep 12 17:50:51.335892 kubelet[3295]: I0912 17:50:51.335878 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/3c8b40e6-947f-4856-95d8-8287ff7bc293-cni-net-dir\") pod \"calico-node-sd9ss\" (UID: \"3c8b40e6-947f-4856-95d8-8287ff7bc293\") " pod="calico-system/calico-node-sd9ss" Sep 12 17:50:51.336126 kubelet[3295]: I0912 17:50:51.335897 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3c8b40e6-947f-4856-95d8-8287ff7bc293-xtables-lock\") pod \"calico-node-sd9ss\" (UID: \"3c8b40e6-947f-4856-95d8-8287ff7bc293\") " pod="calico-system/calico-node-sd9ss" Sep 12 17:50:51.336126 kubelet[3295]: I0912 17:50:51.335915 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3c8b40e6-947f-4856-95d8-8287ff7bc293-var-lib-calico\") pod \"calico-node-sd9ss\" (UID: \"3c8b40e6-947f-4856-95d8-8287ff7bc293\") " pod="calico-system/calico-node-sd9ss" Sep 12 17:50:51.336126 kubelet[3295]: I0912 17:50:51.335932 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/3c8b40e6-947f-4856-95d8-8287ff7bc293-var-run-calico\") pod \"calico-node-sd9ss\" (UID: \"3c8b40e6-947f-4856-95d8-8287ff7bc293\") " pod="calico-system/calico-node-sd9ss" Sep 12 17:50:51.336126 kubelet[3295]: I0912 17:50:51.335950 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/3c8b40e6-947f-4856-95d8-8287ff7bc293-policysync\") pod \"calico-node-sd9ss\" (UID: \"3c8b40e6-947f-4856-95d8-8287ff7bc293\") " pod="calico-system/calico-node-sd9ss" Sep 12 17:50:51.336126 kubelet[3295]: I0912 17:50:51.336041 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c8b40e6-947f-4856-95d8-8287ff7bc293-tigera-ca-bundle\") pod \"calico-node-sd9ss\" (UID: \"3c8b40e6-947f-4856-95d8-8287ff7bc293\") " pod="calico-system/calico-node-sd9ss" Sep 12 17:50:51.336325 kubelet[3295]: I0912 17:50:51.336132 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/3c8b40e6-947f-4856-95d8-8287ff7bc293-cni-log-dir\") pod \"calico-node-sd9ss\" (UID: \"3c8b40e6-947f-4856-95d8-8287ff7bc293\") " pod="calico-system/calico-node-sd9ss" Sep 12 17:50:51.439981 kubelet[3295]: E0912 17:50:51.439926 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.439981 kubelet[3295]: W0912 17:50:51.439971 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.440352 kubelet[3295]: E0912 17:50:51.440014 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.445421 kubelet[3295]: E0912 17:50:51.445332 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.445421 kubelet[3295]: W0912 17:50:51.445372 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.445421 kubelet[3295]: E0912 17:50:51.445412 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.456637 kubelet[3295]: E0912 17:50:51.456586 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.456637 kubelet[3295]: W0912 17:50:51.456623 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.456926 kubelet[3295]: E0912 17:50:51.456661 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.602302 containerd[1930]: time="2025-09-12T17:50:51.602215173Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-sd9ss,Uid:3c8b40e6-947f-4856-95d8-8287ff7bc293,Namespace:calico-system,Attempt:0,}" Sep 12 17:50:51.608656 kubelet[3295]: E0912 17:50:51.608629 3295 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8vfbh" podUID="86dbef57-8f65-4cc3-ae59-c02f1dffe403" Sep 12 17:50:51.613200 containerd[1930]: time="2025-09-12T17:50:51.613147877Z" level=info msg="connecting to shim 509591bd9f1ebc0ab26a795eb5fcf0e5740e5b3a353f215a900816593ee9af9d" address="unix:///run/containerd/s/db1125c6d0ef3ce4baaae9e0498a00bb4cbc17e67b36ff50c2dad96660e534fd" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:50:51.634230 kubelet[3295]: E0912 17:50:51.634196 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.634230 kubelet[3295]: W0912 17:50:51.634206 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.634230 kubelet[3295]: E0912 17:50:51.634218 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.634399 kubelet[3295]: E0912 17:50:51.634367 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.634399 kubelet[3295]: W0912 17:50:51.634374 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.634399 kubelet[3295]: E0912 17:50:51.634380 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.634551 kubelet[3295]: E0912 17:50:51.634542 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.634551 kubelet[3295]: W0912 17:50:51.634549 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.634589 kubelet[3295]: E0912 17:50:51.634556 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.634717 kubelet[3295]: E0912 17:50:51.634685 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.634717 kubelet[3295]: W0912 17:50:51.634691 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.634717 kubelet[3295]: E0912 17:50:51.634698 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.634849 kubelet[3295]: E0912 17:50:51.634818 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.634849 kubelet[3295]: W0912 17:50:51.634824 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.634849 kubelet[3295]: E0912 17:50:51.634830 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.634917 kubelet[3295]: E0912 17:50:51.634912 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.634935 kubelet[3295]: W0912 17:50:51.634917 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.634935 kubelet[3295]: E0912 17:50:51.634922 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.635004 kubelet[3295]: E0912 17:50:51.634999 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.635022 kubelet[3295]: W0912 17:50:51.635004 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.635022 kubelet[3295]: E0912 17:50:51.635009 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.635093 kubelet[3295]: E0912 17:50:51.635088 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.635130 kubelet[3295]: W0912 17:50:51.635094 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.635130 kubelet[3295]: E0912 17:50:51.635106 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.635228 kubelet[3295]: E0912 17:50:51.635222 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.635228 kubelet[3295]: W0912 17:50:51.635227 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.635261 kubelet[3295]: E0912 17:50:51.635232 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.635269 systemd[1]: Started cri-containerd-509591bd9f1ebc0ab26a795eb5fcf0e5740e5b3a353f215a900816593ee9af9d.scope - libcontainer container 509591bd9f1ebc0ab26a795eb5fcf0e5740e5b3a353f215a900816593ee9af9d. Sep 12 17:50:51.635314 kubelet[3295]: E0912 17:50:51.635305 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.635314 kubelet[3295]: W0912 17:50:51.635309 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.635348 kubelet[3295]: E0912 17:50:51.635314 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.635394 kubelet[3295]: E0912 17:50:51.635389 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.635394 kubelet[3295]: W0912 17:50:51.635393 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.635430 kubelet[3295]: E0912 17:50:51.635398 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.635478 kubelet[3295]: E0912 17:50:51.635473 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.635478 kubelet[3295]: W0912 17:50:51.635478 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.635516 kubelet[3295]: E0912 17:50:51.635483 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.635562 kubelet[3295]: E0912 17:50:51.635558 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.635583 kubelet[3295]: W0912 17:50:51.635562 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.635583 kubelet[3295]: E0912 17:50:51.635567 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.635688 kubelet[3295]: E0912 17:50:51.635682 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.635706 kubelet[3295]: W0912 17:50:51.635689 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.635706 kubelet[3295]: E0912 17:50:51.635693 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.635774 kubelet[3295]: E0912 17:50:51.635769 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.635795 kubelet[3295]: W0912 17:50:51.635774 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.635795 kubelet[3295]: E0912 17:50:51.635778 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.635857 kubelet[3295]: E0912 17:50:51.635852 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.635875 kubelet[3295]: W0912 17:50:51.635857 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.635875 kubelet[3295]: E0912 17:50:51.635862 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.635947 kubelet[3295]: E0912 17:50:51.635942 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.635947 kubelet[3295]: W0912 17:50:51.635946 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.635981 kubelet[3295]: E0912 17:50:51.635951 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.636034 kubelet[3295]: E0912 17:50:51.636023 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.636034 kubelet[3295]: W0912 17:50:51.636030 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.636128 kubelet[3295]: E0912 17:50:51.636037 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.636171 kubelet[3295]: E0912 17:50:51.636136 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.636171 kubelet[3295]: W0912 17:50:51.636143 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.636171 kubelet[3295]: E0912 17:50:51.636151 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.636252 kubelet[3295]: E0912 17:50:51.636247 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.636282 kubelet[3295]: W0912 17:50:51.636254 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.636282 kubelet[3295]: E0912 17:50:51.636261 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.639447 kubelet[3295]: E0912 17:50:51.639438 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.639447 kubelet[3295]: W0912 17:50:51.639445 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.639507 kubelet[3295]: E0912 17:50:51.639453 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.639507 kubelet[3295]: I0912 17:50:51.639469 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/86dbef57-8f65-4cc3-ae59-c02f1dffe403-socket-dir\") pod \"csi-node-driver-8vfbh\" (UID: \"86dbef57-8f65-4cc3-ae59-c02f1dffe403\") " pod="calico-system/csi-node-driver-8vfbh" Sep 12 17:50:51.639607 kubelet[3295]: E0912 17:50:51.639599 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.639607 kubelet[3295]: W0912 17:50:51.639606 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.639647 kubelet[3295]: E0912 17:50:51.639614 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.639647 kubelet[3295]: I0912 17:50:51.639623 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/86dbef57-8f65-4cc3-ae59-c02f1dffe403-registration-dir\") pod \"csi-node-driver-8vfbh\" (UID: \"86dbef57-8f65-4cc3-ae59-c02f1dffe403\") " pod="calico-system/csi-node-driver-8vfbh" Sep 12 17:50:51.639762 kubelet[3295]: E0912 17:50:51.639753 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.639784 kubelet[3295]: W0912 17:50:51.639763 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.639784 kubelet[3295]: E0912 17:50:51.639774 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.639821 kubelet[3295]: I0912 17:50:51.639789 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/86dbef57-8f65-4cc3-ae59-c02f1dffe403-varrun\") pod \"csi-node-driver-8vfbh\" (UID: \"86dbef57-8f65-4cc3-ae59-c02f1dffe403\") " pod="calico-system/csi-node-driver-8vfbh" Sep 12 17:50:51.639905 kubelet[3295]: E0912 17:50:51.639898 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.639905 kubelet[3295]: W0912 17:50:51.639904 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.639943 kubelet[3295]: E0912 17:50:51.639912 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.639943 kubelet[3295]: I0912 17:50:51.639921 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phbqf\" (UniqueName: \"kubernetes.io/projected/86dbef57-8f65-4cc3-ae59-c02f1dffe403-kube-api-access-phbqf\") pod \"csi-node-driver-8vfbh\" (UID: \"86dbef57-8f65-4cc3-ae59-c02f1dffe403\") " pod="calico-system/csi-node-driver-8vfbh" Sep 12 17:50:51.640049 kubelet[3295]: E0912 17:50:51.640042 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.640049 kubelet[3295]: W0912 17:50:51.640048 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.640086 kubelet[3295]: E0912 17:50:51.640055 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.640086 kubelet[3295]: I0912 17:50:51.640065 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86dbef57-8f65-4cc3-ae59-c02f1dffe403-kubelet-dir\") pod \"csi-node-driver-8vfbh\" (UID: \"86dbef57-8f65-4cc3-ae59-c02f1dffe403\") " pod="calico-system/csi-node-driver-8vfbh" Sep 12 17:50:51.640221 kubelet[3295]: E0912 17:50:51.640213 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.640221 kubelet[3295]: W0912 17:50:51.640220 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.640267 kubelet[3295]: E0912 17:50:51.640228 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.640319 kubelet[3295]: E0912 17:50:51.640313 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.640340 kubelet[3295]: W0912 17:50:51.640318 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.640340 kubelet[3295]: E0912 17:50:51.640325 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.640432 kubelet[3295]: E0912 17:50:51.640426 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.640454 kubelet[3295]: W0912 17:50:51.640432 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.640454 kubelet[3295]: E0912 17:50:51.640438 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.640531 kubelet[3295]: E0912 17:50:51.640526 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.640550 kubelet[3295]: W0912 17:50:51.640531 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.640550 kubelet[3295]: E0912 17:50:51.640538 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.640636 kubelet[3295]: E0912 17:50:51.640631 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.640658 kubelet[3295]: W0912 17:50:51.640636 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.640658 kubelet[3295]: E0912 17:50:51.640643 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.640734 kubelet[3295]: E0912 17:50:51.640728 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.640734 kubelet[3295]: W0912 17:50:51.640733 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.640795 kubelet[3295]: E0912 17:50:51.640740 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.640887 kubelet[3295]: E0912 17:50:51.640876 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.640930 kubelet[3295]: W0912 17:50:51.640886 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.640930 kubelet[3295]: E0912 17:50:51.640900 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.641022 kubelet[3295]: E0912 17:50:51.641014 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.641022 kubelet[3295]: W0912 17:50:51.641021 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.641087 kubelet[3295]: E0912 17:50:51.641032 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.641167 kubelet[3295]: E0912 17:50:51.641158 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.641167 kubelet[3295]: W0912 17:50:51.641166 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.641244 kubelet[3295]: E0912 17:50:51.641175 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.641424 kubelet[3295]: E0912 17:50:51.641416 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.641424 kubelet[3295]: W0912 17:50:51.641422 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.641483 kubelet[3295]: E0912 17:50:51.641431 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.647395 containerd[1930]: time="2025-09-12T17:50:51.647347062Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-sd9ss,Uid:3c8b40e6-947f-4856-95d8-8287ff7bc293,Namespace:calico-system,Attempt:0,} returns sandbox id \"509591bd9f1ebc0ab26a795eb5fcf0e5740e5b3a353f215a900816593ee9af9d\"" Sep 12 17:50:51.740792 kubelet[3295]: E0912 17:50:51.740772 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.740792 kubelet[3295]: W0912 17:50:51.740787 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.740792 kubelet[3295]: E0912 17:50:51.740800 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.740973 kubelet[3295]: E0912 17:50:51.740966 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.740973 kubelet[3295]: W0912 17:50:51.740972 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.741028 kubelet[3295]: E0912 17:50:51.740980 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.741082 kubelet[3295]: E0912 17:50:51.741076 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.741082 kubelet[3295]: W0912 17:50:51.741081 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.741141 kubelet[3295]: E0912 17:50:51.741086 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.741223 kubelet[3295]: E0912 17:50:51.741215 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.741223 kubelet[3295]: W0912 17:50:51.741222 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.741274 kubelet[3295]: E0912 17:50:51.741230 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.741346 kubelet[3295]: E0912 17:50:51.741340 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.741346 kubelet[3295]: W0912 17:50:51.741345 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.741377 kubelet[3295]: E0912 17:50:51.741351 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.741436 kubelet[3295]: E0912 17:50:51.741430 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.741452 kubelet[3295]: W0912 17:50:51.741435 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.741452 kubelet[3295]: E0912 17:50:51.741441 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.741548 kubelet[3295]: E0912 17:50:51.741543 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.741548 kubelet[3295]: W0912 17:50:51.741548 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.741579 kubelet[3295]: E0912 17:50:51.741554 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.741642 kubelet[3295]: E0912 17:50:51.741637 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.741658 kubelet[3295]: W0912 17:50:51.741642 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.741658 kubelet[3295]: E0912 17:50:51.741648 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.741744 kubelet[3295]: E0912 17:50:51.741737 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.741762 kubelet[3295]: W0912 17:50:51.741746 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.741762 kubelet[3295]: E0912 17:50:51.741755 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.741852 kubelet[3295]: E0912 17:50:51.741847 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.741852 kubelet[3295]: W0912 17:50:51.741852 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.741886 kubelet[3295]: E0912 17:50:51.741859 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.741968 kubelet[3295]: E0912 17:50:51.741963 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.741988 kubelet[3295]: W0912 17:50:51.741968 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.741988 kubelet[3295]: E0912 17:50:51.741974 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.742058 kubelet[3295]: E0912 17:50:51.742053 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.742074 kubelet[3295]: W0912 17:50:51.742058 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.742074 kubelet[3295]: E0912 17:50:51.742064 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.742161 kubelet[3295]: E0912 17:50:51.742155 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.742179 kubelet[3295]: W0912 17:50:51.742161 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.742179 kubelet[3295]: E0912 17:50:51.742167 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.742257 kubelet[3295]: E0912 17:50:51.742251 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.742296 kubelet[3295]: W0912 17:50:51.742257 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.742296 kubelet[3295]: E0912 17:50:51.742264 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.742345 kubelet[3295]: E0912 17:50:51.742340 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.742367 kubelet[3295]: W0912 17:50:51.742346 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.742367 kubelet[3295]: E0912 17:50:51.742359 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.742432 kubelet[3295]: E0912 17:50:51.742427 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.742432 kubelet[3295]: W0912 17:50:51.742431 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.742468 kubelet[3295]: E0912 17:50:51.742442 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.742515 kubelet[3295]: E0912 17:50:51.742509 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.742515 kubelet[3295]: W0912 17:50:51.742514 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.742550 kubelet[3295]: E0912 17:50:51.742521 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.742606 kubelet[3295]: E0912 17:50:51.742599 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.742606 kubelet[3295]: W0912 17:50:51.742604 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.742667 kubelet[3295]: E0912 17:50:51.742610 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.742701 kubelet[3295]: E0912 17:50:51.742693 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.742701 kubelet[3295]: W0912 17:50:51.742699 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.742749 kubelet[3295]: E0912 17:50:51.742706 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.742822 kubelet[3295]: E0912 17:50:51.742812 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.742822 kubelet[3295]: W0912 17:50:51.742819 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.742903 kubelet[3295]: E0912 17:50:51.742829 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.742957 kubelet[3295]: E0912 17:50:51.742950 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.742990 kubelet[3295]: W0912 17:50:51.742957 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.742990 kubelet[3295]: E0912 17:50:51.742966 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.743071 kubelet[3295]: E0912 17:50:51.743064 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.743071 kubelet[3295]: W0912 17:50:51.743070 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.743143 kubelet[3295]: E0912 17:50:51.743082 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.743185 kubelet[3295]: E0912 17:50:51.743178 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.743185 kubelet[3295]: W0912 17:50:51.743184 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.743242 kubelet[3295]: E0912 17:50:51.743194 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.743312 kubelet[3295]: E0912 17:50:51.743305 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.743312 kubelet[3295]: W0912 17:50:51.743311 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.743368 kubelet[3295]: E0912 17:50:51.743321 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.743474 kubelet[3295]: E0912 17:50:51.743467 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.743474 kubelet[3295]: W0912 17:50:51.743473 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.743511 kubelet[3295]: E0912 17:50:51.743479 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:51.747668 kubelet[3295]: E0912 17:50:51.747630 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:51.747668 kubelet[3295]: W0912 17:50:51.747638 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:51.747668 kubelet[3295]: E0912 17:50:51.747644 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:52.814211 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3563960093.mount: Deactivated successfully. Sep 12 17:50:53.338457 kubelet[3295]: E0912 17:50:53.338432 3295 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8vfbh" podUID="86dbef57-8f65-4cc3-ae59-c02f1dffe403" Sep 12 17:50:53.386883 containerd[1930]: time="2025-09-12T17:50:53.386834885Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:53.387112 containerd[1930]: time="2025-09-12T17:50:53.386973040Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 12 17:50:53.387419 containerd[1930]: time="2025-09-12T17:50:53.387367993Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:53.388187 containerd[1930]: time="2025-09-12T17:50:53.388175618Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:53.388576 containerd[1930]: time="2025-09-12T17:50:53.388561190Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.08287472s" Sep 12 17:50:53.388612 containerd[1930]: time="2025-09-12T17:50:53.388580245Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 12 17:50:53.389062 containerd[1930]: time="2025-09-12T17:50:53.389051622Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 17:50:53.392209 containerd[1930]: time="2025-09-12T17:50:53.392167344Z" level=info msg="CreateContainer within sandbox \"74e3efc4fd17c59edbf02c4a18dcefc12e018e96c1ad1f8e781c600d931feda7\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 17:50:53.394857 containerd[1930]: time="2025-09-12T17:50:53.394843738Z" level=info msg="Container 9281f8f9ecc81683425d464fbde2c7f2772fa103feeae2759d5a265f9f660a78: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:50:53.397413 containerd[1930]: time="2025-09-12T17:50:53.397398059Z" level=info msg="CreateContainer within sandbox \"74e3efc4fd17c59edbf02c4a18dcefc12e018e96c1ad1f8e781c600d931feda7\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"9281f8f9ecc81683425d464fbde2c7f2772fa103feeae2759d5a265f9f660a78\"" Sep 12 17:50:53.397658 containerd[1930]: time="2025-09-12T17:50:53.397605028Z" level=info msg="StartContainer for \"9281f8f9ecc81683425d464fbde2c7f2772fa103feeae2759d5a265f9f660a78\"" Sep 12 17:50:53.398149 containerd[1930]: time="2025-09-12T17:50:53.398132390Z" level=info msg="connecting to shim 9281f8f9ecc81683425d464fbde2c7f2772fa103feeae2759d5a265f9f660a78" address="unix:///run/containerd/s/0da8e918c768c3ca93a692ba931c772dbbbb6f6e7e21e587b6623390fb5782a9" protocol=ttrpc version=3 Sep 12 17:50:53.412281 systemd[1]: Started cri-containerd-9281f8f9ecc81683425d464fbde2c7f2772fa103feeae2759d5a265f9f660a78.scope - libcontainer container 9281f8f9ecc81683425d464fbde2c7f2772fa103feeae2759d5a265f9f660a78. Sep 12 17:50:53.440063 containerd[1930]: time="2025-09-12T17:50:53.440036707Z" level=info msg="StartContainer for \"9281f8f9ecc81683425d464fbde2c7f2772fa103feeae2759d5a265f9f660a78\" returns successfully" Sep 12 17:50:54.416142 kubelet[3295]: I0912 17:50:54.415991 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-55b64d7ccd-d6qpg" podStartSLOduration=2.33249679 podStartE2EDuration="4.415949599s" podCreationTimestamp="2025-09-12 17:50:50 +0000 UTC" firstStartedPulling="2025-09-12 17:50:51.305549032 +0000 UTC m=+16.051401418" lastFinishedPulling="2025-09-12 17:50:53.38900184 +0000 UTC m=+18.134854227" observedRunningTime="2025-09-12 17:50:54.415483094 +0000 UTC m=+19.161335547" watchObservedRunningTime="2025-09-12 17:50:54.415949599 +0000 UTC m=+19.161802026" Sep 12 17:50:54.459203 kubelet[3295]: E0912 17:50:54.459092 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.459203 kubelet[3295]: W0912 17:50:54.459162 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.459203 kubelet[3295]: E0912 17:50:54.459202 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.459842 kubelet[3295]: E0912 17:50:54.459765 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.459842 kubelet[3295]: W0912 17:50:54.459800 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.459842 kubelet[3295]: E0912 17:50:54.459837 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.460484 kubelet[3295]: E0912 17:50:54.460404 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.460484 kubelet[3295]: W0912 17:50:54.460433 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.460484 kubelet[3295]: E0912 17:50:54.460462 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.460994 kubelet[3295]: E0912 17:50:54.460915 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.460994 kubelet[3295]: W0912 17:50:54.460943 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.460994 kubelet[3295]: E0912 17:50:54.460971 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.461527 kubelet[3295]: E0912 17:50:54.461496 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.461622 kubelet[3295]: W0912 17:50:54.461525 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.461622 kubelet[3295]: E0912 17:50:54.461553 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.461998 kubelet[3295]: E0912 17:50:54.461969 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.462093 kubelet[3295]: W0912 17:50:54.461996 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.462093 kubelet[3295]: E0912 17:50:54.462023 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.462503 kubelet[3295]: E0912 17:50:54.462476 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.462503 kubelet[3295]: W0912 17:50:54.462499 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.462674 kubelet[3295]: E0912 17:50:54.462523 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.462991 kubelet[3295]: E0912 17:50:54.462947 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.462991 kubelet[3295]: W0912 17:50:54.462969 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.462991 kubelet[3295]: E0912 17:50:54.462992 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.463530 kubelet[3295]: E0912 17:50:54.463476 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.463530 kubelet[3295]: W0912 17:50:54.463502 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.463702 kubelet[3295]: E0912 17:50:54.463544 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.464030 kubelet[3295]: E0912 17:50:54.463975 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.464030 kubelet[3295]: W0912 17:50:54.464002 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.464030 kubelet[3295]: E0912 17:50:54.464026 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.464549 kubelet[3295]: E0912 17:50:54.464489 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.464549 kubelet[3295]: W0912 17:50:54.464523 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.464718 kubelet[3295]: E0912 17:50:54.464560 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.465059 kubelet[3295]: E0912 17:50:54.465004 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.465059 kubelet[3295]: W0912 17:50:54.465033 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.465059 kubelet[3295]: E0912 17:50:54.465057 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.465622 kubelet[3295]: E0912 17:50:54.465563 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.465622 kubelet[3295]: W0912 17:50:54.465591 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.465622 kubelet[3295]: E0912 17:50:54.465615 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.466121 kubelet[3295]: E0912 17:50:54.466060 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.466121 kubelet[3295]: W0912 17:50:54.466083 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.466297 kubelet[3295]: E0912 17:50:54.466133 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.466632 kubelet[3295]: E0912 17:50:54.466572 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.466632 kubelet[3295]: W0912 17:50:54.466594 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.466632 kubelet[3295]: E0912 17:50:54.466615 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.467211 kubelet[3295]: E0912 17:50:54.467157 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.467211 kubelet[3295]: W0912 17:50:54.467182 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.467211 kubelet[3295]: E0912 17:50:54.467206 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.467756 kubelet[3295]: E0912 17:50:54.467702 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.467756 kubelet[3295]: W0912 17:50:54.467728 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.467964 kubelet[3295]: E0912 17:50:54.467765 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.468288 kubelet[3295]: E0912 17:50:54.468216 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.468288 kubelet[3295]: W0912 17:50:54.468238 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.468288 kubelet[3295]: E0912 17:50:54.468269 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.468867 kubelet[3295]: E0912 17:50:54.468787 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.468867 kubelet[3295]: W0912 17:50:54.468814 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.468867 kubelet[3295]: E0912 17:50:54.468846 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.469340 kubelet[3295]: E0912 17:50:54.469307 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.469340 kubelet[3295]: W0912 17:50:54.469333 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.469507 kubelet[3295]: E0912 17:50:54.469449 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.469827 kubelet[3295]: E0912 17:50:54.469760 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.469827 kubelet[3295]: W0912 17:50:54.469791 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.470018 kubelet[3295]: E0912 17:50:54.469891 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.470277 kubelet[3295]: E0912 17:50:54.470225 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.470277 kubelet[3295]: W0912 17:50:54.470249 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.470450 kubelet[3295]: E0912 17:50:54.470327 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.470719 kubelet[3295]: E0912 17:50:54.470669 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.470719 kubelet[3295]: W0912 17:50:54.470691 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.470913 kubelet[3295]: E0912 17:50:54.470722 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.471194 kubelet[3295]: E0912 17:50:54.471140 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.471194 kubelet[3295]: W0912 17:50:54.471165 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.471194 kubelet[3295]: E0912 17:50:54.471193 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.471693 kubelet[3295]: E0912 17:50:54.471647 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.471693 kubelet[3295]: W0912 17:50:54.471672 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.471881 kubelet[3295]: E0912 17:50:54.471702 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.472278 kubelet[3295]: E0912 17:50:54.472216 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.472278 kubelet[3295]: W0912 17:50:54.472247 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.472464 kubelet[3295]: E0912 17:50:54.472313 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.472704 kubelet[3295]: E0912 17:50:54.472650 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.472704 kubelet[3295]: W0912 17:50:54.472677 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.472908 kubelet[3295]: E0912 17:50:54.472760 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.473190 kubelet[3295]: E0912 17:50:54.473136 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.473190 kubelet[3295]: W0912 17:50:54.473162 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.473363 kubelet[3295]: E0912 17:50:54.473193 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.473886 kubelet[3295]: E0912 17:50:54.473824 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.473886 kubelet[3295]: W0912 17:50:54.473857 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.474081 kubelet[3295]: E0912 17:50:54.473900 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.474389 kubelet[3295]: E0912 17:50:54.474311 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.474389 kubelet[3295]: W0912 17:50:54.474340 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.474389 kubelet[3295]: E0912 17:50:54.474376 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.474973 kubelet[3295]: E0912 17:50:54.474910 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.474973 kubelet[3295]: W0912 17:50:54.474936 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.474973 kubelet[3295]: E0912 17:50:54.474969 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.475599 kubelet[3295]: E0912 17:50:54.475548 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.475599 kubelet[3295]: W0912 17:50:54.475575 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.475599 kubelet[3295]: E0912 17:50:54.475601 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:54.476145 kubelet[3295]: E0912 17:50:54.476115 3295 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:50:54.476145 kubelet[3295]: W0912 17:50:54.476126 3295 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:50:54.476145 kubelet[3295]: E0912 17:50:54.476137 3295 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:50:55.183164 containerd[1930]: time="2025-09-12T17:50:55.183124976Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:55.183438 containerd[1930]: time="2025-09-12T17:50:55.183360081Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 12 17:50:55.183605 containerd[1930]: time="2025-09-12T17:50:55.183592527Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:55.201339 containerd[1930]: time="2025-09-12T17:50:55.201262130Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:55.202151 containerd[1930]: time="2025-09-12T17:50:55.202105174Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.813024259s" Sep 12 17:50:55.202151 containerd[1930]: time="2025-09-12T17:50:55.202144659Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 12 17:50:55.203558 containerd[1930]: time="2025-09-12T17:50:55.203545172Z" level=info msg="CreateContainer within sandbox \"509591bd9f1ebc0ab26a795eb5fcf0e5740e5b3a353f215a900816593ee9af9d\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 17:50:55.208017 containerd[1930]: time="2025-09-12T17:50:55.207997499Z" level=info msg="Container ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:50:55.211796 containerd[1930]: time="2025-09-12T17:50:55.211782678Z" level=info msg="CreateContainer within sandbox \"509591bd9f1ebc0ab26a795eb5fcf0e5740e5b3a353f215a900816593ee9af9d\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a\"" Sep 12 17:50:55.212112 containerd[1930]: time="2025-09-12T17:50:55.212096304Z" level=info msg="StartContainer for \"ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a\"" Sep 12 17:50:55.212912 containerd[1930]: time="2025-09-12T17:50:55.212876532Z" level=info msg="connecting to shim ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a" address="unix:///run/containerd/s/db1125c6d0ef3ce4baaae9e0498a00bb4cbc17e67b36ff50c2dad96660e534fd" protocol=ttrpc version=3 Sep 12 17:50:55.233235 systemd[1]: Started cri-containerd-ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a.scope - libcontainer container ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a. Sep 12 17:50:55.256395 containerd[1930]: time="2025-09-12T17:50:55.256338364Z" level=info msg="StartContainer for \"ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a\" returns successfully" Sep 12 17:50:55.261500 systemd[1]: cri-containerd-ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a.scope: Deactivated successfully. Sep 12 17:50:55.262717 containerd[1930]: time="2025-09-12T17:50:55.262695388Z" level=info msg="received exit event container_id:\"ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a\" id:\"ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a\" pid:4114 exited_at:{seconds:1757699455 nanos:262417316}" Sep 12 17:50:55.262852 containerd[1930]: time="2025-09-12T17:50:55.262810382Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a\" id:\"ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a\" pid:4114 exited_at:{seconds:1757699455 nanos:262417316}" Sep 12 17:50:55.277185 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a-rootfs.mount: Deactivated successfully. Sep 12 17:50:55.339455 kubelet[3295]: E0912 17:50:55.339339 3295 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8vfbh" podUID="86dbef57-8f65-4cc3-ae59-c02f1dffe403" Sep 12 17:50:55.404483 kubelet[3295]: I0912 17:50:55.404430 3295 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:50:56.413025 containerd[1930]: time="2025-09-12T17:50:56.412950363Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 17:50:57.339069 kubelet[3295]: E0912 17:50:57.338968 3295 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8vfbh" podUID="86dbef57-8f65-4cc3-ae59-c02f1dffe403" Sep 12 17:50:58.715588 containerd[1930]: time="2025-09-12T17:50:58.715564894Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:58.715998 containerd[1930]: time="2025-09-12T17:50:58.715979524Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 12 17:50:58.716696 containerd[1930]: time="2025-09-12T17:50:58.716679264Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:58.717649 containerd[1930]: time="2025-09-12T17:50:58.717636456Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:50:58.718058 containerd[1930]: time="2025-09-12T17:50:58.718045660Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 2.305027894s" Sep 12 17:50:58.718083 containerd[1930]: time="2025-09-12T17:50:58.718062209Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 12 17:50:58.718944 containerd[1930]: time="2025-09-12T17:50:58.718930274Z" level=info msg="CreateContainer within sandbox \"509591bd9f1ebc0ab26a795eb5fcf0e5740e5b3a353f215a900816593ee9af9d\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 17:50:58.722340 containerd[1930]: time="2025-09-12T17:50:58.722328089Z" level=info msg="Container a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:50:58.725992 containerd[1930]: time="2025-09-12T17:50:58.725949207Z" level=info msg="CreateContainer within sandbox \"509591bd9f1ebc0ab26a795eb5fcf0e5740e5b3a353f215a900816593ee9af9d\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6\"" Sep 12 17:50:58.726229 containerd[1930]: time="2025-09-12T17:50:58.726187949Z" level=info msg="StartContainer for \"a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6\"" Sep 12 17:50:58.726958 containerd[1930]: time="2025-09-12T17:50:58.726907034Z" level=info msg="connecting to shim a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6" address="unix:///run/containerd/s/db1125c6d0ef3ce4baaae9e0498a00bb4cbc17e67b36ff50c2dad96660e534fd" protocol=ttrpc version=3 Sep 12 17:50:58.746319 systemd[1]: Started cri-containerd-a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6.scope - libcontainer container a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6. Sep 12 17:50:58.766612 containerd[1930]: time="2025-09-12T17:50:58.766592044Z" level=info msg="StartContainer for \"a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6\" returns successfully" Sep 12 17:50:59.336151 systemd[1]: cri-containerd-a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6.scope: Deactivated successfully. Sep 12 17:50:59.336390 systemd[1]: cri-containerd-a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6.scope: Consumed 484ms CPU time, 195.7M memory peak, 171.3M written to disk. Sep 12 17:50:59.336859 containerd[1930]: time="2025-09-12T17:50:59.336838669Z" level=info msg="received exit event container_id:\"a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6\" id:\"a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6\" pid:4173 exited_at:{seconds:1757699459 nanos:336723036}" Sep 12 17:50:59.336945 containerd[1930]: time="2025-09-12T17:50:59.336925639Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6\" id:\"a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6\" pid:4173 exited_at:{seconds:1757699459 nanos:336723036}" Sep 12 17:50:59.338034 kubelet[3295]: E0912 17:50:59.338010 3295 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8vfbh" podUID="86dbef57-8f65-4cc3-ae59-c02f1dffe403" Sep 12 17:50:59.350933 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6-rootfs.mount: Deactivated successfully. Sep 12 17:50:59.361772 kubelet[3295]: I0912 17:50:59.361755 3295 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 12 17:50:59.382045 systemd[1]: Created slice kubepods-besteffort-pode36e7f2e_9196_4fee_b58b_1dd28a04759a.slice - libcontainer container kubepods-besteffort-pode36e7f2e_9196_4fee_b58b_1dd28a04759a.slice. Sep 12 17:50:59.387942 systemd[1]: Created slice kubepods-besteffort-pod184347ef_ae58_4681_aaab_683007b08cb9.slice - libcontainer container kubepods-besteffort-pod184347ef_ae58_4681_aaab_683007b08cb9.slice. Sep 12 17:50:59.392492 systemd[1]: Created slice kubepods-besteffort-pod95881986_b9ee_4892_8fba_1273c4688890.slice - libcontainer container kubepods-besteffort-pod95881986_b9ee_4892_8fba_1273c4688890.slice. Sep 12 17:50:59.397935 systemd[1]: Created slice kubepods-burstable-podce1692a5_2537_46fd_a646_e7a446075a37.slice - libcontainer container kubepods-burstable-podce1692a5_2537_46fd_a646_e7a446075a37.slice. Sep 12 17:50:59.402984 systemd[1]: Created slice kubepods-besteffort-poda38ef781_aaef_47e2_89f1_6f4af63895e3.slice - libcontainer container kubepods-besteffort-poda38ef781_aaef_47e2_89f1_6f4af63895e3.slice. Sep 12 17:50:59.407407 systemd[1]: Created slice kubepods-besteffort-podeb7395a3_5fae_4a36_a604_8f98178712b4.slice - libcontainer container kubepods-besteffort-podeb7395a3_5fae_4a36_a604_8f98178712b4.slice. Sep 12 17:50:59.411691 systemd[1]: Created slice kubepods-burstable-pod0983e9df_c3e1_467a_baa8_004bbe511d69.slice - libcontainer container kubepods-burstable-pod0983e9df_c3e1_467a_baa8_004bbe511d69.slice. Sep 12 17:50:59.506962 kubelet[3295]: I0912 17:50:59.506848 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a38ef781-aaef-47e2-89f1-6f4af63895e3-goldmane-ca-bundle\") pod \"goldmane-7988f88666-bmktx\" (UID: \"a38ef781-aaef-47e2-89f1-6f4af63895e3\") " pod="calico-system/goldmane-7988f88666-bmktx" Sep 12 17:50:59.506962 kubelet[3295]: I0912 17:50:59.506952 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vktz5\" (UniqueName: \"kubernetes.io/projected/eb7395a3-5fae-4a36-a604-8f98178712b4-kube-api-access-vktz5\") pod \"calico-apiserver-7c89f6dbbd-kbbhh\" (UID: \"eb7395a3-5fae-4a36-a604-8f98178712b4\") " pod="calico-apiserver/calico-apiserver-7c89f6dbbd-kbbhh" Sep 12 17:50:59.507389 kubelet[3295]: I0912 17:50:59.507039 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgdpb\" (UniqueName: \"kubernetes.io/projected/e36e7f2e-9196-4fee-b58b-1dd28a04759a-kube-api-access-kgdpb\") pod \"calico-apiserver-7c89f6dbbd-fcrhd\" (UID: \"e36e7f2e-9196-4fee-b58b-1dd28a04759a\") " pod="calico-apiserver/calico-apiserver-7c89f6dbbd-fcrhd" Sep 12 17:50:59.507389 kubelet[3295]: I0912 17:50:59.507156 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gncs2\" (UniqueName: \"kubernetes.io/projected/0983e9df-c3e1-467a-baa8-004bbe511d69-kube-api-access-gncs2\") pod \"coredns-7c65d6cfc9-44h8j\" (UID: \"0983e9df-c3e1-467a-baa8-004bbe511d69\") " pod="kube-system/coredns-7c65d6cfc9-44h8j" Sep 12 17:50:59.507389 kubelet[3295]: I0912 17:50:59.507235 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0983e9df-c3e1-467a-baa8-004bbe511d69-config-volume\") pod \"coredns-7c65d6cfc9-44h8j\" (UID: \"0983e9df-c3e1-467a-baa8-004bbe511d69\") " pod="kube-system/coredns-7c65d6cfc9-44h8j" Sep 12 17:50:59.507652 kubelet[3295]: I0912 17:50:59.507407 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd68r\" (UniqueName: \"kubernetes.io/projected/ce1692a5-2537-46fd-a646-e7a446075a37-kube-api-access-pd68r\") pod \"coredns-7c65d6cfc9-n6k4b\" (UID: \"ce1692a5-2537-46fd-a646-e7a446075a37\") " pod="kube-system/coredns-7c65d6cfc9-n6k4b" Sep 12 17:50:59.507652 kubelet[3295]: I0912 17:50:59.507493 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/eb7395a3-5fae-4a36-a604-8f98178712b4-calico-apiserver-certs\") pod \"calico-apiserver-7c89f6dbbd-kbbhh\" (UID: \"eb7395a3-5fae-4a36-a604-8f98178712b4\") " pod="calico-apiserver/calico-apiserver-7c89f6dbbd-kbbhh" Sep 12 17:50:59.507652 kubelet[3295]: I0912 17:50:59.507543 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99dqj\" (UniqueName: \"kubernetes.io/projected/95881986-b9ee-4892-8fba-1273c4688890-kube-api-access-99dqj\") pod \"calico-kube-controllers-5bd8b64d9c-m7tcm\" (UID: \"95881986-b9ee-4892-8fba-1273c4688890\") " pod="calico-system/calico-kube-controllers-5bd8b64d9c-m7tcm" Sep 12 17:50:59.507919 kubelet[3295]: I0912 17:50:59.507643 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a38ef781-aaef-47e2-89f1-6f4af63895e3-config\") pod \"goldmane-7988f88666-bmktx\" (UID: \"a38ef781-aaef-47e2-89f1-6f4af63895e3\") " pod="calico-system/goldmane-7988f88666-bmktx" Sep 12 17:50:59.507919 kubelet[3295]: I0912 17:50:59.507710 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p8kh\" (UniqueName: \"kubernetes.io/projected/a38ef781-aaef-47e2-89f1-6f4af63895e3-kube-api-access-4p8kh\") pod \"goldmane-7988f88666-bmktx\" (UID: \"a38ef781-aaef-47e2-89f1-6f4af63895e3\") " pod="calico-system/goldmane-7988f88666-bmktx" Sep 12 17:50:59.507919 kubelet[3295]: I0912 17:50:59.507776 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kflm2\" (UniqueName: \"kubernetes.io/projected/184347ef-ae58-4681-aaab-683007b08cb9-kube-api-access-kflm2\") pod \"whisker-555ccc88ff-plwcl\" (UID: \"184347ef-ae58-4681-aaab-683007b08cb9\") " pod="calico-system/whisker-555ccc88ff-plwcl" Sep 12 17:50:59.507919 kubelet[3295]: I0912 17:50:59.507842 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/a38ef781-aaef-47e2-89f1-6f4af63895e3-goldmane-key-pair\") pod \"goldmane-7988f88666-bmktx\" (UID: \"a38ef781-aaef-47e2-89f1-6f4af63895e3\") " pod="calico-system/goldmane-7988f88666-bmktx" Sep 12 17:50:59.507919 kubelet[3295]: I0912 17:50:59.507914 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/184347ef-ae58-4681-aaab-683007b08cb9-whisker-backend-key-pair\") pod \"whisker-555ccc88ff-plwcl\" (UID: \"184347ef-ae58-4681-aaab-683007b08cb9\") " pod="calico-system/whisker-555ccc88ff-plwcl" Sep 12 17:50:59.508370 kubelet[3295]: I0912 17:50:59.507976 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95881986-b9ee-4892-8fba-1273c4688890-tigera-ca-bundle\") pod \"calico-kube-controllers-5bd8b64d9c-m7tcm\" (UID: \"95881986-b9ee-4892-8fba-1273c4688890\") " pod="calico-system/calico-kube-controllers-5bd8b64d9c-m7tcm" Sep 12 17:50:59.508370 kubelet[3295]: I0912 17:50:59.508033 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/184347ef-ae58-4681-aaab-683007b08cb9-whisker-ca-bundle\") pod \"whisker-555ccc88ff-plwcl\" (UID: \"184347ef-ae58-4681-aaab-683007b08cb9\") " pod="calico-system/whisker-555ccc88ff-plwcl" Sep 12 17:50:59.508370 kubelet[3295]: I0912 17:50:59.508262 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce1692a5-2537-46fd-a646-e7a446075a37-config-volume\") pod \"coredns-7c65d6cfc9-n6k4b\" (UID: \"ce1692a5-2537-46fd-a646-e7a446075a37\") " pod="kube-system/coredns-7c65d6cfc9-n6k4b" Sep 12 17:50:59.508370 kubelet[3295]: I0912 17:50:59.508346 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e36e7f2e-9196-4fee-b58b-1dd28a04759a-calico-apiserver-certs\") pod \"calico-apiserver-7c89f6dbbd-fcrhd\" (UID: \"e36e7f2e-9196-4fee-b58b-1dd28a04759a\") " pod="calico-apiserver/calico-apiserver-7c89f6dbbd-fcrhd" Sep 12 17:50:59.710941 containerd[1930]: time="2025-09-12T17:50:59.710890688Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c89f6dbbd-kbbhh,Uid:eb7395a3-5fae-4a36-a604-8f98178712b4,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:50:59.714444 containerd[1930]: time="2025-09-12T17:50:59.714423071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-44h8j,Uid:0983e9df-c3e1-467a-baa8-004bbe511d69,Namespace:kube-system,Attempt:0,}" Sep 12 17:50:59.736800 containerd[1930]: time="2025-09-12T17:50:59.736770863Z" level=error msg="Failed to destroy network for sandbox \"1a9d88c1100e254b0f01935f0f43e27aed574f20fa2e05cf22232470494241a0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:50:59.737247 containerd[1930]: time="2025-09-12T17:50:59.737227453Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c89f6dbbd-kbbhh,Uid:eb7395a3-5fae-4a36-a604-8f98178712b4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a9d88c1100e254b0f01935f0f43e27aed574f20fa2e05cf22232470494241a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:50:59.737411 kubelet[3295]: E0912 17:50:59.737379 3295 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a9d88c1100e254b0f01935f0f43e27aed574f20fa2e05cf22232470494241a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:50:59.737474 kubelet[3295]: E0912 17:50:59.737437 3295 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a9d88c1100e254b0f01935f0f43e27aed574f20fa2e05cf22232470494241a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7c89f6dbbd-kbbhh" Sep 12 17:50:59.737474 kubelet[3295]: E0912 17:50:59.737456 3295 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a9d88c1100e254b0f01935f0f43e27aed574f20fa2e05cf22232470494241a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7c89f6dbbd-kbbhh" Sep 12 17:50:59.737536 kubelet[3295]: E0912 17:50:59.737495 3295 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7c89f6dbbd-kbbhh_calico-apiserver(eb7395a3-5fae-4a36-a604-8f98178712b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7c89f6dbbd-kbbhh_calico-apiserver(eb7395a3-5fae-4a36-a604-8f98178712b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1a9d88c1100e254b0f01935f0f43e27aed574f20fa2e05cf22232470494241a0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7c89f6dbbd-kbbhh" podUID="eb7395a3-5fae-4a36-a604-8f98178712b4" Sep 12 17:50:59.738250 systemd[1]: run-netns-cni\x2df7338f4b\x2d390b\x2d5799\x2da97a\x2dd0eac196f790.mount: Deactivated successfully. Sep 12 17:50:59.740211 containerd[1930]: time="2025-09-12T17:50:59.740181368Z" level=error msg="Failed to destroy network for sandbox \"7f435873c41caf74b7f5546804ed4430e13d02279ec75c0ca40c1f81bbd61b0b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:50:59.740657 containerd[1930]: time="2025-09-12T17:50:59.740640654Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-44h8j,Uid:0983e9df-c3e1-467a-baa8-004bbe511d69,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f435873c41caf74b7f5546804ed4430e13d02279ec75c0ca40c1f81bbd61b0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:50:59.740765 kubelet[3295]: E0912 17:50:59.740744 3295 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f435873c41caf74b7f5546804ed4430e13d02279ec75c0ca40c1f81bbd61b0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:50:59.740812 kubelet[3295]: E0912 17:50:59.740781 3295 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f435873c41caf74b7f5546804ed4430e13d02279ec75c0ca40c1f81bbd61b0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-44h8j" Sep 12 17:50:59.740812 kubelet[3295]: E0912 17:50:59.740794 3295 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f435873c41caf74b7f5546804ed4430e13d02279ec75c0ca40c1f81bbd61b0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-44h8j" Sep 12 17:50:59.740852 kubelet[3295]: E0912 17:50:59.740817 3295 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-44h8j_kube-system(0983e9df-c3e1-467a-baa8-004bbe511d69)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-44h8j_kube-system(0983e9df-c3e1-467a-baa8-004bbe511d69)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7f435873c41caf74b7f5546804ed4430e13d02279ec75c0ca40c1f81bbd61b0b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-44h8j" podUID="0983e9df-c3e1-467a-baa8-004bbe511d69" Sep 12 17:50:59.741633 systemd[1]: run-netns-cni\x2d040ab9c1\x2d1716\x2d82c2\x2d0ffe\x2dabf436a484f1.mount: Deactivated successfully. Sep 12 17:50:59.985938 containerd[1930]: time="2025-09-12T17:50:59.985742977Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c89f6dbbd-fcrhd,Uid:e36e7f2e-9196-4fee-b58b-1dd28a04759a,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:50:59.991370 containerd[1930]: time="2025-09-12T17:50:59.991350493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-555ccc88ff-plwcl,Uid:184347ef-ae58-4681-aaab-683007b08cb9,Namespace:calico-system,Attempt:0,}" Sep 12 17:50:59.995874 containerd[1930]: time="2025-09-12T17:50:59.995853452Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bd8b64d9c-m7tcm,Uid:95881986-b9ee-4892-8fba-1273c4688890,Namespace:calico-system,Attempt:0,}" Sep 12 17:51:00.001421 containerd[1930]: time="2025-09-12T17:51:00.001396839Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-n6k4b,Uid:ce1692a5-2537-46fd-a646-e7a446075a37,Namespace:kube-system,Attempt:0,}" Sep 12 17:51:00.005857 containerd[1930]: time="2025-09-12T17:51:00.005833351Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-bmktx,Uid:a38ef781-aaef-47e2-89f1-6f4af63895e3,Namespace:calico-system,Attempt:0,}" Sep 12 17:51:00.014541 containerd[1930]: time="2025-09-12T17:51:00.014512316Z" level=error msg="Failed to destroy network for sandbox \"6a9b4cf0016cb21b78ac1221c86d40061accc360ff7612b031cec557c0e795d8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.018695 containerd[1930]: time="2025-09-12T17:51:00.018644192Z" level=error msg="Failed to destroy network for sandbox \"030010283647e6d4473943f5d8c7a184df6ee8b9686fcb4f26642e914c72a8d7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.022583 containerd[1930]: time="2025-09-12T17:51:00.022550307Z" level=error msg="Failed to destroy network for sandbox \"02b294ef50661a34097bf5be5323b84bdc57c6cbdd89a25aa12fef3e0fb3dc77\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.025408 containerd[1930]: time="2025-09-12T17:51:00.025377251Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c89f6dbbd-fcrhd,Uid:e36e7f2e-9196-4fee-b58b-1dd28a04759a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a9b4cf0016cb21b78ac1221c86d40061accc360ff7612b031cec557c0e795d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.025565 kubelet[3295]: E0912 17:51:00.025544 3295 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a9b4cf0016cb21b78ac1221c86d40061accc360ff7612b031cec557c0e795d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.025601 kubelet[3295]: E0912 17:51:00.025581 3295 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a9b4cf0016cb21b78ac1221c86d40061accc360ff7612b031cec557c0e795d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7c89f6dbbd-fcrhd" Sep 12 17:51:00.025601 kubelet[3295]: E0912 17:51:00.025595 3295 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a9b4cf0016cb21b78ac1221c86d40061accc360ff7612b031cec557c0e795d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7c89f6dbbd-fcrhd" Sep 12 17:51:00.025644 kubelet[3295]: E0912 17:51:00.025622 3295 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7c89f6dbbd-fcrhd_calico-apiserver(e36e7f2e-9196-4fee-b58b-1dd28a04759a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7c89f6dbbd-fcrhd_calico-apiserver(e36e7f2e-9196-4fee-b58b-1dd28a04759a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6a9b4cf0016cb21b78ac1221c86d40061accc360ff7612b031cec557c0e795d8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7c89f6dbbd-fcrhd" podUID="e36e7f2e-9196-4fee-b58b-1dd28a04759a" Sep 12 17:51:00.025814 containerd[1930]: time="2025-09-12T17:51:00.025794965Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-555ccc88ff-plwcl,Uid:184347ef-ae58-4681-aaab-683007b08cb9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"030010283647e6d4473943f5d8c7a184df6ee8b9686fcb4f26642e914c72a8d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.025882 kubelet[3295]: E0912 17:51:00.025870 3295 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"030010283647e6d4473943f5d8c7a184df6ee8b9686fcb4f26642e914c72a8d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.025914 kubelet[3295]: E0912 17:51:00.025888 3295 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"030010283647e6d4473943f5d8c7a184df6ee8b9686fcb4f26642e914c72a8d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-555ccc88ff-plwcl" Sep 12 17:51:00.025914 kubelet[3295]: E0912 17:51:00.025898 3295 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"030010283647e6d4473943f5d8c7a184df6ee8b9686fcb4f26642e914c72a8d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-555ccc88ff-plwcl" Sep 12 17:51:00.025952 kubelet[3295]: E0912 17:51:00.025929 3295 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-555ccc88ff-plwcl_calico-system(184347ef-ae58-4681-aaab-683007b08cb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-555ccc88ff-plwcl_calico-system(184347ef-ae58-4681-aaab-683007b08cb9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"030010283647e6d4473943f5d8c7a184df6ee8b9686fcb4f26642e914c72a8d7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-555ccc88ff-plwcl" podUID="184347ef-ae58-4681-aaab-683007b08cb9" Sep 12 17:51:00.026122 containerd[1930]: time="2025-09-12T17:51:00.026093345Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bd8b64d9c-m7tcm,Uid:95881986-b9ee-4892-8fba-1273c4688890,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"02b294ef50661a34097bf5be5323b84bdc57c6cbdd89a25aa12fef3e0fb3dc77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.026189 kubelet[3295]: E0912 17:51:00.026176 3295 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02b294ef50661a34097bf5be5323b84bdc57c6cbdd89a25aa12fef3e0fb3dc77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.026224 kubelet[3295]: E0912 17:51:00.026193 3295 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02b294ef50661a34097bf5be5323b84bdc57c6cbdd89a25aa12fef3e0fb3dc77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5bd8b64d9c-m7tcm" Sep 12 17:51:00.026224 kubelet[3295]: E0912 17:51:00.026202 3295 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02b294ef50661a34097bf5be5323b84bdc57c6cbdd89a25aa12fef3e0fb3dc77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5bd8b64d9c-m7tcm" Sep 12 17:51:00.026224 kubelet[3295]: E0912 17:51:00.026217 3295 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5bd8b64d9c-m7tcm_calico-system(95881986-b9ee-4892-8fba-1273c4688890)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5bd8b64d9c-m7tcm_calico-system(95881986-b9ee-4892-8fba-1273c4688890)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"02b294ef50661a34097bf5be5323b84bdc57c6cbdd89a25aa12fef3e0fb3dc77\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5bd8b64d9c-m7tcm" podUID="95881986-b9ee-4892-8fba-1273c4688890" Sep 12 17:51:00.032515 containerd[1930]: time="2025-09-12T17:51:00.032484744Z" level=error msg="Failed to destroy network for sandbox \"d784b0c13af093958da9e2b1293d7f99e6e068a6de5f161d624ce95241ab33e5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.032979 containerd[1930]: time="2025-09-12T17:51:00.032963193Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-n6k4b,Uid:ce1692a5-2537-46fd-a646-e7a446075a37,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d784b0c13af093958da9e2b1293d7f99e6e068a6de5f161d624ce95241ab33e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.033161 kubelet[3295]: E0912 17:51:00.033104 3295 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d784b0c13af093958da9e2b1293d7f99e6e068a6de5f161d624ce95241ab33e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.033161 kubelet[3295]: E0912 17:51:00.033150 3295 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d784b0c13af093958da9e2b1293d7f99e6e068a6de5f161d624ce95241ab33e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-n6k4b" Sep 12 17:51:00.033212 kubelet[3295]: E0912 17:51:00.033162 3295 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d784b0c13af093958da9e2b1293d7f99e6e068a6de5f161d624ce95241ab33e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-n6k4b" Sep 12 17:51:00.033212 kubelet[3295]: E0912 17:51:00.033187 3295 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-n6k4b_kube-system(ce1692a5-2537-46fd-a646-e7a446075a37)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-n6k4b_kube-system(ce1692a5-2537-46fd-a646-e7a446075a37)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d784b0c13af093958da9e2b1293d7f99e6e068a6de5f161d624ce95241ab33e5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-n6k4b" podUID="ce1692a5-2537-46fd-a646-e7a446075a37" Sep 12 17:51:00.033391 containerd[1930]: time="2025-09-12T17:51:00.033333958Z" level=error msg="Failed to destroy network for sandbox \"d00f82fbb0a40fb36df27c166591ffd58dcb36a76277554e1698ab6b9b1b9e1d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.033717 containerd[1930]: time="2025-09-12T17:51:00.033673524Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-bmktx,Uid:a38ef781-aaef-47e2-89f1-6f4af63895e3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d00f82fbb0a40fb36df27c166591ffd58dcb36a76277554e1698ab6b9b1b9e1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.033805 kubelet[3295]: E0912 17:51:00.033765 3295 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d00f82fbb0a40fb36df27c166591ffd58dcb36a76277554e1698ab6b9b1b9e1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:00.033805 kubelet[3295]: E0912 17:51:00.033792 3295 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d00f82fbb0a40fb36df27c166591ffd58dcb36a76277554e1698ab6b9b1b9e1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-bmktx" Sep 12 17:51:00.033876 kubelet[3295]: E0912 17:51:00.033802 3295 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d00f82fbb0a40fb36df27c166591ffd58dcb36a76277554e1698ab6b9b1b9e1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-bmktx" Sep 12 17:51:00.033876 kubelet[3295]: E0912 17:51:00.033824 3295 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-bmktx_calico-system(a38ef781-aaef-47e2-89f1-6f4af63895e3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-bmktx_calico-system(a38ef781-aaef-47e2-89f1-6f4af63895e3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d00f82fbb0a40fb36df27c166591ffd58dcb36a76277554e1698ab6b9b1b9e1d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-bmktx" podUID="a38ef781-aaef-47e2-89f1-6f4af63895e3" Sep 12 17:51:00.429631 containerd[1930]: time="2025-09-12T17:51:00.429552251Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 17:51:00.728016 systemd[1]: run-netns-cni\x2d8f095966\x2ddb10\x2dae30\x2dbf4f\x2d39812bc8c7e9.mount: Deactivated successfully. Sep 12 17:51:00.728069 systemd[1]: run-netns-cni\x2d336b26b1\x2decb5\x2d8967\x2d6929\x2d5b3b165297ff.mount: Deactivated successfully. Sep 12 17:51:01.341464 systemd[1]: Created slice kubepods-besteffort-pod86dbef57_8f65_4cc3_ae59_c02f1dffe403.slice - libcontainer container kubepods-besteffort-pod86dbef57_8f65_4cc3_ae59_c02f1dffe403.slice. Sep 12 17:51:01.343005 containerd[1930]: time="2025-09-12T17:51:01.342940457Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8vfbh,Uid:86dbef57-8f65-4cc3-ae59-c02f1dffe403,Namespace:calico-system,Attempt:0,}" Sep 12 17:51:01.368100 containerd[1930]: time="2025-09-12T17:51:01.368045985Z" level=error msg="Failed to destroy network for sandbox \"85b92cf5c2f7543920c26eb823589f0c930f1726767910e8ccb59fef37ffb273\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:01.368825 containerd[1930]: time="2025-09-12T17:51:01.368779216Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8vfbh,Uid:86dbef57-8f65-4cc3-ae59-c02f1dffe403,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"85b92cf5c2f7543920c26eb823589f0c930f1726767910e8ccb59fef37ffb273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:01.369011 kubelet[3295]: E0912 17:51:01.368990 3295 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85b92cf5c2f7543920c26eb823589f0c930f1726767910e8ccb59fef37ffb273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:51:01.369217 kubelet[3295]: E0912 17:51:01.369027 3295 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85b92cf5c2f7543920c26eb823589f0c930f1726767910e8ccb59fef37ffb273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8vfbh" Sep 12 17:51:01.369217 kubelet[3295]: E0912 17:51:01.369043 3295 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85b92cf5c2f7543920c26eb823589f0c930f1726767910e8ccb59fef37ffb273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8vfbh" Sep 12 17:51:01.369217 kubelet[3295]: E0912 17:51:01.369071 3295 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8vfbh_calico-system(86dbef57-8f65-4cc3-ae59-c02f1dffe403)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8vfbh_calico-system(86dbef57-8f65-4cc3-ae59-c02f1dffe403)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"85b92cf5c2f7543920c26eb823589f0c930f1726767910e8ccb59fef37ffb273\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8vfbh" podUID="86dbef57-8f65-4cc3-ae59-c02f1dffe403" Sep 12 17:51:01.369774 systemd[1]: run-netns-cni\x2d561731d7\x2dc9e7\x2d3359\x2d78e2\x2df0b626a51138.mount: Deactivated successfully. Sep 12 17:51:02.795613 kubelet[3295]: I0912 17:51:02.795577 3295 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:51:05.556423 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1380214504.mount: Deactivated successfully. Sep 12 17:51:05.575337 containerd[1930]: time="2025-09-12T17:51:05.575290807Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:05.575592 containerd[1930]: time="2025-09-12T17:51:05.575539857Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 12 17:51:05.575880 containerd[1930]: time="2025-09-12T17:51:05.575828005Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:05.576610 containerd[1930]: time="2025-09-12T17:51:05.576570558Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:05.576961 containerd[1930]: time="2025-09-12T17:51:05.576920556Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 5.14731048s" Sep 12 17:51:05.576961 containerd[1930]: time="2025-09-12T17:51:05.576936020Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 12 17:51:05.580507 containerd[1930]: time="2025-09-12T17:51:05.580487765Z" level=info msg="CreateContainer within sandbox \"509591bd9f1ebc0ab26a795eb5fcf0e5740e5b3a353f215a900816593ee9af9d\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 17:51:05.585004 containerd[1930]: time="2025-09-12T17:51:05.584988048Z" level=info msg="Container 6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:51:05.589514 containerd[1930]: time="2025-09-12T17:51:05.589468414Z" level=info msg="CreateContainer within sandbox \"509591bd9f1ebc0ab26a795eb5fcf0e5740e5b3a353f215a900816593ee9af9d\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\"" Sep 12 17:51:05.589701 containerd[1930]: time="2025-09-12T17:51:05.589690871Z" level=info msg="StartContainer for \"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\"" Sep 12 17:51:05.590438 containerd[1930]: time="2025-09-12T17:51:05.590417559Z" level=info msg="connecting to shim 6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e" address="unix:///run/containerd/s/db1125c6d0ef3ce4baaae9e0498a00bb4cbc17e67b36ff50c2dad96660e534fd" protocol=ttrpc version=3 Sep 12 17:51:05.613327 systemd[1]: Started cri-containerd-6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e.scope - libcontainer container 6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e. Sep 12 17:51:05.640724 containerd[1930]: time="2025-09-12T17:51:05.640666634Z" level=info msg="StartContainer for \"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" returns successfully" Sep 12 17:51:05.704324 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 17:51:05.704593 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 17:51:05.750881 kubelet[3295]: I0912 17:51:05.750851 3295 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kflm2\" (UniqueName: \"kubernetes.io/projected/184347ef-ae58-4681-aaab-683007b08cb9-kube-api-access-kflm2\") pod \"184347ef-ae58-4681-aaab-683007b08cb9\" (UID: \"184347ef-ae58-4681-aaab-683007b08cb9\") " Sep 12 17:51:05.751160 kubelet[3295]: I0912 17:51:05.750892 3295 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/184347ef-ae58-4681-aaab-683007b08cb9-whisker-ca-bundle\") pod \"184347ef-ae58-4681-aaab-683007b08cb9\" (UID: \"184347ef-ae58-4681-aaab-683007b08cb9\") " Sep 12 17:51:05.751160 kubelet[3295]: I0912 17:51:05.750919 3295 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/184347ef-ae58-4681-aaab-683007b08cb9-whisker-backend-key-pair\") pod \"184347ef-ae58-4681-aaab-683007b08cb9\" (UID: \"184347ef-ae58-4681-aaab-683007b08cb9\") " Sep 12 17:51:05.751216 kubelet[3295]: I0912 17:51:05.751200 3295 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/184347ef-ae58-4681-aaab-683007b08cb9-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "184347ef-ae58-4681-aaab-683007b08cb9" (UID: "184347ef-ae58-4681-aaab-683007b08cb9"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 12 17:51:05.752457 kubelet[3295]: I0912 17:51:05.752413 3295 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/184347ef-ae58-4681-aaab-683007b08cb9-kube-api-access-kflm2" (OuterVolumeSpecName: "kube-api-access-kflm2") pod "184347ef-ae58-4681-aaab-683007b08cb9" (UID: "184347ef-ae58-4681-aaab-683007b08cb9"). InnerVolumeSpecName "kube-api-access-kflm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 12 17:51:05.752457 kubelet[3295]: I0912 17:51:05.752411 3295 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/184347ef-ae58-4681-aaab-683007b08cb9-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "184347ef-ae58-4681-aaab-683007b08cb9" (UID: "184347ef-ae58-4681-aaab-683007b08cb9"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 12 17:51:05.851522 kubelet[3295]: I0912 17:51:05.851320 3295 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kflm2\" (UniqueName: \"kubernetes.io/projected/184347ef-ae58-4681-aaab-683007b08cb9-kube-api-access-kflm2\") on node \"ci-4426.1.0-a-b1d4eb1a76\" DevicePath \"\"" Sep 12 17:51:05.851522 kubelet[3295]: I0912 17:51:05.851387 3295 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/184347ef-ae58-4681-aaab-683007b08cb9-whisker-ca-bundle\") on node \"ci-4426.1.0-a-b1d4eb1a76\" DevicePath \"\"" Sep 12 17:51:05.851522 kubelet[3295]: I0912 17:51:05.851417 3295 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/184347ef-ae58-4681-aaab-683007b08cb9-whisker-backend-key-pair\") on node \"ci-4426.1.0-a-b1d4eb1a76\" DevicePath \"\"" Sep 12 17:51:06.458401 systemd[1]: Removed slice kubepods-besteffort-pod184347ef_ae58_4681_aaab_683007b08cb9.slice - libcontainer container kubepods-besteffort-pod184347ef_ae58_4681_aaab_683007b08cb9.slice. Sep 12 17:51:06.481139 kubelet[3295]: I0912 17:51:06.480704 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-sd9ss" podStartSLOduration=1.551174654 podStartE2EDuration="15.480665917s" podCreationTimestamp="2025-09-12 17:50:51 +0000 UTC" firstStartedPulling="2025-09-12 17:50:51.647812455 +0000 UTC m=+16.393664846" lastFinishedPulling="2025-09-12 17:51:05.577303722 +0000 UTC m=+30.323156109" observedRunningTime="2025-09-12 17:51:06.479699674 +0000 UTC m=+31.225552116" watchObservedRunningTime="2025-09-12 17:51:06.480665917 +0000 UTC m=+31.226518344" Sep 12 17:51:06.502696 systemd[1]: Created slice kubepods-besteffort-podf8074f58_08c0_43e5_ba04_73ad74d7b2df.slice - libcontainer container kubepods-besteffort-podf8074f58_08c0_43e5_ba04_73ad74d7b2df.slice. Sep 12 17:51:06.556792 kubelet[3295]: I0912 17:51:06.556702 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkw4m\" (UniqueName: \"kubernetes.io/projected/f8074f58-08c0-43e5-ba04-73ad74d7b2df-kube-api-access-fkw4m\") pod \"whisker-7bd5fbbb6-c5gkf\" (UID: \"f8074f58-08c0-43e5-ba04-73ad74d7b2df\") " pod="calico-system/whisker-7bd5fbbb6-c5gkf" Sep 12 17:51:06.557020 kubelet[3295]: I0912 17:51:06.556874 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8074f58-08c0-43e5-ba04-73ad74d7b2df-whisker-ca-bundle\") pod \"whisker-7bd5fbbb6-c5gkf\" (UID: \"f8074f58-08c0-43e5-ba04-73ad74d7b2df\") " pod="calico-system/whisker-7bd5fbbb6-c5gkf" Sep 12 17:51:06.557020 kubelet[3295]: I0912 17:51:06.556968 3295 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f8074f58-08c0-43e5-ba04-73ad74d7b2df-whisker-backend-key-pair\") pod \"whisker-7bd5fbbb6-c5gkf\" (UID: \"f8074f58-08c0-43e5-ba04-73ad74d7b2df\") " pod="calico-system/whisker-7bd5fbbb6-c5gkf" Sep 12 17:51:06.564091 systemd[1]: var-lib-kubelet-pods-184347ef\x2dae58\x2d4681\x2daaab\x2d683007b08cb9-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dkflm2.mount: Deactivated successfully. Sep 12 17:51:06.564373 systemd[1]: var-lib-kubelet-pods-184347ef\x2dae58\x2d4681\x2daaab\x2d683007b08cb9-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 17:51:06.805643 containerd[1930]: time="2025-09-12T17:51:06.805547947Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bd5fbbb6-c5gkf,Uid:f8074f58-08c0-43e5-ba04-73ad74d7b2df,Namespace:calico-system,Attempt:0,}" Sep 12 17:51:06.876867 systemd-networkd[1838]: calib93f617429c: Link UP Sep 12 17:51:06.877127 systemd-networkd[1838]: calib93f617429c: Gained carrier Sep 12 17:51:06.883394 containerd[1930]: 2025-09-12 17:51:06.826 [INFO][4693] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:51:06.883394 containerd[1930]: 2025-09-12 17:51:06.835 [INFO][4693] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--b1d4eb1a76-k8s-whisker--7bd5fbbb6--c5gkf-eth0 whisker-7bd5fbbb6- calico-system f8074f58-08c0-43e5-ba04-73ad74d7b2df 856 0 2025-09-12 17:51:06 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7bd5fbbb6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4426.1.0-a-b1d4eb1a76 whisker-7bd5fbbb6-c5gkf eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calib93f617429c [] [] }} ContainerID="de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" Namespace="calico-system" Pod="whisker-7bd5fbbb6-c5gkf" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-whisker--7bd5fbbb6--c5gkf-" Sep 12 17:51:06.883394 containerd[1930]: 2025-09-12 17:51:06.835 [INFO][4693] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" Namespace="calico-system" Pod="whisker-7bd5fbbb6-c5gkf" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-whisker--7bd5fbbb6--c5gkf-eth0" Sep 12 17:51:06.883394 containerd[1930]: 2025-09-12 17:51:06.850 [INFO][4828] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" HandleID="k8s-pod-network.de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-whisker--7bd5fbbb6--c5gkf-eth0" Sep 12 17:51:06.883566 containerd[1930]: 2025-09-12 17:51:06.850 [INFO][4828] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" HandleID="k8s-pod-network.de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-whisker--7bd5fbbb6--c5gkf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001a56a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-b1d4eb1a76", "pod":"whisker-7bd5fbbb6-c5gkf", "timestamp":"2025-09-12 17:51:06.850092653 +0000 UTC"}, Hostname:"ci-4426.1.0-a-b1d4eb1a76", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:51:06.883566 containerd[1930]: 2025-09-12 17:51:06.850 [INFO][4828] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:51:06.883566 containerd[1930]: 2025-09-12 17:51:06.850 [INFO][4828] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:51:06.883566 containerd[1930]: 2025-09-12 17:51:06.850 [INFO][4828] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-b1d4eb1a76' Sep 12 17:51:06.883566 containerd[1930]: 2025-09-12 17:51:06.855 [INFO][4828] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:06.883566 containerd[1930]: 2025-09-12 17:51:06.859 [INFO][4828] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:06.883566 containerd[1930]: 2025-09-12 17:51:06.861 [INFO][4828] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:06.883566 containerd[1930]: 2025-09-12 17:51:06.862 [INFO][4828] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:06.883566 containerd[1930]: 2025-09-12 17:51:06.863 [INFO][4828] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:06.883702 containerd[1930]: 2025-09-12 17:51:06.863 [INFO][4828] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:06.883702 containerd[1930]: 2025-09-12 17:51:06.864 [INFO][4828] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1 Sep 12 17:51:06.883702 containerd[1930]: 2025-09-12 17:51:06.866 [INFO][4828] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:06.883702 containerd[1930]: 2025-09-12 17:51:06.870 [INFO][4828] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.193/26] block=192.168.61.192/26 handle="k8s-pod-network.de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:06.883702 containerd[1930]: 2025-09-12 17:51:06.870 [INFO][4828] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.193/26] handle="k8s-pod-network.de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:06.883702 containerd[1930]: 2025-09-12 17:51:06.870 [INFO][4828] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:51:06.883702 containerd[1930]: 2025-09-12 17:51:06.870 [INFO][4828] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.193/26] IPv6=[] ContainerID="de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" HandleID="k8s-pod-network.de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-whisker--7bd5fbbb6--c5gkf-eth0" Sep 12 17:51:06.883798 containerd[1930]: 2025-09-12 17:51:06.872 [INFO][4693] cni-plugin/k8s.go 418: Populated endpoint ContainerID="de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" Namespace="calico-system" Pod="whisker-7bd5fbbb6-c5gkf" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-whisker--7bd5fbbb6--c5gkf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-whisker--7bd5fbbb6--c5gkf-eth0", GenerateName:"whisker-7bd5fbbb6-", Namespace:"calico-system", SelfLink:"", UID:"f8074f58-08c0-43e5-ba04-73ad74d7b2df", ResourceVersion:"856", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 51, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7bd5fbbb6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"", Pod:"whisker-7bd5fbbb6-c5gkf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.61.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib93f617429c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:06.883798 containerd[1930]: 2025-09-12 17:51:06.872 [INFO][4693] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.193/32] ContainerID="de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" Namespace="calico-system" Pod="whisker-7bd5fbbb6-c5gkf" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-whisker--7bd5fbbb6--c5gkf-eth0" Sep 12 17:51:06.883850 containerd[1930]: 2025-09-12 17:51:06.872 [INFO][4693] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib93f617429c ContainerID="de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" Namespace="calico-system" Pod="whisker-7bd5fbbb6-c5gkf" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-whisker--7bd5fbbb6--c5gkf-eth0" Sep 12 17:51:06.883850 containerd[1930]: 2025-09-12 17:51:06.877 [INFO][4693] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" Namespace="calico-system" Pod="whisker-7bd5fbbb6-c5gkf" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-whisker--7bd5fbbb6--c5gkf-eth0" Sep 12 17:51:06.883883 containerd[1930]: 2025-09-12 17:51:06.877 [INFO][4693] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" Namespace="calico-system" Pod="whisker-7bd5fbbb6-c5gkf" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-whisker--7bd5fbbb6--c5gkf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-whisker--7bd5fbbb6--c5gkf-eth0", GenerateName:"whisker-7bd5fbbb6-", Namespace:"calico-system", SelfLink:"", UID:"f8074f58-08c0-43e5-ba04-73ad74d7b2df", ResourceVersion:"856", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 51, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7bd5fbbb6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1", Pod:"whisker-7bd5fbbb6-c5gkf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.61.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib93f617429c", MAC:"62:f2:0d:e0:b0:76", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:06.883920 containerd[1930]: 2025-09-12 17:51:06.882 [INFO][4693] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" Namespace="calico-system" Pod="whisker-7bd5fbbb6-c5gkf" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-whisker--7bd5fbbb6--c5gkf-eth0" Sep 12 17:51:06.892675 containerd[1930]: time="2025-09-12T17:51:06.892651582Z" level=info msg="connecting to shim de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1" address="unix:///run/containerd/s/1721f654edc6288e405c72f3773d9043934d52e75234d79544661a6ee58522c7" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:51:06.914561 systemd[1]: Started cri-containerd-de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1.scope - libcontainer container de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1. Sep 12 17:51:07.003190 containerd[1930]: time="2025-09-12T17:51:07.003169016Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bd5fbbb6-c5gkf,Uid:f8074f58-08c0-43e5-ba04-73ad74d7b2df,Namespace:calico-system,Attempt:0,} returns sandbox id \"de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1\"" Sep 12 17:51:07.003923 containerd[1930]: time="2025-09-12T17:51:07.003912035Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 17:51:07.167864 systemd-networkd[1838]: vxlan.calico: Link UP Sep 12 17:51:07.167867 systemd-networkd[1838]: vxlan.calico: Gained carrier Sep 12 17:51:07.339729 kubelet[3295]: I0912 17:51:07.339708 3295 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="184347ef-ae58-4681-aaab-683007b08cb9" path="/var/lib/kubelet/pods/184347ef-ae58-4681-aaab-683007b08cb9/volumes" Sep 12 17:51:07.500793 containerd[1930]: time="2025-09-12T17:51:07.500725385Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"99dcbffb2ab178c9fab643ac921d9731404b5764670657f149963b715026a315\" pid:5042 exit_status:1 exited_at:{seconds:1757699467 nanos:500535277}" Sep 12 17:51:08.415275 systemd-networkd[1838]: calib93f617429c: Gained IPv6LL Sep 12 17:51:08.511096 containerd[1930]: time="2025-09-12T17:51:08.511068677Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"908b3ee0500aebe8194fbd6856548209f4e82d73279a360a5c5b43c6556d03f0\" pid:5087 exit_status:1 exited_at:{seconds:1757699468 nanos:510909516}" Sep 12 17:51:08.786161 containerd[1930]: time="2025-09-12T17:51:08.786055727Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:08.786611 containerd[1930]: time="2025-09-12T17:51:08.786568196Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 12 17:51:08.787251 containerd[1930]: time="2025-09-12T17:51:08.787230140Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:08.788454 containerd[1930]: time="2025-09-12T17:51:08.788411776Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:08.788854 containerd[1930]: time="2025-09-12T17:51:08.788804631Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.784873971s" Sep 12 17:51:08.788854 containerd[1930]: time="2025-09-12T17:51:08.788820379Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 12 17:51:08.790036 containerd[1930]: time="2025-09-12T17:51:08.790025203Z" level=info msg="CreateContainer within sandbox \"de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 17:51:08.792469 containerd[1930]: time="2025-09-12T17:51:08.792452882Z" level=info msg="Container 7fa5f079557dd63a479d4a3792c093a7bf5a9df07330db389374905a01fea9e8: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:51:08.795318 containerd[1930]: time="2025-09-12T17:51:08.795302556Z" level=info msg="CreateContainer within sandbox \"de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"7fa5f079557dd63a479d4a3792c093a7bf5a9df07330db389374905a01fea9e8\"" Sep 12 17:51:08.795595 containerd[1930]: time="2025-09-12T17:51:08.795548675Z" level=info msg="StartContainer for \"7fa5f079557dd63a479d4a3792c093a7bf5a9df07330db389374905a01fea9e8\"" Sep 12 17:51:08.796279 containerd[1930]: time="2025-09-12T17:51:08.796262183Z" level=info msg="connecting to shim 7fa5f079557dd63a479d4a3792c093a7bf5a9df07330db389374905a01fea9e8" address="unix:///run/containerd/s/1721f654edc6288e405c72f3773d9043934d52e75234d79544661a6ee58522c7" protocol=ttrpc version=3 Sep 12 17:51:08.799269 systemd-networkd[1838]: vxlan.calico: Gained IPv6LL Sep 12 17:51:08.812281 systemd[1]: Started cri-containerd-7fa5f079557dd63a479d4a3792c093a7bf5a9df07330db389374905a01fea9e8.scope - libcontainer container 7fa5f079557dd63a479d4a3792c093a7bf5a9df07330db389374905a01fea9e8. Sep 12 17:51:08.840865 containerd[1930]: time="2025-09-12T17:51:08.840841820Z" level=info msg="StartContainer for \"7fa5f079557dd63a479d4a3792c093a7bf5a9df07330db389374905a01fea9e8\" returns successfully" Sep 12 17:51:08.841365 containerd[1930]: time="2025-09-12T17:51:08.841352435Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 17:51:10.338479 containerd[1930]: time="2025-09-12T17:51:10.338423669Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c89f6dbbd-kbbhh,Uid:eb7395a3-5fae-4a36-a604-8f98178712b4,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:51:10.395912 systemd-networkd[1838]: cali8424db918f2: Link UP Sep 12 17:51:10.396034 systemd-networkd[1838]: cali8424db918f2: Gained carrier Sep 12 17:51:10.401852 containerd[1930]: 2025-09-12 17:51:10.361 [INFO][5164] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--kbbhh-eth0 calico-apiserver-7c89f6dbbd- calico-apiserver eb7395a3-5fae-4a36-a604-8f98178712b4 785 0 2025-09-12 17:50:49 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7c89f6dbbd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.1.0-a-b1d4eb1a76 calico-apiserver-7c89f6dbbd-kbbhh eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali8424db918f2 [] [] }} ContainerID="213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-kbbhh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--kbbhh-" Sep 12 17:51:10.401852 containerd[1930]: 2025-09-12 17:51:10.361 [INFO][5164] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-kbbhh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--kbbhh-eth0" Sep 12 17:51:10.401852 containerd[1930]: 2025-09-12 17:51:10.374 [INFO][5184] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" HandleID="k8s-pod-network.213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--kbbhh-eth0" Sep 12 17:51:10.401979 containerd[1930]: 2025-09-12 17:51:10.375 [INFO][5184] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" HandleID="k8s-pod-network.213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--kbbhh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00042a700), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.1.0-a-b1d4eb1a76", "pod":"calico-apiserver-7c89f6dbbd-kbbhh", "timestamp":"2025-09-12 17:51:10.374958899 +0000 UTC"}, Hostname:"ci-4426.1.0-a-b1d4eb1a76", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:51:10.401979 containerd[1930]: 2025-09-12 17:51:10.375 [INFO][5184] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:51:10.401979 containerd[1930]: 2025-09-12 17:51:10.375 [INFO][5184] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:51:10.401979 containerd[1930]: 2025-09-12 17:51:10.375 [INFO][5184] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-b1d4eb1a76' Sep 12 17:51:10.401979 containerd[1930]: 2025-09-12 17:51:10.379 [INFO][5184] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:10.401979 containerd[1930]: 2025-09-12 17:51:10.383 [INFO][5184] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:10.401979 containerd[1930]: 2025-09-12 17:51:10.385 [INFO][5184] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:10.401979 containerd[1930]: 2025-09-12 17:51:10.386 [INFO][5184] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:10.401979 containerd[1930]: 2025-09-12 17:51:10.387 [INFO][5184] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:10.402121 containerd[1930]: 2025-09-12 17:51:10.387 [INFO][5184] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:10.402121 containerd[1930]: 2025-09-12 17:51:10.388 [INFO][5184] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb Sep 12 17:51:10.402121 containerd[1930]: 2025-09-12 17:51:10.390 [INFO][5184] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:10.402121 containerd[1930]: 2025-09-12 17:51:10.393 [INFO][5184] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.194/26] block=192.168.61.192/26 handle="k8s-pod-network.213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:10.402121 containerd[1930]: 2025-09-12 17:51:10.393 [INFO][5184] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.194/26] handle="k8s-pod-network.213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:10.402121 containerd[1930]: 2025-09-12 17:51:10.393 [INFO][5184] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:51:10.402121 containerd[1930]: 2025-09-12 17:51:10.393 [INFO][5184] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.194/26] IPv6=[] ContainerID="213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" HandleID="k8s-pod-network.213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--kbbhh-eth0" Sep 12 17:51:10.402231 containerd[1930]: 2025-09-12 17:51:10.394 [INFO][5164] cni-plugin/k8s.go 418: Populated endpoint ContainerID="213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-kbbhh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--kbbhh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--kbbhh-eth0", GenerateName:"calico-apiserver-7c89f6dbbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"eb7395a3-5fae-4a36-a604-8f98178712b4", ResourceVersion:"785", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 50, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7c89f6dbbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"", Pod:"calico-apiserver-7c89f6dbbd-kbbhh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8424db918f2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:10.402272 containerd[1930]: 2025-09-12 17:51:10.394 [INFO][5164] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.194/32] ContainerID="213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-kbbhh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--kbbhh-eth0" Sep 12 17:51:10.402272 containerd[1930]: 2025-09-12 17:51:10.395 [INFO][5164] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8424db918f2 ContainerID="213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-kbbhh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--kbbhh-eth0" Sep 12 17:51:10.402272 containerd[1930]: 2025-09-12 17:51:10.396 [INFO][5164] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-kbbhh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--kbbhh-eth0" Sep 12 17:51:10.402314 containerd[1930]: 2025-09-12 17:51:10.396 [INFO][5164] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-kbbhh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--kbbhh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--kbbhh-eth0", GenerateName:"calico-apiserver-7c89f6dbbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"eb7395a3-5fae-4a36-a604-8f98178712b4", ResourceVersion:"785", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 50, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7c89f6dbbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb", Pod:"calico-apiserver-7c89f6dbbd-kbbhh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8424db918f2", MAC:"0a:eb:53:3d:5a:7b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:10.402352 containerd[1930]: 2025-09-12 17:51:10.400 [INFO][5164] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-kbbhh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--kbbhh-eth0" Sep 12 17:51:10.480745 containerd[1930]: time="2025-09-12T17:51:10.480636040Z" level=info msg="connecting to shim 213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb" address="unix:///run/containerd/s/37e97b9f8b2a0cf224eecbc39f3ea0bc47af8216f4cb4898b01ec3f2dfa0e91a" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:51:10.502219 systemd[1]: Started cri-containerd-213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb.scope - libcontainer container 213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb. Sep 12 17:51:10.528637 containerd[1930]: time="2025-09-12T17:51:10.528613296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c89f6dbbd-kbbhh,Uid:eb7395a3-5fae-4a36-a604-8f98178712b4,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb\"" Sep 12 17:51:10.688381 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3667998104.mount: Deactivated successfully. Sep 12 17:51:10.693410 containerd[1930]: time="2025-09-12T17:51:10.693363194Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:10.693630 containerd[1930]: time="2025-09-12T17:51:10.693586109Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 12 17:51:10.693960 containerd[1930]: time="2025-09-12T17:51:10.693919482Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:10.694902 containerd[1930]: time="2025-09-12T17:51:10.694855044Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:10.695317 containerd[1930]: time="2025-09-12T17:51:10.695276116Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 1.85390694s" Sep 12 17:51:10.695317 containerd[1930]: time="2025-09-12T17:51:10.695291387Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 12 17:51:10.695753 containerd[1930]: time="2025-09-12T17:51:10.695740443Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:51:10.696231 containerd[1930]: time="2025-09-12T17:51:10.696216238Z" level=info msg="CreateContainer within sandbox \"de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 17:51:10.698866 containerd[1930]: time="2025-09-12T17:51:10.698829317Z" level=info msg="Container 5816eb84caf0e84734d1bbede9b6bc2def6c9b00156363e5705aeecb3a22d9aa: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:51:10.701885 containerd[1930]: time="2025-09-12T17:51:10.701871348Z" level=info msg="CreateContainer within sandbox \"de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"5816eb84caf0e84734d1bbede9b6bc2def6c9b00156363e5705aeecb3a22d9aa\"" Sep 12 17:51:10.702141 containerd[1930]: time="2025-09-12T17:51:10.702126113Z" level=info msg="StartContainer for \"5816eb84caf0e84734d1bbede9b6bc2def6c9b00156363e5705aeecb3a22d9aa\"" Sep 12 17:51:10.702672 containerd[1930]: time="2025-09-12T17:51:10.702630051Z" level=info msg="connecting to shim 5816eb84caf0e84734d1bbede9b6bc2def6c9b00156363e5705aeecb3a22d9aa" address="unix:///run/containerd/s/1721f654edc6288e405c72f3773d9043934d52e75234d79544661a6ee58522c7" protocol=ttrpc version=3 Sep 12 17:51:10.721384 systemd[1]: Started cri-containerd-5816eb84caf0e84734d1bbede9b6bc2def6c9b00156363e5705aeecb3a22d9aa.scope - libcontainer container 5816eb84caf0e84734d1bbede9b6bc2def6c9b00156363e5705aeecb3a22d9aa. Sep 12 17:51:10.767354 containerd[1930]: time="2025-09-12T17:51:10.767328885Z" level=info msg="StartContainer for \"5816eb84caf0e84734d1bbede9b6bc2def6c9b00156363e5705aeecb3a22d9aa\" returns successfully" Sep 12 17:51:11.483549 kubelet[3295]: I0912 17:51:11.483501 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7bd5fbbb6-c5gkf" podStartSLOduration=1.791598203 podStartE2EDuration="5.48348535s" podCreationTimestamp="2025-09-12 17:51:06 +0000 UTC" firstStartedPulling="2025-09-12 17:51:07.003798152 +0000 UTC m=+31.749650539" lastFinishedPulling="2025-09-12 17:51:10.6956853 +0000 UTC m=+35.441537686" observedRunningTime="2025-09-12 17:51:11.48289675 +0000 UTC m=+36.228749137" watchObservedRunningTime="2025-09-12 17:51:11.48348535 +0000 UTC m=+36.229337734" Sep 12 17:51:11.999395 systemd-networkd[1838]: cali8424db918f2: Gained IPv6LL Sep 12 17:51:13.214841 containerd[1930]: time="2025-09-12T17:51:13.214816228Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:13.215059 containerd[1930]: time="2025-09-12T17:51:13.215000300Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 12 17:51:13.215393 containerd[1930]: time="2025-09-12T17:51:13.215382724Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:13.216263 containerd[1930]: time="2025-09-12T17:51:13.216226319Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:13.216653 containerd[1930]: time="2025-09-12T17:51:13.216620161Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.520865751s" Sep 12 17:51:13.216653 containerd[1930]: time="2025-09-12T17:51:13.216633484Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 17:51:13.217530 containerd[1930]: time="2025-09-12T17:51:13.217519103Z" level=info msg="CreateContainer within sandbox \"213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:51:13.220019 containerd[1930]: time="2025-09-12T17:51:13.219984841Z" level=info msg="Container f5be7a5ff70897594600c9e1ff4ae615f5dabad1c4e0059020059d9d47c55857: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:51:13.223153 containerd[1930]: time="2025-09-12T17:51:13.223134661Z" level=info msg="CreateContainer within sandbox \"213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"f5be7a5ff70897594600c9e1ff4ae615f5dabad1c4e0059020059d9d47c55857\"" Sep 12 17:51:13.223409 containerd[1930]: time="2025-09-12T17:51:13.223397946Z" level=info msg="StartContainer for \"f5be7a5ff70897594600c9e1ff4ae615f5dabad1c4e0059020059d9d47c55857\"" Sep 12 17:51:13.223917 containerd[1930]: time="2025-09-12T17:51:13.223906655Z" level=info msg="connecting to shim f5be7a5ff70897594600c9e1ff4ae615f5dabad1c4e0059020059d9d47c55857" address="unix:///run/containerd/s/37e97b9f8b2a0cf224eecbc39f3ea0bc47af8216f4cb4898b01ec3f2dfa0e91a" protocol=ttrpc version=3 Sep 12 17:51:13.242253 systemd[1]: Started cri-containerd-f5be7a5ff70897594600c9e1ff4ae615f5dabad1c4e0059020059d9d47c55857.scope - libcontainer container f5be7a5ff70897594600c9e1ff4ae615f5dabad1c4e0059020059d9d47c55857. Sep 12 17:51:13.274543 containerd[1930]: time="2025-09-12T17:51:13.274519439Z" level=info msg="StartContainer for \"f5be7a5ff70897594600c9e1ff4ae615f5dabad1c4e0059020059d9d47c55857\" returns successfully" Sep 12 17:51:13.337969 containerd[1930]: time="2025-09-12T17:51:13.337945236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-44h8j,Uid:0983e9df-c3e1-467a-baa8-004bbe511d69,Namespace:kube-system,Attempt:0,}" Sep 12 17:51:13.338040 containerd[1930]: time="2025-09-12T17:51:13.337946163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-bmktx,Uid:a38ef781-aaef-47e2-89f1-6f4af63895e3,Namespace:calico-system,Attempt:0,}" Sep 12 17:51:13.390221 systemd-networkd[1838]: califaa73294ac4: Link UP Sep 12 17:51:13.390396 systemd-networkd[1838]: califaa73294ac4: Gained carrier Sep 12 17:51:13.397153 containerd[1930]: 2025-09-12 17:51:13.356 [INFO][5374] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--44h8j-eth0 coredns-7c65d6cfc9- kube-system 0983e9df-c3e1-467a-baa8-004bbe511d69 781 0 2025-09-12 17:50:41 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426.1.0-a-b1d4eb1a76 coredns-7c65d6cfc9-44h8j eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] califaa73294ac4 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" Namespace="kube-system" Pod="coredns-7c65d6cfc9-44h8j" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--44h8j-" Sep 12 17:51:13.397153 containerd[1930]: 2025-09-12 17:51:13.356 [INFO][5374] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" Namespace="kube-system" Pod="coredns-7c65d6cfc9-44h8j" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--44h8j-eth0" Sep 12 17:51:13.397153 containerd[1930]: 2025-09-12 17:51:13.368 [INFO][5423] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" HandleID="k8s-pod-network.d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--44h8j-eth0" Sep 12 17:51:13.397287 containerd[1930]: 2025-09-12 17:51:13.368 [INFO][5423] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" HandleID="k8s-pod-network.d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--44h8j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00026f940), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426.1.0-a-b1d4eb1a76", "pod":"coredns-7c65d6cfc9-44h8j", "timestamp":"2025-09-12 17:51:13.368035956 +0000 UTC"}, Hostname:"ci-4426.1.0-a-b1d4eb1a76", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:51:13.397287 containerd[1930]: 2025-09-12 17:51:13.368 [INFO][5423] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:51:13.397287 containerd[1930]: 2025-09-12 17:51:13.368 [INFO][5423] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:51:13.397287 containerd[1930]: 2025-09-12 17:51:13.368 [INFO][5423] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-b1d4eb1a76' Sep 12 17:51:13.397287 containerd[1930]: 2025-09-12 17:51:13.373 [INFO][5423] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.397287 containerd[1930]: 2025-09-12 17:51:13.376 [INFO][5423] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.397287 containerd[1930]: 2025-09-12 17:51:13.379 [INFO][5423] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.397287 containerd[1930]: 2025-09-12 17:51:13.380 [INFO][5423] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.397287 containerd[1930]: 2025-09-12 17:51:13.381 [INFO][5423] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.397424 containerd[1930]: 2025-09-12 17:51:13.381 [INFO][5423] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.397424 containerd[1930]: 2025-09-12 17:51:13.382 [INFO][5423] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209 Sep 12 17:51:13.397424 containerd[1930]: 2025-09-12 17:51:13.385 [INFO][5423] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.397424 containerd[1930]: 2025-09-12 17:51:13.388 [INFO][5423] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.195/26] block=192.168.61.192/26 handle="k8s-pod-network.d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.397424 containerd[1930]: 2025-09-12 17:51:13.388 [INFO][5423] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.195/26] handle="k8s-pod-network.d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.397424 containerd[1930]: 2025-09-12 17:51:13.388 [INFO][5423] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:51:13.397424 containerd[1930]: 2025-09-12 17:51:13.388 [INFO][5423] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.195/26] IPv6=[] ContainerID="d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" HandleID="k8s-pod-network.d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--44h8j-eth0" Sep 12 17:51:13.397522 containerd[1930]: 2025-09-12 17:51:13.389 [INFO][5374] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" Namespace="kube-system" Pod="coredns-7c65d6cfc9-44h8j" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--44h8j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--44h8j-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"0983e9df-c3e1-467a-baa8-004bbe511d69", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 50, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"", Pod:"coredns-7c65d6cfc9-44h8j", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califaa73294ac4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:13.397522 containerd[1930]: 2025-09-12 17:51:13.389 [INFO][5374] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.195/32] ContainerID="d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" Namespace="kube-system" Pod="coredns-7c65d6cfc9-44h8j" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--44h8j-eth0" Sep 12 17:51:13.397522 containerd[1930]: 2025-09-12 17:51:13.389 [INFO][5374] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califaa73294ac4 ContainerID="d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" Namespace="kube-system" Pod="coredns-7c65d6cfc9-44h8j" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--44h8j-eth0" Sep 12 17:51:13.397522 containerd[1930]: 2025-09-12 17:51:13.390 [INFO][5374] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" Namespace="kube-system" Pod="coredns-7c65d6cfc9-44h8j" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--44h8j-eth0" Sep 12 17:51:13.397522 containerd[1930]: 2025-09-12 17:51:13.390 [INFO][5374] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" Namespace="kube-system" Pod="coredns-7c65d6cfc9-44h8j" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--44h8j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--44h8j-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"0983e9df-c3e1-467a-baa8-004bbe511d69", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 50, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209", Pod:"coredns-7c65d6cfc9-44h8j", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califaa73294ac4", MAC:"ba:18:2d:6a:0d:c0", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:13.397522 containerd[1930]: 2025-09-12 17:51:13.395 [INFO][5374] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" Namespace="kube-system" Pod="coredns-7c65d6cfc9-44h8j" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--44h8j-eth0" Sep 12 17:51:13.407821 containerd[1930]: time="2025-09-12T17:51:13.407796300Z" level=info msg="connecting to shim d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209" address="unix:///run/containerd/s/34326d3cc72da9e26eb4dcc1198f54a95d15f9acfa57ce6a39acf3d57d2981ed" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:51:13.429268 systemd[1]: Started cri-containerd-d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209.scope - libcontainer container d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209. Sep 12 17:51:13.454846 containerd[1930]: time="2025-09-12T17:51:13.454791930Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-44h8j,Uid:0983e9df-c3e1-467a-baa8-004bbe511d69,Namespace:kube-system,Attempt:0,} returns sandbox id \"d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209\"" Sep 12 17:51:13.455894 containerd[1930]: time="2025-09-12T17:51:13.455858632Z" level=info msg="CreateContainer within sandbox \"d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:51:13.459023 containerd[1930]: time="2025-09-12T17:51:13.458980965Z" level=info msg="Container b744f6bd613c0571efe4597aa223d78ee48ddc57cade342325ae3d2bb5c3c835: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:51:13.461106 containerd[1930]: time="2025-09-12T17:51:13.461065277Z" level=info msg="CreateContainer within sandbox \"d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b744f6bd613c0571efe4597aa223d78ee48ddc57cade342325ae3d2bb5c3c835\"" Sep 12 17:51:13.461321 containerd[1930]: time="2025-09-12T17:51:13.461306693Z" level=info msg="StartContainer for \"b744f6bd613c0571efe4597aa223d78ee48ddc57cade342325ae3d2bb5c3c835\"" Sep 12 17:51:13.461714 containerd[1930]: time="2025-09-12T17:51:13.461700210Z" level=info msg="connecting to shim b744f6bd613c0571efe4597aa223d78ee48ddc57cade342325ae3d2bb5c3c835" address="unix:///run/containerd/s/34326d3cc72da9e26eb4dcc1198f54a95d15f9acfa57ce6a39acf3d57d2981ed" protocol=ttrpc version=3 Sep 12 17:51:13.478370 systemd[1]: Started cri-containerd-b744f6bd613c0571efe4597aa223d78ee48ddc57cade342325ae3d2bb5c3c835.scope - libcontainer container b744f6bd613c0571efe4597aa223d78ee48ddc57cade342325ae3d2bb5c3c835. Sep 12 17:51:13.478673 kubelet[3295]: I0912 17:51:13.478643 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7c89f6dbbd-kbbhh" podStartSLOduration=21.790820825 podStartE2EDuration="24.478630802s" podCreationTimestamp="2025-09-12 17:50:49 +0000 UTC" firstStartedPulling="2025-09-12 17:51:10.529195919 +0000 UTC m=+35.275048306" lastFinishedPulling="2025-09-12 17:51:13.217005897 +0000 UTC m=+37.962858283" observedRunningTime="2025-09-12 17:51:13.478353512 +0000 UTC m=+38.224205897" watchObservedRunningTime="2025-09-12 17:51:13.478630802 +0000 UTC m=+38.224483185" Sep 12 17:51:13.490591 systemd-networkd[1838]: calic2953b4ad1c: Link UP Sep 12 17:51:13.490763 systemd-networkd[1838]: calic2953b4ad1c: Gained carrier Sep 12 17:51:13.493189 containerd[1930]: time="2025-09-12T17:51:13.493136612Z" level=info msg="StartContainer for \"b744f6bd613c0571efe4597aa223d78ee48ddc57cade342325ae3d2bb5c3c835\" returns successfully" Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.355 [INFO][5375] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--b1d4eb1a76-k8s-goldmane--7988f88666--bmktx-eth0 goldmane-7988f88666- calico-system a38ef781-aaef-47e2-89f1-6f4af63895e3 786 0 2025-09-12 17:50:50 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4426.1.0-a-b1d4eb1a76 goldmane-7988f88666-bmktx eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calic2953b4ad1c [] [] }} ContainerID="62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" Namespace="calico-system" Pod="goldmane-7988f88666-bmktx" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-goldmane--7988f88666--bmktx-" Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.355 [INFO][5375] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" Namespace="calico-system" Pod="goldmane-7988f88666-bmktx" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-goldmane--7988f88666--bmktx-eth0" Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.368 [INFO][5421] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" HandleID="k8s-pod-network.62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-goldmane--7988f88666--bmktx-eth0" Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.368 [INFO][5421] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" HandleID="k8s-pod-network.62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-goldmane--7988f88666--bmktx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00026f0d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-b1d4eb1a76", "pod":"goldmane-7988f88666-bmktx", "timestamp":"2025-09-12 17:51:13.368205638 +0000 UTC"}, Hostname:"ci-4426.1.0-a-b1d4eb1a76", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.368 [INFO][5421] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.388 [INFO][5421] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.388 [INFO][5421] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-b1d4eb1a76' Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.474 [INFO][5421] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.477 [INFO][5421] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.479 [INFO][5421] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.481 [INFO][5421] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.482 [INFO][5421] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.482 [INFO][5421] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.483 [INFO][5421] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0 Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.485 [INFO][5421] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.487 [INFO][5421] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.196/26] block=192.168.61.192/26 handle="k8s-pod-network.62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.488 [INFO][5421] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.196/26] handle="k8s-pod-network.62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.488 [INFO][5421] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:51:13.498409 containerd[1930]: 2025-09-12 17:51:13.488 [INFO][5421] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.196/26] IPv6=[] ContainerID="62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" HandleID="k8s-pod-network.62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-goldmane--7988f88666--bmktx-eth0" Sep 12 17:51:13.498772 containerd[1930]: 2025-09-12 17:51:13.489 [INFO][5375] cni-plugin/k8s.go 418: Populated endpoint ContainerID="62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" Namespace="calico-system" Pod="goldmane-7988f88666-bmktx" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-goldmane--7988f88666--bmktx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-goldmane--7988f88666--bmktx-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"a38ef781-aaef-47e2-89f1-6f4af63895e3", ResourceVersion:"786", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 50, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"", Pod:"goldmane-7988f88666-bmktx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.61.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic2953b4ad1c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:13.498772 containerd[1930]: 2025-09-12 17:51:13.489 [INFO][5375] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.196/32] ContainerID="62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" Namespace="calico-system" Pod="goldmane-7988f88666-bmktx" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-goldmane--7988f88666--bmktx-eth0" Sep 12 17:51:13.498772 containerd[1930]: 2025-09-12 17:51:13.489 [INFO][5375] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic2953b4ad1c ContainerID="62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" Namespace="calico-system" Pod="goldmane-7988f88666-bmktx" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-goldmane--7988f88666--bmktx-eth0" Sep 12 17:51:13.498772 containerd[1930]: 2025-09-12 17:51:13.490 [INFO][5375] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" Namespace="calico-system" Pod="goldmane-7988f88666-bmktx" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-goldmane--7988f88666--bmktx-eth0" Sep 12 17:51:13.498772 containerd[1930]: 2025-09-12 17:51:13.491 [INFO][5375] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" Namespace="calico-system" Pod="goldmane-7988f88666-bmktx" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-goldmane--7988f88666--bmktx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-goldmane--7988f88666--bmktx-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"a38ef781-aaef-47e2-89f1-6f4af63895e3", ResourceVersion:"786", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 50, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0", Pod:"goldmane-7988f88666-bmktx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.61.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic2953b4ad1c", MAC:"0a:bc:96:45:73:b5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:13.498772 containerd[1930]: 2025-09-12 17:51:13.496 [INFO][5375] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" Namespace="calico-system" Pod="goldmane-7988f88666-bmktx" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-goldmane--7988f88666--bmktx-eth0" Sep 12 17:51:13.506340 containerd[1930]: time="2025-09-12T17:51:13.506312198Z" level=info msg="connecting to shim 62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0" address="unix:///run/containerd/s/a14b45fe9451eb59ce24d352d32a9a79b4c0bce81f7339996d9c329fd805ea2d" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:51:13.533252 systemd[1]: Started cri-containerd-62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0.scope - libcontainer container 62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0. Sep 12 17:51:13.558811 containerd[1930]: time="2025-09-12T17:51:13.558789027Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-bmktx,Uid:a38ef781-aaef-47e2-89f1-6f4af63895e3,Namespace:calico-system,Attempt:0,} returns sandbox id \"62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0\"" Sep 12 17:51:13.559490 containerd[1930]: time="2025-09-12T17:51:13.559477855Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 17:51:14.339222 containerd[1930]: time="2025-09-12T17:51:14.339134644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bd8b64d9c-m7tcm,Uid:95881986-b9ee-4892-8fba-1273c4688890,Namespace:calico-system,Attempt:0,}" Sep 12 17:51:14.339921 containerd[1930]: time="2025-09-12T17:51:14.339361310Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c89f6dbbd-fcrhd,Uid:e36e7f2e-9196-4fee-b58b-1dd28a04759a,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:51:14.339921 containerd[1930]: time="2025-09-12T17:51:14.339358998Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-n6k4b,Uid:ce1692a5-2537-46fd-a646-e7a446075a37,Namespace:kube-system,Attempt:0,}" Sep 12 17:51:14.393831 systemd-networkd[1838]: cali3fb2d48efe7: Link UP Sep 12 17:51:14.393975 systemd-networkd[1838]: cali3fb2d48efe7: Gained carrier Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.359 [INFO][5627] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--n6k4b-eth0 coredns-7c65d6cfc9- kube-system ce1692a5-2537-46fd-a646-e7a446075a37 782 0 2025-09-12 17:50:41 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426.1.0-a-b1d4eb1a76 coredns-7c65d6cfc9-n6k4b eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3fb2d48efe7 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-n6k4b" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--n6k4b-" Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.359 [INFO][5627] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-n6k4b" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--n6k4b-eth0" Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.371 [INFO][5691] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" HandleID="k8s-pod-network.282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--n6k4b-eth0" Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.371 [INFO][5691] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" HandleID="k8s-pod-network.282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--n6k4b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004ff30), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426.1.0-a-b1d4eb1a76", "pod":"coredns-7c65d6cfc9-n6k4b", "timestamp":"2025-09-12 17:51:14.371756998 +0000 UTC"}, Hostname:"ci-4426.1.0-a-b1d4eb1a76", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.372 [INFO][5691] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.372 [INFO][5691] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.372 [INFO][5691] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-b1d4eb1a76' Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.376 [INFO][5691] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.379 [INFO][5691] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.382 [INFO][5691] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.383 [INFO][5691] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.385 [INFO][5691] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.385 [INFO][5691] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.386 [INFO][5691] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07 Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.388 [INFO][5691] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.392 [INFO][5691] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.197/26] block=192.168.61.192/26 handle="k8s-pod-network.282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.392 [INFO][5691] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.197/26] handle="k8s-pod-network.282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.392 [INFO][5691] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:51:14.399404 containerd[1930]: 2025-09-12 17:51:14.392 [INFO][5691] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.197/26] IPv6=[] ContainerID="282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" HandleID="k8s-pod-network.282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--n6k4b-eth0" Sep 12 17:51:14.399779 containerd[1930]: 2025-09-12 17:51:14.392 [INFO][5627] cni-plugin/k8s.go 418: Populated endpoint ContainerID="282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-n6k4b" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--n6k4b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--n6k4b-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"ce1692a5-2537-46fd-a646-e7a446075a37", ResourceVersion:"782", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 50, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"", Pod:"coredns-7c65d6cfc9-n6k4b", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3fb2d48efe7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:14.399779 containerd[1930]: 2025-09-12 17:51:14.393 [INFO][5627] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.197/32] ContainerID="282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-n6k4b" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--n6k4b-eth0" Sep 12 17:51:14.399779 containerd[1930]: 2025-09-12 17:51:14.393 [INFO][5627] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3fb2d48efe7 ContainerID="282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-n6k4b" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--n6k4b-eth0" Sep 12 17:51:14.399779 containerd[1930]: 2025-09-12 17:51:14.394 [INFO][5627] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-n6k4b" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--n6k4b-eth0" Sep 12 17:51:14.399779 containerd[1930]: 2025-09-12 17:51:14.394 [INFO][5627] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-n6k4b" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--n6k4b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--n6k4b-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"ce1692a5-2537-46fd-a646-e7a446075a37", ResourceVersion:"782", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 50, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07", Pod:"coredns-7c65d6cfc9-n6k4b", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3fb2d48efe7", MAC:"be:3f:98:d5:3f:08", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:14.399779 containerd[1930]: 2025-09-12 17:51:14.398 [INFO][5627] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-n6k4b" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-coredns--7c65d6cfc9--n6k4b-eth0" Sep 12 17:51:14.408008 containerd[1930]: time="2025-09-12T17:51:14.407983781Z" level=info msg="connecting to shim 282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07" address="unix:///run/containerd/s/73cb71b87c7ec554a433ea2946b0a1949375de9368852f9c668fe93ef9734818" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:51:14.427525 systemd[1]: Started cri-containerd-282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07.scope - libcontainer container 282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07. Sep 12 17:51:14.470949 containerd[1930]: time="2025-09-12T17:51:14.470927006Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-n6k4b,Uid:ce1692a5-2537-46fd-a646-e7a446075a37,Namespace:kube-system,Attempt:0,} returns sandbox id \"282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07\"" Sep 12 17:51:14.472005 containerd[1930]: time="2025-09-12T17:51:14.471990491Z" level=info msg="CreateContainer within sandbox \"282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:51:14.475034 containerd[1930]: time="2025-09-12T17:51:14.475019520Z" level=info msg="Container 526694dcc42d642118085dd9d93ce5c41f775768a1c4071b8a60ddfde57bf583: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:51:14.477129 containerd[1930]: time="2025-09-12T17:51:14.477107269Z" level=info msg="CreateContainer within sandbox \"282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"526694dcc42d642118085dd9d93ce5c41f775768a1c4071b8a60ddfde57bf583\"" Sep 12 17:51:14.477327 containerd[1930]: time="2025-09-12T17:51:14.477315807Z" level=info msg="StartContainer for \"526694dcc42d642118085dd9d93ce5c41f775768a1c4071b8a60ddfde57bf583\"" Sep 12 17:51:14.477764 containerd[1930]: time="2025-09-12T17:51:14.477752315Z" level=info msg="connecting to shim 526694dcc42d642118085dd9d93ce5c41f775768a1c4071b8a60ddfde57bf583" address="unix:///run/containerd/s/73cb71b87c7ec554a433ea2946b0a1949375de9368852f9c668fe93ef9734818" protocol=ttrpc version=3 Sep 12 17:51:14.480813 kubelet[3295]: I0912 17:51:14.480766 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-44h8j" podStartSLOduration=33.480743274 podStartE2EDuration="33.480743274s" podCreationTimestamp="2025-09-12 17:50:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:51:14.48044771 +0000 UTC m=+39.226300098" watchObservedRunningTime="2025-09-12 17:51:14.480743274 +0000 UTC m=+39.226595659" Sep 12 17:51:14.501320 systemd[1]: Started cri-containerd-526694dcc42d642118085dd9d93ce5c41f775768a1c4071b8a60ddfde57bf583.scope - libcontainer container 526694dcc42d642118085dd9d93ce5c41f775768a1c4071b8a60ddfde57bf583. Sep 12 17:51:14.505297 systemd-networkd[1838]: caliab066779ad5: Link UP Sep 12 17:51:14.505468 systemd-networkd[1838]: caliab066779ad5: Gained carrier Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.359 [INFO][5615] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--b1d4eb1a76-k8s-calico--kube--controllers--5bd8b64d9c--m7tcm-eth0 calico-kube-controllers-5bd8b64d9c- calico-system 95881986-b9ee-4892-8fba-1273c4688890 784 0 2025-09-12 17:50:51 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5bd8b64d9c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4426.1.0-a-b1d4eb1a76 calico-kube-controllers-5bd8b64d9c-m7tcm eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] caliab066779ad5 [] [] }} ContainerID="788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" Namespace="calico-system" Pod="calico-kube-controllers-5bd8b64d9c-m7tcm" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--kube--controllers--5bd8b64d9c--m7tcm-" Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.359 [INFO][5615] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" Namespace="calico-system" Pod="calico-kube-controllers-5bd8b64d9c-m7tcm" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--kube--controllers--5bd8b64d9c--m7tcm-eth0" Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.372 [INFO][5689] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" HandleID="k8s-pod-network.788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--kube--controllers--5bd8b64d9c--m7tcm-eth0" Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.372 [INFO][5689] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" HandleID="k8s-pod-network.788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--kube--controllers--5bd8b64d9c--m7tcm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000692020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-b1d4eb1a76", "pod":"calico-kube-controllers-5bd8b64d9c-m7tcm", "timestamp":"2025-09-12 17:51:14.37213061 +0000 UTC"}, Hostname:"ci-4426.1.0-a-b1d4eb1a76", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.372 [INFO][5689] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.392 [INFO][5689] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.392 [INFO][5689] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-b1d4eb1a76' Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.476 [INFO][5689] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.479 [INFO][5689] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.482 [INFO][5689] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.483 [INFO][5689] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.484 [INFO][5689] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.484 [INFO][5689] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.485 [INFO][5689] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.487 [INFO][5689] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.502 [INFO][5689] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.198/26] block=192.168.61.192/26 handle="k8s-pod-network.788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.502 [INFO][5689] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.198/26] handle="k8s-pod-network.788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.503 [INFO][5689] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:51:14.511294 containerd[1930]: 2025-09-12 17:51:14.503 [INFO][5689] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.198/26] IPv6=[] ContainerID="788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" HandleID="k8s-pod-network.788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--kube--controllers--5bd8b64d9c--m7tcm-eth0" Sep 12 17:51:14.511687 containerd[1930]: 2025-09-12 17:51:14.503 [INFO][5615] cni-plugin/k8s.go 418: Populated endpoint ContainerID="788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" Namespace="calico-system" Pod="calico-kube-controllers-5bd8b64d9c-m7tcm" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--kube--controllers--5bd8b64d9c--m7tcm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-calico--kube--controllers--5bd8b64d9c--m7tcm-eth0", GenerateName:"calico-kube-controllers-5bd8b64d9c-", Namespace:"calico-system", SelfLink:"", UID:"95881986-b9ee-4892-8fba-1273c4688890", ResourceVersion:"784", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 50, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5bd8b64d9c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"", Pod:"calico-kube-controllers-5bd8b64d9c-m7tcm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.61.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"caliab066779ad5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:14.511687 containerd[1930]: 2025-09-12 17:51:14.504 [INFO][5615] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.198/32] ContainerID="788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" Namespace="calico-system" Pod="calico-kube-controllers-5bd8b64d9c-m7tcm" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--kube--controllers--5bd8b64d9c--m7tcm-eth0" Sep 12 17:51:14.511687 containerd[1930]: 2025-09-12 17:51:14.504 [INFO][5615] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliab066779ad5 ContainerID="788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" Namespace="calico-system" Pod="calico-kube-controllers-5bd8b64d9c-m7tcm" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--kube--controllers--5bd8b64d9c--m7tcm-eth0" Sep 12 17:51:14.511687 containerd[1930]: 2025-09-12 17:51:14.505 [INFO][5615] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" Namespace="calico-system" Pod="calico-kube-controllers-5bd8b64d9c-m7tcm" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--kube--controllers--5bd8b64d9c--m7tcm-eth0" Sep 12 17:51:14.511687 containerd[1930]: 2025-09-12 17:51:14.505 [INFO][5615] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" Namespace="calico-system" Pod="calico-kube-controllers-5bd8b64d9c-m7tcm" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--kube--controllers--5bd8b64d9c--m7tcm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-calico--kube--controllers--5bd8b64d9c--m7tcm-eth0", GenerateName:"calico-kube-controllers-5bd8b64d9c-", Namespace:"calico-system", SelfLink:"", UID:"95881986-b9ee-4892-8fba-1273c4688890", ResourceVersion:"784", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 50, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5bd8b64d9c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f", Pod:"calico-kube-controllers-5bd8b64d9c-m7tcm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.61.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"caliab066779ad5", MAC:"ba:8a:be:6b:81:9b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:14.511687 containerd[1930]: 2025-09-12 17:51:14.510 [INFO][5615] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" Namespace="calico-system" Pod="calico-kube-controllers-5bd8b64d9c-m7tcm" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--kube--controllers--5bd8b64d9c--m7tcm-eth0" Sep 12 17:51:14.516393 containerd[1930]: time="2025-09-12T17:51:14.516366050Z" level=info msg="StartContainer for \"526694dcc42d642118085dd9d93ce5c41f775768a1c4071b8a60ddfde57bf583\" returns successfully" Sep 12 17:51:14.520121 containerd[1930]: time="2025-09-12T17:51:14.520075073Z" level=info msg="connecting to shim 788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f" address="unix:///run/containerd/s/a9e8f8392942f9b11041578ea1de576edb7408effca31d447bd079b5772bcaeb" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:51:14.542319 systemd[1]: Started cri-containerd-788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f.scope - libcontainer container 788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f. Sep 12 17:51:14.567997 containerd[1930]: time="2025-09-12T17:51:14.567946175Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bd8b64d9c-m7tcm,Uid:95881986-b9ee-4892-8fba-1273c4688890,Namespace:calico-system,Attempt:0,} returns sandbox id \"788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f\"" Sep 12 17:51:14.592487 systemd-networkd[1838]: califec778d203e: Link UP Sep 12 17:51:14.592632 systemd-networkd[1838]: califec778d203e: Gained carrier Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.359 [INFO][5620] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--fcrhd-eth0 calico-apiserver-7c89f6dbbd- calico-apiserver e36e7f2e-9196-4fee-b58b-1dd28a04759a 777 0 2025-09-12 17:50:49 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7c89f6dbbd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.1.0-a-b1d4eb1a76 calico-apiserver-7c89f6dbbd-fcrhd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] califec778d203e [] [] }} ContainerID="5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-fcrhd" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--fcrhd-" Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.359 [INFO][5620] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-fcrhd" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--fcrhd-eth0" Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.371 [INFO][5687] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" HandleID="k8s-pod-network.5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--fcrhd-eth0" Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.371 [INFO][5687] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" HandleID="k8s-pod-network.5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--fcrhd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f7d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.1.0-a-b1d4eb1a76", "pod":"calico-apiserver-7c89f6dbbd-fcrhd", "timestamp":"2025-09-12 17:51:14.371816778 +0000 UTC"}, Hostname:"ci-4426.1.0-a-b1d4eb1a76", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.372 [INFO][5687] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.503 [INFO][5687] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.503 [INFO][5687] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-b1d4eb1a76' Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.576 [INFO][5687] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.580 [INFO][5687] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.582 [INFO][5687] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.583 [INFO][5687] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.584 [INFO][5687] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.584 [INFO][5687] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.585 [INFO][5687] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19 Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.587 [INFO][5687] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.590 [INFO][5687] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.199/26] block=192.168.61.192/26 handle="k8s-pod-network.5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.590 [INFO][5687] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.199/26] handle="k8s-pod-network.5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.590 [INFO][5687] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:51:14.598494 containerd[1930]: 2025-09-12 17:51:14.590 [INFO][5687] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.199/26] IPv6=[] ContainerID="5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" HandleID="k8s-pod-network.5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--fcrhd-eth0" Sep 12 17:51:14.598862 containerd[1930]: 2025-09-12 17:51:14.591 [INFO][5620] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-fcrhd" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--fcrhd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--fcrhd-eth0", GenerateName:"calico-apiserver-7c89f6dbbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"e36e7f2e-9196-4fee-b58b-1dd28a04759a", ResourceVersion:"777", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 50, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7c89f6dbbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"", Pod:"calico-apiserver-7c89f6dbbd-fcrhd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califec778d203e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:14.598862 containerd[1930]: 2025-09-12 17:51:14.591 [INFO][5620] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.199/32] ContainerID="5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-fcrhd" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--fcrhd-eth0" Sep 12 17:51:14.598862 containerd[1930]: 2025-09-12 17:51:14.591 [INFO][5620] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califec778d203e ContainerID="5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-fcrhd" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--fcrhd-eth0" Sep 12 17:51:14.598862 containerd[1930]: 2025-09-12 17:51:14.592 [INFO][5620] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-fcrhd" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--fcrhd-eth0" Sep 12 17:51:14.598862 containerd[1930]: 2025-09-12 17:51:14.592 [INFO][5620] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-fcrhd" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--fcrhd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--fcrhd-eth0", GenerateName:"calico-apiserver-7c89f6dbbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"e36e7f2e-9196-4fee-b58b-1dd28a04759a", ResourceVersion:"777", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 50, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7c89f6dbbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19", Pod:"calico-apiserver-7c89f6dbbd-fcrhd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califec778d203e", MAC:"42:54:87:58:01:5b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:14.598862 containerd[1930]: 2025-09-12 17:51:14.597 [INFO][5620] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" Namespace="calico-apiserver" Pod="calico-apiserver-7c89f6dbbd-fcrhd" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-calico--apiserver--7c89f6dbbd--fcrhd-eth0" Sep 12 17:51:14.606178 containerd[1930]: time="2025-09-12T17:51:14.606153919Z" level=info msg="connecting to shim 5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19" address="unix:///run/containerd/s/ad301f58352a20eae35b8a4bd1c7c8d0a5ec235168875231a7b155db161decdd" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:51:14.632540 systemd[1]: Started cri-containerd-5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19.scope - libcontainer container 5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19. Sep 12 17:51:14.704679 containerd[1930]: time="2025-09-12T17:51:14.704654696Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c89f6dbbd-fcrhd,Uid:e36e7f2e-9196-4fee-b58b-1dd28a04759a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19\"" Sep 12 17:51:14.705685 containerd[1930]: time="2025-09-12T17:51:14.705670241Z" level=info msg="CreateContainer within sandbox \"5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:51:14.708366 containerd[1930]: time="2025-09-12T17:51:14.708353553Z" level=info msg="Container 3a3baf1b125ad3008dabfc9a6b9026072e0e9792146b48f425a010ab7034e360: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:51:14.710892 containerd[1930]: time="2025-09-12T17:51:14.710880644Z" level=info msg="CreateContainer within sandbox \"5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"3a3baf1b125ad3008dabfc9a6b9026072e0e9792146b48f425a010ab7034e360\"" Sep 12 17:51:14.711061 containerd[1930]: time="2025-09-12T17:51:14.711050410Z" level=info msg="StartContainer for \"3a3baf1b125ad3008dabfc9a6b9026072e0e9792146b48f425a010ab7034e360\"" Sep 12 17:51:14.711649 containerd[1930]: time="2025-09-12T17:51:14.711608092Z" level=info msg="connecting to shim 3a3baf1b125ad3008dabfc9a6b9026072e0e9792146b48f425a010ab7034e360" address="unix:///run/containerd/s/ad301f58352a20eae35b8a4bd1c7c8d0a5ec235168875231a7b155db161decdd" protocol=ttrpc version=3 Sep 12 17:51:14.731179 systemd[1]: Started cri-containerd-3a3baf1b125ad3008dabfc9a6b9026072e0e9792146b48f425a010ab7034e360.scope - libcontainer container 3a3baf1b125ad3008dabfc9a6b9026072e0e9792146b48f425a010ab7034e360. Sep 12 17:51:14.758722 containerd[1930]: time="2025-09-12T17:51:14.758694383Z" level=info msg="StartContainer for \"3a3baf1b125ad3008dabfc9a6b9026072e0e9792146b48f425a010ab7034e360\" returns successfully" Sep 12 17:51:15.071266 systemd-networkd[1838]: calic2953b4ad1c: Gained IPv6LL Sep 12 17:51:15.327193 systemd-networkd[1838]: califaa73294ac4: Gained IPv6LL Sep 12 17:51:15.339074 containerd[1930]: time="2025-09-12T17:51:15.339049855Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8vfbh,Uid:86dbef57-8f65-4cc3-ae59-c02f1dffe403,Namespace:calico-system,Attempt:0,}" Sep 12 17:51:15.390269 systemd-networkd[1838]: califba711e8d88: Link UP Sep 12 17:51:15.390469 systemd-networkd[1838]: califba711e8d88: Gained carrier Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.357 [INFO][5997] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--b1d4eb1a76-k8s-csi--node--driver--8vfbh-eth0 csi-node-driver- calico-system 86dbef57-8f65-4cc3-ae59-c02f1dffe403 670 0 2025-09-12 17:50:51 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4426.1.0-a-b1d4eb1a76 csi-node-driver-8vfbh eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] califba711e8d88 [] [] }} ContainerID="cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" Namespace="calico-system" Pod="csi-node-driver-8vfbh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-csi--node--driver--8vfbh-" Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.357 [INFO][5997] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" Namespace="calico-system" Pod="csi-node-driver-8vfbh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-csi--node--driver--8vfbh-eth0" Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.373 [INFO][6020] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" HandleID="k8s-pod-network.cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-csi--node--driver--8vfbh-eth0" Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.373 [INFO][6020] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" HandleID="k8s-pod-network.cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-csi--node--driver--8vfbh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00078c130), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-b1d4eb1a76", "pod":"csi-node-driver-8vfbh", "timestamp":"2025-09-12 17:51:15.373644964 +0000 UTC"}, Hostname:"ci-4426.1.0-a-b1d4eb1a76", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.373 [INFO][6020] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.373 [INFO][6020] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.373 [INFO][6020] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-b1d4eb1a76' Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.377 [INFO][6020] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.379 [INFO][6020] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.380 [INFO][6020] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.381 [INFO][6020] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.382 [INFO][6020] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.382 [INFO][6020] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.383 [INFO][6020] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.385 [INFO][6020] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.388 [INFO][6020] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.200/26] block=192.168.61.192/26 handle="k8s-pod-network.cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.388 [INFO][6020] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.200/26] handle="k8s-pod-network.cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" host="ci-4426.1.0-a-b1d4eb1a76" Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.388 [INFO][6020] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:51:15.396942 containerd[1930]: 2025-09-12 17:51:15.388 [INFO][6020] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.200/26] IPv6=[] ContainerID="cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" HandleID="k8s-pod-network.cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" Workload="ci--4426.1.0--a--b1d4eb1a76-k8s-csi--node--driver--8vfbh-eth0" Sep 12 17:51:15.397644 containerd[1930]: 2025-09-12 17:51:15.389 [INFO][5997] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" Namespace="calico-system" Pod="csi-node-driver-8vfbh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-csi--node--driver--8vfbh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-csi--node--driver--8vfbh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"86dbef57-8f65-4cc3-ae59-c02f1dffe403", ResourceVersion:"670", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 50, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"", Pod:"csi-node-driver-8vfbh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.61.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"califba711e8d88", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:15.397644 containerd[1930]: 2025-09-12 17:51:15.389 [INFO][5997] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.200/32] ContainerID="cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" Namespace="calico-system" Pod="csi-node-driver-8vfbh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-csi--node--driver--8vfbh-eth0" Sep 12 17:51:15.397644 containerd[1930]: 2025-09-12 17:51:15.389 [INFO][5997] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califba711e8d88 ContainerID="cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" Namespace="calico-system" Pod="csi-node-driver-8vfbh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-csi--node--driver--8vfbh-eth0" Sep 12 17:51:15.397644 containerd[1930]: 2025-09-12 17:51:15.390 [INFO][5997] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" Namespace="calico-system" Pod="csi-node-driver-8vfbh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-csi--node--driver--8vfbh-eth0" Sep 12 17:51:15.397644 containerd[1930]: 2025-09-12 17:51:15.390 [INFO][5997] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" Namespace="calico-system" Pod="csi-node-driver-8vfbh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-csi--node--driver--8vfbh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--b1d4eb1a76-k8s-csi--node--driver--8vfbh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"86dbef57-8f65-4cc3-ae59-c02f1dffe403", ResourceVersion:"670", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 50, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-b1d4eb1a76", ContainerID:"cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b", Pod:"csi-node-driver-8vfbh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.61.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"califba711e8d88", MAC:"02:94:30:8c:b8:f5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:51:15.397644 containerd[1930]: 2025-09-12 17:51:15.395 [INFO][5997] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" Namespace="calico-system" Pod="csi-node-driver-8vfbh" WorkloadEndpoint="ci--4426.1.0--a--b1d4eb1a76-k8s-csi--node--driver--8vfbh-eth0" Sep 12 17:51:15.405594 containerd[1930]: time="2025-09-12T17:51:15.405565787Z" level=info msg="connecting to shim cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b" address="unix:///run/containerd/s/e841902dfa23f010eccbe7bd9b33619b60b4462efc5ee3530b1b304e97dc8136" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:51:15.423319 systemd[1]: Started cri-containerd-cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b.scope - libcontainer container cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b. Sep 12 17:51:15.437031 containerd[1930]: time="2025-09-12T17:51:15.437009061Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8vfbh,Uid:86dbef57-8f65-4cc3-ae59-c02f1dffe403,Namespace:calico-system,Attempt:0,} returns sandbox id \"cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b\"" Sep 12 17:51:15.484252 kubelet[3295]: I0912 17:51:15.484217 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7c89f6dbbd-fcrhd" podStartSLOduration=26.484206657 podStartE2EDuration="26.484206657s" podCreationTimestamp="2025-09-12 17:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:51:15.483874763 +0000 UTC m=+40.229727150" watchObservedRunningTime="2025-09-12 17:51:15.484206657 +0000 UTC m=+40.230059041" Sep 12 17:51:15.489728 kubelet[3295]: I0912 17:51:15.489692 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-n6k4b" podStartSLOduration=34.489680187 podStartE2EDuration="34.489680187s" podCreationTimestamp="2025-09-12 17:50:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:51:15.489431477 +0000 UTC m=+40.235283868" watchObservedRunningTime="2025-09-12 17:51:15.489680187 +0000 UTC m=+40.235532570" Sep 12 17:51:15.776195 systemd-networkd[1838]: cali3fb2d48efe7: Gained IPv6LL Sep 12 17:51:15.791088 containerd[1930]: time="2025-09-12T17:51:15.791040001Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:15.791319 containerd[1930]: time="2025-09-12T17:51:15.791275862Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 12 17:51:15.791641 containerd[1930]: time="2025-09-12T17:51:15.791598533Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:15.792611 containerd[1930]: time="2025-09-12T17:51:15.792570065Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:15.793004 containerd[1930]: time="2025-09-12T17:51:15.792962765Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 2.233466255s" Sep 12 17:51:15.793004 containerd[1930]: time="2025-09-12T17:51:15.792980049Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 12 17:51:15.793515 containerd[1930]: time="2025-09-12T17:51:15.793475268Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 17:51:15.793998 containerd[1930]: time="2025-09-12T17:51:15.793958246Z" level=info msg="CreateContainer within sandbox \"62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 17:51:15.796705 containerd[1930]: time="2025-09-12T17:51:15.796691883Z" level=info msg="Container cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:51:15.799308 containerd[1930]: time="2025-09-12T17:51:15.799268244Z" level=info msg="CreateContainer within sandbox \"62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\"" Sep 12 17:51:15.799517 containerd[1930]: time="2025-09-12T17:51:15.799504404Z" level=info msg="StartContainer for \"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\"" Sep 12 17:51:15.800079 containerd[1930]: time="2025-09-12T17:51:15.800066658Z" level=info msg="connecting to shim cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f" address="unix:///run/containerd/s/a14b45fe9451eb59ce24d352d32a9a79b4c0bce81f7339996d9c329fd805ea2d" protocol=ttrpc version=3 Sep 12 17:51:15.821229 systemd[1]: Started cri-containerd-cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f.scope - libcontainer container cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f. Sep 12 17:51:15.851152 containerd[1930]: time="2025-09-12T17:51:15.851129292Z" level=info msg="StartContainer for \"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" returns successfully" Sep 12 17:51:15.903446 systemd-networkd[1838]: califec778d203e: Gained IPv6LL Sep 12 17:51:15.967322 systemd-networkd[1838]: caliab066779ad5: Gained IPv6LL Sep 12 17:51:16.222792 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2692159249.mount: Deactivated successfully. Sep 12 17:51:16.485567 kubelet[3295]: I0912 17:51:16.485422 3295 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:51:16.506517 kubelet[3295]: I0912 17:51:16.506381 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-bmktx" podStartSLOduration=24.272290108 podStartE2EDuration="26.506331793s" podCreationTimestamp="2025-09-12 17:50:50 +0000 UTC" firstStartedPulling="2025-09-12 17:51:13.559357192 +0000 UTC m=+38.305209577" lastFinishedPulling="2025-09-12 17:51:15.793398873 +0000 UTC m=+40.539251262" observedRunningTime="2025-09-12 17:51:16.50513242 +0000 UTC m=+41.250985664" watchObservedRunningTime="2025-09-12 17:51:16.506331793 +0000 UTC m=+41.252184223" Sep 12 17:51:16.579378 containerd[1930]: time="2025-09-12T17:51:16.579320596Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"13e0db0198fdc0f0768c46be44e7963b89f181d85ac78feaa80182d9dbcf7000\" pid:6159 exited_at:{seconds:1757699476 nanos:578996759}" Sep 12 17:51:16.927261 systemd-networkd[1838]: califba711e8d88: Gained IPv6LL Sep 12 17:51:17.783267 containerd[1930]: time="2025-09-12T17:51:17.783210195Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:17.783484 containerd[1930]: time="2025-09-12T17:51:17.783427671Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 12 17:51:17.783788 containerd[1930]: time="2025-09-12T17:51:17.783748369Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:17.784654 containerd[1930]: time="2025-09-12T17:51:17.784610806Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:17.785030 containerd[1930]: time="2025-09-12T17:51:17.784988876Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 1.991499443s" Sep 12 17:51:17.785030 containerd[1930]: time="2025-09-12T17:51:17.785005364Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 12 17:51:17.785501 containerd[1930]: time="2025-09-12T17:51:17.785455810Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 17:51:17.789144 containerd[1930]: time="2025-09-12T17:51:17.789126422Z" level=info msg="CreateContainer within sandbox \"788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 17:51:17.791964 containerd[1930]: time="2025-09-12T17:51:17.791923905Z" level=info msg="Container 9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:51:17.794663 containerd[1930]: time="2025-09-12T17:51:17.794636623Z" level=info msg="CreateContainer within sandbox \"788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\"" Sep 12 17:51:17.794872 containerd[1930]: time="2025-09-12T17:51:17.794861421Z" level=info msg="StartContainer for \"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\"" Sep 12 17:51:17.795443 containerd[1930]: time="2025-09-12T17:51:17.795406687Z" level=info msg="connecting to shim 9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82" address="unix:///run/containerd/s/a9e8f8392942f9b11041578ea1de576edb7408effca31d447bd079b5772bcaeb" protocol=ttrpc version=3 Sep 12 17:51:17.811386 systemd[1]: Started cri-containerd-9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82.scope - libcontainer container 9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82. Sep 12 17:51:17.838834 containerd[1930]: time="2025-09-12T17:51:17.838812528Z" level=info msg="StartContainer for \"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" returns successfully" Sep 12 17:51:18.515197 kubelet[3295]: I0912 17:51:18.515052 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5bd8b64d9c-m7tcm" podStartSLOduration=24.298153815 podStartE2EDuration="27.515017861s" podCreationTimestamp="2025-09-12 17:50:51 +0000 UTC" firstStartedPulling="2025-09-12 17:51:14.56854423 +0000 UTC m=+39.314396615" lastFinishedPulling="2025-09-12 17:51:17.785408271 +0000 UTC m=+42.531260661" observedRunningTime="2025-09-12 17:51:18.51433511 +0000 UTC m=+43.260187555" watchObservedRunningTime="2025-09-12 17:51:18.515017861 +0000 UTC m=+43.260870292" Sep 12 17:51:19.093776 containerd[1930]: time="2025-09-12T17:51:19.093725712Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:19.094024 containerd[1930]: time="2025-09-12T17:51:19.093902683Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 12 17:51:19.094356 containerd[1930]: time="2025-09-12T17:51:19.094315809Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:19.095250 containerd[1930]: time="2025-09-12T17:51:19.095210630Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:19.095644 containerd[1930]: time="2025-09-12T17:51:19.095599121Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.31013038s" Sep 12 17:51:19.095644 containerd[1930]: time="2025-09-12T17:51:19.095614693Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 12 17:51:19.097406 containerd[1930]: time="2025-09-12T17:51:19.097390805Z" level=info msg="CreateContainer within sandbox \"cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 17:51:19.101542 containerd[1930]: time="2025-09-12T17:51:19.101500747Z" level=info msg="Container ec899315cc236ca6d178692ef256de50f93b50f735f4a96914a4e78dac739834: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:51:19.105123 containerd[1930]: time="2025-09-12T17:51:19.105078102Z" level=info msg="CreateContainer within sandbox \"cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"ec899315cc236ca6d178692ef256de50f93b50f735f4a96914a4e78dac739834\"" Sep 12 17:51:19.105430 containerd[1930]: time="2025-09-12T17:51:19.105380385Z" level=info msg="StartContainer for \"ec899315cc236ca6d178692ef256de50f93b50f735f4a96914a4e78dac739834\"" Sep 12 17:51:19.106113 containerd[1930]: time="2025-09-12T17:51:19.106068987Z" level=info msg="connecting to shim ec899315cc236ca6d178692ef256de50f93b50f735f4a96914a4e78dac739834" address="unix:///run/containerd/s/e841902dfa23f010eccbe7bd9b33619b60b4462efc5ee3530b1b304e97dc8136" protocol=ttrpc version=3 Sep 12 17:51:19.128245 systemd[1]: Started cri-containerd-ec899315cc236ca6d178692ef256de50f93b50f735f4a96914a4e78dac739834.scope - libcontainer container ec899315cc236ca6d178692ef256de50f93b50f735f4a96914a4e78dac739834. Sep 12 17:51:19.164571 containerd[1930]: time="2025-09-12T17:51:19.164515128Z" level=info msg="StartContainer for \"ec899315cc236ca6d178692ef256de50f93b50f735f4a96914a4e78dac739834\" returns successfully" Sep 12 17:51:19.165234 containerd[1930]: time="2025-09-12T17:51:19.165217315Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 17:51:19.581239 containerd[1930]: time="2025-09-12T17:51:19.581216971Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"0548fd94a118de72ee1e387502675dd58a3ab5f3168d412f4420f94949ab6c06\" pid:6291 exited_at:{seconds:1757699479 nanos:581063864}" Sep 12 17:51:20.708038 containerd[1930]: time="2025-09-12T17:51:20.707984976Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:20.708353 containerd[1930]: time="2025-09-12T17:51:20.708237840Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 12 17:51:20.708644 containerd[1930]: time="2025-09-12T17:51:20.708604858Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:20.709493 containerd[1930]: time="2025-09-12T17:51:20.709450718Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:20.709871 containerd[1930]: time="2025-09-12T17:51:20.709829248Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.544591581s" Sep 12 17:51:20.709871 containerd[1930]: time="2025-09-12T17:51:20.709844295Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 12 17:51:20.710873 containerd[1930]: time="2025-09-12T17:51:20.710833902Z" level=info msg="CreateContainer within sandbox \"cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 17:51:20.713810 containerd[1930]: time="2025-09-12T17:51:20.713770588Z" level=info msg="Container 23a00a2d747ad51573e218e99f274c5258dd3aa9fb83d00400f860e31de4458a: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:51:20.717552 containerd[1930]: time="2025-09-12T17:51:20.717509116Z" level=info msg="CreateContainer within sandbox \"cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"23a00a2d747ad51573e218e99f274c5258dd3aa9fb83d00400f860e31de4458a\"" Sep 12 17:51:20.717842 containerd[1930]: time="2025-09-12T17:51:20.717797399Z" level=info msg="StartContainer for \"23a00a2d747ad51573e218e99f274c5258dd3aa9fb83d00400f860e31de4458a\"" Sep 12 17:51:20.718623 containerd[1930]: time="2025-09-12T17:51:20.718611446Z" level=info msg="connecting to shim 23a00a2d747ad51573e218e99f274c5258dd3aa9fb83d00400f860e31de4458a" address="unix:///run/containerd/s/e841902dfa23f010eccbe7bd9b33619b60b4462efc5ee3530b1b304e97dc8136" protocol=ttrpc version=3 Sep 12 17:51:20.736370 systemd[1]: Started cri-containerd-23a00a2d747ad51573e218e99f274c5258dd3aa9fb83d00400f860e31de4458a.scope - libcontainer container 23a00a2d747ad51573e218e99f274c5258dd3aa9fb83d00400f860e31de4458a. Sep 12 17:51:20.755883 containerd[1930]: time="2025-09-12T17:51:20.755860513Z" level=info msg="StartContainer for \"23a00a2d747ad51573e218e99f274c5258dd3aa9fb83d00400f860e31de4458a\" returns successfully" Sep 12 17:51:21.391215 kubelet[3295]: I0912 17:51:21.391148 3295 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 17:51:21.391215 kubelet[3295]: I0912 17:51:21.391230 3295 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 17:51:25.607893 containerd[1930]: time="2025-09-12T17:51:25.607865400Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"bf5140635c5aab7acef3e6202f34ecf97894234628be0e21c054369553820dd9\" pid:6361 exited_at:{seconds:1757699485 nanos:607662396}" Sep 12 17:51:25.618145 kubelet[3295]: I0912 17:51:25.618109 3295 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-8vfbh" podStartSLOduration=29.345499784 podStartE2EDuration="34.61808943s" podCreationTimestamp="2025-09-12 17:50:51 +0000 UTC" firstStartedPulling="2025-09-12 17:51:15.437611883 +0000 UTC m=+40.183464269" lastFinishedPulling="2025-09-12 17:51:20.710201528 +0000 UTC m=+45.456053915" observedRunningTime="2025-09-12 17:51:21.535244973 +0000 UTC m=+46.281097424" watchObservedRunningTime="2025-09-12 17:51:25.61808943 +0000 UTC m=+50.363941814" Sep 12 17:51:30.344691 kubelet[3295]: I0912 17:51:30.344583 3295 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:51:34.417177 containerd[1930]: time="2025-09-12T17:51:34.417125546Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"c67b830afb8c2fb600e4a8fbd54438a6ff829fa5e81e5abf397c2a300122f83b\" pid:6406 exited_at:{seconds:1757699494 nanos:417015439}" Sep 12 17:51:43.959971 containerd[1930]: time="2025-09-12T17:51:43.959946358Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"28bab93c257a3b4f31c21c778f24686a993259ab02a1adc8b862b7c9c831d476\" pid:6437 exited_at:{seconds:1757699503 nanos:959775121}" Sep 12 17:51:55.574045 containerd[1930]: time="2025-09-12T17:51:55.574018707Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"cfdc726b9958916d9caa7a2e2c7c108aa1a9b7ffddaabd94917806e2f18f06ed\" pid:6477 exited_at:{seconds:1757699515 nanos:573802954}" Sep 12 17:51:59.074395 containerd[1930]: time="2025-09-12T17:51:59.074364213Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"69ec827d302f5fc1310132aecc21b7217390e321743097fed17261ffa6f143e8\" pid:6511 exited_at:{seconds:1757699519 nanos:74200438}" Sep 12 17:52:04.478660 containerd[1930]: time="2025-09-12T17:52:04.478635404Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"c900556a096a2cb3f25c29ec06f9421b73b472e1a5a1a2b1749525e4f2fe0e4c\" pid:6532 exited_at:{seconds:1757699524 nanos:478508448}" Sep 12 17:52:05.386832 containerd[1930]: time="2025-09-12T17:52:05.386777892Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"621878362d41f3da34bf759d82371dd431bed8d26fe14b781dbbd1e4f0b4a799\" pid:6554 exited_at:{seconds:1757699525 nanos:386584538}" Sep 12 17:52:13.950590 containerd[1930]: time="2025-09-12T17:52:13.950562328Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"218012b75b9cc6eb4c74574927b453f288dac46c26efd6cd3de320b22222c20c\" pid:6589 exited_at:{seconds:1757699533 nanos:950389575}" Sep 12 17:52:25.630588 containerd[1930]: time="2025-09-12T17:52:25.630560695Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"d052b26ab038c7b81f51f083077432925675d08fcdfeeabd2340836aa751e4f2\" pid:6622 exited_at:{seconds:1757699545 nanos:630287426}" Sep 12 17:52:34.463965 containerd[1930]: time="2025-09-12T17:52:34.463932219Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"921f26b166456736c18f7079e0ee8f88433cb29265bc855183c540acf7ca26f1\" pid:6665 exited_at:{seconds:1757699554 nanos:463734554}" Sep 12 17:52:43.921031 containerd[1930]: time="2025-09-12T17:52:43.921003746Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"60bc0f66b8863a920619b42126262cb35251edfa14bfedb1a265894601a0d1d8\" pid:6717 exited_at:{seconds:1757699563 nanos:920771622}" Sep 12 17:52:55.620759 containerd[1930]: time="2025-09-12T17:52:55.620730367Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"377289b40ccd84a3f726d1791ae5abf79a035ca06af5a294f4a75ba793fcd936\" pid:6749 exited_at:{seconds:1757699575 nanos:620515059}" Sep 12 17:52:59.048896 containerd[1930]: time="2025-09-12T17:52:59.048866788Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"fc26ea22a5d0bbdebeeed690c6068dae6e96b3ae10ad72c80c9567075782f12e\" pid:6787 exited_at:{seconds:1757699579 nanos:48436480}" Sep 12 17:53:04.470174 containerd[1930]: time="2025-09-12T17:53:04.470090124Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"3e7c0dac1d7b7f732c6207de89e4729bf54eca323652503f0a0fb853252d2281\" pid:6809 exited_at:{seconds:1757699584 nanos:469960566}" Sep 12 17:53:05.388207 containerd[1930]: time="2025-09-12T17:53:05.388182810Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"9c3a6afeac6ea398b2bf2db48455f2d31184adc1a682ce485554cdc2b170be95\" pid:6832 exited_at:{seconds:1757699585 nanos:388047233}" Sep 12 17:53:13.957735 containerd[1930]: time="2025-09-12T17:53:13.957703244Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"235230095087f02481eec6013e116187f273f585bc7d25d89760d6cb36938d49\" pid:6870 exited_at:{seconds:1757699593 nanos:957507016}" Sep 12 17:53:25.613816 containerd[1930]: time="2025-09-12T17:53:25.613789667Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"f1311d8802df7e15db8702e91731f91e9954e83c03b274878398f2d4c8ea3046\" pid:6901 exited_at:{seconds:1757699605 nanos:613551377}" Sep 12 17:53:34.471687 containerd[1930]: time="2025-09-12T17:53:34.471631370Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"d8fa632089160417d28c9c033ba63f56d0fc3ca0c2675b7b3d825b4fe26c05e9\" pid:6937 exited_at:{seconds:1757699614 nanos:471435701}" Sep 12 17:53:43.902362 containerd[1930]: time="2025-09-12T17:53:43.902311821Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"18cef149ba9ed93d5516a18970ef19204bc196f646e5d39e5791e67ebc7dac28\" pid:6963 exited_at:{seconds:1757699623 nanos:902145919}" Sep 12 17:53:55.614914 containerd[1930]: time="2025-09-12T17:53:55.614886508Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"1f84f8e151757d7e585ebbaef640799a96a7b65e4c7b3fe81cbc29f3ee53fdf2\" pid:7000 exited_at:{seconds:1757699635 nanos:614661092}" Sep 12 17:53:59.035272 containerd[1930]: time="2025-09-12T17:53:59.035246928Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"2e9f16602a7952f47a81b388be4290a17e993b4a4484ac58f19e9ae20cbd3870\" pid:7036 exited_at:{seconds:1757699639 nanos:35134644}" Sep 12 17:54:04.473375 containerd[1930]: time="2025-09-12T17:54:04.473345421Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"a6c9b1bab13e2dea9e25634b475cb022f7de2f454387b3b44bda365b5a922767\" pid:7058 exited_at:{seconds:1757699644 nanos:473201812}" Sep 12 17:54:05.399018 containerd[1930]: time="2025-09-12T17:54:05.398988922Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"642d85434c66e502d6b95eac4d751e627759ee360fdf2883e38606f83d4d6d58\" pid:7079 exited_at:{seconds:1757699645 nanos:398751990}" Sep 12 17:54:13.919853 containerd[1930]: time="2025-09-12T17:54:13.919825582Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"69ec99b487f8e0b05bf6109011a9da1b130e7832c54b4bfca49dedaf6a48c747\" pid:7127 exited_at:{seconds:1757699653 nanos:919561365}" Sep 12 17:54:16.383074 systemd[1]: Started sshd@9-139.178.94.149:22-8.130.184.62:48928.service - OpenSSH per-connection server daemon (8.130.184.62:48928). Sep 12 17:54:21.671531 sshd[7150]: banner exchange: Connection from 8.130.184.62 port 48928: invalid format Sep 12 17:54:21.673205 systemd[1]: sshd@9-139.178.94.149:22-8.130.184.62:48928.service: Deactivated successfully. Sep 12 17:54:25.570601 containerd[1930]: time="2025-09-12T17:54:25.570575806Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"1914a62817fb54b2c5554daa21f7c02c02a067203562790d2f81b2a026353771\" pid:7173 exited_at:{seconds:1757699665 nanos:570336608}" Sep 12 17:54:34.430461 containerd[1930]: time="2025-09-12T17:54:34.430398230Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"88fca3581782fe39683c5f6731ac314dc84ff48809b959f0465fbd8274298206\" pid:7207 exited_at:{seconds:1757699674 nanos:430273306}" Sep 12 17:54:43.953127 containerd[1930]: time="2025-09-12T17:54:43.953092884Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"d32e2a5e767093ffa95b57394373e00d898de682ac614aeb4a61b730ce3d2172\" pid:7233 exited_at:{seconds:1757699683 nanos:952914548}" Sep 12 17:54:55.578366 containerd[1930]: time="2025-09-12T17:54:55.578338196Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"a092dec6e8e8b533ad4045908649bb7c415dbe441098303763c77d43b4579711\" pid:7266 exited_at:{seconds:1757699695 nanos:578105136}" Sep 12 17:54:59.015354 containerd[1930]: time="2025-09-12T17:54:59.015297305Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"121a9b66c8672c617cade4b74f25646ee9868d38ba8eb5e0c9a8b6d70969d4ab\" pid:7301 exited_at:{seconds:1757699699 nanos:15179406}" Sep 12 17:55:04.429290 containerd[1930]: time="2025-09-12T17:55:04.429268285Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"43eac5d95a2cac181b1281a852a2dc4487324056b494f637fa888ad774165fcc\" pid:7323 exited_at:{seconds:1757699704 nanos:429152082}" Sep 12 17:55:05.449036 containerd[1930]: time="2025-09-12T17:55:05.448985092Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"fdc34a3fec0ddc9574a0aab7cb5a5cdde8125d468ed4a781888d4a140a4f0651\" pid:7344 exited_at:{seconds:1757699705 nanos:448794268}" Sep 12 17:55:13.909089 containerd[1930]: time="2025-09-12T17:55:13.909035754Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"ba14e3fcec26181766a11a840012e61e6cb345d1772820eda8c1f9f13e98ba77\" pid:7380 exited_at:{seconds:1757699713 nanos:908869543}" Sep 12 17:55:25.612299 containerd[1930]: time="2025-09-12T17:55:25.612239057Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"7d446e4dfab5fc05577c2812bd8f14163bbf5735adfb23147d17104550ceb1d6\" pid:7416 exited_at:{seconds:1757699725 nanos:612022787}" Sep 12 17:55:31.497074 containerd[1930]: time="2025-09-12T17:55:31.496919218Z" level=warning msg="container event discarded" container=0e4e2ccec013f657d1b7bc8aa23f2149b0a7c7e5e7ab7d6884d561423ec38f07 type=CONTAINER_CREATED_EVENT Sep 12 17:55:31.497074 containerd[1930]: time="2025-09-12T17:55:31.497046203Z" level=warning msg="container event discarded" container=0e4e2ccec013f657d1b7bc8aa23f2149b0a7c7e5e7ab7d6884d561423ec38f07 type=CONTAINER_STARTED_EVENT Sep 12 17:55:31.508398 containerd[1930]: time="2025-09-12T17:55:31.508318062Z" level=warning msg="container event discarded" container=107c807b9c7e3682213e53d6afb892b7e07f7d9413813864f2a41ba57757718b type=CONTAINER_CREATED_EVENT Sep 12 17:55:31.508398 containerd[1930]: time="2025-09-12T17:55:31.508383898Z" level=warning msg="container event discarded" container=107c807b9c7e3682213e53d6afb892b7e07f7d9413813864f2a41ba57757718b type=CONTAINER_STARTED_EVENT Sep 12 17:55:31.508698 containerd[1930]: time="2025-09-12T17:55:31.508410259Z" level=warning msg="container event discarded" container=1d4e7310980478ad208d1dfbc7007361bce00d39a1e333a01d938153d7f56f2f type=CONTAINER_CREATED_EVENT Sep 12 17:55:31.508698 containerd[1930]: time="2025-09-12T17:55:31.508431290Z" level=warning msg="container event discarded" container=87f1e5005697c0d1c06c975df7843f2f881b5d164ce2c133a40e8b3a2a87f449 type=CONTAINER_CREATED_EVENT Sep 12 17:55:31.519881 containerd[1930]: time="2025-09-12T17:55:31.519801947Z" level=warning msg="container event discarded" container=62609e6cf098b31c5ef0c68cb3e7df041b552464f0a891a41398c5289a1a5f58 type=CONTAINER_CREATED_EVENT Sep 12 17:55:31.519881 containerd[1930]: time="2025-09-12T17:55:31.519862386Z" level=warning msg="container event discarded" container=62609e6cf098b31c5ef0c68cb3e7df041b552464f0a891a41398c5289a1a5f58 type=CONTAINER_STARTED_EVENT Sep 12 17:55:31.520223 containerd[1930]: time="2025-09-12T17:55:31.519906170Z" level=warning msg="container event discarded" container=c819e41f80dd0a203868627784ba2ea85180a87bc6254ccf95297aabc4ac67d1 type=CONTAINER_CREATED_EVENT Sep 12 17:55:31.568316 containerd[1930]: time="2025-09-12T17:55:31.568231923Z" level=warning msg="container event discarded" container=87f1e5005697c0d1c06c975df7843f2f881b5d164ce2c133a40e8b3a2a87f449 type=CONTAINER_STARTED_EVENT Sep 12 17:55:31.568316 containerd[1930]: time="2025-09-12T17:55:31.568299318Z" level=warning msg="container event discarded" container=1d4e7310980478ad208d1dfbc7007361bce00d39a1e333a01d938153d7f56f2f type=CONTAINER_STARTED_EVENT Sep 12 17:55:31.568598 containerd[1930]: time="2025-09-12T17:55:31.568325717Z" level=warning msg="container event discarded" container=c819e41f80dd0a203868627784ba2ea85180a87bc6254ccf95297aabc4ac67d1 type=CONTAINER_STARTED_EVENT Sep 12 17:55:34.421695 containerd[1930]: time="2025-09-12T17:55:34.421671129Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"7d7a1cc936eb75aa48fc7b6b3f53a283e056607fe5ee59d588976343c5aeab27\" pid:7451 exited_at:{seconds:1757699734 nanos:421495494}" Sep 12 17:55:42.415476 containerd[1930]: time="2025-09-12T17:55:42.415288077Z" level=warning msg="container event discarded" container=41ef0386d9aa1f1f6531ed4ba469108793d2709afd5a19709fd040b66d71a615 type=CONTAINER_CREATED_EVENT Sep 12 17:55:42.415476 containerd[1930]: time="2025-09-12T17:55:42.415418568Z" level=warning msg="container event discarded" container=41ef0386d9aa1f1f6531ed4ba469108793d2709afd5a19709fd040b66d71a615 type=CONTAINER_STARTED_EVENT Sep 12 17:55:42.555018 containerd[1930]: time="2025-09-12T17:55:42.554885819Z" level=warning msg="container event discarded" container=98fd4102631751c4195df424c7ca03f7e514bbd7922d4e3a33bedd2fd6388890 type=CONTAINER_CREATED_EVENT Sep 12 17:55:42.555018 containerd[1930]: time="2025-09-12T17:55:42.554969406Z" level=warning msg="container event discarded" container=98fd4102631751c4195df424c7ca03f7e514bbd7922d4e3a33bedd2fd6388890 type=CONTAINER_STARTED_EVENT Sep 12 17:55:42.849737 containerd[1930]: time="2025-09-12T17:55:42.849625216Z" level=warning msg="container event discarded" container=be58c909089a5b36789cd02cf7fc69773ed8a81d60c5d8c5f0232efe7a40885b type=CONTAINER_CREATED_EVENT Sep 12 17:55:42.904348 containerd[1930]: time="2025-09-12T17:55:42.904236733Z" level=warning msg="container event discarded" container=be58c909089a5b36789cd02cf7fc69773ed8a81d60c5d8c5f0232efe7a40885b type=CONTAINER_STARTED_EVENT Sep 12 17:55:43.909236 containerd[1930]: time="2025-09-12T17:55:43.909182215Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"e61d17b16aba66b61fe5839813ec5f626c3418b06c41c284cc50291889c614cf\" pid:7496 exited_at:{seconds:1757699743 nanos:908988025}" Sep 12 17:55:44.244211 containerd[1930]: time="2025-09-12T17:55:44.243933347Z" level=warning msg="container event discarded" container=3b8821ce17e51ef35826b7561a511efc6847cbcbbc3c18092f265befd8d07df1 type=CONTAINER_CREATED_EVENT Sep 12 17:55:44.279343 containerd[1930]: time="2025-09-12T17:55:44.279239537Z" level=warning msg="container event discarded" container=3b8821ce17e51ef35826b7561a511efc6847cbcbbc3c18092f265befd8d07df1 type=CONTAINER_STARTED_EVENT Sep 12 17:55:51.316184 containerd[1930]: time="2025-09-12T17:55:51.315998769Z" level=warning msg="container event discarded" container=74e3efc4fd17c59edbf02c4a18dcefc12e018e96c1ad1f8e781c600d931feda7 type=CONTAINER_CREATED_EVENT Sep 12 17:55:51.316184 containerd[1930]: time="2025-09-12T17:55:51.316125600Z" level=warning msg="container event discarded" container=74e3efc4fd17c59edbf02c4a18dcefc12e018e96c1ad1f8e781c600d931feda7 type=CONTAINER_STARTED_EVENT Sep 12 17:55:51.658136 containerd[1930]: time="2025-09-12T17:55:51.657887463Z" level=warning msg="container event discarded" container=509591bd9f1ebc0ab26a795eb5fcf0e5740e5b3a353f215a900816593ee9af9d type=CONTAINER_CREATED_EVENT Sep 12 17:55:51.658136 containerd[1930]: time="2025-09-12T17:55:51.657961487Z" level=warning msg="container event discarded" container=509591bd9f1ebc0ab26a795eb5fcf0e5740e5b3a353f215a900816593ee9af9d type=CONTAINER_STARTED_EVENT Sep 12 17:55:53.407456 containerd[1930]: time="2025-09-12T17:55:53.407327324Z" level=warning msg="container event discarded" container=9281f8f9ecc81683425d464fbde2c7f2772fa103feeae2759d5a265f9f660a78 type=CONTAINER_CREATED_EVENT Sep 12 17:55:53.450125 containerd[1930]: time="2025-09-12T17:55:53.449977058Z" level=warning msg="container event discarded" container=9281f8f9ecc81683425d464fbde2c7f2772fa103feeae2759d5a265f9f660a78 type=CONTAINER_STARTED_EVENT Sep 12 17:55:55.221512 containerd[1930]: time="2025-09-12T17:55:55.221376519Z" level=warning msg="container event discarded" container=ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a type=CONTAINER_CREATED_EVENT Sep 12 17:55:55.266833 containerd[1930]: time="2025-09-12T17:55:55.266707949Z" level=warning msg="container event discarded" container=ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a type=CONTAINER_STARTED_EVENT Sep 12 17:55:55.581746 containerd[1930]: time="2025-09-12T17:55:55.581679514Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"09b6b436380ddd7a12cb1a27c0b0ef31d6d522892fda6930c458c5ba7d0a5687\" pid:7527 exited_at:{seconds:1757699755 nanos:581469645}" Sep 12 17:55:56.079495 containerd[1930]: time="2025-09-12T17:55:56.079440420Z" level=warning msg="container event discarded" container=ea22cb8df797753389d0ea719e886624bb0860d00e5dfcca65ce1b050673ec0a type=CONTAINER_STOPPED_EVENT Sep 12 17:55:58.735782 containerd[1930]: time="2025-09-12T17:55:58.735631602Z" level=warning msg="container event discarded" container=a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6 type=CONTAINER_CREATED_EVENT Sep 12 17:55:58.777153 containerd[1930]: time="2025-09-12T17:55:58.777035968Z" level=warning msg="container event discarded" container=a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6 type=CONTAINER_STARTED_EVENT Sep 12 17:55:59.040052 containerd[1930]: time="2025-09-12T17:55:59.039997388Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"184fc04b1bfe71e3cc3f5b7aa4a55dd5ad6611b0fc88615dbfa11fb6aad42dfa\" pid:7562 exited_at:{seconds:1757699759 nanos:39897386}" Sep 12 17:55:59.721391 containerd[1930]: time="2025-09-12T17:55:59.721252101Z" level=warning msg="container event discarded" container=a4fa32a190f5665188f7ea3d1b5ffe392f5739f226a8a382646c92be2d2108d6 type=CONTAINER_STOPPED_EVENT Sep 12 17:56:04.436438 containerd[1930]: time="2025-09-12T17:56:04.436381054Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"e7782dccc296b41ea0553b61a95209d80d7a2daea10f163ea463088ff189183e\" pid:7591 exited_at:{seconds:1757699764 nanos:436259319}" Sep 12 17:56:05.398807 containerd[1930]: time="2025-09-12T17:56:05.398779473Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"4ed96cc79d75b1149cd3517d2765352433e0b57de0f9d67f790c5c0a488d3450\" pid:7614 exited_at:{seconds:1757699765 nanos:398571795}" Sep 12 17:56:05.599470 containerd[1930]: time="2025-09-12T17:56:05.599305500Z" level=warning msg="container event discarded" container=6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e type=CONTAINER_CREATED_EVENT Sep 12 17:56:05.650972 containerd[1930]: time="2025-09-12T17:56:05.650749267Z" level=warning msg="container event discarded" container=6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e type=CONTAINER_STARTED_EVENT Sep 12 17:56:07.014057 containerd[1930]: time="2025-09-12T17:56:07.013951818Z" level=warning msg="container event discarded" container=de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1 type=CONTAINER_CREATED_EVENT Sep 12 17:56:07.014057 containerd[1930]: time="2025-09-12T17:56:07.014023940Z" level=warning msg="container event discarded" container=de4abba0af60916e213ac90b661105a4460329dbd8e374b368314dbbc9bb13b1 type=CONTAINER_STARTED_EVENT Sep 12 17:56:08.805355 containerd[1930]: time="2025-09-12T17:56:08.805257174Z" level=warning msg="container event discarded" container=7fa5f079557dd63a479d4a3792c093a7bf5a9df07330db389374905a01fea9e8 type=CONTAINER_CREATED_EVENT Sep 12 17:56:08.850842 containerd[1930]: time="2025-09-12T17:56:08.850730873Z" level=warning msg="container event discarded" container=7fa5f079557dd63a479d4a3792c093a7bf5a9df07330db389374905a01fea9e8 type=CONTAINER_STARTED_EVENT Sep 12 17:56:10.538886 containerd[1930]: time="2025-09-12T17:56:10.538799234Z" level=warning msg="container event discarded" container=213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb type=CONTAINER_CREATED_EVENT Sep 12 17:56:10.538886 containerd[1930]: time="2025-09-12T17:56:10.538869073Z" level=warning msg="container event discarded" container=213d9ea709ed38b263068c3dd9545209c1794fa4f5a0b0d4a58c3a0215edf9eb type=CONTAINER_STARTED_EVENT Sep 12 17:56:10.712406 containerd[1930]: time="2025-09-12T17:56:10.712335329Z" level=warning msg="container event discarded" container=5816eb84caf0e84734d1bbede9b6bc2def6c9b00156363e5705aeecb3a22d9aa type=CONTAINER_CREATED_EVENT Sep 12 17:56:10.776667 containerd[1930]: time="2025-09-12T17:56:10.776580251Z" level=warning msg="container event discarded" container=5816eb84caf0e84734d1bbede9b6bc2def6c9b00156363e5705aeecb3a22d9aa type=CONTAINER_STARTED_EVENT Sep 12 17:56:13.232921 containerd[1930]: time="2025-09-12T17:56:13.232783322Z" level=warning msg="container event discarded" container=f5be7a5ff70897594600c9e1ff4ae615f5dabad1c4e0059020059d9d47c55857 type=CONTAINER_CREATED_EVENT Sep 12 17:56:13.284232 containerd[1930]: time="2025-09-12T17:56:13.284086631Z" level=warning msg="container event discarded" container=f5be7a5ff70897594600c9e1ff4ae615f5dabad1c4e0059020059d9d47c55857 type=CONTAINER_STARTED_EVENT Sep 12 17:56:13.465890 containerd[1930]: time="2025-09-12T17:56:13.465794524Z" level=warning msg="container event discarded" container=d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209 type=CONTAINER_CREATED_EVENT Sep 12 17:56:13.465890 containerd[1930]: time="2025-09-12T17:56:13.465876213Z" level=warning msg="container event discarded" container=d9bcbfd1ec96953c2f28ce80ebeaf7d5baa08ccf57602b2e8f168b802b3a9209 type=CONTAINER_STARTED_EVENT Sep 12 17:56:13.466204 containerd[1930]: time="2025-09-12T17:56:13.465903217Z" level=warning msg="container event discarded" container=b744f6bd613c0571efe4597aa223d78ee48ddc57cade342325ae3d2bb5c3c835 type=CONTAINER_CREATED_EVENT Sep 12 17:56:13.503435 containerd[1930]: time="2025-09-12T17:56:13.503282612Z" level=warning msg="container event discarded" container=b744f6bd613c0571efe4597aa223d78ee48ddc57cade342325ae3d2bb5c3c835 type=CONTAINER_STARTED_EVENT Sep 12 17:56:13.569748 containerd[1930]: time="2025-09-12T17:56:13.569641286Z" level=warning msg="container event discarded" container=62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0 type=CONTAINER_CREATED_EVENT Sep 12 17:56:13.569748 containerd[1930]: time="2025-09-12T17:56:13.569702605Z" level=warning msg="container event discarded" container=62e1ded5b1f56bd49022c3e60558e90ac7fb03a4e78c4ba96bc66a95a3598df0 type=CONTAINER_STARTED_EVENT Sep 12 17:56:13.909471 containerd[1930]: time="2025-09-12T17:56:13.909430949Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"e3c669263674be539cad98d25332be6e9d038ee6dad2ea8fd92b5cfbb729166e\" pid:7648 exited_at:{seconds:1757699773 nanos:909197934}" Sep 12 17:56:14.481463 containerd[1930]: time="2025-09-12T17:56:14.481345650Z" level=warning msg="container event discarded" container=282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07 type=CONTAINER_CREATED_EVENT Sep 12 17:56:14.481463 containerd[1930]: time="2025-09-12T17:56:14.481438125Z" level=warning msg="container event discarded" container=282ea97f74551bf32b3f4f1515accbd52aed726e6e3f4bee1dd53c77d6c32e07 type=CONTAINER_STARTED_EVENT Sep 12 17:56:14.481463 containerd[1930]: time="2025-09-12T17:56:14.481465407Z" level=warning msg="container event discarded" container=526694dcc42d642118085dd9d93ce5c41f775768a1c4071b8a60ddfde57bf583 type=CONTAINER_CREATED_EVENT Sep 12 17:56:14.526977 containerd[1930]: time="2025-09-12T17:56:14.526886325Z" level=warning msg="container event discarded" container=526694dcc42d642118085dd9d93ce5c41f775768a1c4071b8a60ddfde57bf583 type=CONTAINER_STARTED_EVENT Sep 12 17:56:14.578586 containerd[1930]: time="2025-09-12T17:56:14.578469415Z" level=warning msg="container event discarded" container=788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f type=CONTAINER_CREATED_EVENT Sep 12 17:56:14.578586 containerd[1930]: time="2025-09-12T17:56:14.578531677Z" level=warning msg="container event discarded" container=788d993391add27a260928d8ab93c3290ad878d9947d4dab18e812edfd35d81f type=CONTAINER_STARTED_EVENT Sep 12 17:56:14.715019 containerd[1930]: time="2025-09-12T17:56:14.714901109Z" level=warning msg="container event discarded" container=5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19 type=CONTAINER_CREATED_EVENT Sep 12 17:56:14.715019 containerd[1930]: time="2025-09-12T17:56:14.714962975Z" level=warning msg="container event discarded" container=5d9f7f8060edb2f6bf901caabb6ed9fdf94064de0eb00efa49cb7e37f056de19 type=CONTAINER_STARTED_EVENT Sep 12 17:56:14.715019 containerd[1930]: time="2025-09-12T17:56:14.714990149Z" level=warning msg="container event discarded" container=3a3baf1b125ad3008dabfc9a6b9026072e0e9792146b48f425a010ab7034e360 type=CONTAINER_CREATED_EVENT Sep 12 17:56:14.768419 containerd[1930]: time="2025-09-12T17:56:14.768226156Z" level=warning msg="container event discarded" container=3a3baf1b125ad3008dabfc9a6b9026072e0e9792146b48f425a010ab7034e360 type=CONTAINER_STARTED_EVENT Sep 12 17:56:15.447600 containerd[1930]: time="2025-09-12T17:56:15.447471395Z" level=warning msg="container event discarded" container=cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b type=CONTAINER_CREATED_EVENT Sep 12 17:56:15.447600 containerd[1930]: time="2025-09-12T17:56:15.447544250Z" level=warning msg="container event discarded" container=cd0141ff19eb0b5ebccd951749884674ddee1a41d8f1f3b1a019cf2eeec1a93b type=CONTAINER_STARTED_EVENT Sep 12 17:56:15.810134 containerd[1930]: time="2025-09-12T17:56:15.810013266Z" level=warning msg="container event discarded" container=cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f type=CONTAINER_CREATED_EVENT Sep 12 17:56:15.861564 containerd[1930]: time="2025-09-12T17:56:15.861479438Z" level=warning msg="container event discarded" container=cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f type=CONTAINER_STARTED_EVENT Sep 12 17:56:17.805263 containerd[1930]: time="2025-09-12T17:56:17.805051839Z" level=warning msg="container event discarded" container=9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82 type=CONTAINER_CREATED_EVENT Sep 12 17:56:17.848533 containerd[1930]: time="2025-09-12T17:56:17.848397508Z" level=warning msg="container event discarded" container=9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82 type=CONTAINER_STARTED_EVENT Sep 12 17:56:19.115488 containerd[1930]: time="2025-09-12T17:56:19.115330556Z" level=warning msg="container event discarded" container=ec899315cc236ca6d178692ef256de50f93b50f735f4a96914a4e78dac739834 type=CONTAINER_CREATED_EVENT Sep 12 17:56:19.174951 containerd[1930]: time="2025-09-12T17:56:19.174821476Z" level=warning msg="container event discarded" container=ec899315cc236ca6d178692ef256de50f93b50f735f4a96914a4e78dac739834 type=CONTAINER_STARTED_EVENT Sep 12 17:56:20.727698 containerd[1930]: time="2025-09-12T17:56:20.727637801Z" level=warning msg="container event discarded" container=23a00a2d747ad51573e218e99f274c5258dd3aa9fb83d00400f860e31de4458a type=CONTAINER_CREATED_EVENT Sep 12 17:56:20.765959 containerd[1930]: time="2025-09-12T17:56:20.765895927Z" level=warning msg="container event discarded" container=23a00a2d747ad51573e218e99f274c5258dd3aa9fb83d00400f860e31de4458a type=CONTAINER_STARTED_EVENT Sep 12 17:56:21.926340 update_engine[1916]: I20250912 17:56:21.926234 1916 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 12 17:56:21.926340 update_engine[1916]: I20250912 17:56:21.926339 1916 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 12 17:56:21.927664 update_engine[1916]: I20250912 17:56:21.926842 1916 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 12 17:56:21.928272 update_engine[1916]: I20250912 17:56:21.928206 1916 omaha_request_params.cc:62] Current group set to beta Sep 12 17:56:21.928581 update_engine[1916]: I20250912 17:56:21.928517 1916 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 12 17:56:21.928581 update_engine[1916]: I20250912 17:56:21.928557 1916 update_attempter.cc:643] Scheduling an action processor start. Sep 12 17:56:21.928897 update_engine[1916]: I20250912 17:56:21.928614 1916 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 12 17:56:21.928897 update_engine[1916]: I20250912 17:56:21.928718 1916 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 12 17:56:21.929244 update_engine[1916]: I20250912 17:56:21.928933 1916 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 12 17:56:21.929244 update_engine[1916]: I20250912 17:56:21.928975 1916 omaha_request_action.cc:272] Request: Sep 12 17:56:21.929244 update_engine[1916]: Sep 12 17:56:21.929244 update_engine[1916]: Sep 12 17:56:21.929244 update_engine[1916]: Sep 12 17:56:21.929244 update_engine[1916]: Sep 12 17:56:21.929244 update_engine[1916]: Sep 12 17:56:21.929244 update_engine[1916]: Sep 12 17:56:21.929244 update_engine[1916]: Sep 12 17:56:21.929244 update_engine[1916]: Sep 12 17:56:21.929244 update_engine[1916]: I20250912 17:56:21.929005 1916 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:56:21.930682 locksmithd[1980]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 12 17:56:21.931568 update_engine[1916]: I20250912 17:56:21.931555 1916 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:56:21.931887 update_engine[1916]: I20250912 17:56:21.931872 1916 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:56:21.932278 update_engine[1916]: E20250912 17:56:21.932261 1916 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:56:21.932324 update_engine[1916]: I20250912 17:56:21.932307 1916 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 12 17:56:25.614927 containerd[1930]: time="2025-09-12T17:56:25.614876598Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"0fe7509f8c30a302d256a3a01e0a5c61650e4699d2a4f8289798046bff01cfad\" pid:7683 exited_at:{seconds:1757699785 nanos:614678883}" Sep 12 17:56:31.849279 update_engine[1916]: I20250912 17:56:31.849169 1916 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:56:31.850657 update_engine[1916]: I20250912 17:56:31.849324 1916 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:56:31.850657 update_engine[1916]: I20250912 17:56:31.850530 1916 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:56:31.850977 update_engine[1916]: E20250912 17:56:31.850878 1916 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:56:31.851161 update_engine[1916]: I20250912 17:56:31.851074 1916 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 12 17:56:34.430679 containerd[1930]: time="2025-09-12T17:56:34.430453582Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"817e7a10cba8b4d8e8040d2b741217ea7e8b758d81ae4b4628cb8925ec4b4aa3\" pid:7725 exited_at:{seconds:1757699794 nanos:430359488}" Sep 12 17:56:41.849362 update_engine[1916]: I20250912 17:56:41.849220 1916 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:56:41.849362 update_engine[1916]: I20250912 17:56:41.849359 1916 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:56:41.850310 update_engine[1916]: I20250912 17:56:41.850183 1916 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:56:41.850516 update_engine[1916]: E20250912 17:56:41.850427 1916 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:56:41.850677 update_engine[1916]: I20250912 17:56:41.850574 1916 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 12 17:56:43.910970 containerd[1930]: time="2025-09-12T17:56:43.910895701Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"00db70f6f4caccfa7e8d0f626eae09d99d3bdd0644049bcece8c9911d37ce17c\" pid:7753 exited_at:{seconds:1757699803 nanos:910670593}" Sep 12 17:56:51.845735 update_engine[1916]: I20250912 17:56:51.845592 1916 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:56:51.845735 update_engine[1916]: I20250912 17:56:51.845737 1916 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:56:51.846642 update_engine[1916]: I20250912 17:56:51.846578 1916 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:56:51.847012 update_engine[1916]: E20250912 17:56:51.846946 1916 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:56:51.847205 update_engine[1916]: I20250912 17:56:51.847092 1916 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 12 17:56:51.847205 update_engine[1916]: I20250912 17:56:51.847150 1916 omaha_request_action.cc:617] Omaha request response: Sep 12 17:56:51.847429 update_engine[1916]: E20250912 17:56:51.847329 1916 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 12 17:56:51.847429 update_engine[1916]: I20250912 17:56:51.847380 1916 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 12 17:56:51.847429 update_engine[1916]: I20250912 17:56:51.847397 1916 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 17:56:51.847429 update_engine[1916]: I20250912 17:56:51.847412 1916 update_attempter.cc:306] Processing Done. Sep 12 17:56:51.847740 update_engine[1916]: E20250912 17:56:51.847442 1916 update_attempter.cc:619] Update failed. Sep 12 17:56:51.847740 update_engine[1916]: I20250912 17:56:51.847460 1916 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 12 17:56:51.847740 update_engine[1916]: I20250912 17:56:51.847476 1916 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 12 17:56:51.847740 update_engine[1916]: I20250912 17:56:51.847491 1916 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 12 17:56:51.847740 update_engine[1916]: I20250912 17:56:51.847636 1916 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 12 17:56:51.847740 update_engine[1916]: I20250912 17:56:51.847694 1916 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 12 17:56:51.847740 update_engine[1916]: I20250912 17:56:51.847713 1916 omaha_request_action.cc:272] Request: Sep 12 17:56:51.847740 update_engine[1916]: Sep 12 17:56:51.847740 update_engine[1916]: Sep 12 17:56:51.847740 update_engine[1916]: Sep 12 17:56:51.847740 update_engine[1916]: Sep 12 17:56:51.847740 update_engine[1916]: Sep 12 17:56:51.847740 update_engine[1916]: Sep 12 17:56:51.847740 update_engine[1916]: I20250912 17:56:51.847730 1916 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:56:51.848993 update_engine[1916]: I20250912 17:56:51.847776 1916 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:56:51.848993 update_engine[1916]: I20250912 17:56:51.848565 1916 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:56:51.848993 update_engine[1916]: E20250912 17:56:51.848922 1916 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:56:51.849289 locksmithd[1980]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 12 17:56:51.850054 update_engine[1916]: I20250912 17:56:51.849074 1916 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 12 17:56:51.850054 update_engine[1916]: I20250912 17:56:51.849123 1916 omaha_request_action.cc:617] Omaha request response: Sep 12 17:56:51.850054 update_engine[1916]: I20250912 17:56:51.849144 1916 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 17:56:51.850054 update_engine[1916]: I20250912 17:56:51.849159 1916 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 17:56:51.850054 update_engine[1916]: I20250912 17:56:51.849174 1916 update_attempter.cc:306] Processing Done. Sep 12 17:56:51.850054 update_engine[1916]: I20250912 17:56:51.849188 1916 update_attempter.cc:310] Error event sent. Sep 12 17:56:51.850054 update_engine[1916]: I20250912 17:56:51.849210 1916 update_check_scheduler.cc:74] Next update check in 42m19s Sep 12 17:56:51.850624 locksmithd[1980]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 12 17:56:55.638033 containerd[1930]: time="2025-09-12T17:56:55.638003009Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"ffc11de679fa0d6c4ce518e35b0613b8d9910de24508a3b7bfbffbea9bba5f91\" pid:7787 exited_at:{seconds:1757699815 nanos:637796813}" Sep 12 17:56:59.020378 containerd[1930]: time="2025-09-12T17:56:59.020356276Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"5236b9b890b6e7b44073c1a13c3aa92a5221fa0832625c34c563f3d7dd34addc\" pid:7823 exited_at:{seconds:1757699819 nanos:20236770}" Sep 12 17:57:04.427043 containerd[1930]: time="2025-09-12T17:57:04.427022147Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"ec7323d89ed963d143910c49be46174f41816177f8f8078dad0f5c917950a8ee\" pid:7844 exited_at:{seconds:1757699824 nanos:426917090}" Sep 12 17:57:05.406559 containerd[1930]: time="2025-09-12T17:57:05.406530732Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"d485fba6179f59ed8c8364486393053494c7518aebe3dfc255eb9c5d41f4177d\" pid:7865 exited_at:{seconds:1757699825 nanos:406275103}" Sep 12 17:57:13.905408 containerd[1930]: time="2025-09-12T17:57:13.905381480Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"87365adffcd93da7e94256fb54d4453858e4c341e1a593b036884d3ea8b78433\" pid:7901 exited_at:{seconds:1757699833 nanos:905215892}" Sep 12 17:57:25.575473 containerd[1930]: time="2025-09-12T17:57:25.575441265Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"a61189df81d3db96bb357080c5f0c8d680aa1965e7eff910af4c0eabb6f56a14\" pid:7950 exited_at:{seconds:1757699845 nanos:575192945}" Sep 12 17:57:34.434360 containerd[1930]: time="2025-09-12T17:57:34.434302721Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"c1586ef1dce05eaa94e15e66f11b966197779e91ce77e15ad1ff54022742c27e\" pid:7995 exited_at:{seconds:1757699854 nanos:434172910}" Sep 12 17:57:43.906567 containerd[1930]: time="2025-09-12T17:57:43.906541917Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"c731626f67d281a351b558984b7cababafd421e3e69133359db1c70bf5f8d678\" pid:8020 exited_at:{seconds:1757699863 nanos:906325668}" Sep 12 17:57:55.576990 containerd[1930]: time="2025-09-12T17:57:55.576956805Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"af8386612145743b8cc4b88c6c5e63db4b161929d6fcff9364b5ffacb074c140\" pid:8052 exited_at:{seconds:1757699875 nanos:576751358}" Sep 12 17:57:59.062981 containerd[1930]: time="2025-09-12T17:57:59.062948222Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"0a32216549d16481836efaa8632d03ba3ec5879886a8967971ea81e1b8584b5f\" pid:8087 exited_at:{seconds:1757699879 nanos:62797515}" Sep 12 17:58:04.476083 containerd[1930]: time="2025-09-12T17:58:04.476057945Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"d0f663ef11916f2659d58d1dd805c4d1d14bf1a1a1bfe80146713aa4c6543f74\" pid:8109 exited_at:{seconds:1757699884 nanos:475945181}" Sep 12 17:58:05.434132 containerd[1930]: time="2025-09-12T17:58:05.434107322Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"99e8038bd85271c48f844de45d69e9be9d6a15a6ebd8c3c791fa05796f0ee603\" pid:8132 exited_at:{seconds:1757699885 nanos:433907830}" Sep 12 17:58:13.911756 containerd[1930]: time="2025-09-12T17:58:13.911726536Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"1d59adda48dbe2771bce2e56e529886c2daa285e3ee4d1d490029cf16779e16e\" pid:8167 exited_at:{seconds:1757699893 nanos:911515194}" Sep 12 17:58:24.780231 systemd[1]: Started sshd@10-139.178.94.149:22-139.178.89.65:57960.service - OpenSSH per-connection server daemon (139.178.89.65:57960). Sep 12 17:58:24.893527 sshd[8195]: Accepted publickey for core from 139.178.89.65 port 57960 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:58:24.894708 sshd-session[8195]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:24.899658 systemd-logind[1911]: New session 12 of user core. Sep 12 17:58:24.922350 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 17:58:25.027037 sshd[8198]: Connection closed by 139.178.89.65 port 57960 Sep 12 17:58:25.027284 sshd-session[8195]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:25.029731 systemd[1]: sshd@10-139.178.94.149:22-139.178.89.65:57960.service: Deactivated successfully. Sep 12 17:58:25.030817 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 17:58:25.031386 systemd-logind[1911]: Session 12 logged out. Waiting for processes to exit. Sep 12 17:58:25.032258 systemd-logind[1911]: Removed session 12. Sep 12 17:58:25.626443 containerd[1930]: time="2025-09-12T17:58:25.626415087Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"97dd0ce18d85a350d5135ca89ad2d3a49b491c4f560ff073ac363801c66b0f1c\" pid:8237 exited_at:{seconds:1757699905 nanos:626169482}" Sep 12 17:58:30.058277 systemd[1]: Started sshd@11-139.178.94.149:22-139.178.89.65:43070.service - OpenSSH per-connection server daemon (139.178.89.65:43070). Sep 12 17:58:30.108467 sshd[8261]: Accepted publickey for core from 139.178.89.65 port 43070 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:58:30.109333 sshd-session[8261]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:30.112402 systemd-logind[1911]: New session 13 of user core. Sep 12 17:58:30.123286 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 17:58:30.211040 sshd[8264]: Connection closed by 139.178.89.65 port 43070 Sep 12 17:58:30.211263 sshd-session[8261]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:30.213166 systemd[1]: sshd@11-139.178.94.149:22-139.178.89.65:43070.service: Deactivated successfully. Sep 12 17:58:30.214167 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 17:58:30.214926 systemd-logind[1911]: Session 13 logged out. Waiting for processes to exit. Sep 12 17:58:30.215716 systemd-logind[1911]: Removed session 13. Sep 12 17:58:34.473613 containerd[1930]: time="2025-09-12T17:58:34.473582074Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"27feef2c30444b7c7219218b71b9314cc4408293f34fd5da0b88fb4936a245a8\" pid:8302 exited_at:{seconds:1757699914 nanos:473442414}" Sep 12 17:58:35.233223 systemd[1]: Started sshd@12-139.178.94.149:22-139.178.89.65:43086.service - OpenSSH per-connection server daemon (139.178.89.65:43086). Sep 12 17:58:35.276322 sshd[8313]: Accepted publickey for core from 139.178.89.65 port 43086 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:58:35.276980 sshd-session[8313]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:35.279928 systemd-logind[1911]: New session 14 of user core. Sep 12 17:58:35.295561 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 17:58:35.385056 sshd[8316]: Connection closed by 139.178.89.65 port 43086 Sep 12 17:58:35.385266 sshd-session[8313]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:35.400214 systemd[1]: sshd@12-139.178.94.149:22-139.178.89.65:43086.service: Deactivated successfully. Sep 12 17:58:35.401118 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 17:58:35.401580 systemd-logind[1911]: Session 14 logged out. Waiting for processes to exit. Sep 12 17:58:35.402842 systemd[1]: Started sshd@13-139.178.94.149:22-139.178.89.65:43098.service - OpenSSH per-connection server daemon (139.178.89.65:43098). Sep 12 17:58:35.403229 systemd-logind[1911]: Removed session 14. Sep 12 17:58:35.434368 sshd[8344]: Accepted publickey for core from 139.178.89.65 port 43098 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:58:35.435074 sshd-session[8344]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:35.438054 systemd-logind[1911]: New session 15 of user core. Sep 12 17:58:35.447216 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 17:58:35.546686 sshd[8349]: Connection closed by 139.178.89.65 port 43098 Sep 12 17:58:35.546840 sshd-session[8344]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:35.560224 systemd[1]: sshd@13-139.178.94.149:22-139.178.89.65:43098.service: Deactivated successfully. Sep 12 17:58:35.561145 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 17:58:35.561676 systemd-logind[1911]: Session 15 logged out. Waiting for processes to exit. Sep 12 17:58:35.562905 systemd[1]: Started sshd@14-139.178.94.149:22-139.178.89.65:43106.service - OpenSSH per-connection server daemon (139.178.89.65:43106). Sep 12 17:58:35.563313 systemd-logind[1911]: Removed session 15. Sep 12 17:58:35.594443 sshd[8372]: Accepted publickey for core from 139.178.89.65 port 43106 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:58:35.595138 sshd-session[8372]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:35.598069 systemd-logind[1911]: New session 16 of user core. Sep 12 17:58:35.615416 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 17:58:35.748968 sshd[8375]: Connection closed by 139.178.89.65 port 43106 Sep 12 17:58:35.749192 sshd-session[8372]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:35.751461 systemd[1]: sshd@14-139.178.94.149:22-139.178.89.65:43106.service: Deactivated successfully. Sep 12 17:58:35.752395 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 17:58:35.752854 systemd-logind[1911]: Session 16 logged out. Waiting for processes to exit. Sep 12 17:58:35.753437 systemd-logind[1911]: Removed session 16. Sep 12 17:58:40.775421 systemd[1]: Started sshd@15-139.178.94.149:22-139.178.89.65:47630.service - OpenSSH per-connection server daemon (139.178.89.65:47630). Sep 12 17:58:40.818313 sshd[8405]: Accepted publickey for core from 139.178.89.65 port 47630 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:58:40.818924 sshd-session[8405]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:40.821929 systemd-logind[1911]: New session 17 of user core. Sep 12 17:58:40.843514 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 17:58:40.995654 sshd[8408]: Connection closed by 139.178.89.65 port 47630 Sep 12 17:58:40.995877 sshd-session[8405]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:40.998457 systemd[1]: sshd@15-139.178.94.149:22-139.178.89.65:47630.service: Deactivated successfully. Sep 12 17:58:40.999569 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 17:58:41.000022 systemd-logind[1911]: Session 17 logged out. Waiting for processes to exit. Sep 12 17:58:41.000826 systemd-logind[1911]: Removed session 17. Sep 12 17:58:43.906370 containerd[1930]: time="2025-09-12T17:58:43.906300407Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"f1041c1813a149aeb9372e9cdff3ac4670a88093b2432385ee9b7ecfbe0d7f10\" pid:8445 exited_at:{seconds:1757699923 nanos:906030418}" Sep 12 17:58:46.021950 systemd[1]: Started sshd@16-139.178.94.149:22-139.178.89.65:47634.service - OpenSSH per-connection server daemon (139.178.89.65:47634). Sep 12 17:58:46.130420 sshd[8468]: Accepted publickey for core from 139.178.89.65 port 47634 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:58:46.131568 sshd-session[8468]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:46.136041 systemd-logind[1911]: New session 18 of user core. Sep 12 17:58:46.145332 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 17:58:46.236082 sshd[8471]: Connection closed by 139.178.89.65 port 47634 Sep 12 17:58:46.236331 sshd-session[8468]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:46.238189 systemd[1]: sshd@16-139.178.94.149:22-139.178.89.65:47634.service: Deactivated successfully. Sep 12 17:58:46.239212 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 17:58:46.239931 systemd-logind[1911]: Session 18 logged out. Waiting for processes to exit. Sep 12 17:58:46.240725 systemd-logind[1911]: Removed session 18. Sep 12 17:58:51.252867 systemd[1]: Started sshd@17-139.178.94.149:22-139.178.89.65:43068.service - OpenSSH per-connection server daemon (139.178.89.65:43068). Sep 12 17:58:51.299684 sshd[8497]: Accepted publickey for core from 139.178.89.65 port 43068 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:58:51.300366 sshd-session[8497]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:51.303471 systemd-logind[1911]: New session 19 of user core. Sep 12 17:58:51.317207 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 17:58:51.407062 sshd[8500]: Connection closed by 139.178.89.65 port 43068 Sep 12 17:58:51.407312 sshd-session[8497]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:51.409310 systemd[1]: sshd@17-139.178.94.149:22-139.178.89.65:43068.service: Deactivated successfully. Sep 12 17:58:51.410351 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 17:58:51.411376 systemd-logind[1911]: Session 19 logged out. Waiting for processes to exit. Sep 12 17:58:51.412051 systemd-logind[1911]: Removed session 19. Sep 12 17:58:55.596540 containerd[1930]: time="2025-09-12T17:58:55.596511920Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6324c18940b18dffbeda3dd69e88a7e82117da331846457930d33a3dab6fab5e\" id:\"59bde12104cb1cdb5bea6c4b4f883a1b3eb2a2b332da2ec0ef8c440cba67736b\" pid:8553 exited_at:{seconds:1757699935 nanos:596227305}" Sep 12 17:58:56.425878 systemd[1]: Started sshd@18-139.178.94.149:22-139.178.89.65:43084.service - OpenSSH per-connection server daemon (139.178.89.65:43084). Sep 12 17:58:56.465656 sshd[8578]: Accepted publickey for core from 139.178.89.65 port 43084 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:58:56.466321 sshd-session[8578]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:56.469046 systemd-logind[1911]: New session 20 of user core. Sep 12 17:58:56.485314 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 17:58:56.578717 sshd[8581]: Connection closed by 139.178.89.65 port 43084 Sep 12 17:58:56.578931 sshd-session[8578]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:56.599597 systemd[1]: sshd@18-139.178.94.149:22-139.178.89.65:43084.service: Deactivated successfully. Sep 12 17:58:56.600651 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 17:58:56.601152 systemd-logind[1911]: Session 20 logged out. Waiting for processes to exit. Sep 12 17:58:56.602708 systemd[1]: Started sshd@19-139.178.94.149:22-139.178.89.65:43090.service - OpenSSH per-connection server daemon (139.178.89.65:43090). Sep 12 17:58:56.603221 systemd-logind[1911]: Removed session 20. Sep 12 17:58:56.652668 sshd[8606]: Accepted publickey for core from 139.178.89.65 port 43090 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:58:56.653711 sshd-session[8606]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:56.657807 systemd-logind[1911]: New session 21 of user core. Sep 12 17:58:56.678528 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 17:58:56.842219 sshd[8609]: Connection closed by 139.178.89.65 port 43090 Sep 12 17:58:56.842453 sshd-session[8606]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:56.857996 systemd[1]: sshd@19-139.178.94.149:22-139.178.89.65:43090.service: Deactivated successfully. Sep 12 17:58:56.859226 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 17:58:56.859890 systemd-logind[1911]: Session 21 logged out. Waiting for processes to exit. Sep 12 17:58:56.861725 systemd[1]: Started sshd@20-139.178.94.149:22-139.178.89.65:43100.service - OpenSSH per-connection server daemon (139.178.89.65:43100). Sep 12 17:58:56.862335 systemd-logind[1911]: Removed session 21. Sep 12 17:58:56.915459 sshd[8631]: Accepted publickey for core from 139.178.89.65 port 43100 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:58:56.916469 sshd-session[8631]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:56.920592 systemd-logind[1911]: New session 22 of user core. Sep 12 17:58:56.941316 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 17:58:58.043898 sshd[8636]: Connection closed by 139.178.89.65 port 43100 Sep 12 17:58:58.044119 sshd-session[8631]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:58.058255 systemd[1]: sshd@20-139.178.94.149:22-139.178.89.65:43100.service: Deactivated successfully. Sep 12 17:58:58.059555 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 17:58:58.059711 systemd[1]: session-22.scope: Consumed 479ms CPU time, 80.4M memory peak. Sep 12 17:58:58.060112 systemd-logind[1911]: Session 22 logged out. Waiting for processes to exit. Sep 12 17:58:58.061860 systemd[1]: Started sshd@21-139.178.94.149:22-139.178.89.65:43102.service - OpenSSH per-connection server daemon (139.178.89.65:43102). Sep 12 17:58:58.062256 systemd-logind[1911]: Removed session 22. Sep 12 17:58:58.092993 sshd[8664]: Accepted publickey for core from 139.178.89.65 port 43102 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:58:58.093765 sshd-session[8664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:58.096795 systemd-logind[1911]: New session 23 of user core. Sep 12 17:58:58.106275 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 12 17:58:58.296188 sshd[8669]: Connection closed by 139.178.89.65 port 43102 Sep 12 17:58:58.296382 sshd-session[8664]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:58.310304 systemd[1]: sshd@21-139.178.94.149:22-139.178.89.65:43102.service: Deactivated successfully. Sep 12 17:58:58.311268 systemd[1]: session-23.scope: Deactivated successfully. Sep 12 17:58:58.311783 systemd-logind[1911]: Session 23 logged out. Waiting for processes to exit. Sep 12 17:58:58.313116 systemd[1]: Started sshd@22-139.178.94.149:22-139.178.89.65:43104.service - OpenSSH per-connection server daemon (139.178.89.65:43104). Sep 12 17:58:58.313498 systemd-logind[1911]: Removed session 23. Sep 12 17:58:58.351500 sshd[8693]: Accepted publickey for core from 139.178.89.65 port 43104 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:58:58.352335 sshd-session[8693]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:58:58.355657 systemd-logind[1911]: New session 24 of user core. Sep 12 17:58:58.376206 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 12 17:58:58.504109 sshd[8696]: Connection closed by 139.178.89.65 port 43104 Sep 12 17:58:58.504331 sshd-session[8693]: pam_unix(sshd:session): session closed for user core Sep 12 17:58:58.506225 systemd[1]: sshd@22-139.178.94.149:22-139.178.89.65:43104.service: Deactivated successfully. Sep 12 17:58:58.507200 systemd[1]: session-24.scope: Deactivated successfully. Sep 12 17:58:58.507890 systemd-logind[1911]: Session 24 logged out. Waiting for processes to exit. Sep 12 17:58:58.508576 systemd-logind[1911]: Removed session 24. Sep 12 17:58:59.032062 containerd[1930]: time="2025-09-12T17:58:59.032037247Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"c96f38f0a35aac966dad85e4a7b56eafe15f2527aefcf5aae2af1bd87e4bad7e\" pid:8731 exited_at:{seconds:1757699939 nanos:31691973}" Sep 12 17:59:03.530075 systemd[1]: Started sshd@23-139.178.94.149:22-139.178.89.65:34310.service - OpenSSH per-connection server daemon (139.178.89.65:34310). Sep 12 17:59:03.604298 sshd[8745]: Accepted publickey for core from 139.178.89.65 port 34310 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:59:03.605157 sshd-session[8745]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:59:03.608020 systemd-logind[1911]: New session 25 of user core. Sep 12 17:59:03.617384 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 12 17:59:03.724279 sshd[8748]: Connection closed by 139.178.89.65 port 34310 Sep 12 17:59:03.724482 sshd-session[8745]: pam_unix(sshd:session): session closed for user core Sep 12 17:59:03.726736 systemd[1]: sshd@23-139.178.94.149:22-139.178.89.65:34310.service: Deactivated successfully. Sep 12 17:59:03.727663 systemd[1]: session-25.scope: Deactivated successfully. Sep 12 17:59:03.728061 systemd-logind[1911]: Session 25 logged out. Waiting for processes to exit. Sep 12 17:59:03.728794 systemd-logind[1911]: Removed session 25. Sep 12 17:59:04.458264 containerd[1930]: time="2025-09-12T17:59:04.458235563Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c82b589d9c8b998266158675bf9ce6cb1b4e080848c61c18a77a69d1a642d82\" id:\"91026cbec153570ffb116a7b3d0092d995788a8fe976ac43e11e14f5c4544db9\" pid:8781 exited_at:{seconds:1757699944 nanos:458123852}" Sep 12 17:59:05.402955 containerd[1930]: time="2025-09-12T17:59:05.402931338Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"96b265d2f13d9863ad6194b3e53c2aa223bd348e3800a9e82637139b4a509f9b\" pid:8803 exited_at:{seconds:1757699945 nanos:402461733}" Sep 12 17:59:08.751296 systemd[1]: Started sshd@24-139.178.94.149:22-139.178.89.65:34314.service - OpenSSH per-connection server daemon (139.178.89.65:34314). Sep 12 17:59:08.825846 sshd[8828]: Accepted publickey for core from 139.178.89.65 port 34314 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:59:08.826431 sshd-session[8828]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:59:08.829340 systemd-logind[1911]: New session 26 of user core. Sep 12 17:59:08.844514 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 12 17:59:08.937724 sshd[8831]: Connection closed by 139.178.89.65 port 34314 Sep 12 17:59:08.937936 sshd-session[8828]: pam_unix(sshd:session): session closed for user core Sep 12 17:59:08.940029 systemd[1]: sshd@24-139.178.94.149:22-139.178.89.65:34314.service: Deactivated successfully. Sep 12 17:59:08.941119 systemd[1]: session-26.scope: Deactivated successfully. Sep 12 17:59:08.941905 systemd-logind[1911]: Session 26 logged out. Waiting for processes to exit. Sep 12 17:59:08.942633 systemd-logind[1911]: Removed session 26. Sep 12 17:59:13.908212 containerd[1930]: time="2025-09-12T17:59:13.908174486Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cad99cc4163cd86a0da5fa8598f1fac5a489efe6701fa48c3191120c4d88eb4f\" id:\"e072bf54b9392f36e3a41c5ab5970c875b605e134f49417f6f4b290a5c590a0f\" pid:8868 exited_at:{seconds:1757699953 nanos:907933495}" Sep 12 17:59:13.951216 systemd[1]: Started sshd@25-139.178.94.149:22-139.178.89.65:42696.service - OpenSSH per-connection server daemon (139.178.89.65:42696). Sep 12 17:59:13.984648 sshd[8889]: Accepted publickey for core from 139.178.89.65 port 42696 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 17:59:13.985277 sshd-session[8889]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:59:13.988034 systemd-logind[1911]: New session 27 of user core. Sep 12 17:59:14.007229 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 12 17:59:14.096310 sshd[8892]: Connection closed by 139.178.89.65 port 42696 Sep 12 17:59:14.096520 sshd-session[8889]: pam_unix(sshd:session): session closed for user core Sep 12 17:59:14.098714 systemd[1]: sshd@25-139.178.94.149:22-139.178.89.65:42696.service: Deactivated successfully. Sep 12 17:59:14.099681 systemd[1]: session-27.scope: Deactivated successfully. Sep 12 17:59:14.100064 systemd-logind[1911]: Session 27 logged out. Waiting for processes to exit. Sep 12 17:59:14.100757 systemd-logind[1911]: Removed session 27.