Sep 16 04:59:36.931227 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Sep 16 03:05:42 -00 2025 Sep 16 04:59:36.931242 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=0b876f86a632750e9937176808a48c2452d5168964273bcfc3c72f2a26140c06 Sep 16 04:59:36.931249 kernel: BIOS-provided physical RAM map: Sep 16 04:59:36.931254 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000008f7ff] usable Sep 16 04:59:36.931258 kernel: BIOS-e820: [mem 0x000000000008f800-0x000000000009ffff] reserved Sep 16 04:59:36.931262 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Sep 16 04:59:36.931267 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Sep 16 04:59:36.931272 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Sep 16 04:59:36.931276 kernel: BIOS-e820: [mem 0x0000000040400000-0x000000005ff2efff] usable Sep 16 04:59:36.931282 kernel: BIOS-e820: [mem 0x000000005ff2f000-0x000000005ff2ffff] ACPI NVS Sep 16 04:59:36.931286 kernel: BIOS-e820: [mem 0x000000005ff30000-0x000000005ff30fff] reserved Sep 16 04:59:36.931291 kernel: BIOS-e820: [mem 0x000000005ff31000-0x000000005fffffff] usable Sep 16 04:59:36.931295 kernel: BIOS-e820: [mem 0x0000000060000000-0x0000000067ffffff] reserved Sep 16 04:59:36.931300 kernel: BIOS-e820: [mem 0x0000000068000000-0x0000000077fc4fff] usable Sep 16 04:59:36.931305 kernel: BIOS-e820: [mem 0x0000000077fc5000-0x00000000790a7fff] reserved Sep 16 04:59:36.931311 kernel: BIOS-e820: [mem 0x00000000790a8000-0x0000000079230fff] usable Sep 16 04:59:36.931316 kernel: BIOS-e820: [mem 0x0000000079231000-0x0000000079662fff] ACPI NVS Sep 16 04:59:36.931321 kernel: BIOS-e820: [mem 0x0000000079663000-0x000000007befefff] reserved Sep 16 04:59:36.931326 kernel: BIOS-e820: [mem 0x000000007beff000-0x000000007befffff] usable Sep 16 04:59:36.931331 kernel: BIOS-e820: [mem 0x000000007bf00000-0x000000007f7fffff] reserved Sep 16 04:59:36.931336 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 16 04:59:36.931341 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Sep 16 04:59:36.931346 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Sep 16 04:59:36.931351 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Sep 16 04:59:36.931356 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Sep 16 04:59:36.931362 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000087f7fffff] usable Sep 16 04:59:36.931367 kernel: NX (Execute Disable) protection: active Sep 16 04:59:36.931372 kernel: APIC: Static calls initialized Sep 16 04:59:36.931378 kernel: SMBIOS 3.2.1 present. Sep 16 04:59:36.931383 kernel: DMI: Supermicro PIO-519C-MR-PH004/X11SCH-F, BIOS 1.5.V1 04/14/2021 Sep 16 04:59:36.931388 kernel: DMI: Memory slots populated: 2/4 Sep 16 04:59:36.931393 kernel: tsc: Detected 3400.000 MHz processor Sep 16 04:59:36.931398 kernel: tsc: Detected 3399.906 MHz TSC Sep 16 04:59:36.931403 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 16 04:59:36.931408 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 16 04:59:36.931414 kernel: last_pfn = 0x87f800 max_arch_pfn = 0x400000000 Sep 16 04:59:36.931420 kernel: MTRR map: 5 entries (3 fixed + 2 variable; max 23), built from 10 variable MTRRs Sep 16 04:59:36.931425 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 16 04:59:36.931430 kernel: last_pfn = 0x7bf00 max_arch_pfn = 0x400000000 Sep 16 04:59:36.931435 kernel: Using GB pages for direct mapping Sep 16 04:59:36.931442 kernel: ACPI: Early table checksum verification disabled Sep 16 04:59:36.931448 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Sep 16 04:59:36.931453 kernel: ACPI: XSDT 0x00000000795440C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Sep 16 04:59:36.931460 kernel: ACPI: FACP 0x0000000079580620 000114 (v06 01072009 AMI 00010013) Sep 16 04:59:36.931465 kernel: ACPI: DSDT 0x0000000079544268 03C3B7 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Sep 16 04:59:36.931470 kernel: ACPI: FACS 0x0000000079662F80 000040 Sep 16 04:59:36.931476 kernel: ACPI: APIC 0x0000000079580738 00012C (v04 01072009 AMI 00010013) Sep 16 04:59:36.931481 kernel: ACPI: FPDT 0x0000000079580868 000044 (v01 01072009 AMI 00010013) Sep 16 04:59:36.931486 kernel: ACPI: FIDT 0x00000000795808B0 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Sep 16 04:59:36.931492 kernel: ACPI: MCFG 0x0000000079580950 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Sep 16 04:59:36.931498 kernel: ACPI: SPMI 0x0000000079580990 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Sep 16 04:59:36.931503 kernel: ACPI: SSDT 0x00000000795809D8 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Sep 16 04:59:36.931508 kernel: ACPI: SSDT 0x00000000795824F8 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Sep 16 04:59:36.931514 kernel: ACPI: SSDT 0x00000000795856C0 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Sep 16 04:59:36.931519 kernel: ACPI: HPET 0x00000000795879F0 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 16 04:59:36.931524 kernel: ACPI: SSDT 0x0000000079587A28 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Sep 16 04:59:36.931530 kernel: ACPI: SSDT 0x00000000795889D8 0008F7 (v02 INTEL xh_mossb 00000000 INTL 20160527) Sep 16 04:59:36.931535 kernel: ACPI: UEFI 0x00000000795892D0 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 16 04:59:36.931540 kernel: ACPI: LPIT 0x0000000079589318 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 16 04:59:36.931547 kernel: ACPI: SSDT 0x00000000795893B0 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Sep 16 04:59:36.931552 kernel: ACPI: SSDT 0x000000007958BB90 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Sep 16 04:59:36.931558 kernel: ACPI: DBGP 0x000000007958D078 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 16 04:59:36.931563 kernel: ACPI: DBG2 0x000000007958D0B0 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Sep 16 04:59:36.931568 kernel: ACPI: SSDT 0x000000007958D108 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Sep 16 04:59:36.931574 kernel: ACPI: DMAR 0x000000007958EC70 0000A8 (v01 INTEL EDK2 00000002 01000013) Sep 16 04:59:36.931579 kernel: ACPI: SSDT 0x000000007958ED18 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Sep 16 04:59:36.931584 kernel: ACPI: TPM2 0x000000007958EE60 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Sep 16 04:59:36.931590 kernel: ACPI: SSDT 0x000000007958EE98 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Sep 16 04:59:36.931596 kernel: ACPI: WSMT 0x000000007958FC28 000028 (v01 \xec_ 01072009 AMI 00010013) Sep 16 04:59:36.931602 kernel: ACPI: EINJ 0x000000007958FC50 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Sep 16 04:59:36.931607 kernel: ACPI: ERST 0x000000007958FD80 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Sep 16 04:59:36.931612 kernel: ACPI: BERT 0x000000007958FFB0 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Sep 16 04:59:36.931617 kernel: ACPI: HEST 0x000000007958FFE0 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Sep 16 04:59:36.931623 kernel: ACPI: SSDT 0x0000000079590260 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Sep 16 04:59:36.931628 kernel: ACPI: Reserving FACP table memory at [mem 0x79580620-0x79580733] Sep 16 04:59:36.931633 kernel: ACPI: Reserving DSDT table memory at [mem 0x79544268-0x7958061e] Sep 16 04:59:36.931640 kernel: ACPI: Reserving FACS table memory at [mem 0x79662f80-0x79662fbf] Sep 16 04:59:36.931645 kernel: ACPI: Reserving APIC table memory at [mem 0x79580738-0x79580863] Sep 16 04:59:36.931650 kernel: ACPI: Reserving FPDT table memory at [mem 0x79580868-0x795808ab] Sep 16 04:59:36.931655 kernel: ACPI: Reserving FIDT table memory at [mem 0x795808b0-0x7958094b] Sep 16 04:59:36.931661 kernel: ACPI: Reserving MCFG table memory at [mem 0x79580950-0x7958098b] Sep 16 04:59:36.931666 kernel: ACPI: Reserving SPMI table memory at [mem 0x79580990-0x795809d0] Sep 16 04:59:36.931671 kernel: ACPI: Reserving SSDT table memory at [mem 0x795809d8-0x795824f3] Sep 16 04:59:36.931676 kernel: ACPI: Reserving SSDT table memory at [mem 0x795824f8-0x795856bd] Sep 16 04:59:36.931682 kernel: ACPI: Reserving SSDT table memory at [mem 0x795856c0-0x795879ea] Sep 16 04:59:36.931688 kernel: ACPI: Reserving HPET table memory at [mem 0x795879f0-0x79587a27] Sep 16 04:59:36.931693 kernel: ACPI: Reserving SSDT table memory at [mem 0x79587a28-0x795889d5] Sep 16 04:59:36.931698 kernel: ACPI: Reserving SSDT table memory at [mem 0x795889d8-0x795892ce] Sep 16 04:59:36.931704 kernel: ACPI: Reserving UEFI table memory at [mem 0x795892d0-0x79589311] Sep 16 04:59:36.931709 kernel: ACPI: Reserving LPIT table memory at [mem 0x79589318-0x795893ab] Sep 16 04:59:36.931714 kernel: ACPI: Reserving SSDT table memory at [mem 0x795893b0-0x7958bb8d] Sep 16 04:59:36.931719 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958bb90-0x7958d071] Sep 16 04:59:36.931725 kernel: ACPI: Reserving DBGP table memory at [mem 0x7958d078-0x7958d0ab] Sep 16 04:59:36.931730 kernel: ACPI: Reserving DBG2 table memory at [mem 0x7958d0b0-0x7958d103] Sep 16 04:59:36.931736 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958d108-0x7958ec6e] Sep 16 04:59:36.931742 kernel: ACPI: Reserving DMAR table memory at [mem 0x7958ec70-0x7958ed17] Sep 16 04:59:36.931747 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958ed18-0x7958ee5b] Sep 16 04:59:36.931752 kernel: ACPI: Reserving TPM2 table memory at [mem 0x7958ee60-0x7958ee93] Sep 16 04:59:36.931757 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958ee98-0x7958fc26] Sep 16 04:59:36.931763 kernel: ACPI: Reserving WSMT table memory at [mem 0x7958fc28-0x7958fc4f] Sep 16 04:59:36.931768 kernel: ACPI: Reserving EINJ table memory at [mem 0x7958fc50-0x7958fd7f] Sep 16 04:59:36.931773 kernel: ACPI: Reserving ERST table memory at [mem 0x7958fd80-0x7958ffaf] Sep 16 04:59:36.931778 kernel: ACPI: Reserving BERT table memory at [mem 0x7958ffb0-0x7958ffdf] Sep 16 04:59:36.931783 kernel: ACPI: Reserving HEST table memory at [mem 0x7958ffe0-0x7959025b] Sep 16 04:59:36.931790 kernel: ACPI: Reserving SSDT table memory at [mem 0x79590260-0x795903c1] Sep 16 04:59:36.931795 kernel: No NUMA configuration found Sep 16 04:59:36.931800 kernel: Faking a node at [mem 0x0000000000000000-0x000000087f7fffff] Sep 16 04:59:36.931806 kernel: NODE_DATA(0) allocated [mem 0x87f7f8dc0-0x87f7fffff] Sep 16 04:59:36.931811 kernel: Zone ranges: Sep 16 04:59:36.931816 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 16 04:59:36.931822 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Sep 16 04:59:36.931827 kernel: Normal [mem 0x0000000100000000-0x000000087f7fffff] Sep 16 04:59:36.931832 kernel: Device empty Sep 16 04:59:36.931839 kernel: Movable zone start for each node Sep 16 04:59:36.931844 kernel: Early memory node ranges Sep 16 04:59:36.931849 kernel: node 0: [mem 0x0000000000001000-0x000000000008efff] Sep 16 04:59:36.931854 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Sep 16 04:59:36.931860 kernel: node 0: [mem 0x0000000040400000-0x000000005ff2efff] Sep 16 04:59:36.931869 kernel: node 0: [mem 0x000000005ff31000-0x000000005fffffff] Sep 16 04:59:36.931875 kernel: node 0: [mem 0x0000000068000000-0x0000000077fc4fff] Sep 16 04:59:36.931881 kernel: node 0: [mem 0x00000000790a8000-0x0000000079230fff] Sep 16 04:59:36.931887 kernel: node 0: [mem 0x000000007beff000-0x000000007befffff] Sep 16 04:59:36.931893 kernel: node 0: [mem 0x0000000100000000-0x000000087f7fffff] Sep 16 04:59:36.931899 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000087f7fffff] Sep 16 04:59:36.931904 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 16 04:59:36.931910 kernel: On node 0, zone DMA: 113 pages in unavailable ranges Sep 16 04:59:36.931916 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Sep 16 04:59:36.931921 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Sep 16 04:59:36.931927 kernel: On node 0, zone DMA32: 4323 pages in unavailable ranges Sep 16 04:59:36.931932 kernel: On node 0, zone DMA32: 11470 pages in unavailable ranges Sep 16 04:59:36.931939 kernel: On node 0, zone Normal: 16640 pages in unavailable ranges Sep 16 04:59:36.931945 kernel: On node 0, zone Normal: 2048 pages in unavailable ranges Sep 16 04:59:36.931960 kernel: ACPI: PM-Timer IO Port: 0x1808 Sep 16 04:59:36.931966 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Sep 16 04:59:36.931972 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Sep 16 04:59:36.931977 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Sep 16 04:59:36.931983 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Sep 16 04:59:36.931988 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Sep 16 04:59:36.931994 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Sep 16 04:59:36.931999 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Sep 16 04:59:36.932007 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Sep 16 04:59:36.932012 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Sep 16 04:59:36.932018 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Sep 16 04:59:36.932023 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Sep 16 04:59:36.932029 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Sep 16 04:59:36.932034 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Sep 16 04:59:36.932040 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Sep 16 04:59:36.932045 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Sep 16 04:59:36.932051 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Sep 16 04:59:36.932058 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Sep 16 04:59:36.932063 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 16 04:59:36.932069 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 16 04:59:36.932075 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 16 04:59:36.932080 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 16 04:59:36.932086 kernel: TSC deadline timer available Sep 16 04:59:36.932091 kernel: CPU topo: Max. logical packages: 1 Sep 16 04:59:36.932097 kernel: CPU topo: Max. logical dies: 1 Sep 16 04:59:36.932103 kernel: CPU topo: Max. dies per package: 1 Sep 16 04:59:36.932109 kernel: CPU topo: Max. threads per core: 2 Sep 16 04:59:36.932115 kernel: CPU topo: Num. cores per package: 8 Sep 16 04:59:36.932120 kernel: CPU topo: Num. threads per package: 16 Sep 16 04:59:36.932126 kernel: CPU topo: Allowing 16 present CPUs plus 0 hotplug CPUs Sep 16 04:59:36.932131 kernel: [mem 0x7f800000-0xdfffffff] available for PCI devices Sep 16 04:59:36.932137 kernel: Booting paravirtualized kernel on bare hardware Sep 16 04:59:36.932143 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 16 04:59:36.932149 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Sep 16 04:59:36.932154 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 16 04:59:36.932161 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 16 04:59:36.932167 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Sep 16 04:59:36.932173 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=0b876f86a632750e9937176808a48c2452d5168964273bcfc3c72f2a26140c06 Sep 16 04:59:36.932179 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 16 04:59:36.932184 kernel: random: crng init done Sep 16 04:59:36.932190 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Sep 16 04:59:36.932196 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Sep 16 04:59:36.932201 kernel: Fallback order for Node 0: 0 Sep 16 04:59:36.932208 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8320219 Sep 16 04:59:36.932213 kernel: Policy zone: Normal Sep 16 04:59:36.932219 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 16 04:59:36.932224 kernel: software IO TLB: area num 16. Sep 16 04:59:36.932230 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Sep 16 04:59:36.932236 kernel: ftrace: allocating 40125 entries in 157 pages Sep 16 04:59:36.932241 kernel: ftrace: allocated 157 pages with 5 groups Sep 16 04:59:36.932247 kernel: Dynamic Preempt: voluntary Sep 16 04:59:36.932253 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 16 04:59:36.932260 kernel: rcu: RCU event tracing is enabled. Sep 16 04:59:36.932265 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Sep 16 04:59:36.932271 kernel: Trampoline variant of Tasks RCU enabled. Sep 16 04:59:36.932277 kernel: Rude variant of Tasks RCU enabled. Sep 16 04:59:36.932282 kernel: Tracing variant of Tasks RCU enabled. Sep 16 04:59:36.932288 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 16 04:59:36.932293 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Sep 16 04:59:36.932299 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 16 04:59:36.932304 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 16 04:59:36.932310 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 16 04:59:36.932317 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Sep 16 04:59:36.932322 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 16 04:59:36.932328 kernel: Console: colour VGA+ 80x25 Sep 16 04:59:36.932334 kernel: printk: legacy console [tty0] enabled Sep 16 04:59:36.932339 kernel: printk: legacy console [ttyS1] enabled Sep 16 04:59:36.932345 kernel: ACPI: Core revision 20240827 Sep 16 04:59:36.932351 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 79635855245 ns Sep 16 04:59:36.932356 kernel: APIC: Switch to symmetric I/O mode setup Sep 16 04:59:36.932362 kernel: DMAR: Host address width 39 Sep 16 04:59:36.932369 kernel: DMAR: DRHD base: 0x000000fed90000 flags: 0x0 Sep 16 04:59:36.932374 kernel: DMAR: dmar0: reg_base_addr fed90000 ver 1:0 cap 1c0000c40660462 ecap 19e2ff0505e Sep 16 04:59:36.932380 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Sep 16 04:59:36.932385 kernel: DMAR: dmar1: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Sep 16 04:59:36.932391 kernel: DMAR: RMRR base: 0x00000079f11000 end: 0x0000007a15afff Sep 16 04:59:36.932397 kernel: DMAR: RMRR base: 0x0000007d000000 end: 0x0000007f7fffff Sep 16 04:59:36.932402 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 1 Sep 16 04:59:36.932408 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Sep 16 04:59:36.932413 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Sep 16 04:59:36.932420 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Sep 16 04:59:36.932426 kernel: x2apic enabled Sep 16 04:59:36.932431 kernel: APIC: Switched APIC routing to: cluster x2apic Sep 16 04:59:36.932437 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 16 04:59:36.932443 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Sep 16 04:59:36.932448 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Sep 16 04:59:36.932454 kernel: CPU0: Thermal monitoring enabled (TM1) Sep 16 04:59:36.932460 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 16 04:59:36.932465 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 16 04:59:36.932472 kernel: process: using mwait in idle threads Sep 16 04:59:36.932477 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 16 04:59:36.932483 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 16 04:59:36.932489 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 16 04:59:36.932495 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 16 04:59:36.932500 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 16 04:59:36.932506 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 16 04:59:36.932511 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 16 04:59:36.932518 kernel: TAA: Mitigation: Clear CPU buffers Sep 16 04:59:36.932524 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 16 04:59:36.932529 kernel: SRBDS: Mitigation: Microcode Sep 16 04:59:36.932535 kernel: GDS: Vulnerable: No microcode Sep 16 04:59:36.932540 kernel: active return thunk: its_return_thunk Sep 16 04:59:36.932546 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 16 04:59:36.932552 kernel: VMSCAPE: Mitigation: IBPB before exit to userspace Sep 16 04:59:36.932557 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 16 04:59:36.932563 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 16 04:59:36.932569 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 16 04:59:36.932575 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Sep 16 04:59:36.932581 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Sep 16 04:59:36.932586 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 16 04:59:36.932592 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Sep 16 04:59:36.932597 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Sep 16 04:59:36.932603 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Sep 16 04:59:36.932609 kernel: Freeing SMP alternatives memory: 32K Sep 16 04:59:36.932614 kernel: pid_max: default: 32768 minimum: 301 Sep 16 04:59:36.932621 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 16 04:59:36.932627 kernel: landlock: Up and running. Sep 16 04:59:36.932632 kernel: SELinux: Initializing. Sep 16 04:59:36.932638 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 16 04:59:36.932644 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 16 04:59:36.932649 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Sep 16 04:59:36.932655 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Sep 16 04:59:36.932661 kernel: ... version: 4 Sep 16 04:59:36.932666 kernel: ... bit width: 48 Sep 16 04:59:36.932673 kernel: ... generic registers: 4 Sep 16 04:59:36.932678 kernel: ... value mask: 0000ffffffffffff Sep 16 04:59:36.932684 kernel: ... max period: 00007fffffffffff Sep 16 04:59:36.932690 kernel: ... fixed-purpose events: 3 Sep 16 04:59:36.932695 kernel: ... event mask: 000000070000000f Sep 16 04:59:36.932701 kernel: signal: max sigframe size: 2032 Sep 16 04:59:36.932706 kernel: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Sep 16 04:59:36.932712 kernel: rcu: Hierarchical SRCU implementation. Sep 16 04:59:36.932718 kernel: rcu: Max phase no-delay instances is 400. Sep 16 04:59:36.932723 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Sep 16 04:59:36.932730 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Sep 16 04:59:36.932736 kernel: smp: Bringing up secondary CPUs ... Sep 16 04:59:36.932741 kernel: smpboot: x86: Booting SMP configuration: Sep 16 04:59:36.932747 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12 #13 #14 #15 Sep 16 04:59:36.932753 kernel: TAA CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/tsx_async_abort.html for more details. Sep 16 04:59:36.932759 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Sep 16 04:59:36.932764 kernel: smp: Brought up 1 node, 16 CPUs Sep 16 04:59:36.932770 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Sep 16 04:59:36.932777 kernel: Memory: 32523092K/33280876K available (14336K kernel code, 2432K rwdata, 9992K rodata, 54096K init, 2868K bss, 730436K reserved, 0K cma-reserved) Sep 16 04:59:36.932783 kernel: devtmpfs: initialized Sep 16 04:59:36.932789 kernel: x86/mm: Memory block size: 128MB Sep 16 04:59:36.932794 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x5ff2f000-0x5ff2ffff] (4096 bytes) Sep 16 04:59:36.932800 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x79231000-0x79662fff] (4399104 bytes) Sep 16 04:59:36.932806 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 16 04:59:36.932811 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Sep 16 04:59:36.932817 kernel: pinctrl core: initialized pinctrl subsystem Sep 16 04:59:36.932824 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 16 04:59:36.932829 kernel: audit: initializing netlink subsys (disabled) Sep 16 04:59:36.932835 kernel: audit: type=2000 audit(1757998768.170:1): state=initialized audit_enabled=0 res=1 Sep 16 04:59:36.932840 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 16 04:59:36.932846 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 16 04:59:36.932851 kernel: cpuidle: using governor menu Sep 16 04:59:36.932857 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 16 04:59:36.932863 kernel: dca service started, version 1.12.1 Sep 16 04:59:36.932868 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Sep 16 04:59:36.932875 kernel: PCI: Using configuration type 1 for base access Sep 16 04:59:36.932881 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 16 04:59:36.932886 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 16 04:59:36.932892 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 16 04:59:36.932898 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 16 04:59:36.932903 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 16 04:59:36.932909 kernel: ACPI: Added _OSI(Module Device) Sep 16 04:59:36.932914 kernel: ACPI: Added _OSI(Processor Device) Sep 16 04:59:36.932920 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 16 04:59:36.932927 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Sep 16 04:59:36.932932 kernel: ACPI: Dynamic OEM Table Load: Sep 16 04:59:36.932938 kernel: ACPI: SSDT 0xFFFF9C82023A3800 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Sep 16 04:59:36.932944 kernel: ACPI: Dynamic OEM Table Load: Sep 16 04:59:36.932952 kernel: ACPI: SSDT 0xFFFF9C8200246600 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Sep 16 04:59:36.932958 kernel: ACPI: Dynamic OEM Table Load: Sep 16 04:59:36.932963 kernel: ACPI: SSDT 0xFFFF9C82023A7800 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Sep 16 04:59:36.932969 kernel: ACPI: Dynamic OEM Table Load: Sep 16 04:59:36.932997 kernel: ACPI: SSDT 0xFFFF9C82001A2000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Sep 16 04:59:36.933002 kernel: ACPI: Interpreter enabled Sep 16 04:59:36.933025 kernel: ACPI: PM: (supports S0 S5) Sep 16 04:59:36.933031 kernel: ACPI: Using IOAPIC for interrupt routing Sep 16 04:59:36.933036 kernel: HEST: Enabling Firmware First mode for corrected errors. Sep 16 04:59:36.933041 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Sep 16 04:59:36.933047 kernel: HEST: Table parsing has been initialized. Sep 16 04:59:36.933052 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Sep 16 04:59:36.933058 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 16 04:59:36.933063 kernel: PCI: Using E820 reservations for host bridge windows Sep 16 04:59:36.933068 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Sep 16 04:59:36.933075 kernel: ACPI: \_SB_.PCI0.XDCI.USBC: New power resource Sep 16 04:59:36.933081 kernel: ACPI: \_SB_.PCI0.SAT0.VOL0.V0PR: New power resource Sep 16 04:59:36.933086 kernel: ACPI: \_SB_.PCI0.SAT0.VOL1.V1PR: New power resource Sep 16 04:59:36.933092 kernel: ACPI: \_SB_.PCI0.SAT0.VOL2.V2PR: New power resource Sep 16 04:59:36.933097 kernel: ACPI: \_SB_.PCI0.CNVW.WRST: New power resource Sep 16 04:59:36.933102 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Sep 16 04:59:36.933108 kernel: ACPI: \_TZ_.FN00: New power resource Sep 16 04:59:36.933113 kernel: ACPI: \_TZ_.FN01: New power resource Sep 16 04:59:36.933118 kernel: ACPI: \_TZ_.FN02: New power resource Sep 16 04:59:36.933125 kernel: ACPI: \_TZ_.FN03: New power resource Sep 16 04:59:36.933130 kernel: ACPI: \_TZ_.FN04: New power resource Sep 16 04:59:36.933136 kernel: ACPI: \PIN_: New power resource Sep 16 04:59:36.933141 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Sep 16 04:59:36.933229 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 16 04:59:36.933289 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Sep 16 04:59:36.933345 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Sep 16 04:59:36.933354 kernel: PCI host bridge to bus 0000:00 Sep 16 04:59:36.933411 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 16 04:59:36.933503 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 16 04:59:36.933553 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 16 04:59:36.933602 kernel: pci_bus 0000:00: root bus resource [mem 0x7f800000-0xdfffffff window] Sep 16 04:59:36.933651 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Sep 16 04:59:36.933700 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Sep 16 04:59:36.933767 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 conventional PCI endpoint Sep 16 04:59:36.933834 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 PCIe Root Port Sep 16 04:59:36.933892 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 16 04:59:36.933951 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Sep 16 04:59:36.934054 kernel: pci 0000:00:01.1: [8086:1905] type 01 class 0x060400 PCIe Root Port Sep 16 04:59:36.934111 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Sep 16 04:59:36.934170 kernel: pci 0000:00:01.1: bridge window [mem 0x96100000-0x962fffff] Sep 16 04:59:36.934227 kernel: pci 0000:00:01.1: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Sep 16 04:59:36.934282 kernel: pci 0000:00:01.1: PME# supported from D0 D3hot D3cold Sep 16 04:59:36.934342 kernel: pci 0000:00:02.0: [8086:3e9a] type 00 class 0x038000 PCIe Root Complex Integrated Endpoint Sep 16 04:59:36.934406 kernel: pci 0000:00:02.0: BAR 0 [mem 0x94000000-0x94ffffff 64bit] Sep 16 04:59:36.934463 kernel: pci 0000:00:02.0: BAR 2 [mem 0x80000000-0x8fffffff 64bit pref] Sep 16 04:59:36.934519 kernel: pci 0000:00:02.0: BAR 4 [io 0x6000-0x603f] Sep 16 04:59:36.934582 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 conventional PCI endpoint Sep 16 04:59:36.934638 kernel: pci 0000:00:12.0: BAR 0 [mem 0x9651e000-0x9651efff 64bit] Sep 16 04:59:36.934698 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 conventional PCI endpoint Sep 16 04:59:36.934754 kernel: pci 0000:00:14.0: BAR 0 [mem 0x96500000-0x9650ffff 64bit] Sep 16 04:59:36.934810 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Sep 16 04:59:36.934869 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 conventional PCI endpoint Sep 16 04:59:36.934928 kernel: pci 0000:00:14.2: BAR 0 [mem 0x96512000-0x96513fff 64bit] Sep 16 04:59:36.935041 kernel: pci 0000:00:14.2: BAR 2 [mem 0x9651d000-0x9651dfff 64bit] Sep 16 04:59:36.935103 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 conventional PCI endpoint Sep 16 04:59:36.935159 kernel: pci 0000:00:15.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 16 04:59:36.935221 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 conventional PCI endpoint Sep 16 04:59:36.935278 kernel: pci 0000:00:15.1: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 16 04:59:36.935337 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 conventional PCI endpoint Sep 16 04:59:36.935393 kernel: pci 0000:00:16.0: BAR 0 [mem 0x9651a000-0x9651afff 64bit] Sep 16 04:59:36.935447 kernel: pci 0000:00:16.0: PME# supported from D3hot Sep 16 04:59:36.935507 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 conventional PCI endpoint Sep 16 04:59:36.935562 kernel: pci 0000:00:16.1: BAR 0 [mem 0x96519000-0x96519fff 64bit] Sep 16 04:59:36.935620 kernel: pci 0000:00:16.1: PME# supported from D3hot Sep 16 04:59:36.935679 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 conventional PCI endpoint Sep 16 04:59:36.935735 kernel: pci 0000:00:16.4: BAR 0 [mem 0x96518000-0x96518fff 64bit] Sep 16 04:59:36.935789 kernel: pci 0000:00:16.4: PME# supported from D3hot Sep 16 04:59:36.935848 kernel: pci 0000:00:17.0: [8086:2826] type 00 class 0x010400 conventional PCI endpoint Sep 16 04:59:36.935904 kernel: pci 0000:00:17.0: BAR 0 [mem 0x96510000-0x96511fff] Sep 16 04:59:36.935965 kernel: pci 0000:00:17.0: BAR 1 [mem 0x96517000-0x965170ff] Sep 16 04:59:36.936061 kernel: pci 0000:00:17.0: BAR 2 [io 0x6090-0x6097] Sep 16 04:59:36.936117 kernel: pci 0000:00:17.0: BAR 3 [io 0x6080-0x6083] Sep 16 04:59:36.936172 kernel: pci 0000:00:17.0: BAR 4 [io 0x6060-0x607f] Sep 16 04:59:36.936227 kernel: pci 0000:00:17.0: BAR 5 [mem 0x96516000-0x965167ff] Sep 16 04:59:36.936281 kernel: pci 0000:00:17.0: PME# supported from D3hot Sep 16 04:59:36.936341 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 PCIe Root Port Sep 16 04:59:36.936400 kernel: pci 0000:00:1b.0: PCI bridge to [bus 03] Sep 16 04:59:36.936456 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Sep 16 04:59:36.936520 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 PCIe Root Port Sep 16 04:59:36.936577 kernel: pci 0000:00:1b.4: PCI bridge to [bus 04] Sep 16 04:59:36.936633 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Sep 16 04:59:36.936689 kernel: pci 0000:00:1b.4: bridge window [mem 0x96400000-0x964fffff] Sep 16 04:59:36.936745 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Sep 16 04:59:36.936807 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 PCIe Root Port Sep 16 04:59:36.936864 kernel: pci 0000:00:1b.5: PCI bridge to [bus 05] Sep 16 04:59:36.936919 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Sep 16 04:59:36.937003 kernel: pci 0000:00:1b.5: bridge window [mem 0x96300000-0x963fffff] Sep 16 04:59:36.937090 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Sep 16 04:59:36.937150 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 PCIe Root Port Sep 16 04:59:36.937206 kernel: pci 0000:00:1c.0: PCI bridge to [bus 06] Sep 16 04:59:36.937265 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Sep 16 04:59:36.937324 kernel: pci 0000:00:1c.1: [8086:a339] type 01 class 0x060400 PCIe Root Port Sep 16 04:59:36.937379 kernel: pci 0000:00:1c.1: PCI bridge to [bus 07-08] Sep 16 04:59:36.937435 kernel: pci 0000:00:1c.1: bridge window [io 0x3000-0x3fff] Sep 16 04:59:36.937489 kernel: pci 0000:00:1c.1: bridge window [mem 0x95000000-0x960fffff] Sep 16 04:59:36.937545 kernel: pci 0000:00:1c.1: PME# supported from D0 D3hot D3cold Sep 16 04:59:36.937605 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 conventional PCI endpoint Sep 16 04:59:36.937664 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 16 04:59:36.937723 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 conventional PCI endpoint Sep 16 04:59:36.937787 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 conventional PCI endpoint Sep 16 04:59:36.937844 kernel: pci 0000:00:1f.4: BAR 0 [mem 0x96514000-0x965140ff 64bit] Sep 16 04:59:36.937899 kernel: pci 0000:00:1f.4: BAR 4 [io 0xefa0-0xefbf] Sep 16 04:59:36.937960 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 conventional PCI endpoint Sep 16 04:59:36.938080 kernel: pci 0000:00:1f.5: BAR 0 [mem 0xfe010000-0xfe010fff] Sep 16 04:59:36.938136 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 16 04:59:36.938198 kernel: pci 0000:02:00.0: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Sep 16 04:59:36.938256 kernel: pci 0000:02:00.0: BAR 0 [mem 0x92000000-0x93ffffff 64bit pref] Sep 16 04:59:36.938313 kernel: pci 0000:02:00.0: ROM [mem 0x96200000-0x962fffff pref] Sep 16 04:59:36.938370 kernel: pci 0000:02:00.0: PME# supported from D3cold Sep 16 04:59:36.938427 kernel: pci 0000:02:00.0: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Sep 16 04:59:36.938487 kernel: pci 0000:02:00.0: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Sep 16 04:59:36.938548 kernel: pci 0000:02:00.1: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Sep 16 04:59:36.938606 kernel: pci 0000:02:00.1: BAR 0 [mem 0x90000000-0x91ffffff 64bit pref] Sep 16 04:59:36.938663 kernel: pci 0000:02:00.1: ROM [mem 0x96100000-0x961fffff pref] Sep 16 04:59:36.938720 kernel: pci 0000:02:00.1: PME# supported from D3cold Sep 16 04:59:36.938777 kernel: pci 0000:02:00.1: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Sep 16 04:59:36.938835 kernel: pci 0000:02:00.1: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Sep 16 04:59:36.938894 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Sep 16 04:59:36.938952 kernel: pci 0000:00:1b.0: PCI bridge to [bus 03] Sep 16 04:59:36.939073 kernel: pci 0000:04:00.0: working around ROM BAR overlap defect Sep 16 04:59:36.939131 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Sep 16 04:59:36.939190 kernel: pci 0000:04:00.0: BAR 0 [mem 0x96400000-0x9647ffff] Sep 16 04:59:36.939309 kernel: pci 0000:04:00.0: BAR 2 [io 0x5000-0x501f] Sep 16 04:59:36.939366 kernel: pci 0000:04:00.0: BAR 3 [mem 0x96480000-0x96483fff] Sep 16 04:59:36.939426 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Sep 16 04:59:36.939483 kernel: pci 0000:00:1b.4: PCI bridge to [bus 04] Sep 16 04:59:36.939545 kernel: pci 0000:05:00.0: working around ROM BAR overlap defect Sep 16 04:59:36.939633 kernel: pci 0000:05:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Sep 16 04:59:36.939766 kernel: pci 0000:05:00.0: BAR 0 [mem 0x96300000-0x9637ffff] Sep 16 04:59:36.939856 kernel: pci 0000:05:00.0: BAR 2 [io 0x4000-0x401f] Sep 16 04:59:36.939917 kernel: pci 0000:05:00.0: BAR 3 [mem 0x96380000-0x96383fff] Sep 16 04:59:36.939982 kernel: pci 0000:05:00.0: PME# supported from D0 D3hot D3cold Sep 16 04:59:36.940041 kernel: pci 0000:00:1b.5: PCI bridge to [bus 05] Sep 16 04:59:36.940099 kernel: pci 0000:00:1c.0: PCI bridge to [bus 06] Sep 16 04:59:36.940162 kernel: pci 0000:07:00.0: [1a03:1150] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Sep 16 04:59:36.940222 kernel: pci 0000:07:00.0: PCI bridge to [bus 08] Sep 16 04:59:36.940282 kernel: pci 0000:07:00.0: bridge window [io 0x3000-0x3fff] Sep 16 04:59:36.940341 kernel: pci 0000:07:00.0: bridge window [mem 0x95000000-0x960fffff] Sep 16 04:59:36.940400 kernel: pci 0000:07:00.0: enabling Extended Tags Sep 16 04:59:36.940462 kernel: pci 0000:07:00.0: supports D1 D2 Sep 16 04:59:36.940521 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 16 04:59:36.940580 kernel: pci 0000:00:1c.1: PCI bridge to [bus 07-08] Sep 16 04:59:36.940647 kernel: pci_bus 0000:08: extended config space not accessible Sep 16 04:59:36.940714 kernel: pci 0000:08:00.0: [1a03:2000] type 00 class 0x030000 conventional PCI endpoint Sep 16 04:59:36.940778 kernel: pci 0000:08:00.0: BAR 0 [mem 0x95000000-0x95ffffff] Sep 16 04:59:36.940843 kernel: pci 0000:08:00.0: BAR 1 [mem 0x96000000-0x9601ffff] Sep 16 04:59:36.940904 kernel: pci 0000:08:00.0: BAR 2 [io 0x3000-0x307f] Sep 16 04:59:36.940969 kernel: pci 0000:08:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 16 04:59:36.941032 kernel: pci 0000:08:00.0: supports D1 D2 Sep 16 04:59:36.941094 kernel: pci 0000:08:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 16 04:59:36.941154 kernel: pci 0000:07:00.0: PCI bridge to [bus 08] Sep 16 04:59:36.941162 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Sep 16 04:59:36.941168 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Sep 16 04:59:36.941176 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Sep 16 04:59:36.941182 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Sep 16 04:59:36.941189 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Sep 16 04:59:36.941194 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Sep 16 04:59:36.941200 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Sep 16 04:59:36.941206 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Sep 16 04:59:36.941212 kernel: iommu: Default domain type: Translated Sep 16 04:59:36.941218 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 16 04:59:36.941225 kernel: PCI: Using ACPI for IRQ routing Sep 16 04:59:36.941232 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 16 04:59:36.941238 kernel: e820: reserve RAM buffer [mem 0x0008f800-0x0008ffff] Sep 16 04:59:36.941244 kernel: e820: reserve RAM buffer [mem 0x5ff2f000-0x5fffffff] Sep 16 04:59:36.941249 kernel: e820: reserve RAM buffer [mem 0x77fc5000-0x77ffffff] Sep 16 04:59:36.941256 kernel: e820: reserve RAM buffer [mem 0x79231000-0x7bffffff] Sep 16 04:59:36.941262 kernel: e820: reserve RAM buffer [mem 0x7bf00000-0x7bffffff] Sep 16 04:59:36.941268 kernel: e820: reserve RAM buffer [mem 0x87f800000-0x87fffffff] Sep 16 04:59:36.941329 kernel: pci 0000:08:00.0: vgaarb: setting as boot VGA device Sep 16 04:59:36.941390 kernel: pci 0000:08:00.0: vgaarb: bridge control possible Sep 16 04:59:36.941455 kernel: pci 0000:08:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 16 04:59:36.941464 kernel: vgaarb: loaded Sep 16 04:59:36.941470 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0 Sep 16 04:59:36.941476 kernel: hpet0: 8 comparators, 64-bit 24.000000 MHz counter Sep 16 04:59:36.941482 kernel: clocksource: Switched to clocksource tsc-early Sep 16 04:59:36.941490 kernel: VFS: Disk quotas dquot_6.6.0 Sep 16 04:59:36.941496 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 16 04:59:36.941502 kernel: pnp: PnP ACPI init Sep 16 04:59:36.941561 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Sep 16 04:59:36.941619 kernel: pnp 00:02: [dma 0 disabled] Sep 16 04:59:36.941677 kernel: pnp 00:03: [dma 0 disabled] Sep 16 04:59:36.941737 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Sep 16 04:59:36.941793 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Sep 16 04:59:36.941849 kernel: system 00:05: [mem 0xfed10000-0xfed17fff] has been reserved Sep 16 04:59:36.941902 kernel: system 00:05: [mem 0xfed18000-0xfed18fff] has been reserved Sep 16 04:59:36.941959 kernel: system 00:05: [mem 0xfed19000-0xfed19fff] has been reserved Sep 16 04:59:36.942012 kernel: system 00:05: [mem 0xe0000000-0xefffffff] has been reserved Sep 16 04:59:36.942066 kernel: system 00:05: [mem 0xfed20000-0xfed3ffff] has been reserved Sep 16 04:59:36.942119 kernel: system 00:05: [mem 0xfed90000-0xfed93fff] could not be reserved Sep 16 04:59:36.942175 kernel: system 00:05: [mem 0xfed45000-0xfed8ffff] has been reserved Sep 16 04:59:36.942227 kernel: system 00:05: [mem 0xfee00000-0xfeefffff] could not be reserved Sep 16 04:59:36.942284 kernel: system 00:06: [io 0x1800-0x18fe] could not be reserved Sep 16 04:59:36.942337 kernel: system 00:06: [mem 0xfd000000-0xfd69ffff] has been reserved Sep 16 04:59:36.942389 kernel: system 00:06: [mem 0xfd6c0000-0xfd6cffff] has been reserved Sep 16 04:59:36.942441 kernel: system 00:06: [mem 0xfd6f0000-0xfdffffff] has been reserved Sep 16 04:59:36.942496 kernel: system 00:06: [mem 0xfe000000-0xfe01ffff] could not be reserved Sep 16 04:59:36.942548 kernel: system 00:06: [mem 0xfe200000-0xfe7fffff] has been reserved Sep 16 04:59:36.942601 kernel: system 00:06: [mem 0xff000000-0xffffffff] has been reserved Sep 16 04:59:36.942657 kernel: system 00:07: [io 0x2000-0x20fe] has been reserved Sep 16 04:59:36.942666 kernel: pnp: PnP ACPI: found 9 devices Sep 16 04:59:36.942672 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 16 04:59:36.942679 kernel: NET: Registered PF_INET protocol family Sep 16 04:59:36.942685 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 16 04:59:36.942692 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Sep 16 04:59:36.942699 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 16 04:59:36.942705 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 16 04:59:36.942711 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Sep 16 04:59:36.942717 kernel: TCP: Hash tables configured (established 262144 bind 65536) Sep 16 04:59:36.942723 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 16 04:59:36.942729 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 16 04:59:36.942735 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 16 04:59:36.942741 kernel: NET: Registered PF_XDP protocol family Sep 16 04:59:36.942801 kernel: pci 0000:00:15.0: BAR 0 [mem 0x7f800000-0x7f800fff 64bit]: assigned Sep 16 04:59:36.942859 kernel: pci 0000:00:15.1: BAR 0 [mem 0x7f801000-0x7f801fff 64bit]: assigned Sep 16 04:59:36.942918 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x7f802000-0x7f802fff 64bit]: assigned Sep 16 04:59:36.942979 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 16 04:59:36.943039 kernel: pci 0000:02:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Sep 16 04:59:36.943099 kernel: pci 0000:02:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Sep 16 04:59:36.943159 kernel: pci 0000:02:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Sep 16 04:59:36.943219 kernel: pci 0000:02:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Sep 16 04:59:36.943279 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Sep 16 04:59:36.943336 kernel: pci 0000:00:01.1: bridge window [mem 0x96100000-0x962fffff] Sep 16 04:59:36.943394 kernel: pci 0000:00:01.1: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Sep 16 04:59:36.943452 kernel: pci 0000:00:1b.0: PCI bridge to [bus 03] Sep 16 04:59:36.943509 kernel: pci 0000:00:1b.4: PCI bridge to [bus 04] Sep 16 04:59:36.943567 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Sep 16 04:59:36.943624 kernel: pci 0000:00:1b.4: bridge window [mem 0x96400000-0x964fffff] Sep 16 04:59:36.943682 kernel: pci 0000:00:1b.5: PCI bridge to [bus 05] Sep 16 04:59:36.943739 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Sep 16 04:59:36.943797 kernel: pci 0000:00:1b.5: bridge window [mem 0x96300000-0x963fffff] Sep 16 04:59:36.943857 kernel: pci 0000:00:1c.0: PCI bridge to [bus 06] Sep 16 04:59:36.943915 kernel: pci 0000:07:00.0: PCI bridge to [bus 08] Sep 16 04:59:36.944008 kernel: pci 0000:07:00.0: bridge window [io 0x3000-0x3fff] Sep 16 04:59:36.944068 kernel: pci 0000:07:00.0: bridge window [mem 0x95000000-0x960fffff] Sep 16 04:59:36.944126 kernel: pci 0000:00:1c.1: PCI bridge to [bus 07-08] Sep 16 04:59:36.944184 kernel: pci 0000:00:1c.1: bridge window [io 0x3000-0x3fff] Sep 16 04:59:36.944242 kernel: pci 0000:00:1c.1: bridge window [mem 0x95000000-0x960fffff] Sep 16 04:59:36.944295 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Sep 16 04:59:36.944347 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 16 04:59:36.944401 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 16 04:59:36.944452 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 16 04:59:36.944503 kernel: pci_bus 0000:00: resource 7 [mem 0x7f800000-0xdfffffff window] Sep 16 04:59:36.944555 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Sep 16 04:59:36.944613 kernel: pci_bus 0000:02: resource 1 [mem 0x96100000-0x962fffff] Sep 16 04:59:36.944668 kernel: pci_bus 0000:02: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Sep 16 04:59:36.944726 kernel: pci_bus 0000:04: resource 0 [io 0x5000-0x5fff] Sep 16 04:59:36.944782 kernel: pci_bus 0000:04: resource 1 [mem 0x96400000-0x964fffff] Sep 16 04:59:36.944842 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Sep 16 04:59:36.944896 kernel: pci_bus 0000:05: resource 1 [mem 0x96300000-0x963fffff] Sep 16 04:59:36.944957 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Sep 16 04:59:36.945033 kernel: pci_bus 0000:07: resource 1 [mem 0x95000000-0x960fffff] Sep 16 04:59:36.945088 kernel: pci_bus 0000:08: resource 0 [io 0x3000-0x3fff] Sep 16 04:59:36.945142 kernel: pci_bus 0000:08: resource 1 [mem 0x95000000-0x960fffff] Sep 16 04:59:36.945153 kernel: PCI: CLS 64 bytes, default 64 Sep 16 04:59:36.945159 kernel: DMAR: No ATSR found Sep 16 04:59:36.945165 kernel: DMAR: No SATC found Sep 16 04:59:36.945170 kernel: DMAR: IOMMU feature fl1gp_support inconsistent Sep 16 04:59:36.945176 kernel: DMAR: IOMMU feature pgsel_inv inconsistent Sep 16 04:59:36.945182 kernel: DMAR: IOMMU feature nwfs inconsistent Sep 16 04:59:36.945188 kernel: DMAR: IOMMU feature pasid inconsistent Sep 16 04:59:36.945193 kernel: DMAR: IOMMU feature eafs inconsistent Sep 16 04:59:36.945199 kernel: DMAR: IOMMU feature prs inconsistent Sep 16 04:59:36.945206 kernel: DMAR: IOMMU feature nest inconsistent Sep 16 04:59:36.945211 kernel: DMAR: IOMMU feature mts inconsistent Sep 16 04:59:36.945217 kernel: DMAR: IOMMU feature sc_support inconsistent Sep 16 04:59:36.945223 kernel: DMAR: IOMMU feature dev_iotlb_support inconsistent Sep 16 04:59:36.945228 kernel: DMAR: dmar0: Using Queued invalidation Sep 16 04:59:36.945234 kernel: DMAR: dmar1: Using Queued invalidation Sep 16 04:59:36.945290 kernel: pci 0000:00:02.0: Adding to iommu group 0 Sep 16 04:59:36.945347 kernel: pci 0000:00:00.0: Adding to iommu group 1 Sep 16 04:59:36.945404 kernel: pci 0000:00:01.0: Adding to iommu group 2 Sep 16 04:59:36.945463 kernel: pci 0000:00:01.1: Adding to iommu group 2 Sep 16 04:59:36.945520 kernel: pci 0000:00:12.0: Adding to iommu group 3 Sep 16 04:59:36.945576 kernel: pci 0000:00:14.0: Adding to iommu group 4 Sep 16 04:59:36.945631 kernel: pci 0000:00:14.2: Adding to iommu group 4 Sep 16 04:59:36.945686 kernel: pci 0000:00:15.0: Adding to iommu group 5 Sep 16 04:59:36.945741 kernel: pci 0000:00:15.1: Adding to iommu group 5 Sep 16 04:59:36.945796 kernel: pci 0000:00:16.0: Adding to iommu group 6 Sep 16 04:59:36.945854 kernel: pci 0000:00:16.1: Adding to iommu group 6 Sep 16 04:59:36.945909 kernel: pci 0000:00:16.4: Adding to iommu group 6 Sep 16 04:59:36.945994 kernel: pci 0000:00:17.0: Adding to iommu group 7 Sep 16 04:59:36.946066 kernel: pci 0000:00:1b.0: Adding to iommu group 8 Sep 16 04:59:36.946122 kernel: pci 0000:00:1b.4: Adding to iommu group 9 Sep 16 04:59:36.946179 kernel: pci 0000:00:1b.5: Adding to iommu group 10 Sep 16 04:59:36.946234 kernel: pci 0000:00:1c.0: Adding to iommu group 11 Sep 16 04:59:36.946291 kernel: pci 0000:00:1c.1: Adding to iommu group 12 Sep 16 04:59:36.946349 kernel: pci 0000:00:1e.0: Adding to iommu group 13 Sep 16 04:59:36.946405 kernel: pci 0000:00:1f.0: Adding to iommu group 14 Sep 16 04:59:36.946460 kernel: pci 0000:00:1f.4: Adding to iommu group 14 Sep 16 04:59:36.946516 kernel: pci 0000:00:1f.5: Adding to iommu group 14 Sep 16 04:59:36.946573 kernel: pci 0000:02:00.0: Adding to iommu group 2 Sep 16 04:59:36.946630 kernel: pci 0000:02:00.1: Adding to iommu group 2 Sep 16 04:59:36.946688 kernel: pci 0000:04:00.0: Adding to iommu group 15 Sep 16 04:59:36.946746 kernel: pci 0000:05:00.0: Adding to iommu group 16 Sep 16 04:59:36.946805 kernel: pci 0000:07:00.0: Adding to iommu group 17 Sep 16 04:59:36.946865 kernel: pci 0000:08:00.0: Adding to iommu group 17 Sep 16 04:59:36.946873 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Sep 16 04:59:36.946879 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 16 04:59:36.946885 kernel: software IO TLB: mapped [mem 0x0000000073fc5000-0x0000000077fc5000] (64MB) Sep 16 04:59:36.946891 kernel: RAPL PMU: API unit is 2^-32 Joules, 4 fixed counters, 655360 ms ovfl timer Sep 16 04:59:36.946897 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Sep 16 04:59:36.946902 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Sep 16 04:59:36.946908 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Sep 16 04:59:36.946916 kernel: RAPL PMU: hw unit of domain pp1-gpu 2^-14 Joules Sep 16 04:59:36.947006 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Sep 16 04:59:36.947031 kernel: Initialise system trusted keyrings Sep 16 04:59:36.947037 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Sep 16 04:59:36.947058 kernel: Key type asymmetric registered Sep 16 04:59:36.947063 kernel: Asymmetric key parser 'x509' registered Sep 16 04:59:36.947069 kernel: tsc: Refined TSC clocksource calibration: 3407.998 MHz Sep 16 04:59:36.947075 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd208cfc, max_idle_ns: 440795283699 ns Sep 16 04:59:36.947082 kernel: clocksource: Switched to clocksource tsc Sep 16 04:59:36.947088 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 16 04:59:36.947094 kernel: io scheduler mq-deadline registered Sep 16 04:59:36.947100 kernel: io scheduler kyber registered Sep 16 04:59:36.947105 kernel: io scheduler bfq registered Sep 16 04:59:36.947161 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 122 Sep 16 04:59:36.947216 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 123 Sep 16 04:59:36.947273 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 124 Sep 16 04:59:36.947329 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 125 Sep 16 04:59:36.947387 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 126 Sep 16 04:59:36.947442 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 127 Sep 16 04:59:36.947498 kernel: pcieport 0000:00:1c.1: PME: Signaling with IRQ 128 Sep 16 04:59:36.947560 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Sep 16 04:59:36.947569 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Sep 16 04:59:36.947575 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Sep 16 04:59:36.947581 kernel: pstore: Using crash dump compression: deflate Sep 16 04:59:36.947587 kernel: pstore: Registered erst as persistent store backend Sep 16 04:59:36.947594 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 16 04:59:36.947600 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 16 04:59:36.947606 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 16 04:59:36.947612 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Sep 16 04:59:36.947668 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Sep 16 04:59:36.947676 kernel: i8042: PNP: No PS/2 controller found. Sep 16 04:59:36.947727 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Sep 16 04:59:36.947780 kernel: rtc_cmos rtc_cmos: registered as rtc0 Sep 16 04:59:36.947834 kernel: rtc_cmos rtc_cmos: setting system clock to 2025-09-16T04:59:35 UTC (1757998775) Sep 16 04:59:36.947885 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Sep 16 04:59:36.947893 kernel: intel_pstate: Intel P-state driver initializing Sep 16 04:59:36.947899 kernel: intel_pstate: Disabling energy efficiency optimization Sep 16 04:59:36.947905 kernel: intel_pstate: HWP enabled Sep 16 04:59:36.947911 kernel: NET: Registered PF_INET6 protocol family Sep 16 04:59:36.947917 kernel: Segment Routing with IPv6 Sep 16 04:59:36.947922 kernel: In-situ OAM (IOAM) with IPv6 Sep 16 04:59:36.947930 kernel: NET: Registered PF_PACKET protocol family Sep 16 04:59:36.947935 kernel: Key type dns_resolver registered Sep 16 04:59:36.947941 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Sep 16 04:59:36.947949 kernel: microcode: Current revision: 0x000000de Sep 16 04:59:36.947955 kernel: IPI shorthand broadcast: enabled Sep 16 04:59:36.947961 kernel: sched_clock: Marking stable (3771001194, 1523550781)->(6841586993, -1547035018) Sep 16 04:59:36.947993 kernel: registered taskstats version 1 Sep 16 04:59:36.947999 kernel: Loading compiled-in X.509 certificates Sep 16 04:59:36.948005 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: d1d5b0d56b9b23dabf19e645632ff93bf659b3bf' Sep 16 04:59:36.948012 kernel: Demotion targets for Node 0: null Sep 16 04:59:36.948034 kernel: Key type .fscrypt registered Sep 16 04:59:36.948039 kernel: Key type fscrypt-provisioning registered Sep 16 04:59:36.948045 kernel: ima: Allocated hash algorithm: sha1 Sep 16 04:59:36.948051 kernel: ima: No architecture policies found Sep 16 04:59:36.948056 kernel: clk: Disabling unused clocks Sep 16 04:59:36.948063 kernel: Warning: unable to open an initial console. Sep 16 04:59:36.948068 kernel: Freeing unused kernel image (initmem) memory: 54096K Sep 16 04:59:36.948074 kernel: Write protecting the kernel read-only data: 24576k Sep 16 04:59:36.948081 kernel: Freeing unused kernel image (rodata/data gap) memory: 248K Sep 16 04:59:36.948087 kernel: Run /init as init process Sep 16 04:59:36.948093 kernel: with arguments: Sep 16 04:59:36.948098 kernel: /init Sep 16 04:59:36.948104 kernel: with environment: Sep 16 04:59:36.948109 kernel: HOME=/ Sep 16 04:59:36.948115 kernel: TERM=linux Sep 16 04:59:36.948120 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 16 04:59:36.948127 systemd[1]: Successfully made /usr/ read-only. Sep 16 04:59:36.948136 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 16 04:59:36.948143 systemd[1]: Detected architecture x86-64. Sep 16 04:59:36.948149 systemd[1]: Running in initrd. Sep 16 04:59:36.948155 systemd[1]: No hostname configured, using default hostname. Sep 16 04:59:36.948161 systemd[1]: Hostname set to . Sep 16 04:59:36.948167 systemd[1]: Initializing machine ID from random generator. Sep 16 04:59:36.948173 systemd[1]: Queued start job for default target initrd.target. Sep 16 04:59:36.948180 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 04:59:36.948186 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 04:59:36.948193 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 16 04:59:36.948199 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 16 04:59:36.948205 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 16 04:59:36.948211 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 16 04:59:36.948218 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 16 04:59:36.948225 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 16 04:59:36.948231 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 04:59:36.948237 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 16 04:59:36.948243 systemd[1]: Reached target paths.target - Path Units. Sep 16 04:59:36.948249 systemd[1]: Reached target slices.target - Slice Units. Sep 16 04:59:36.948256 systemd[1]: Reached target swap.target - Swaps. Sep 16 04:59:36.948262 systemd[1]: Reached target timers.target - Timer Units. Sep 16 04:59:36.948268 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 16 04:59:36.948275 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 16 04:59:36.948281 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 16 04:59:36.948288 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 16 04:59:36.948294 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 16 04:59:36.948300 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 16 04:59:36.948306 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 04:59:36.948312 systemd[1]: Reached target sockets.target - Socket Units. Sep 16 04:59:36.948318 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 16 04:59:36.948324 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 16 04:59:36.948331 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 16 04:59:36.948337 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 16 04:59:36.948343 systemd[1]: Starting systemd-fsck-usr.service... Sep 16 04:59:36.948350 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 16 04:59:36.948356 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 16 04:59:36.948362 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:59:36.948380 systemd-journald[298]: Collecting audit messages is disabled. Sep 16 04:59:36.948396 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 16 04:59:36.948402 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 04:59:36.948410 systemd-journald[298]: Journal started Sep 16 04:59:36.948424 systemd-journald[298]: Runtime Journal (/run/log/journal/fca8749123df44f4a063cdf2c63145f1) is 8M, max 636.8M, 628.8M free. Sep 16 04:59:36.917439 systemd-modules-load[299]: Inserted module 'overlay' Sep 16 04:59:36.987723 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 16 04:59:36.987737 kernel: Bridge firewalling registered Sep 16 04:59:36.952885 systemd-modules-load[299]: Inserted module 'br_netfilter' Sep 16 04:59:37.018123 systemd[1]: Started systemd-journald.service - Journal Service. Sep 16 04:59:37.027326 systemd[1]: Finished systemd-fsck-usr.service. Sep 16 04:59:37.035205 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 16 04:59:37.042303 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:59:37.062045 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 16 04:59:37.077635 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 16 04:59:37.098835 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 16 04:59:37.120583 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 16 04:59:37.135592 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 16 04:59:37.135705 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 16 04:59:37.136471 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 16 04:59:37.138551 systemd-tmpfiles[317]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 16 04:59:37.141028 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 04:59:37.141993 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 16 04:59:37.142117 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 04:59:37.149217 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 16 04:59:37.160664 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 16 04:59:37.165439 systemd-resolved[330]: Positive Trust Anchors: Sep 16 04:59:37.165445 systemd-resolved[330]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 16 04:59:37.165469 systemd-resolved[330]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 16 04:59:37.167195 systemd-resolved[330]: Defaulting to hostname 'linux'. Sep 16 04:59:37.171307 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 16 04:59:37.196173 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 16 04:59:37.297084 dracut-cmdline[341]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=0b876f86a632750e9937176808a48c2452d5168964273bcfc3c72f2a26140c06 Sep 16 04:59:37.427989 kernel: SCSI subsystem initialized Sep 16 04:59:37.453006 kernel: Loading iSCSI transport class v2.0-870. Sep 16 04:59:37.466009 kernel: iscsi: registered transport (tcp) Sep 16 04:59:37.496345 kernel: iscsi: registered transport (qla4xxx) Sep 16 04:59:37.496363 kernel: QLogic iSCSI HBA Driver Sep 16 04:59:37.507254 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 16 04:59:37.536115 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 04:59:37.547179 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 16 04:59:37.653683 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 16 04:59:37.666051 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 16 04:59:37.791985 kernel: raid6: avx2x4 gen() 21241 MB/s Sep 16 04:59:37.812979 kernel: raid6: avx2x2 gen() 43410 MB/s Sep 16 04:59:37.839023 kernel: raid6: avx2x1 gen() 46253 MB/s Sep 16 04:59:37.839040 kernel: raid6: using algorithm avx2x1 gen() 46253 MB/s Sep 16 04:59:37.866111 kernel: raid6: .... xor() 24449 MB/s, rmw enabled Sep 16 04:59:37.866128 kernel: raid6: using avx2x2 recovery algorithm Sep 16 04:59:37.886004 kernel: xor: automatically using best checksumming function avx Sep 16 04:59:38.064984 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 16 04:59:38.068376 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 16 04:59:38.079105 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 04:59:38.124989 systemd-udevd[554]: Using default interface naming scheme 'v255'. Sep 16 04:59:38.128728 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 04:59:38.154696 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 16 04:59:38.201180 dracut-pre-trigger[566]: rd.md=0: removing MD RAID activation Sep 16 04:59:38.266061 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 16 04:59:38.280866 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 16 04:59:38.425013 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 04:59:38.447062 kernel: cryptd: max_cpu_qlen set to 1000 Sep 16 04:59:38.425739 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 16 04:59:38.509063 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 16 04:59:38.509086 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 16 04:59:38.509100 kernel: PTP clock support registered Sep 16 04:59:38.509113 kernel: ACPI: bus type USB registered Sep 16 04:59:38.509125 kernel: usbcore: registered new interface driver usbfs Sep 16 04:59:38.509142 kernel: usbcore: registered new interface driver hub Sep 16 04:59:38.509154 kernel: usbcore: registered new device driver usb Sep 16 04:59:38.509167 kernel: AES CTR mode by8 optimization enabled Sep 16 04:59:38.509181 kernel: libata version 3.00 loaded. Sep 16 04:59:38.454812 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 04:59:38.626468 kernel: ahci 0000:00:17.0: version 3.0 Sep 16 04:59:38.626574 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Sep 16 04:59:38.626650 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Sep 16 04:59:38.626736 kernel: ahci 0000:00:17.0: AHCI vers 0001.0301, 32 command slots, 6 Gbps, RAID mode Sep 16 04:59:38.626894 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Sep 16 04:59:38.626972 kernel: ahci 0000:00:17.0: 8/8 ports implemented (port mask 0xff) Sep 16 04:59:38.627043 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Sep 16 04:59:38.627112 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Sep 16 04:59:38.627184 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Sep 16 04:59:38.627254 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Sep 16 04:59:38.627324 kernel: scsi host0: ahci Sep 16 04:59:38.627396 kernel: hub 1-0:1.0: USB hub found Sep 16 04:59:38.627481 kernel: scsi host1: ahci Sep 16 04:59:38.627550 kernel: hub 1-0:1.0: 16 ports detected Sep 16 04:59:38.627627 kernel: scsi host2: ahci Sep 16 04:59:38.627695 kernel: hub 2-0:1.0: USB hub found Sep 16 04:59:38.627774 kernel: scsi host3: ahci Sep 16 04:59:38.627842 kernel: hub 2-0:1.0: 10 ports detected Sep 16 04:59:38.627917 kernel: scsi host4: ahci Sep 16 04:59:38.627992 kernel: scsi host5: ahci Sep 16 04:59:38.628064 kernel: scsi host6: ahci Sep 16 04:59:38.628129 kernel: scsi host7: ahci Sep 16 04:59:38.454895 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:59:38.706203 kernel: ata1: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516100 irq 129 lpm-pol 0 Sep 16 04:59:38.706217 kernel: ata2: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516180 irq 129 lpm-pol 0 Sep 16 04:59:38.706225 kernel: ata3: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516200 irq 129 lpm-pol 0 Sep 16 04:59:38.706235 kernel: ata4: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516280 irq 129 lpm-pol 0 Sep 16 04:59:38.706247 kernel: ata5: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516300 irq 129 lpm-pol 0 Sep 16 04:59:38.706259 kernel: ata6: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516380 irq 129 lpm-pol 0 Sep 16 04:59:38.706271 kernel: ata7: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516400 irq 129 lpm-pol 0 Sep 16 04:59:38.706283 kernel: ata8: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516480 irq 129 lpm-pol 0 Sep 16 04:59:38.509351 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:59:38.743051 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Sep 16 04:59:38.743062 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Sep 16 04:59:38.680276 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:59:38.706316 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 16 04:59:38.795076 kernel: igb 0000:04:00.0: added PHC on eth0 Sep 16 04:59:38.795179 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Sep 16 04:59:38.795259 kernel: igb 0000:04:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:70:c8:96 Sep 16 04:59:38.795336 kernel: igb 0000:04:00.0: eth0: PBA No: 010000-000 Sep 16 04:59:38.795409 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Sep 16 04:59:38.833173 kernel: igb 0000:05:00.0: added PHC on eth1 Sep 16 04:59:38.833272 kernel: igb 0000:05:00.0: Intel(R) Gigabit Ethernet Network Connection Sep 16 04:59:38.840596 kernel: igb 0000:05:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:70:c8:97 Sep 16 04:59:38.840763 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Sep 16 04:59:38.847301 kernel: igb 0000:05:00.0: eth1: PBA No: 010000-000 Sep 16 04:59:38.860982 kernel: igb 0000:05:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Sep 16 04:59:38.888024 kernel: mlx5_core 0000:02:00.0: PTM is not supported by PCIe Sep 16 04:59:38.888141 kernel: mlx5_core 0000:02:00.0: firmware version: 14.28.2006 Sep 16 04:59:38.888218 kernel: mlx5_core 0000:02:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Sep 16 04:59:38.927217 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:59:38.979392 kernel: hub 1-14:1.0: USB hub found Sep 16 04:59:38.979747 kernel: hub 1-14:1.0: 4 ports detected Sep 16 04:59:38.993998 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Sep 16 04:59:38.994016 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 16 04:59:38.998994 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 16 04:59:39.005020 kernel: ata8: SATA link down (SStatus 0 SControl 300) Sep 16 04:59:39.011017 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 16 04:59:39.015997 kernel: ata7: SATA link down (SStatus 0 SControl 300) Sep 16 04:59:39.021997 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 16 04:59:39.027996 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Sep 16 04:59:39.033992 kernel: ata1.00: Model 'Micron_5300_MTFDDAK480TDT', rev ' D3MU001', applying quirks: zeroaftertrim Sep 16 04:59:39.050566 kernel: ata1.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Sep 16 04:59:39.050953 kernel: ata2.00: Model 'Micron_5300_MTFDDAK480TDT', rev ' D3MU001', applying quirks: zeroaftertrim Sep 16 04:59:39.067707 kernel: ata2.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Sep 16 04:59:39.079015 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Sep 16 04:59:39.079031 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Sep 16 04:59:39.095992 kernel: ata1.00: Features: NCQ-prio Sep 16 04:59:39.096008 kernel: ata2.00: Features: NCQ-prio Sep 16 04:59:39.117006 kernel: ata1.00: configured for UDMA/133 Sep 16 04:59:39.117023 kernel: ata2.00: configured for UDMA/133 Sep 16 04:59:39.117031 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Sep 16 04:59:39.137958 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Sep 16 04:59:39.145009 kernel: igb 0000:05:00.0 eno2: renamed from eth1 Sep 16 04:59:39.145116 kernel: mlx5_core 0000:02:00.0: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Sep 16 04:59:39.154953 kernel: igb 0000:04:00.0 eno1: renamed from eth0 Sep 16 04:59:39.155085 kernel: ata1.00: Enabling discard_zeroes_data Sep 16 04:59:39.164213 kernel: mlx5_core 0000:02:00.0: Port module event: module 0, Cable plugged Sep 16 04:59:39.164308 kernel: sd 0:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Sep 16 04:59:39.164402 kernel: ata2.00: Enabling discard_zeroes_data Sep 16 04:59:39.164411 kernel: sd 1:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Sep 16 04:59:39.164482 kernel: sd 1:0:0:0: [sdb] 4096-byte physical blocks Sep 16 04:59:39.164549 kernel: sd 1:0:0:0: [sdb] Write Protect is off Sep 16 04:59:39.164618 kernel: sd 1:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Sep 16 04:59:39.164685 kernel: sd 1:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 16 04:59:39.164750 kernel: sd 1:0:0:0: [sdb] Preferred minimum I/O size 4096 bytes Sep 16 04:59:39.164815 kernel: ata2.00: Enabling discard_zeroes_data Sep 16 04:59:39.229781 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Sep 16 04:59:39.234871 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 16 04:59:39.244203 kernel: sd 0:0:0:0: [sda] Mode Sense: 00 3a 00 00 Sep 16 04:59:39.244296 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 16 04:59:39.244373 kernel: sd 1:0:0:0: [sdb] Attached SCSI disk Sep 16 04:59:39.244446 kernel: sd 0:0:0:0: [sda] Preferred minimum I/O size 4096 bytes Sep 16 04:59:39.261197 kernel: ata1.00: Enabling discard_zeroes_data Sep 16 04:59:39.282926 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 16 04:59:39.282942 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Sep 16 04:59:39.282966 kernel: GPT:9289727 != 937703087 Sep 16 04:59:39.293365 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 16 04:59:39.297224 kernel: GPT:9289727 != 937703087 Sep 16 04:59:39.302632 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 16 04:59:39.307889 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 16 04:59:39.313046 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 16 04:59:39.351331 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Micron_5300_MTFDDAK480TDT ROOT. Sep 16 04:59:39.373706 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Micron_5300_MTFDDAK480TDT EFI-SYSTEM. Sep 16 04:59:39.450341 kernel: mlx5_core 0000:02:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Sep 16 04:59:39.450454 kernel: mlx5_core 0000:02:00.1: PTM is not supported by PCIe Sep 16 04:59:39.450535 kernel: mlx5_core 0000:02:00.1: firmware version: 14.28.2006 Sep 16 04:59:39.450610 kernel: mlx5_core 0000:02:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Sep 16 04:59:39.450684 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 16 04:59:39.450767 kernel: usbcore: registered new interface driver usbhid Sep 16 04:59:39.450775 kernel: usbhid: USB HID core driver Sep 16 04:59:39.450782 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Sep 16 04:59:39.418917 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Micron_5300_MTFDDAK480TDT USR-A. Sep 16 04:59:39.461045 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Micron_5300_MTFDDAK480TDT USR-A. Sep 16 04:59:39.468038 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Sep 16 04:59:39.551076 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Sep 16 04:59:39.551182 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Sep 16 04:59:39.551191 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Sep 16 04:59:39.541524 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 16 04:59:39.577300 disk-uuid[778]: Primary Header is updated. Sep 16 04:59:39.577300 disk-uuid[778]: Secondary Entries is updated. Sep 16 04:59:39.577300 disk-uuid[778]: Secondary Header is updated. Sep 16 04:59:39.604436 kernel: ata1.00: Enabling discard_zeroes_data Sep 16 04:59:39.604447 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 16 04:59:39.611609 kernel: ata1.00: Enabling discard_zeroes_data Sep 16 04:59:39.631991 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 16 04:59:39.699997 kernel: mlx5_core 0000:02:00.1: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Sep 16 04:59:39.711784 kernel: mlx5_core 0000:02:00.1: Port module event: module 1, Cable plugged Sep 16 04:59:39.951047 kernel: mlx5_core 0000:02:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Sep 16 04:59:39.963030 kernel: mlx5_core 0000:02:00.1 enp2s0f1np1: renamed from eth1 Sep 16 04:59:39.963533 kernel: mlx5_core 0000:02:00.0 enp2s0f0np0: renamed from eth0 Sep 16 04:59:40.011717 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 16 04:59:40.021937 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 16 04:59:40.040174 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 04:59:40.061137 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 16 04:59:40.081434 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 16 04:59:40.141852 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 16 04:59:40.611263 kernel: ata1.00: Enabling discard_zeroes_data Sep 16 04:59:40.628864 disk-uuid[779]: The operation has completed successfully. Sep 16 04:59:40.636096 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 16 04:59:40.665196 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 16 04:59:40.665269 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 16 04:59:40.700211 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 16 04:59:40.729009 sh[816]: Success Sep 16 04:59:40.760085 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 16 04:59:40.760106 kernel: device-mapper: uevent: version 1.0.3 Sep 16 04:59:40.769326 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 16 04:59:40.781988 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 16 04:59:40.826459 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 16 04:59:40.827621 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 16 04:59:40.857068 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 16 04:59:40.904045 kernel: BTRFS: device fsid f1b91845-3914-4d21-a370-6d760ee45b2e devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (829) Sep 16 04:59:40.904061 kernel: BTRFS info (device dm-0): first mount of filesystem f1b91845-3914-4d21-a370-6d760ee45b2e Sep 16 04:59:40.904072 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 16 04:59:40.921344 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 16 04:59:40.921362 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 16 04:59:40.927466 kernel: BTRFS info (device dm-0): enabling free space tree Sep 16 04:59:40.929782 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 16 04:59:40.930004 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 16 04:59:40.954200 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 16 04:59:40.954666 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 16 04:59:40.970564 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 16 04:59:41.026954 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (852) Sep 16 04:59:41.044641 kernel: BTRFS info (device sda6): first mount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 04:59:41.044661 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 16 04:59:41.059834 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 16 04:59:41.059850 kernel: BTRFS info (device sda6): turning on async discard Sep 16 04:59:41.066034 kernel: BTRFS info (device sda6): enabling free space tree Sep 16 04:59:41.077156 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 16 04:59:41.099165 kernel: BTRFS info (device sda6): last unmount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 04:59:41.089315 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 16 04:59:41.100159 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 16 04:59:41.131665 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 16 04:59:41.182169 systemd-networkd[1000]: lo: Link UP Sep 16 04:59:41.182173 systemd-networkd[1000]: lo: Gained carrier Sep 16 04:59:41.185015 systemd-networkd[1000]: Enumeration completed Sep 16 04:59:41.185096 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 16 04:59:41.185762 systemd-networkd[1000]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:59:41.198042 systemd[1]: Reached target network.target - Network. Sep 16 04:59:41.228158 ignition[998]: Ignition 2.22.0 Sep 16 04:59:41.212589 systemd-networkd[1000]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:59:41.228163 ignition[998]: Stage: fetch-offline Sep 16 04:59:41.230918 unknown[998]: fetched base config from "system" Sep 16 04:59:41.228183 ignition[998]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:59:41.230922 unknown[998]: fetched user config from "system" Sep 16 04:59:41.228188 ignition[998]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 16 04:59:41.232061 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 16 04:59:41.228235 ignition[998]: parsed url from cmdline: "" Sep 16 04:59:41.241065 systemd-networkd[1000]: enp2s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:59:41.228237 ignition[998]: no config URL provided Sep 16 04:59:41.248263 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 16 04:59:41.228240 ignition[998]: reading system config file "/usr/lib/ignition/user.ign" Sep 16 04:59:41.248913 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 16 04:59:41.228264 ignition[998]: parsing config with SHA512: 15821b3a4b3f1c590c30651c336b3183a338d2c0dae428058e2352c49a6bfd856b126c1fa1b72c3c39231cd651f5b07d76a2f3cdccafa26ec7c5d639588af33e Sep 16 04:59:41.231137 ignition[998]: fetch-offline: fetch-offline passed Sep 16 04:59:41.231140 ignition[998]: POST message to Packet Timeline Sep 16 04:59:41.231143 ignition[998]: POST Status error: resource requires networking Sep 16 04:59:41.231173 ignition[998]: Ignition finished successfully Sep 16 04:59:41.331107 ignition[1016]: Ignition 2.22.0 Sep 16 04:59:41.415213 kernel: mlx5_core 0000:02:00.0 enp2s0f0np0: Link up Sep 16 04:59:41.414071 systemd-networkd[1000]: enp2s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:59:41.331127 ignition[1016]: Stage: kargs Sep 16 04:59:41.331572 ignition[1016]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:59:41.331603 ignition[1016]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 16 04:59:41.334097 ignition[1016]: kargs: kargs passed Sep 16 04:59:41.334112 ignition[1016]: POST message to Packet Timeline Sep 16 04:59:41.334146 ignition[1016]: GET https://metadata.packet.net/metadata: attempt #1 Sep 16 04:59:41.335904 ignition[1016]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:49895->[::1]:53: read: connection refused Sep 16 04:59:41.536573 ignition[1016]: GET https://metadata.packet.net/metadata: attempt #2 Sep 16 04:59:41.540686 ignition[1016]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:40663->[::1]:53: read: connection refused Sep 16 04:59:41.595045 kernel: mlx5_core 0000:02:00.1 enp2s0f1np1: Link up Sep 16 04:59:41.600095 systemd-networkd[1000]: eno1: Link UP Sep 16 04:59:41.600532 systemd-networkd[1000]: eno2: Link UP Sep 16 04:59:41.600943 systemd-networkd[1000]: enp2s0f0np0: Link UP Sep 16 04:59:41.601453 systemd-networkd[1000]: enp2s0f0np0: Gained carrier Sep 16 04:59:41.615505 systemd-networkd[1000]: enp2s0f1np1: Link UP Sep 16 04:59:41.616894 systemd-networkd[1000]: enp2s0f1np1: Gained carrier Sep 16 04:59:41.648133 systemd-networkd[1000]: enp2s0f0np0: DHCPv4 address 139.178.94.33/31, gateway 139.178.94.32 acquired from 145.40.83.140 Sep 16 04:59:41.940940 ignition[1016]: GET https://metadata.packet.net/metadata: attempt #3 Sep 16 04:59:41.942029 ignition[1016]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:49145->[::1]:53: read: connection refused Sep 16 04:59:42.742204 ignition[1016]: GET https://metadata.packet.net/metadata: attempt #4 Sep 16 04:59:42.743144 ignition[1016]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:59190->[::1]:53: read: connection refused Sep 16 04:59:43.213151 systemd-networkd[1000]: enp2s0f1np1: Gained IPv6LL Sep 16 04:59:43.341175 systemd-networkd[1000]: enp2s0f0np0: Gained IPv6LL Sep 16 04:59:44.344015 ignition[1016]: GET https://metadata.packet.net/metadata: attempt #5 Sep 16 04:59:44.345025 ignition[1016]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:54627->[::1]:53: read: connection refused Sep 16 04:59:47.545868 ignition[1016]: GET https://metadata.packet.net/metadata: attempt #6 Sep 16 04:59:48.600179 ignition[1016]: GET result: OK Sep 16 04:59:49.404779 ignition[1016]: Ignition finished successfully Sep 16 04:59:49.410743 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 16 04:59:49.420901 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 16 04:59:49.471683 ignition[1037]: Ignition 2.22.0 Sep 16 04:59:49.471694 ignition[1037]: Stage: disks Sep 16 04:59:49.471856 ignition[1037]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:59:49.471869 ignition[1037]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 16 04:59:49.472832 ignition[1037]: disks: disks passed Sep 16 04:59:49.472838 ignition[1037]: POST message to Packet Timeline Sep 16 04:59:49.472855 ignition[1037]: GET https://metadata.packet.net/metadata: attempt #1 Sep 16 04:59:50.670567 ignition[1037]: GET result: OK Sep 16 04:59:51.316413 ignition[1037]: Ignition finished successfully Sep 16 04:59:51.321554 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 16 04:59:51.333178 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 16 04:59:51.351199 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 16 04:59:51.370186 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 16 04:59:51.389171 systemd[1]: Reached target sysinit.target - System Initialization. Sep 16 04:59:51.407154 systemd[1]: Reached target basic.target - Basic System. Sep 16 04:59:51.425763 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 16 04:59:51.475192 systemd-fsck[1057]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 16 04:59:51.484444 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 16 04:59:51.497542 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 16 04:59:51.633985 kernel: EXT4-fs (sda9): mounted filesystem fb1cb44f-955b-4cd0-8849-33ce3640d547 r/w with ordered data mode. Quota mode: none. Sep 16 04:59:51.633988 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 16 04:59:51.641333 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 16 04:59:51.665293 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 16 04:59:51.672904 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 16 04:59:51.685745 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 16 04:59:51.704955 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sda6 (8:6) scanned by mount (1066) Sep 16 04:59:51.706136 systemd[1]: Starting flatcar-static-network.service - Flatcar Static Network Agent... Sep 16 04:59:51.765152 kernel: BTRFS info (device sda6): first mount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 04:59:51.765165 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 16 04:59:51.765172 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 16 04:59:51.765180 kernel: BTRFS info (device sda6): turning on async discard Sep 16 04:59:51.765187 kernel: BTRFS info (device sda6): enabling free space tree Sep 16 04:59:51.765027 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 16 04:59:51.765044 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 16 04:59:51.817004 coreos-metadata[1069]: Sep 16 04:59:51.810 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 16 04:59:51.836112 coreos-metadata[1068]: Sep 16 04:59:51.809 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 16 04:59:51.766012 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 16 04:59:51.808184 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 16 04:59:51.825017 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 16 04:59:51.891683 initrd-setup-root[1098]: cut: /sysroot/etc/passwd: No such file or directory Sep 16 04:59:51.900068 initrd-setup-root[1105]: cut: /sysroot/etc/group: No such file or directory Sep 16 04:59:51.910026 initrd-setup-root[1112]: cut: /sysroot/etc/shadow: No such file or directory Sep 16 04:59:51.919081 initrd-setup-root[1119]: cut: /sysroot/etc/gshadow: No such file or directory Sep 16 04:59:51.963456 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 16 04:59:51.973209 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 16 04:59:51.982022 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 16 04:59:52.010387 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 16 04:59:52.028084 kernel: BTRFS info (device sda6): last unmount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 04:59:52.035523 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 16 04:59:52.051206 ignition[1186]: INFO : Ignition 2.22.0 Sep 16 04:59:52.051206 ignition[1186]: INFO : Stage: mount Sep 16 04:59:52.051206 ignition[1186]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:59:52.051206 ignition[1186]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 16 04:59:52.051206 ignition[1186]: INFO : mount: mount passed Sep 16 04:59:52.051206 ignition[1186]: INFO : POST message to Packet Timeline Sep 16 04:59:52.051206 ignition[1186]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 16 04:59:52.859614 coreos-metadata[1068]: Sep 16 04:59:52.859 INFO Fetch successful Sep 16 04:59:52.936967 coreos-metadata[1068]: Sep 16 04:59:52.936 INFO wrote hostname ci-4459.0.0-n-32926c0571 to /sysroot/etc/hostname Sep 16 04:59:52.951080 coreos-metadata[1069]: Sep 16 04:59:52.943 INFO Fetch successful Sep 16 04:59:52.938233 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 16 04:59:52.976529 systemd[1]: flatcar-static-network.service: Deactivated successfully. Sep 16 04:59:52.976577 systemd[1]: Finished flatcar-static-network.service - Flatcar Static Network Agent. Sep 16 04:59:53.008130 ignition[1186]: INFO : GET result: OK Sep 16 04:59:53.390325 ignition[1186]: INFO : Ignition finished successfully Sep 16 04:59:53.394886 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 16 04:59:53.410164 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 16 04:59:53.443154 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 16 04:59:53.481964 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sda6 (8:6) scanned by mount (1212) Sep 16 04:59:53.499360 kernel: BTRFS info (device sda6): first mount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 04:59:53.499376 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 16 04:59:53.514120 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 16 04:59:53.514137 kernel: BTRFS info (device sda6): turning on async discard Sep 16 04:59:53.520235 kernel: BTRFS info (device sda6): enabling free space tree Sep 16 04:59:53.521917 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 16 04:59:53.558325 ignition[1229]: INFO : Ignition 2.22.0 Sep 16 04:59:53.558325 ignition[1229]: INFO : Stage: files Sep 16 04:59:53.570213 ignition[1229]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:59:53.570213 ignition[1229]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 16 04:59:53.570213 ignition[1229]: DEBUG : files: compiled without relabeling support, skipping Sep 16 04:59:53.570213 ignition[1229]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 16 04:59:53.570213 ignition[1229]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 16 04:59:53.570213 ignition[1229]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 16 04:59:53.570213 ignition[1229]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 16 04:59:53.570213 ignition[1229]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 16 04:59:53.570213 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 16 04:59:53.570213 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 16 04:59:53.561882 unknown[1229]: wrote ssh authorized keys file for user: core Sep 16 04:59:53.696109 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 16 04:59:53.736495 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 16 04:59:53.752186 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 16 04:59:53.752186 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 16 04:59:53.752186 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 16 04:59:53.752186 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 16 04:59:53.752186 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 16 04:59:53.752186 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 16 04:59:53.752186 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 16 04:59:53.752186 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 16 04:59:53.752186 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 16 04:59:53.752186 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 16 04:59:53.752186 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 16 04:59:53.752186 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 16 04:59:53.752186 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 16 04:59:53.752186 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 16 04:59:54.305786 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 16 04:59:54.991090 ignition[1229]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 16 04:59:54.991090 ignition[1229]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 16 04:59:55.019096 ignition[1229]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 16 04:59:55.019096 ignition[1229]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 16 04:59:55.019096 ignition[1229]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 16 04:59:55.019096 ignition[1229]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 16 04:59:55.019096 ignition[1229]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 16 04:59:55.019096 ignition[1229]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 16 04:59:55.019096 ignition[1229]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 16 04:59:55.019096 ignition[1229]: INFO : files: files passed Sep 16 04:59:55.019096 ignition[1229]: INFO : POST message to Packet Timeline Sep 16 04:59:55.019096 ignition[1229]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 16 04:59:55.888527 ignition[1229]: INFO : GET result: OK Sep 16 04:59:57.016246 ignition[1229]: INFO : Ignition finished successfully Sep 16 04:59:57.017631 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 16 04:59:57.035873 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 16 04:59:57.062593 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 16 04:59:57.109932 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 16 04:59:57.110047 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 16 04:59:57.137174 initrd-setup-root-after-ignition[1266]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:59:57.137174 initrd-setup-root-after-ignition[1266]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:59:57.118438 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 16 04:59:57.192203 initrd-setup-root-after-ignition[1270]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:59:57.150258 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 16 04:59:57.184446 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 16 04:59:57.264664 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 16 04:59:57.264731 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 16 04:59:57.282430 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 16 04:59:57.301132 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 16 04:59:57.319404 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 16 04:59:57.321810 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 16 04:59:57.380432 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 16 04:59:57.393983 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 16 04:59:57.464090 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 16 04:59:57.474395 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 04:59:57.493520 systemd[1]: Stopped target timers.target - Timer Units. Sep 16 04:59:57.511482 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 16 04:59:57.511852 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 16 04:59:57.548319 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 16 04:59:57.557430 systemd[1]: Stopped target basic.target - Basic System. Sep 16 04:59:57.575428 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 16 04:59:57.591439 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 16 04:59:57.610439 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 16 04:59:57.629429 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 16 04:59:57.648432 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 16 04:59:57.666429 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 16 04:59:57.685474 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 16 04:59:57.704454 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 16 04:59:57.722435 systemd[1]: Stopped target swap.target - Swaps. Sep 16 04:59:57.738331 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 16 04:59:57.738705 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 16 04:59:57.772318 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 16 04:59:57.781454 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 04:59:57.800317 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 16 04:59:57.800775 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 04:59:57.821316 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 16 04:59:57.821677 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 16 04:59:57.859222 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 16 04:59:57.859627 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 16 04:59:57.879620 systemd[1]: Stopped target paths.target - Path Units. Sep 16 04:59:57.895309 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 16 04:59:57.895799 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 04:59:57.915432 systemd[1]: Stopped target slices.target - Slice Units. Sep 16 04:59:57.931438 systemd[1]: Stopped target sockets.target - Socket Units. Sep 16 04:59:57.949389 systemd[1]: iscsid.socket: Deactivated successfully. Sep 16 04:59:57.949676 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 16 04:59:57.971448 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 16 04:59:57.971729 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 16 04:59:57.987549 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 16 04:59:57.987924 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 16 04:59:58.004515 systemd[1]: ignition-files.service: Deactivated successfully. Sep 16 04:59:58.004881 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 16 04:59:58.124153 ignition[1291]: INFO : Ignition 2.22.0 Sep 16 04:59:58.124153 ignition[1291]: INFO : Stage: umount Sep 16 04:59:58.124153 ignition[1291]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:59:58.124153 ignition[1291]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 16 04:59:58.124153 ignition[1291]: INFO : umount: umount passed Sep 16 04:59:58.124153 ignition[1291]: INFO : POST message to Packet Timeline Sep 16 04:59:58.124153 ignition[1291]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 16 04:59:58.021516 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 16 04:59:58.021882 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 16 04:59:58.041650 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 16 04:59:58.053122 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 16 04:59:58.053193 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 04:59:58.081844 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 16 04:59:58.091281 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 16 04:59:58.091458 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 04:59:58.117375 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 16 04:59:58.117446 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 16 04:59:58.166459 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 16 04:59:58.167584 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 16 04:59:58.167689 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 16 04:59:58.173436 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 16 04:59:58.173578 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 16 04:59:58.961485 ignition[1291]: INFO : GET result: OK Sep 16 04:59:59.405698 ignition[1291]: INFO : Ignition finished successfully Sep 16 04:59:59.410393 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 16 04:59:59.410808 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 16 04:59:59.425304 systemd[1]: Stopped target network.target - Network. Sep 16 04:59:59.431479 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 16 04:59:59.431664 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 16 04:59:59.454339 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 16 04:59:59.454484 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 16 04:59:59.470277 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 16 04:59:59.470428 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 16 04:59:59.489261 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 16 04:59:59.489415 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 16 04:59:59.507213 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 16 04:59:59.507359 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 16 04:59:59.523569 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 16 04:59:59.541392 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 16 04:59:59.559146 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 16 04:59:59.559436 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 16 04:59:59.581386 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 16 04:59:59.581492 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 16 04:59:59.581544 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 16 04:59:59.585923 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 16 04:59:59.586310 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 16 04:59:59.620117 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 16 04:59:59.620159 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 16 04:59:59.631295 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 16 04:59:59.664077 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 16 04:59:59.664124 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 16 04:59:59.681149 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 16 04:59:59.681211 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 16 04:59:59.700581 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 16 04:59:59.700723 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 16 04:59:59.720199 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 16 04:59:59.720340 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 04:59:59.739653 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 04:59:59.761491 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 16 04:59:59.761690 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 16 04:59:59.762903 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 16 04:59:59.763321 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 04:59:59.781793 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 16 04:59:59.781993 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 16 04:59:59.788256 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 16 04:59:59.788277 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 04:59:59.822097 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 16 04:59:59.822143 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 16 04:59:59.848102 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 16 04:59:59.848205 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 16 04:59:59.875149 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 16 04:59:59.875333 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 16 05:00:00.149091 systemd-journald[298]: Received SIGTERM from PID 1 (systemd). Sep 16 04:59:59.915048 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 16 04:59:59.922099 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 16 04:59:59.922129 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 04:59:59.961156 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 16 04:59:59.961209 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 04:59:59.982337 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 04:59:59.982424 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 05:00:00.007324 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 16 05:00:00.007495 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 16 05:00:00.007636 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 16 05:00:00.009218 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 16 05:00:00.009472 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 16 05:00:00.034439 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 16 05:00:00.034744 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 16 05:00:00.050197 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 16 05:00:00.069168 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 16 05:00:00.109372 systemd[1]: Switching root. Sep 16 05:00:00.258214 systemd-journald[298]: Journal stopped Sep 16 05:00:02.023790 kernel: SELinux: policy capability network_peer_controls=1 Sep 16 05:00:02.023807 kernel: SELinux: policy capability open_perms=1 Sep 16 05:00:02.023816 kernel: SELinux: policy capability extended_socket_class=1 Sep 16 05:00:02.023822 kernel: SELinux: policy capability always_check_network=0 Sep 16 05:00:02.023827 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 16 05:00:02.023832 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 16 05:00:02.023839 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 16 05:00:02.023844 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 16 05:00:02.023849 kernel: SELinux: policy capability userspace_initial_context=0 Sep 16 05:00:02.023856 kernel: audit: type=1403 audit(1757998800.393:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 16 05:00:02.023863 systemd[1]: Successfully loaded SELinux policy in 96.418ms. Sep 16 05:00:02.023870 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.056ms. Sep 16 05:00:02.023878 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 16 05:00:02.023884 systemd[1]: Detected architecture x86-64. Sep 16 05:00:02.023892 systemd[1]: Detected first boot. Sep 16 05:00:02.023899 systemd[1]: Hostname set to . Sep 16 05:00:02.023905 systemd[1]: Initializing machine ID from random generator. Sep 16 05:00:02.023912 zram_generator::config[1348]: No configuration found. Sep 16 05:00:02.023919 systemd[1]: Populated /etc with preset unit settings. Sep 16 05:00:02.023927 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 16 05:00:02.023935 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 16 05:00:02.023941 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 16 05:00:02.023951 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 16 05:00:02.023958 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 16 05:00:02.023964 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 16 05:00:02.023972 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 16 05:00:02.023979 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 16 05:00:02.023987 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 16 05:00:02.023994 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 16 05:00:02.024001 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 16 05:00:02.024008 systemd[1]: Created slice user.slice - User and Session Slice. Sep 16 05:00:02.024015 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 05:00:02.024021 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 05:00:02.024028 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 16 05:00:02.024035 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 16 05:00:02.024043 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 16 05:00:02.024050 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 16 05:00:02.024056 systemd[1]: Expecting device dev-ttyS1.device - /dev/ttyS1... Sep 16 05:00:02.024063 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 05:00:02.024070 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 16 05:00:02.024078 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 16 05:00:02.024085 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 16 05:00:02.024092 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 16 05:00:02.024100 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 16 05:00:02.024108 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 05:00:02.024114 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 16 05:00:02.024121 systemd[1]: Reached target slices.target - Slice Units. Sep 16 05:00:02.024128 systemd[1]: Reached target swap.target - Swaps. Sep 16 05:00:02.024135 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 16 05:00:02.024142 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 16 05:00:02.024149 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 16 05:00:02.024157 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 16 05:00:02.024164 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 16 05:00:02.024171 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 05:00:02.024178 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 16 05:00:02.024185 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 16 05:00:02.024193 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 16 05:00:02.024200 systemd[1]: Mounting media.mount - External Media Directory... Sep 16 05:00:02.024207 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 05:00:02.024215 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 16 05:00:02.024222 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 16 05:00:02.024229 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 16 05:00:02.024237 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 16 05:00:02.024244 systemd[1]: Reached target machines.target - Containers. Sep 16 05:00:02.024252 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 16 05:00:02.024259 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 05:00:02.024266 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 16 05:00:02.024273 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 16 05:00:02.024280 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 05:00:02.024287 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 16 05:00:02.024294 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 05:00:02.024301 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 16 05:00:02.024309 kernel: ACPI: bus type drm_connector registered Sep 16 05:00:02.024316 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 05:00:02.024323 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 16 05:00:02.024330 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 16 05:00:02.024337 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 16 05:00:02.024343 kernel: fuse: init (API version 7.41) Sep 16 05:00:02.024349 kernel: loop: module loaded Sep 16 05:00:02.024356 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 16 05:00:02.024363 systemd[1]: Stopped systemd-fsck-usr.service. Sep 16 05:00:02.024371 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 05:00:02.024378 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 16 05:00:02.024385 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 16 05:00:02.024392 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 16 05:00:02.024410 systemd-journald[1451]: Collecting audit messages is disabled. Sep 16 05:00:02.024426 systemd-journald[1451]: Journal started Sep 16 05:00:02.024440 systemd-journald[1451]: Runtime Journal (/run/log/journal/717ddfc0fdd548ef8f7720f27e6429a0) is 8M, max 636.8M, 628.8M free. Sep 16 05:00:00.881042 systemd[1]: Queued start job for default target multi-user.target. Sep 16 05:00:00.891943 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 16 05:00:00.892220 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 16 05:00:02.043995 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 16 05:00:02.064010 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 16 05:00:02.075101 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 16 05:00:02.103169 systemd[1]: verity-setup.service: Deactivated successfully. Sep 16 05:00:02.103191 systemd[1]: Stopped verity-setup.service. Sep 16 05:00:02.127004 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 05:00:02.136019 systemd[1]: Started systemd-journald.service - Journal Service. Sep 16 05:00:02.144477 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 16 05:00:02.153098 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 16 05:00:02.162099 systemd[1]: Mounted media.mount - External Media Directory. Sep 16 05:00:02.171090 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 16 05:00:02.180091 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 16 05:00:02.189224 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 16 05:00:02.199294 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 16 05:00:02.209334 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 05:00:02.219268 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 16 05:00:02.219371 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 16 05:00:02.229315 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 05:00:02.229435 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 05:00:02.239281 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 16 05:00:02.239416 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 16 05:00:02.248316 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 05:00:02.248477 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 05:00:02.259421 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 16 05:00:02.259692 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 16 05:00:02.268528 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 05:00:02.268904 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 05:00:02.278932 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 16 05:00:02.288918 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 05:00:02.299915 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 16 05:00:02.310974 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 16 05:00:02.321907 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 05:00:02.340794 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 16 05:00:02.350877 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 16 05:00:02.369266 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 16 05:00:02.379108 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 16 05:00:02.379139 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 16 05:00:02.389426 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 16 05:00:02.401109 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 16 05:00:02.410200 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 05:00:02.417588 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 16 05:00:02.426551 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 16 05:00:02.436062 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 16 05:00:02.448357 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 16 05:00:02.454660 systemd-journald[1451]: Time spent on flushing to /var/log/journal/717ddfc0fdd548ef8f7720f27e6429a0 is 12.673ms for 1423 entries. Sep 16 05:00:02.454660 systemd-journald[1451]: System Journal (/var/log/journal/717ddfc0fdd548ef8f7720f27e6429a0) is 8M, max 195.6M, 187.6M free. Sep 16 05:00:02.485472 systemd-journald[1451]: Received client request to flush runtime journal. Sep 16 05:00:02.466101 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 16 05:00:02.476282 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 16 05:00:02.486015 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 16 05:00:02.516279 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 16 05:00:02.526997 kernel: loop0: detected capacity change from 0 to 8 Sep 16 05:00:02.531293 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 16 05:00:02.538016 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 16 05:00:02.547655 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 16 05:00:02.557407 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 16 05:00:02.572022 kernel: loop1: detected capacity change from 0 to 128016 Sep 16 05:00:02.574253 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 16 05:00:02.584194 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 16 05:00:02.593211 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 16 05:00:02.603767 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 16 05:00:02.616956 kernel: loop2: detected capacity change from 0 to 221472 Sep 16 05:00:02.619764 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 16 05:00:02.638208 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 16 05:00:02.651216 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 16 05:00:02.656086 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 16 05:00:02.664990 kernel: loop3: detected capacity change from 0 to 110984 Sep 16 05:00:02.674922 systemd-tmpfiles[1503]: ACLs are not supported, ignoring. Sep 16 05:00:02.674932 systemd-tmpfiles[1503]: ACLs are not supported, ignoring. Sep 16 05:00:02.676786 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 05:00:02.705997 kernel: loop4: detected capacity change from 0 to 8 Sep 16 05:00:02.712995 kernel: loop5: detected capacity change from 0 to 128016 Sep 16 05:00:02.740001 kernel: loop6: detected capacity change from 0 to 221472 Sep 16 05:00:02.759995 kernel: loop7: detected capacity change from 0 to 110984 Sep 16 05:00:02.775205 (sd-merge)[1509]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-packet'. Sep 16 05:00:02.775509 (sd-merge)[1509]: Merged extensions into '/usr'. Sep 16 05:00:02.778751 systemd[1]: Reload requested from client PID 1487 ('systemd-sysext') (unit systemd-sysext.service)... Sep 16 05:00:02.778765 systemd[1]: Reloading... Sep 16 05:00:02.801114 ldconfig[1481]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 16 05:00:02.804954 zram_generator::config[1535]: No configuration found. Sep 16 05:00:02.933613 systemd[1]: Reloading finished in 154 ms. Sep 16 05:00:02.949888 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 16 05:00:02.959356 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 16 05:00:02.969285 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 16 05:00:02.989807 systemd[1]: Starting ensure-sysext.service... Sep 16 05:00:02.996831 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 16 05:00:03.013691 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 05:00:03.025278 systemd-tmpfiles[1594]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 16 05:00:03.025318 systemd-tmpfiles[1594]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 16 05:00:03.025687 systemd-tmpfiles[1594]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 16 05:00:03.026093 systemd-tmpfiles[1594]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 16 05:00:03.027263 systemd-tmpfiles[1594]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 16 05:00:03.027645 systemd-tmpfiles[1594]: ACLs are not supported, ignoring. Sep 16 05:00:03.027726 systemd-tmpfiles[1594]: ACLs are not supported, ignoring. Sep 16 05:00:03.032129 systemd-tmpfiles[1594]: Detected autofs mount point /boot during canonicalization of boot. Sep 16 05:00:03.032140 systemd-tmpfiles[1594]: Skipping /boot Sep 16 05:00:03.041822 systemd-tmpfiles[1594]: Detected autofs mount point /boot during canonicalization of boot. Sep 16 05:00:03.041833 systemd-tmpfiles[1594]: Skipping /boot Sep 16 05:00:03.049329 systemd[1]: Reload requested from client PID 1593 ('systemctl') (unit ensure-sysext.service)... Sep 16 05:00:03.049344 systemd[1]: Reloading... Sep 16 05:00:03.065310 systemd-udevd[1595]: Using default interface naming scheme 'v255'. Sep 16 05:00:03.084031 zram_generator::config[1622]: No configuration found. Sep 16 05:00:03.128807 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Sep 16 05:00:03.128861 kernel: ACPI: button: Sleep Button [SLPB] Sep 16 05:00:03.136818 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 16 05:00:03.146033 kernel: mousedev: PS/2 mouse device common for all mice Sep 16 05:00:03.154015 kernel: ACPI: button: Power Button [PWRF] Sep 16 05:00:03.154080 kernel: IPMI message handler: version 39.2 Sep 16 05:00:03.154101 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Sep 16 05:00:03.156956 kernel: ACPI: video: Video Device [GFX0] (multi-head: yes rom: no post: no) Sep 16 05:00:03.156980 kernel: input: Video Bus as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0A08:00/LNXVIDEO:00/input/input4 Sep 16 05:00:03.189351 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Sep 16 05:00:03.219055 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Sep 16 05:00:03.219235 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Sep 16 05:00:03.268963 kernel: ipmi device interface Sep 16 05:00:03.275972 kernel: iTCO_vendor_support: vendor-support=0 Sep 16 05:00:03.285377 systemd[1]: Condition check resulted in dev-ttyS1.device - /dev/ttyS1 being skipped. Sep 16 05:00:03.285519 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Sep 16 05:00:03.285955 kernel: ipmi_si: IPMI System Interface driver Sep 16 05:00:03.290956 kernel: MACsec IEEE 802.1AE Sep 16 05:00:03.290991 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Sep 16 05:00:03.305177 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Sep 16 05:00:03.312503 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Sep 16 05:00:03.329088 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Sep 16 05:00:03.329207 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Sep 16 05:00:03.339169 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Sep 16 05:00:03.345899 kernel: ipmi_si: Adding ACPI-specified kcs state machine Sep 16 05:00:03.346089 systemd[1]: Reloading finished in 296 ms. Sep 16 05:00:03.356087 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Sep 16 05:00:03.370954 kernel: iTCO_wdt iTCO_wdt: unable to reset NO_REBOOT flag, device disabled by hardware/BIOS Sep 16 05:00:03.381799 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 05:00:03.395142 kernel: intel_rapl_common: Found RAPL domain package Sep 16 05:00:03.395183 kernel: intel_rapl_common: Found RAPL domain core Sep 16 05:00:03.401828 kernel: intel_rapl_common: Found RAPL domain uncore Sep 16 05:00:03.401847 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Sep 16 05:00:03.401977 kernel: intel_rapl_common: Found RAPL domain dram Sep 16 05:00:03.441186 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 05:00:03.456955 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b11, dev_id: 0x20) Sep 16 05:00:03.477962 systemd[1]: Finished ensure-sysext.service. Sep 16 05:00:03.505766 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Sep 16 05:00:03.514044 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 05:00:03.514733 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 16 05:00:03.539954 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Sep 16 05:00:03.760953 kernel: ipmi_ssif: IPMI SSIF Interface driver Sep 16 05:00:03.760984 kernel: i915 0000:00:02.0: can't derive routing for PCI INT A Sep 16 05:00:03.772691 kernel: i915 0000:00:02.0: PCI INT A: not connected Sep 16 05:00:03.782994 kernel: i915 0000:00:02.0: [drm] Found COFFEELAKE (device ID 3e9a) display version 9.00 stepping N/A Sep 16 05:00:03.785638 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 16 05:00:03.795896 kernel: i915 0000:00:02.0: [drm] VT-d active for gfx access Sep 16 05:00:03.796030 kernel: i915 0000:00:02.0: [drm] Using Transparent Hugepages Sep 16 05:00:03.806694 augenrules[1821]: No rules Sep 16 05:00:03.815673 kernel: i915 0000:00:02.0: ROM [??? 0x00000000 flags 0x20000000]: can't assign; bogus alignment Sep 16 05:00:03.815863 kernel: i915 0000:00:02.0: [drm] Failed to find VBIOS tables (VBT) Sep 16 05:00:03.829952 kernel: i915 0000:00:02.0: [drm] Finished loading DMC firmware i915/kbl_dmc_ver1_04.bin (v1.4) Sep 16 05:00:03.830146 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 05:00:03.830906 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 05:00:03.852076 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 16 05:00:03.870063 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 05:00:03.893083 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 05:00:03.903072 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 05:00:03.917064 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 16 05:00:03.927013 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 05:00:03.940060 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 16 05:00:03.950999 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 16 05:00:03.952019 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 16 05:00:03.952965 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 16 05:00:03.968598 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 16 05:00:03.977584 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 05:00:03.988012 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 05:00:03.988694 systemd[1]: audit-rules.service: Deactivated successfully. Sep 16 05:00:04.005019 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 16 05:00:04.014401 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 16 05:00:04.014577 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 05:00:04.014680 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 05:00:04.014851 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 16 05:00:04.014959 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 16 05:00:04.015238 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 05:00:04.015341 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 05:00:04.015510 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 05:00:04.015612 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 05:00:04.015789 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 16 05:00:04.016015 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 16 05:00:04.020388 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 16 05:00:04.020457 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 16 05:00:04.021176 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 16 05:00:04.022091 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 16 05:00:04.022118 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 16 05:00:04.022343 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 16 05:00:04.040058 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 16 05:00:04.057780 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 16 05:00:04.116236 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 16 05:00:04.121101 systemd-resolved[1834]: Positive Trust Anchors: Sep 16 05:00:04.121108 systemd-resolved[1834]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 16 05:00:04.121136 systemd-resolved[1834]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 16 05:00:04.124261 systemd-resolved[1834]: Using system hostname 'ci-4459.0.0-n-32926c0571'. Sep 16 05:00:04.124562 systemd-networkd[1833]: lo: Link UP Sep 16 05:00:04.124566 systemd-networkd[1833]: lo: Gained carrier Sep 16 05:00:04.127896 systemd-networkd[1833]: bond0: netdev ready Sep 16 05:00:04.128383 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 16 05:00:04.129058 systemd-networkd[1833]: Enumeration completed Sep 16 05:00:04.130198 systemd-networkd[1833]: enp2s0f0np0: Configuring with /etc/systemd/network/10-0c:42:a1:65:fd:de.network. Sep 16 05:00:04.138026 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 16 05:00:04.148185 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 05:00:04.160080 systemd[1]: Reached target network.target - Network. Sep 16 05:00:04.166995 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 16 05:00:04.176998 systemd[1]: Reached target sysinit.target - System Initialization. Sep 16 05:00:04.185059 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 16 05:00:04.195007 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 16 05:00:04.204999 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 16 05:00:04.215005 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 16 05:00:04.224989 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 16 05:00:04.225008 systemd[1]: Reached target paths.target - Path Units. Sep 16 05:00:04.232996 systemd[1]: Reached target time-set.target - System Time Set. Sep 16 05:00:04.242100 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 16 05:00:04.255273 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 16 05:00:04.255951 kernel: mlx5_core 0000:02:00.0 enp2s0f0np0: Link up Sep 16 05:00:04.268951 kernel: bond0: (slave enp2s0f0np0): Enslaving as a backup interface with an up link Sep 16 05:00:04.269058 systemd-networkd[1833]: enp2s0f1np1: Configuring with /etc/systemd/network/10-0c:42:a1:65:fd:df.network. Sep 16 05:00:04.272997 systemd[1]: Reached target timers.target - Timer Units. Sep 16 05:00:04.281825 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 16 05:00:04.291842 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 16 05:00:04.301186 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 16 05:00:04.313090 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 16 05:00:04.323171 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 16 05:00:04.334729 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 16 05:00:04.346587 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 16 05:00:04.357091 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 16 05:00:04.367590 systemd[1]: Reached target sockets.target - Socket Units. Sep 16 05:00:04.376994 systemd[1]: Reached target basic.target - Basic System. Sep 16 05:00:04.385018 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 16 05:00:04.385037 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 16 05:00:04.385595 systemd[1]: Starting containerd.service - containerd container runtime... Sep 16 05:00:04.392951 kernel: mlx5_core 0000:02:00.1 enp2s0f1np1: Link up Sep 16 05:00:04.404953 kernel: bond0: (slave enp2s0f1np1): Enslaving as a backup interface with an up link Sep 16 05:00:04.405198 systemd-networkd[1833]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Sep 16 05:00:04.406089 systemd-networkd[1833]: enp2s0f0np0: Link UP Sep 16 05:00:04.406235 systemd-networkd[1833]: enp2s0f0np0: Gained carrier Sep 16 05:00:04.415952 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Sep 16 05:00:04.416115 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 16 05:00:04.423262 systemd-networkd[1833]: enp2s0f1np1: Reconfiguring with /etc/systemd/network/10-0c:42:a1:65:fd:de.network. Sep 16 05:00:04.423414 systemd-networkd[1833]: enp2s0f1np1: Link UP Sep 16 05:00:04.423556 systemd-networkd[1833]: enp2s0f1np1: Gained carrier Sep 16 05:00:04.425547 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 16 05:00:04.434070 systemd-networkd[1833]: bond0: Link UP Sep 16 05:00:04.434234 systemd-networkd[1833]: bond0: Gained carrier Sep 16 05:00:04.434345 systemd-timesyncd[1835]: Network configuration changed, trying to establish connection. Sep 16 05:00:04.434582 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 16 05:00:04.434657 systemd-timesyncd[1835]: Network configuration changed, trying to establish connection. Sep 16 05:00:04.434837 systemd-timesyncd[1835]: Network configuration changed, trying to establish connection. Sep 16 05:00:04.434915 systemd-timesyncd[1835]: Network configuration changed, trying to establish connection. Sep 16 05:00:04.440352 coreos-metadata[1873]: Sep 16 05:00:04.440 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 16 05:00:04.457045 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 16 05:00:04.466569 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 16 05:00:04.468442 jq[1879]: false Sep 16 05:00:04.475991 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 16 05:00:04.476562 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 16 05:00:04.482155 extend-filesystems[1880]: Found /dev/sda6 Sep 16 05:00:04.486033 extend-filesystems[1880]: Found /dev/sda9 Sep 16 05:00:04.486033 extend-filesystems[1880]: Checking size of /dev/sda9 Sep 16 05:00:04.498875 extend-filesystems[1880]: Resized partition /dev/sda9 Sep 16 05:00:04.554040 kernel: EXT4-fs (sda9): resizing filesystem from 553472 to 116605649 blocks Sep 16 05:00:04.554138 kernel: bond0: (slave enp2s0f0np0): link status definitely up, 10000 Mbps full duplex Sep 16 05:00:04.554168 kernel: bond0: active interface up! Sep 16 05:00:04.554183 kernel: i915 0000:00:02.0: [drm] [ENCODER:98:DDI A/PHY A] failed to retrieve link info, disabling eDP Sep 16 05:00:04.554370 kernel: [drm] Initialized i915 1.6.0 for 0000:00:02.0 on minor 0 Sep 16 05:00:04.486598 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 16 05:00:04.554459 extend-filesystems[1892]: resize2fs 1.47.3 (8-Jul-2025) Sep 16 05:00:04.512114 oslogin_cache_refresh[1881]: Refreshing passwd entry cache Sep 16 05:00:04.507483 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 16 05:00:04.562329 google_oslogin_nss_cache[1881]: oslogin_cache_refresh[1881]: Refreshing passwd entry cache Sep 16 05:00:04.562761 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 16 05:00:04.575477 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 16 05:00:04.590566 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 16 05:00:04.600199 systemd[1]: Starting tcsd.service - TCG Core Services Daemon... Sep 16 05:00:04.607401 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 16 05:00:04.607787 systemd[1]: Starting update-engine.service - Update Engine... Sep 16 05:00:04.615845 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 16 05:00:04.623505 update_engine[1911]: I20250916 05:00:04.623467 1911 main.cc:92] Flatcar Update Engine starting Sep 16 05:00:04.626933 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 16 05:00:04.627916 jq[1912]: true Sep 16 05:00:04.634191 systemd-logind[1906]: Watching system buttons on /dev/input/event3 (Power Button) Sep 16 05:00:04.634732 systemd-logind[1906]: Watching system buttons on /dev/input/event2 (Sleep Button) Sep 16 05:00:04.634783 systemd-logind[1906]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Sep 16 05:00:04.639853 systemd-logind[1906]: New seat seat0. Sep 16 05:00:04.639970 kernel: bond0: (slave enp2s0f1np1): link status definitely up, 10000 Mbps full duplex Sep 16 05:00:04.645142 systemd[1]: Started systemd-logind.service - User Login Management. Sep 16 05:00:04.654620 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 16 05:00:04.664179 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 16 05:00:04.664292 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 16 05:00:04.664445 systemd[1]: motdgen.service: Deactivated successfully. Sep 16 05:00:04.685135 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 16 05:00:04.694608 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 16 05:00:04.694718 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 16 05:00:04.722872 jq[1917]: true Sep 16 05:00:04.723487 (ntainerd)[1918]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 16 05:00:04.733423 tar[1916]: linux-amd64/helm Sep 16 05:00:04.739573 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Sep 16 05:00:04.739714 systemd[1]: Condition check resulted in tcsd.service - TCG Core Services Daemon being skipped. Sep 16 05:00:04.761914 dbus-daemon[1874]: [system] SELinux support is enabled Sep 16 05:00:04.762032 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 16 05:00:04.763957 update_engine[1911]: I20250916 05:00:04.763890 1911 update_check_scheduler.cc:74] Next update check in 7m6s Sep 16 05:00:04.771987 bash[1944]: Updated "/home/core/.ssh/authorized_keys" Sep 16 05:00:04.772702 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 16 05:00:04.774394 sshd_keygen[1909]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 16 05:00:04.784136 dbus-daemon[1874]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 16 05:00:04.784516 systemd[1]: Starting sshkeys.service... Sep 16 05:00:04.790071 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 16 05:00:04.790098 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 16 05:00:04.800066 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 16 05:00:04.800086 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 16 05:00:04.810338 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 16 05:00:04.822157 systemd[1]: Started update-engine.service - Update Engine. Sep 16 05:00:04.833115 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 16 05:00:04.843847 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 16 05:00:04.874359 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 16 05:00:04.879083 containerd[1918]: time="2025-09-16T05:00:04Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 16 05:00:04.879421 containerd[1918]: time="2025-09-16T05:00:04.879406455Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 16 05:00:04.882923 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 16 05:00:04.884890 containerd[1918]: time="2025-09-16T05:00:04.884869698Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.048µs" Sep 16 05:00:04.884890 containerd[1918]: time="2025-09-16T05:00:04.884890517Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 16 05:00:04.884945 containerd[1918]: time="2025-09-16T05:00:04.884901817Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 16 05:00:04.885006 containerd[1918]: time="2025-09-16T05:00:04.884997240Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 16 05:00:04.885034 containerd[1918]: time="2025-09-16T05:00:04.885008582Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 16 05:00:04.885034 containerd[1918]: time="2025-09-16T05:00:04.885023309Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 16 05:00:04.885061 containerd[1918]: time="2025-09-16T05:00:04.885054327Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 16 05:00:04.885075 containerd[1918]: time="2025-09-16T05:00:04.885061664Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 16 05:00:04.885202 containerd[1918]: time="2025-09-16T05:00:04.885193007Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 16 05:00:04.885202 containerd[1918]: time="2025-09-16T05:00:04.885201434Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 16 05:00:04.885248 containerd[1918]: time="2025-09-16T05:00:04.885207855Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 16 05:00:04.885248 containerd[1918]: time="2025-09-16T05:00:04.885212528Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 16 05:00:04.885284 containerd[1918]: time="2025-09-16T05:00:04.885251048Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 16 05:00:04.885304 coreos-metadata[1964]: Sep 16 05:00:04.885 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 16 05:00:04.885429 containerd[1918]: time="2025-09-16T05:00:04.885377092Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 16 05:00:04.885429 containerd[1918]: time="2025-09-16T05:00:04.885392812Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 16 05:00:04.885429 containerd[1918]: time="2025-09-16T05:00:04.885398794Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 16 05:00:04.885429 containerd[1918]: time="2025-09-16T05:00:04.885414317Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 16 05:00:04.885557 containerd[1918]: time="2025-09-16T05:00:04.885544172Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 16 05:00:04.885588 containerd[1918]: time="2025-09-16T05:00:04.885578885Z" level=info msg="metadata content store policy set" policy=shared Sep 16 05:00:04.892507 systemd[1]: issuegen.service: Deactivated successfully. Sep 16 05:00:04.892646 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 16 05:00:04.902582 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 16 05:00:04.907277 tar[1916]: linux-amd64/LICENSE Sep 16 05:00:04.907322 tar[1916]: linux-amd64/README.md Sep 16 05:00:04.908860 containerd[1918]: time="2025-09-16T05:00:04.908826699Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 16 05:00:04.908898 containerd[1918]: time="2025-09-16T05:00:04.908884379Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 16 05:00:04.908898 containerd[1918]: time="2025-09-16T05:00:04.908896544Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 16 05:00:04.908926 containerd[1918]: time="2025-09-16T05:00:04.908904010Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 16 05:00:04.908926 containerd[1918]: time="2025-09-16T05:00:04.908911196Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 16 05:00:04.908956 containerd[1918]: time="2025-09-16T05:00:04.908925274Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 16 05:00:04.908956 containerd[1918]: time="2025-09-16T05:00:04.908932628Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 16 05:00:04.908956 containerd[1918]: time="2025-09-16T05:00:04.908940916Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 16 05:00:04.908956 containerd[1918]: time="2025-09-16T05:00:04.908951409Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 16 05:00:04.909012 containerd[1918]: time="2025-09-16T05:00:04.908969301Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 16 05:00:04.909012 containerd[1918]: time="2025-09-16T05:00:04.908975021Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 16 05:00:04.909012 containerd[1918]: time="2025-09-16T05:00:04.908983589Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 16 05:00:04.909072 containerd[1918]: time="2025-09-16T05:00:04.909063753Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 16 05:00:04.909091 containerd[1918]: time="2025-09-16T05:00:04.909077034Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 16 05:00:04.909091 containerd[1918]: time="2025-09-16T05:00:04.909085381Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 16 05:00:04.909123 containerd[1918]: time="2025-09-16T05:00:04.909091147Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 16 05:00:04.909123 containerd[1918]: time="2025-09-16T05:00:04.909097685Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 16 05:00:04.909123 containerd[1918]: time="2025-09-16T05:00:04.909103552Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 16 05:00:04.909123 containerd[1918]: time="2025-09-16T05:00:04.909115928Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 16 05:00:04.909123 containerd[1918]: time="2025-09-16T05:00:04.909121799Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 16 05:00:04.909189 containerd[1918]: time="2025-09-16T05:00:04.909127834Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 16 05:00:04.909189 containerd[1918]: time="2025-09-16T05:00:04.909133417Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 16 05:00:04.909189 containerd[1918]: time="2025-09-16T05:00:04.909143006Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 16 05:00:04.909189 containerd[1918]: time="2025-09-16T05:00:04.909182171Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 16 05:00:04.909243 containerd[1918]: time="2025-09-16T05:00:04.909190637Z" level=info msg="Start snapshots syncer" Sep 16 05:00:04.909243 containerd[1918]: time="2025-09-16T05:00:04.909203915Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 16 05:00:04.909360 containerd[1918]: time="2025-09-16T05:00:04.909339050Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 16 05:00:04.909423 containerd[1918]: time="2025-09-16T05:00:04.909371248Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 16 05:00:04.910200 containerd[1918]: time="2025-09-16T05:00:04.910150806Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 16 05:00:04.910234 containerd[1918]: time="2025-09-16T05:00:04.910204586Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 16 05:00:04.910234 containerd[1918]: time="2025-09-16T05:00:04.910219082Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 16 05:00:04.910234 containerd[1918]: time="2025-09-16T05:00:04.910225771Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 16 05:00:04.910234 containerd[1918]: time="2025-09-16T05:00:04.910232695Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 16 05:00:04.910290 containerd[1918]: time="2025-09-16T05:00:04.910239938Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 16 05:00:04.910290 containerd[1918]: time="2025-09-16T05:00:04.910246533Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 16 05:00:04.910290 containerd[1918]: time="2025-09-16T05:00:04.910253001Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 16 05:00:04.910290 containerd[1918]: time="2025-09-16T05:00:04.910265406Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 16 05:00:04.910290 containerd[1918]: time="2025-09-16T05:00:04.910271789Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 16 05:00:04.910290 containerd[1918]: time="2025-09-16T05:00:04.910280487Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 16 05:00:04.910368 containerd[1918]: time="2025-09-16T05:00:04.910298742Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 16 05:00:04.910368 containerd[1918]: time="2025-09-16T05:00:04.910308495Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 16 05:00:04.910368 containerd[1918]: time="2025-09-16T05:00:04.910314107Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 16 05:00:04.910368 containerd[1918]: time="2025-09-16T05:00:04.910319344Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 16 05:00:04.910368 containerd[1918]: time="2025-09-16T05:00:04.910323658Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 16 05:00:04.910368 containerd[1918]: time="2025-09-16T05:00:04.910332489Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 16 05:00:04.910368 containerd[1918]: time="2025-09-16T05:00:04.910352352Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 16 05:00:04.910368 containerd[1918]: time="2025-09-16T05:00:04.910364661Z" level=info msg="runtime interface created" Sep 16 05:00:04.910368 containerd[1918]: time="2025-09-16T05:00:04.910368090Z" level=info msg="created NRI interface" Sep 16 05:00:04.910480 containerd[1918]: time="2025-09-16T05:00:04.910373182Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 16 05:00:04.910480 containerd[1918]: time="2025-09-16T05:00:04.910379439Z" level=info msg="Connect containerd service" Sep 16 05:00:04.910480 containerd[1918]: time="2025-09-16T05:00:04.910426368Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 16 05:00:04.910863 containerd[1918]: time="2025-09-16T05:00:04.910822589Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 16 05:00:04.913523 locksmithd[1974]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 16 05:00:04.932763 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 16 05:00:04.936956 kernel: i915 0000:00:02.0: [drm] Cannot find any crtc or sizes Sep 16 05:00:04.946645 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 16 05:00:04.956391 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 16 05:00:04.966883 systemd[1]: Started serial-getty@ttyS1.service - Serial Getty on ttyS1. Sep 16 05:00:04.977271 systemd[1]: Reached target getty.target - Login Prompts. Sep 16 05:00:04.985661 containerd[1918]: time="2025-09-16T05:00:04.985636390Z" level=info msg="Start subscribing containerd event" Sep 16 05:00:04.985716 containerd[1918]: time="2025-09-16T05:00:04.985672933Z" level=info msg="Start recovering state" Sep 16 05:00:04.985716 containerd[1918]: time="2025-09-16T05:00:04.985686221Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 16 05:00:04.985747 containerd[1918]: time="2025-09-16T05:00:04.985720233Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 16 05:00:04.985747 containerd[1918]: time="2025-09-16T05:00:04.985724200Z" level=info msg="Start event monitor" Sep 16 05:00:04.985747 containerd[1918]: time="2025-09-16T05:00:04.985735735Z" level=info msg="Start cni network conf syncer for default" Sep 16 05:00:04.985747 containerd[1918]: time="2025-09-16T05:00:04.985739879Z" level=info msg="Start streaming server" Sep 16 05:00:04.985800 containerd[1918]: time="2025-09-16T05:00:04.985747271Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 16 05:00:04.985800 containerd[1918]: time="2025-09-16T05:00:04.985752071Z" level=info msg="runtime interface starting up..." Sep 16 05:00:04.985800 containerd[1918]: time="2025-09-16T05:00:04.985755199Z" level=info msg="starting plugins..." Sep 16 05:00:04.985800 containerd[1918]: time="2025-09-16T05:00:04.985762664Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 16 05:00:04.985857 containerd[1918]: time="2025-09-16T05:00:04.985832191Z" level=info msg="containerd successfully booted in 0.106958s" Sep 16 05:00:04.987350 systemd[1]: Started containerd.service - containerd container runtime. Sep 16 05:00:05.044954 kernel: EXT4-fs (sda9): resized filesystem to 116605649 Sep 16 05:00:05.071552 extend-filesystems[1892]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 16 05:00:05.071552 extend-filesystems[1892]: old_desc_blocks = 1, new_desc_blocks = 56 Sep 16 05:00:05.071552 extend-filesystems[1892]: The filesystem on /dev/sda9 is now 116605649 (4k) blocks long. Sep 16 05:00:05.109024 extend-filesystems[1880]: Resized filesystem in /dev/sda9 Sep 16 05:00:05.072378 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 16 05:00:05.072518 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 16 05:00:05.135991 kernel: i915 0000:00:02.0: [drm] Cannot find any crtc or sizes Sep 16 05:00:06.317352 systemd-timesyncd[1835]: Network configuration changed, trying to establish connection. Sep 16 05:00:06.445199 systemd-networkd[1833]: bond0: Gained IPv6LL Sep 16 05:00:06.446135 systemd-timesyncd[1835]: Network configuration changed, trying to establish connection. Sep 16 05:00:06.451303 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 16 05:00:06.461422 systemd[1]: Reached target network-online.target - Network is Online. Sep 16 05:00:06.476321 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 05:00:06.497444 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 16 05:00:06.525470 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 16 05:00:07.347958 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:00:07.358720 (kubelet)[2031]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 05:00:07.458891 kernel: mlx5_core 0000:02:00.0: lag map: port 1:1 port 2:2 Sep 16 05:00:07.459029 kernel: mlx5_core 0000:02:00.0: shared_fdb:0 mode:queue_affinity Sep 16 05:00:07.562661 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 16 05:00:07.571805 systemd[1]: Started sshd@0-139.178.94.33:22-139.178.89.65:47216.service - OpenSSH per-connection server daemon (139.178.89.65:47216). Sep 16 05:00:07.646987 sshd[2040]: Accepted publickey for core from 139.178.89.65 port 47216 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:00:07.647713 sshd-session[2040]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:00:07.655295 systemd-logind[1906]: New session 1 of user core. Sep 16 05:00:07.656170 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 16 05:00:07.665739 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 16 05:00:07.692262 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 16 05:00:07.703304 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 16 05:00:07.729274 (systemd)[2049]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 16 05:00:07.730852 systemd-logind[1906]: New session c1 of user core. Sep 16 05:00:07.850746 systemd[2049]: Queued start job for default target default.target. Sep 16 05:00:07.852414 kubelet[2031]: E0916 05:00:07.852395 2031 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 05:00:07.853499 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 05:00:07.853584 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 05:00:07.862292 systemd[1]: kubelet.service: Consumed 613ms CPU time, 268.2M memory peak. Sep 16 05:00:07.862721 systemd[2049]: Created slice app.slice - User Application Slice. Sep 16 05:00:07.862755 systemd[2049]: Reached target paths.target - Paths. Sep 16 05:00:07.862777 systemd[2049]: Reached target timers.target - Timers. Sep 16 05:00:07.863483 systemd[2049]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 16 05:00:07.869617 systemd[2049]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 16 05:00:07.869647 systemd[2049]: Reached target sockets.target - Sockets. Sep 16 05:00:07.869672 systemd[2049]: Reached target basic.target - Basic System. Sep 16 05:00:07.869695 systemd[2049]: Reached target default.target - Main User Target. Sep 16 05:00:07.869713 systemd[2049]: Startup finished in 135ms. Sep 16 05:00:07.869760 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 16 05:00:07.879087 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 16 05:00:07.948771 systemd[1]: Started sshd@1-139.178.94.33:22-139.178.89.65:47230.service - OpenSSH per-connection server daemon (139.178.89.65:47230). Sep 16 05:00:07.999915 sshd[2067]: Accepted publickey for core from 139.178.89.65 port 47230 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:00:08.001096 sshd-session[2067]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:00:08.006386 systemd-logind[1906]: New session 2 of user core. Sep 16 05:00:08.026503 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 16 05:00:08.094807 sshd[2070]: Connection closed by 139.178.89.65 port 47230 Sep 16 05:00:08.094952 sshd-session[2067]: pam_unix(sshd:session): session closed for user core Sep 16 05:00:08.118653 systemd[1]: sshd@1-139.178.94.33:22-139.178.89.65:47230.service: Deactivated successfully. Sep 16 05:00:08.119755 systemd[1]: session-2.scope: Deactivated successfully. Sep 16 05:00:08.120486 systemd-logind[1906]: Session 2 logged out. Waiting for processes to exit. Sep 16 05:00:08.122047 systemd[1]: Started sshd@2-139.178.94.33:22-139.178.89.65:47240.service - OpenSSH per-connection server daemon (139.178.89.65:47240). Sep 16 05:00:08.132851 systemd-logind[1906]: Removed session 2. Sep 16 05:00:08.170395 sshd[2076]: Accepted publickey for core from 139.178.89.65 port 47240 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:00:08.171160 sshd-session[2076]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:00:08.174654 systemd-logind[1906]: New session 3 of user core. Sep 16 05:00:08.185183 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 16 05:00:08.245992 sshd[2079]: Connection closed by 139.178.89.65 port 47240 Sep 16 05:00:08.246134 sshd-session[2076]: pam_unix(sshd:session): session closed for user core Sep 16 05:00:08.247902 systemd[1]: sshd@2-139.178.94.33:22-139.178.89.65:47240.service: Deactivated successfully. Sep 16 05:00:08.248893 systemd[1]: session-3.scope: Deactivated successfully. Sep 16 05:00:08.249696 systemd-logind[1906]: Session 3 logged out. Waiting for processes to exit. Sep 16 05:00:08.250484 systemd-logind[1906]: Removed session 3. Sep 16 05:00:08.516315 google_oslogin_nss_cache[1881]: oslogin_cache_refresh[1881]: Failure getting users, quitting Sep 16 05:00:08.516315 google_oslogin_nss_cache[1881]: oslogin_cache_refresh[1881]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 16 05:00:08.516182 oslogin_cache_refresh[1881]: Failure getting users, quitting Sep 16 05:00:08.517512 google_oslogin_nss_cache[1881]: oslogin_cache_refresh[1881]: Refreshing group entry cache Sep 16 05:00:08.516230 oslogin_cache_refresh[1881]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 16 05:00:08.516339 oslogin_cache_refresh[1881]: Refreshing group entry cache Sep 16 05:00:08.517825 google_oslogin_nss_cache[1881]: oslogin_cache_refresh[1881]: Failure getting groups, quitting Sep 16 05:00:08.517815 oslogin_cache_refresh[1881]: Failure getting groups, quitting Sep 16 05:00:08.518016 google_oslogin_nss_cache[1881]: oslogin_cache_refresh[1881]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 16 05:00:08.517846 oslogin_cache_refresh[1881]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 16 05:00:08.521243 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 16 05:00:08.521814 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 16 05:00:08.590353 coreos-metadata[1964]: Sep 16 05:00:08.590 INFO Fetch successful Sep 16 05:00:08.607610 coreos-metadata[1873]: Sep 16 05:00:08.607 INFO Fetch successful Sep 16 05:00:08.672241 unknown[1964]: wrote ssh authorized keys file for user: core Sep 16 05:00:08.696466 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 16 05:00:08.702450 update-ssh-keys[2088]: Updated "/home/core/.ssh/authorized_keys" Sep 16 05:00:08.705523 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 16 05:00:08.716780 systemd[1]: Finished sshkeys.service. Sep 16 05:00:08.726321 systemd[1]: Starting packet-phone-home.service - Report Success to Packet... Sep 16 05:00:09.193518 systemd[1]: Finished packet-phone-home.service - Report Success to Packet. Sep 16 05:00:09.205053 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 16 05:00:09.214674 systemd[1]: Startup finished in 4.409s (kernel) + 24.083s (initrd) + 8.917s (userspace) = 37.409s. Sep 16 05:00:09.234733 login[2001]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 16 05:00:09.237796 systemd-logind[1906]: New session 4 of user core. Sep 16 05:00:09.238431 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 16 05:00:09.245908 login[2000]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 16 05:00:09.248807 systemd-logind[1906]: New session 5 of user core. Sep 16 05:00:09.249363 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 16 05:00:11.071917 systemd-timesyncd[1835]: Network configuration changed, trying to establish connection. Sep 16 05:00:17.875966 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 16 05:00:17.879494 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 05:00:18.158259 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:00:18.160888 (kubelet)[2129]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 05:00:18.188202 kubelet[2129]: E0916 05:00:18.188124 2129 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 05:00:18.190466 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 05:00:18.190556 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 05:00:18.190745 systemd[1]: kubelet.service: Consumed 165ms CPU time, 116.2M memory peak. Sep 16 05:00:18.259559 systemd[1]: Started sshd@3-139.178.94.33:22-139.178.89.65:58562.service - OpenSSH per-connection server daemon (139.178.89.65:58562). Sep 16 05:00:18.332265 sshd[2148]: Accepted publickey for core from 139.178.89.65 port 58562 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:00:18.332860 sshd-session[2148]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:00:18.335631 systemd-logind[1906]: New session 6 of user core. Sep 16 05:00:18.347200 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 16 05:00:18.399989 sshd[2151]: Connection closed by 139.178.89.65 port 58562 Sep 16 05:00:18.400472 sshd-session[2148]: pam_unix(sshd:session): session closed for user core Sep 16 05:00:18.417028 systemd[1]: sshd@3-139.178.94.33:22-139.178.89.65:58562.service: Deactivated successfully. Sep 16 05:00:18.417889 systemd[1]: session-6.scope: Deactivated successfully. Sep 16 05:00:18.418491 systemd-logind[1906]: Session 6 logged out. Waiting for processes to exit. Sep 16 05:00:18.419472 systemd[1]: Started sshd@4-139.178.94.33:22-139.178.89.65:58564.service - OpenSSH per-connection server daemon (139.178.89.65:58564). Sep 16 05:00:18.420043 systemd-logind[1906]: Removed session 6. Sep 16 05:00:18.470585 sshd[2157]: Accepted publickey for core from 139.178.89.65 port 58564 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:00:18.472031 sshd-session[2157]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:00:18.479151 systemd-logind[1906]: New session 7 of user core. Sep 16 05:00:18.492425 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 16 05:00:18.550958 sshd[2161]: Connection closed by 139.178.89.65 port 58564 Sep 16 05:00:18.551112 sshd-session[2157]: pam_unix(sshd:session): session closed for user core Sep 16 05:00:18.563138 systemd[1]: sshd@4-139.178.94.33:22-139.178.89.65:58564.service: Deactivated successfully. Sep 16 05:00:18.564063 systemd[1]: session-7.scope: Deactivated successfully. Sep 16 05:00:18.564636 systemd-logind[1906]: Session 7 logged out. Waiting for processes to exit. Sep 16 05:00:18.565751 systemd[1]: Started sshd@5-139.178.94.33:22-139.178.89.65:58570.service - OpenSSH per-connection server daemon (139.178.89.65:58570). Sep 16 05:00:18.566477 systemd-logind[1906]: Removed session 7. Sep 16 05:00:18.596954 sshd[2167]: Accepted publickey for core from 139.178.89.65 port 58570 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:00:18.597731 sshd-session[2167]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:00:18.601375 systemd-logind[1906]: New session 8 of user core. Sep 16 05:00:18.622165 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 16 05:00:18.686589 sshd[2171]: Connection closed by 139.178.89.65 port 58570 Sep 16 05:00:18.687329 sshd-session[2167]: pam_unix(sshd:session): session closed for user core Sep 16 05:00:18.707899 systemd[1]: sshd@5-139.178.94.33:22-139.178.89.65:58570.service: Deactivated successfully. Sep 16 05:00:18.708679 systemd[1]: session-8.scope: Deactivated successfully. Sep 16 05:00:18.709218 systemd-logind[1906]: Session 8 logged out. Waiting for processes to exit. Sep 16 05:00:18.710358 systemd[1]: Started sshd@6-139.178.94.33:22-139.178.89.65:58574.service - OpenSSH per-connection server daemon (139.178.89.65:58574). Sep 16 05:00:18.710735 systemd-logind[1906]: Removed session 8. Sep 16 05:00:18.760936 sshd[2177]: Accepted publickey for core from 139.178.89.65 port 58574 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:00:18.762291 sshd-session[2177]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:00:18.768887 systemd-logind[1906]: New session 9 of user core. Sep 16 05:00:18.780392 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 16 05:00:18.849450 sudo[2181]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 16 05:00:18.849601 sudo[2181]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 05:00:18.866414 sudo[2181]: pam_unix(sudo:session): session closed for user root Sep 16 05:00:18.867401 sshd[2180]: Connection closed by 139.178.89.65 port 58574 Sep 16 05:00:18.867614 sshd-session[2177]: pam_unix(sshd:session): session closed for user core Sep 16 05:00:18.882791 systemd[1]: sshd@6-139.178.94.33:22-139.178.89.65:58574.service: Deactivated successfully. Sep 16 05:00:18.884044 systemd[1]: session-9.scope: Deactivated successfully. Sep 16 05:00:18.884819 systemd-logind[1906]: Session 9 logged out. Waiting for processes to exit. Sep 16 05:00:18.886463 systemd[1]: Started sshd@7-139.178.94.33:22-139.178.89.65:58590.service - OpenSSH per-connection server daemon (139.178.89.65:58590). Sep 16 05:00:18.887386 systemd-logind[1906]: Removed session 9. Sep 16 05:00:18.932673 sshd[2187]: Accepted publickey for core from 139.178.89.65 port 58590 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:00:18.934002 sshd-session[2187]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:00:18.939547 systemd-logind[1906]: New session 10 of user core. Sep 16 05:00:18.954289 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 16 05:00:19.023483 sudo[2192]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 16 05:00:19.024355 sudo[2192]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 05:00:19.034196 sudo[2192]: pam_unix(sudo:session): session closed for user root Sep 16 05:00:19.036905 sudo[2191]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 16 05:00:19.037059 sudo[2191]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 05:00:19.042538 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 16 05:00:19.077004 augenrules[2214]: No rules Sep 16 05:00:19.077517 systemd[1]: audit-rules.service: Deactivated successfully. Sep 16 05:00:19.077696 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 16 05:00:19.078427 sudo[2191]: pam_unix(sudo:session): session closed for user root Sep 16 05:00:19.079626 sshd[2190]: Connection closed by 139.178.89.65 port 58590 Sep 16 05:00:19.079943 sshd-session[2187]: pam_unix(sshd:session): session closed for user core Sep 16 05:00:19.105850 systemd[1]: sshd@7-139.178.94.33:22-139.178.89.65:58590.service: Deactivated successfully. Sep 16 05:00:19.108657 systemd[1]: session-10.scope: Deactivated successfully. Sep 16 05:00:19.110812 systemd-logind[1906]: Session 10 logged out. Waiting for processes to exit. Sep 16 05:00:19.116473 systemd[1]: Started sshd@8-139.178.94.33:22-139.178.89.65:58598.service - OpenSSH per-connection server daemon (139.178.89.65:58598). Sep 16 05:00:19.118239 systemd-logind[1906]: Removed session 10. Sep 16 05:00:19.199051 sshd[2223]: Accepted publickey for core from 139.178.89.65 port 58598 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:00:19.200135 sshd-session[2223]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:00:19.204796 systemd-logind[1906]: New session 11 of user core. Sep 16 05:00:19.217186 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 16 05:00:19.270694 sudo[2227]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 16 05:00:19.270909 sudo[2227]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 05:00:19.547839 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 16 05:00:19.573337 (dockerd)[2254]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 16 05:00:19.780877 dockerd[2254]: time="2025-09-16T05:00:19.780816386Z" level=info msg="Starting up" Sep 16 05:00:19.781309 dockerd[2254]: time="2025-09-16T05:00:19.781276271Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 16 05:00:19.787378 dockerd[2254]: time="2025-09-16T05:00:19.787356110Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 16 05:00:19.810668 dockerd[2254]: time="2025-09-16T05:00:19.810624989Z" level=info msg="Loading containers: start." Sep 16 05:00:19.821953 kernel: Initializing XFRM netlink socket Sep 16 05:00:19.959877 systemd-timesyncd[1835]: Network configuration changed, trying to establish connection. Sep 16 05:00:19.993497 systemd-networkd[1833]: docker0: Link UP Sep 16 05:00:19.995479 dockerd[2254]: time="2025-09-16T05:00:19.995435847Z" level=info msg="Loading containers: done." Sep 16 05:00:20.001924 dockerd[2254]: time="2025-09-16T05:00:20.001874859Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 16 05:00:20.001924 dockerd[2254]: time="2025-09-16T05:00:20.001917049Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 16 05:00:20.002021 dockerd[2254]: time="2025-09-16T05:00:20.001962663Z" level=info msg="Initializing buildkit" Sep 16 05:00:20.013242 dockerd[2254]: time="2025-09-16T05:00:20.013214447Z" level=info msg="Completed buildkit initialization" Sep 16 05:00:20.015719 dockerd[2254]: time="2025-09-16T05:00:20.015674757Z" level=info msg="Daemon has completed initialization" Sep 16 05:00:20.015753 dockerd[2254]: time="2025-09-16T05:00:20.015726844Z" level=info msg="API listen on /run/docker.sock" Sep 16 05:00:20.015791 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 16 05:00:19.268879 systemd-resolved[1834]: Clock change detected. Flushing caches. Sep 16 05:00:19.278667 systemd-journald[1451]: Time jumped backwards, rotating. Sep 16 05:00:19.268953 systemd-timesyncd[1835]: Contacted time server [2607:f710:35::29c:0:7]:123 (2.flatcar.pool.ntp.org). Sep 16 05:00:19.268993 systemd-timesyncd[1835]: Initial clock synchronization to Tue 2025-09-16 05:00:19.268817 UTC. Sep 16 05:00:20.070577 containerd[1918]: time="2025-09-16T05:00:20.070545471Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 16 05:00:20.634132 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount581796291.mount: Deactivated successfully. Sep 16 05:00:21.359984 containerd[1918]: time="2025-09-16T05:00:21.359931332Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:21.360223 containerd[1918]: time="2025-09-16T05:00:21.360102741Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=28117124" Sep 16 05:00:21.360647 containerd[1918]: time="2025-09-16T05:00:21.360606833Z" level=info msg="ImageCreate event name:\"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:21.361951 containerd[1918]: time="2025-09-16T05:00:21.361908819Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:21.362540 containerd[1918]: time="2025-09-16T05:00:21.362491788Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"28113723\" in 1.29191796s" Sep 16 05:00:21.362540 containerd[1918]: time="2025-09-16T05:00:21.362516436Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\"" Sep 16 05:00:21.362869 containerd[1918]: time="2025-09-16T05:00:21.362815144Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 16 05:00:22.356307 containerd[1918]: time="2025-09-16T05:00:22.356255040Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:22.356495 containerd[1918]: time="2025-09-16T05:00:22.356450773Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=24716632" Sep 16 05:00:22.356903 containerd[1918]: time="2025-09-16T05:00:22.356864084Z" level=info msg="ImageCreate event name:\"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:22.358138 containerd[1918]: time="2025-09-16T05:00:22.358098669Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:22.359044 containerd[1918]: time="2025-09-16T05:00:22.359000245Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"26351311\" in 996.167402ms" Sep 16 05:00:22.359044 containerd[1918]: time="2025-09-16T05:00:22.359016861Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\"" Sep 16 05:00:22.359319 containerd[1918]: time="2025-09-16T05:00:22.359279166Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 16 05:00:23.132031 containerd[1918]: time="2025-09-16T05:00:23.131978117Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:23.132243 containerd[1918]: time="2025-09-16T05:00:23.132188901Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=18787698" Sep 16 05:00:23.132552 containerd[1918]: time="2025-09-16T05:00:23.132510600Z" level=info msg="ImageCreate event name:\"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:23.133882 containerd[1918]: time="2025-09-16T05:00:23.133842886Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:23.134795 containerd[1918]: time="2025-09-16T05:00:23.134753907Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"20422395\" in 775.460804ms" Sep 16 05:00:23.134795 containerd[1918]: time="2025-09-16T05:00:23.134769213Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\"" Sep 16 05:00:23.135016 containerd[1918]: time="2025-09-16T05:00:23.134983292Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 16 05:00:23.848514 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3667075295.mount: Deactivated successfully. Sep 16 05:00:24.035844 containerd[1918]: time="2025-09-16T05:00:24.035789149Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:24.035964 containerd[1918]: time="2025-09-16T05:00:24.035951912Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=30410252" Sep 16 05:00:24.036376 containerd[1918]: time="2025-09-16T05:00:24.036342112Z" level=info msg="ImageCreate event name:\"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:24.037155 containerd[1918]: time="2025-09-16T05:00:24.037093958Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:24.037702 containerd[1918]: time="2025-09-16T05:00:24.037661568Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"30409271\" in 902.663763ms" Sep 16 05:00:24.037788 containerd[1918]: time="2025-09-16T05:00:24.037776362Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\"" Sep 16 05:00:24.038274 containerd[1918]: time="2025-09-16T05:00:24.038253103Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 16 05:00:24.544504 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2008548004.mount: Deactivated successfully. Sep 16 05:00:25.107581 containerd[1918]: time="2025-09-16T05:00:25.107556381Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:25.107818 containerd[1918]: time="2025-09-16T05:00:25.107805569Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 16 05:00:25.108135 containerd[1918]: time="2025-09-16T05:00:25.108122362Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:25.109568 containerd[1918]: time="2025-09-16T05:00:25.109556332Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:25.110159 containerd[1918]: time="2025-09-16T05:00:25.110147035Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.07187216s" Sep 16 05:00:25.110179 containerd[1918]: time="2025-09-16T05:00:25.110163060Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 16 05:00:25.110422 containerd[1918]: time="2025-09-16T05:00:25.110410858Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 16 05:00:25.502314 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1941654419.mount: Deactivated successfully. Sep 16 05:00:25.503347 containerd[1918]: time="2025-09-16T05:00:25.503304041Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 05:00:25.503575 containerd[1918]: time="2025-09-16T05:00:25.503534436Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 16 05:00:25.503951 containerd[1918]: time="2025-09-16T05:00:25.503910845Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 05:00:25.504831 containerd[1918]: time="2025-09-16T05:00:25.504790594Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 05:00:25.505271 containerd[1918]: time="2025-09-16T05:00:25.505229357Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 394.803859ms" Sep 16 05:00:25.505271 containerd[1918]: time="2025-09-16T05:00:25.505243109Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 16 05:00:25.505608 containerd[1918]: time="2025-09-16T05:00:25.505565116Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 16 05:00:26.013459 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1280606210.mount: Deactivated successfully. Sep 16 05:00:27.118415 containerd[1918]: time="2025-09-16T05:00:27.118389970Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:27.118621 containerd[1918]: time="2025-09-16T05:00:27.118606403Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910709" Sep 16 05:00:27.119055 containerd[1918]: time="2025-09-16T05:00:27.119014987Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:27.120498 containerd[1918]: time="2025-09-16T05:00:27.120458401Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:27.121530 containerd[1918]: time="2025-09-16T05:00:27.121488657Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 1.615909238s" Sep 16 05:00:27.121530 containerd[1918]: time="2025-09-16T05:00:27.121505372Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 16 05:00:27.559679 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 16 05:00:27.561601 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 05:00:27.856632 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:00:27.858658 (kubelet)[2728]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 05:00:27.880678 kubelet[2728]: E0916 05:00:27.880602 2728 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 05:00:27.881740 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 05:00:27.881828 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 05:00:27.882000 systemd[1]: kubelet.service: Consumed 127ms CPU time, 116.7M memory peak. Sep 16 05:00:29.137413 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:00:29.137535 systemd[1]: kubelet.service: Consumed 127ms CPU time, 116.7M memory peak. Sep 16 05:00:29.138942 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 05:00:29.153567 systemd[1]: Reload requested from client PID 2751 ('systemctl') (unit session-11.scope)... Sep 16 05:00:29.153574 systemd[1]: Reloading... Sep 16 05:00:29.189103 zram_generator::config[2795]: No configuration found. Sep 16 05:00:29.347562 systemd[1]: Reloading finished in 193 ms. Sep 16 05:00:29.390393 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 16 05:00:29.390442 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 16 05:00:29.390606 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:00:29.391850 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 05:00:29.668180 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:00:29.673821 (kubelet)[2862]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 16 05:00:29.694860 kubelet[2862]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 05:00:29.694860 kubelet[2862]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 16 05:00:29.694860 kubelet[2862]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 05:00:29.695123 kubelet[2862]: I0916 05:00:29.694892 2862 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 16 05:00:29.846292 kubelet[2862]: I0916 05:00:29.846248 2862 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 16 05:00:29.846292 kubelet[2862]: I0916 05:00:29.846262 2862 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 16 05:00:29.846434 kubelet[2862]: I0916 05:00:29.846394 2862 server.go:934] "Client rotation is on, will bootstrap in background" Sep 16 05:00:29.865025 kubelet[2862]: E0916 05:00:29.864987 2862 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.94.33:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.94.33:6443: connect: connection refused" logger="UnhandledError" Sep 16 05:00:29.867719 kubelet[2862]: I0916 05:00:29.867682 2862 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 16 05:00:29.874454 kubelet[2862]: I0916 05:00:29.874426 2862 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 16 05:00:29.884366 kubelet[2862]: I0916 05:00:29.884316 2862 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 16 05:00:29.884971 kubelet[2862]: I0916 05:00:29.884934 2862 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 16 05:00:29.885054 kubelet[2862]: I0916 05:00:29.885006 2862 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 16 05:00:29.885183 kubelet[2862]: I0916 05:00:29.885021 2862 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.0.0-n-32926c0571","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 16 05:00:29.885183 kubelet[2862]: I0916 05:00:29.885158 2862 topology_manager.go:138] "Creating topology manager with none policy" Sep 16 05:00:29.885183 kubelet[2862]: I0916 05:00:29.885164 2862 container_manager_linux.go:300] "Creating device plugin manager" Sep 16 05:00:29.885277 kubelet[2862]: I0916 05:00:29.885221 2862 state_mem.go:36] "Initialized new in-memory state store" Sep 16 05:00:29.894222 kubelet[2862]: I0916 05:00:29.894184 2862 kubelet.go:408] "Attempting to sync node with API server" Sep 16 05:00:29.894222 kubelet[2862]: I0916 05:00:29.894198 2862 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 16 05:00:29.894222 kubelet[2862]: I0916 05:00:29.894217 2862 kubelet.go:314] "Adding apiserver pod source" Sep 16 05:00:29.894283 kubelet[2862]: I0916 05:00:29.894227 2862 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 16 05:00:29.904145 kubelet[2862]: I0916 05:00:29.904102 2862 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 16 05:00:29.904399 kubelet[2862]: I0916 05:00:29.904365 2862 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 16 05:00:29.906053 kubelet[2862]: W0916 05:00:29.905986 2862 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.94.33:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.0.0-n-32926c0571&limit=500&resourceVersion=0": dial tcp 139.178.94.33:6443: connect: connection refused Sep 16 05:00:29.906053 kubelet[2862]: E0916 05:00:29.906041 2862 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.94.33:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.0.0-n-32926c0571&limit=500&resourceVersion=0\": dial tcp 139.178.94.33:6443: connect: connection refused" logger="UnhandledError" Sep 16 05:00:29.907281 kubelet[2862]: W0916 05:00:29.906430 2862 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.94.33:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.94.33:6443: connect: connection refused Sep 16 05:00:29.907281 kubelet[2862]: E0916 05:00:29.907251 2862 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.94.33:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.94.33:6443: connect: connection refused" logger="UnhandledError" Sep 16 05:00:29.907451 kubelet[2862]: W0916 05:00:29.907352 2862 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 16 05:00:29.909219 kubelet[2862]: I0916 05:00:29.909206 2862 server.go:1274] "Started kubelet" Sep 16 05:00:29.909274 kubelet[2862]: I0916 05:00:29.909233 2862 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 16 05:00:29.909319 kubelet[2862]: I0916 05:00:29.909272 2862 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 16 05:00:29.909588 kubelet[2862]: I0916 05:00:29.909578 2862 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 16 05:00:29.910463 kubelet[2862]: I0916 05:00:29.910453 2862 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 16 05:00:29.910463 kubelet[2862]: I0916 05:00:29.910458 2862 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 16 05:00:29.910524 kubelet[2862]: I0916 05:00:29.910487 2862 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 16 05:00:29.910524 kubelet[2862]: I0916 05:00:29.910503 2862 server.go:449] "Adding debug handlers to kubelet server" Sep 16 05:00:29.910524 kubelet[2862]: E0916 05:00:29.910509 2862 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459.0.0-n-32926c0571\" not found" Sep 16 05:00:29.910601 kubelet[2862]: I0916 05:00:29.910522 2862 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 16 05:00:29.910601 kubelet[2862]: I0916 05:00:29.910550 2862 reconciler.go:26] "Reconciler: start to sync state" Sep 16 05:00:29.910694 kubelet[2862]: E0916 05:00:29.910673 2862 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.0.0-n-32926c0571?timeout=10s\": dial tcp 139.178.94.33:6443: connect: connection refused" interval="200ms" Sep 16 05:00:29.910780 kubelet[2862]: W0916 05:00:29.910751 2862 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.94.33:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.94.33:6443: connect: connection refused Sep 16 05:00:29.910810 kubelet[2862]: E0916 05:00:29.910792 2862 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.94.33:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.94.33:6443: connect: connection refused" logger="UnhandledError" Sep 16 05:00:29.910954 kubelet[2862]: E0916 05:00:29.910939 2862 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 16 05:00:29.910991 kubelet[2862]: I0916 05:00:29.910957 2862 factory.go:221] Registration of the systemd container factory successfully Sep 16 05:00:29.911029 kubelet[2862]: I0916 05:00:29.911017 2862 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 16 05:00:29.911527 kubelet[2862]: I0916 05:00:29.911519 2862 factory.go:221] Registration of the containerd container factory successfully Sep 16 05:00:29.917827 kubelet[2862]: E0916 05:00:29.916332 2862 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.94.33:6443/api/v1/namespaces/default/events\": dial tcp 139.178.94.33:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459.0.0-n-32926c0571.1865aa9f1c4a28e2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459.0.0-n-32926c0571,UID:ci-4459.0.0-n-32926c0571,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459.0.0-n-32926c0571,},FirstTimestamp:2025-09-16 05:00:29.909190882 +0000 UTC m=+0.233400733,LastTimestamp:2025-09-16 05:00:29.909190882 +0000 UTC m=+0.233400733,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459.0.0-n-32926c0571,}" Sep 16 05:00:29.921191 kubelet[2862]: I0916 05:00:29.921151 2862 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 16 05:00:29.921191 kubelet[2862]: I0916 05:00:29.921161 2862 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 16 05:00:29.921191 kubelet[2862]: I0916 05:00:29.921172 2862 state_mem.go:36] "Initialized new in-memory state store" Sep 16 05:00:29.921919 kubelet[2862]: I0916 05:00:29.921906 2862 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 16 05:00:29.922146 kubelet[2862]: I0916 05:00:29.922138 2862 policy_none.go:49] "None policy: Start" Sep 16 05:00:29.922385 kubelet[2862]: I0916 05:00:29.922376 2862 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 16 05:00:29.922385 kubelet[2862]: I0916 05:00:29.922387 2862 state_mem.go:35] "Initializing new in-memory state store" Sep 16 05:00:29.922511 kubelet[2862]: I0916 05:00:29.922477 2862 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 16 05:00:29.922511 kubelet[2862]: I0916 05:00:29.922490 2862 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 16 05:00:29.922511 kubelet[2862]: I0916 05:00:29.922503 2862 kubelet.go:2321] "Starting kubelet main sync loop" Sep 16 05:00:29.922558 kubelet[2862]: E0916 05:00:29.922523 2862 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 16 05:00:29.922769 kubelet[2862]: W0916 05:00:29.922729 2862 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.94.33:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.94.33:6443: connect: connection refused Sep 16 05:00:29.922769 kubelet[2862]: E0916 05:00:29.922760 2862 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.94.33:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.94.33:6443: connect: connection refused" logger="UnhandledError" Sep 16 05:00:29.928931 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 16 05:00:29.946025 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 16 05:00:29.948309 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 16 05:00:29.965873 kubelet[2862]: I0916 05:00:29.965827 2862 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 16 05:00:29.966026 kubelet[2862]: I0916 05:00:29.965985 2862 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 16 05:00:29.966026 kubelet[2862]: I0916 05:00:29.965997 2862 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 16 05:00:29.966149 kubelet[2862]: I0916 05:00:29.966126 2862 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 16 05:00:29.966992 kubelet[2862]: E0916 05:00:29.966975 2862 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459.0.0-n-32926c0571\" not found" Sep 16 05:00:30.044940 systemd[1]: Created slice kubepods-burstable-podcbd58c232a940d5e9c6692593ef50d57.slice - libcontainer container kubepods-burstable-podcbd58c232a940d5e9c6692593ef50d57.slice. Sep 16 05:00:30.069645 kubelet[2862]: I0916 05:00:30.069543 2862 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459.0.0-n-32926c0571" Sep 16 05:00:30.070366 kubelet[2862]: E0916 05:00:30.070269 2862 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.94.33:6443/api/v1/nodes\": dial tcp 139.178.94.33:6443: connect: connection refused" node="ci-4459.0.0-n-32926c0571" Sep 16 05:00:30.084440 systemd[1]: Created slice kubepods-burstable-pod609dcee35be423997775bce859c1c4fc.slice - libcontainer container kubepods-burstable-pod609dcee35be423997775bce859c1c4fc.slice. Sep 16 05:00:30.111606 kubelet[2862]: E0916 05:00:30.111480 2862 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.0.0-n-32926c0571?timeout=10s\": dial tcp 139.178.94.33:6443: connect: connection refused" interval="400ms" Sep 16 05:00:30.111781 kubelet[2862]: I0916 05:00:30.111742 2862 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d980e5dbb6a3caf8d690734cff50842c-kubeconfig\") pod \"kube-scheduler-ci-4459.0.0-n-32926c0571\" (UID: \"d980e5dbb6a3caf8d690734cff50842c\") " pod="kube-system/kube-scheduler-ci-4459.0.0-n-32926c0571" Sep 16 05:00:30.111898 kubelet[2862]: I0916 05:00:30.111823 2862 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/609dcee35be423997775bce859c1c4fc-ca-certs\") pod \"kube-controller-manager-ci-4459.0.0-n-32926c0571\" (UID: \"609dcee35be423997775bce859c1c4fc\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-32926c0571" Sep 16 05:00:30.111992 kubelet[2862]: I0916 05:00:30.111894 2862 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/609dcee35be423997775bce859c1c4fc-kubeconfig\") pod \"kube-controller-manager-ci-4459.0.0-n-32926c0571\" (UID: \"609dcee35be423997775bce859c1c4fc\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-32926c0571" Sep 16 05:00:30.111992 kubelet[2862]: I0916 05:00:30.111956 2862 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/609dcee35be423997775bce859c1c4fc-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.0.0-n-32926c0571\" (UID: \"609dcee35be423997775bce859c1c4fc\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-32926c0571" Sep 16 05:00:30.112192 kubelet[2862]: I0916 05:00:30.112010 2862 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cbd58c232a940d5e9c6692593ef50d57-ca-certs\") pod \"kube-apiserver-ci-4459.0.0-n-32926c0571\" (UID: \"cbd58c232a940d5e9c6692593ef50d57\") " pod="kube-system/kube-apiserver-ci-4459.0.0-n-32926c0571" Sep 16 05:00:30.112296 kubelet[2862]: I0916 05:00:30.112182 2862 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cbd58c232a940d5e9c6692593ef50d57-k8s-certs\") pod \"kube-apiserver-ci-4459.0.0-n-32926c0571\" (UID: \"cbd58c232a940d5e9c6692593ef50d57\") " pod="kube-system/kube-apiserver-ci-4459.0.0-n-32926c0571" Sep 16 05:00:30.112379 kubelet[2862]: I0916 05:00:30.112291 2862 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cbd58c232a940d5e9c6692593ef50d57-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.0.0-n-32926c0571\" (UID: \"cbd58c232a940d5e9c6692593ef50d57\") " pod="kube-system/kube-apiserver-ci-4459.0.0-n-32926c0571" Sep 16 05:00:30.112457 kubelet[2862]: I0916 05:00:30.112404 2862 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/609dcee35be423997775bce859c1c4fc-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.0.0-n-32926c0571\" (UID: \"609dcee35be423997775bce859c1c4fc\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-32926c0571" Sep 16 05:00:30.112438 systemd[1]: Created slice kubepods-burstable-podd980e5dbb6a3caf8d690734cff50842c.slice - libcontainer container kubepods-burstable-podd980e5dbb6a3caf8d690734cff50842c.slice. Sep 16 05:00:30.112776 kubelet[2862]: I0916 05:00:30.112506 2862 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/609dcee35be423997775bce859c1c4fc-k8s-certs\") pod \"kube-controller-manager-ci-4459.0.0-n-32926c0571\" (UID: \"609dcee35be423997775bce859c1c4fc\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-32926c0571" Sep 16 05:00:30.274392 kubelet[2862]: I0916 05:00:30.274235 2862 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459.0.0-n-32926c0571" Sep 16 05:00:30.274932 kubelet[2862]: E0916 05:00:30.274824 2862 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.94.33:6443/api/v1/nodes\": dial tcp 139.178.94.33:6443: connect: connection refused" node="ci-4459.0.0-n-32926c0571" Sep 16 05:00:30.378339 containerd[1918]: time="2025-09-16T05:00:30.378245365Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.0.0-n-32926c0571,Uid:cbd58c232a940d5e9c6692593ef50d57,Namespace:kube-system,Attempt:0,}" Sep 16 05:00:30.386670 containerd[1918]: time="2025-09-16T05:00:30.386649815Z" level=info msg="connecting to shim 86c51a70b4c44123bfd560b4cca792b727300f287975258edfc7d83da7053c8c" address="unix:///run/containerd/s/2f364f80b8d7c3c9f3f425ec6a1efa420baaa88c56d5e4d2182103f00c8884e9" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:00:30.405612 containerd[1918]: time="2025-09-16T05:00:30.405587682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.0.0-n-32926c0571,Uid:609dcee35be423997775bce859c1c4fc,Namespace:kube-system,Attempt:0,}" Sep 16 05:00:30.409219 systemd[1]: Started cri-containerd-86c51a70b4c44123bfd560b4cca792b727300f287975258edfc7d83da7053c8c.scope - libcontainer container 86c51a70b4c44123bfd560b4cca792b727300f287975258edfc7d83da7053c8c. Sep 16 05:00:30.413682 containerd[1918]: time="2025-09-16T05:00:30.413649943Z" level=info msg="connecting to shim 6b3658ef6c1cde3ba407227c9bf6ed1380eb4b0867379dc60a4921562ea0a2e4" address="unix:///run/containerd/s/1588fa09274164ebc75c694d4e800370a6e0946b3059fe3ecab7909e5b2d3d2f" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:00:30.418414 containerd[1918]: time="2025-09-16T05:00:30.418390534Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.0.0-n-32926c0571,Uid:d980e5dbb6a3caf8d690734cff50842c,Namespace:kube-system,Attempt:0,}" Sep 16 05:00:30.421855 systemd[1]: Started cri-containerd-6b3658ef6c1cde3ba407227c9bf6ed1380eb4b0867379dc60a4921562ea0a2e4.scope - libcontainer container 6b3658ef6c1cde3ba407227c9bf6ed1380eb4b0867379dc60a4921562ea0a2e4. Sep 16 05:00:30.425117 containerd[1918]: time="2025-09-16T05:00:30.425089139Z" level=info msg="connecting to shim 52b6dc2f95ea295a05a300b2e452e2af5cb4d6b3cad8ea09c27928ea30d0664a" address="unix:///run/containerd/s/8c0dd3cd14c6e2d280d8e48887c134d810470ec193343a50abcccd218ac25858" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:00:30.433770 systemd[1]: Started cri-containerd-52b6dc2f95ea295a05a300b2e452e2af5cb4d6b3cad8ea09c27928ea30d0664a.scope - libcontainer container 52b6dc2f95ea295a05a300b2e452e2af5cb4d6b3cad8ea09c27928ea30d0664a. Sep 16 05:00:30.439011 containerd[1918]: time="2025-09-16T05:00:30.438987346Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.0.0-n-32926c0571,Uid:cbd58c232a940d5e9c6692593ef50d57,Namespace:kube-system,Attempt:0,} returns sandbox id \"86c51a70b4c44123bfd560b4cca792b727300f287975258edfc7d83da7053c8c\"" Sep 16 05:00:30.440416 containerd[1918]: time="2025-09-16T05:00:30.440403943Z" level=info msg="CreateContainer within sandbox \"86c51a70b4c44123bfd560b4cca792b727300f287975258edfc7d83da7053c8c\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 16 05:00:30.443913 containerd[1918]: time="2025-09-16T05:00:30.443896050Z" level=info msg="Container a4ef41c99f0c23b34e4b55874de9b6b68907c43ba1a78015214b36b9890f7b9d: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:00:30.447158 containerd[1918]: time="2025-09-16T05:00:30.447113258Z" level=info msg="CreateContainer within sandbox \"86c51a70b4c44123bfd560b4cca792b727300f287975258edfc7d83da7053c8c\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"a4ef41c99f0c23b34e4b55874de9b6b68907c43ba1a78015214b36b9890f7b9d\"" Sep 16 05:00:30.447589 containerd[1918]: time="2025-09-16T05:00:30.447546615Z" level=info msg="StartContainer for \"a4ef41c99f0c23b34e4b55874de9b6b68907c43ba1a78015214b36b9890f7b9d\"" Sep 16 05:00:30.448133 containerd[1918]: time="2025-09-16T05:00:30.448120795Z" level=info msg="connecting to shim a4ef41c99f0c23b34e4b55874de9b6b68907c43ba1a78015214b36b9890f7b9d" address="unix:///run/containerd/s/2f364f80b8d7c3c9f3f425ec6a1efa420baaa88c56d5e4d2182103f00c8884e9" protocol=ttrpc version=3 Sep 16 05:00:30.453238 containerd[1918]: time="2025-09-16T05:00:30.453212458Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.0.0-n-32926c0571,Uid:609dcee35be423997775bce859c1c4fc,Namespace:kube-system,Attempt:0,} returns sandbox id \"6b3658ef6c1cde3ba407227c9bf6ed1380eb4b0867379dc60a4921562ea0a2e4\"" Sep 16 05:00:30.454307 containerd[1918]: time="2025-09-16T05:00:30.454293461Z" level=info msg="CreateContainer within sandbox \"6b3658ef6c1cde3ba407227c9bf6ed1380eb4b0867379dc60a4921562ea0a2e4\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 16 05:00:30.457232 containerd[1918]: time="2025-09-16T05:00:30.457212550Z" level=info msg="Container 40fec3a257bbe01a64f87c227cf385ea6d2f50d35344c99fcf238ef095084723: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:00:30.459828 containerd[1918]: time="2025-09-16T05:00:30.459806750Z" level=info msg="CreateContainer within sandbox \"6b3658ef6c1cde3ba407227c9bf6ed1380eb4b0867379dc60a4921562ea0a2e4\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"40fec3a257bbe01a64f87c227cf385ea6d2f50d35344c99fcf238ef095084723\"" Sep 16 05:00:30.460057 containerd[1918]: time="2025-09-16T05:00:30.460044985Z" level=info msg="StartContainer for \"40fec3a257bbe01a64f87c227cf385ea6d2f50d35344c99fcf238ef095084723\"" Sep 16 05:00:30.460574 containerd[1918]: time="2025-09-16T05:00:30.460561754Z" level=info msg="connecting to shim 40fec3a257bbe01a64f87c227cf385ea6d2f50d35344c99fcf238ef095084723" address="unix:///run/containerd/s/1588fa09274164ebc75c694d4e800370a6e0946b3059fe3ecab7909e5b2d3d2f" protocol=ttrpc version=3 Sep 16 05:00:30.465194 systemd[1]: Started cri-containerd-a4ef41c99f0c23b34e4b55874de9b6b68907c43ba1a78015214b36b9890f7b9d.scope - libcontainer container a4ef41c99f0c23b34e4b55874de9b6b68907c43ba1a78015214b36b9890f7b9d. Sep 16 05:00:30.467201 systemd[1]: Started cri-containerd-40fec3a257bbe01a64f87c227cf385ea6d2f50d35344c99fcf238ef095084723.scope - libcontainer container 40fec3a257bbe01a64f87c227cf385ea6d2f50d35344c99fcf238ef095084723. Sep 16 05:00:30.481117 containerd[1918]: time="2025-09-16T05:00:30.481068405Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.0.0-n-32926c0571,Uid:d980e5dbb6a3caf8d690734cff50842c,Namespace:kube-system,Attempt:0,} returns sandbox id \"52b6dc2f95ea295a05a300b2e452e2af5cb4d6b3cad8ea09c27928ea30d0664a\"" Sep 16 05:00:30.482098 containerd[1918]: time="2025-09-16T05:00:30.482083197Z" level=info msg="CreateContainer within sandbox \"52b6dc2f95ea295a05a300b2e452e2af5cb4d6b3cad8ea09c27928ea30d0664a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 16 05:00:30.485393 containerd[1918]: time="2025-09-16T05:00:30.485350993Z" level=info msg="Container f96fc3414739472b1d19800eb161464d3f580bcb4bd643332482d7df6a239e89: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:00:30.488687 containerd[1918]: time="2025-09-16T05:00:30.488291642Z" level=info msg="CreateContainer within sandbox \"52b6dc2f95ea295a05a300b2e452e2af5cb4d6b3cad8ea09c27928ea30d0664a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"f96fc3414739472b1d19800eb161464d3f580bcb4bd643332482d7df6a239e89\"" Sep 16 05:00:30.488949 containerd[1918]: time="2025-09-16T05:00:30.488933485Z" level=info msg="StartContainer for \"f96fc3414739472b1d19800eb161464d3f580bcb4bd643332482d7df6a239e89\"" Sep 16 05:00:30.489525 containerd[1918]: time="2025-09-16T05:00:30.489513272Z" level=info msg="connecting to shim f96fc3414739472b1d19800eb161464d3f580bcb4bd643332482d7df6a239e89" address="unix:///run/containerd/s/8c0dd3cd14c6e2d280d8e48887c134d810470ec193343a50abcccd218ac25858" protocol=ttrpc version=3 Sep 16 05:00:30.492272 containerd[1918]: time="2025-09-16T05:00:30.492248107Z" level=info msg="StartContainer for \"a4ef41c99f0c23b34e4b55874de9b6b68907c43ba1a78015214b36b9890f7b9d\" returns successfully" Sep 16 05:00:30.495615 containerd[1918]: time="2025-09-16T05:00:30.495593814Z" level=info msg="StartContainer for \"40fec3a257bbe01a64f87c227cf385ea6d2f50d35344c99fcf238ef095084723\" returns successfully" Sep 16 05:00:30.506239 systemd[1]: Started cri-containerd-f96fc3414739472b1d19800eb161464d3f580bcb4bd643332482d7df6a239e89.scope - libcontainer container f96fc3414739472b1d19800eb161464d3f580bcb4bd643332482d7df6a239e89. Sep 16 05:00:30.512179 kubelet[2862]: E0916 05:00:30.512150 2862 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.0.0-n-32926c0571?timeout=10s\": dial tcp 139.178.94.33:6443: connect: connection refused" interval="800ms" Sep 16 05:00:30.534834 containerd[1918]: time="2025-09-16T05:00:30.534776885Z" level=info msg="StartContainer for \"f96fc3414739472b1d19800eb161464d3f580bcb4bd643332482d7df6a239e89\" returns successfully" Sep 16 05:00:30.677015 kubelet[2862]: I0916 05:00:30.676997 2862 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459.0.0-n-32926c0571" Sep 16 05:00:31.176209 kubelet[2862]: I0916 05:00:31.176188 2862 kubelet_node_status.go:75] "Successfully registered node" node="ci-4459.0.0-n-32926c0571" Sep 16 05:00:31.176209 kubelet[2862]: E0916 05:00:31.176211 2862 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-4459.0.0-n-32926c0571\": node \"ci-4459.0.0-n-32926c0571\" not found" Sep 16 05:00:31.180909 kubelet[2862]: E0916 05:00:31.180892 2862 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459.0.0-n-32926c0571\" not found" Sep 16 05:00:31.895734 kubelet[2862]: I0916 05:00:31.895622 2862 apiserver.go:52] "Watching apiserver" Sep 16 05:00:31.910901 kubelet[2862]: I0916 05:00:31.910819 2862 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 16 05:00:31.929097 kubelet[2862]: E0916 05:00:31.929081 2862 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4459.0.0-n-32926c0571\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459.0.0-n-32926c0571" Sep 16 05:00:32.935728 kubelet[2862]: W0916 05:00:32.935659 2862 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 16 05:00:33.123678 kubelet[2862]: W0916 05:00:33.123603 2862 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 16 05:00:33.616598 systemd[1]: Reload requested from client PID 3180 ('systemctl') (unit session-11.scope)... Sep 16 05:00:33.616606 systemd[1]: Reloading... Sep 16 05:00:33.651097 zram_generator::config[3225]: No configuration found. Sep 16 05:00:33.819306 systemd[1]: Reloading finished in 202 ms. Sep 16 05:00:33.835401 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 05:00:33.843723 systemd[1]: kubelet.service: Deactivated successfully. Sep 16 05:00:33.843845 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:00:33.843869 systemd[1]: kubelet.service: Consumed 675ms CPU time, 139.6M memory peak. Sep 16 05:00:33.845175 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 05:00:34.136730 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:00:34.139403 (kubelet)[3290]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 16 05:00:34.160311 kubelet[3290]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 05:00:34.160311 kubelet[3290]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 16 05:00:34.160311 kubelet[3290]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 05:00:34.160529 kubelet[3290]: I0916 05:00:34.160316 3290 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 16 05:00:34.163741 kubelet[3290]: I0916 05:00:34.163727 3290 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 16 05:00:34.163741 kubelet[3290]: I0916 05:00:34.163739 3290 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 16 05:00:34.163893 kubelet[3290]: I0916 05:00:34.163886 3290 server.go:934] "Client rotation is on, will bootstrap in background" Sep 16 05:00:34.164709 kubelet[3290]: I0916 05:00:34.164658 3290 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 16 05:00:34.165837 kubelet[3290]: I0916 05:00:34.165805 3290 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 16 05:00:34.168031 kubelet[3290]: I0916 05:00:34.168018 3290 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 16 05:00:34.175193 kubelet[3290]: I0916 05:00:34.175183 3290 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 16 05:00:34.175258 kubelet[3290]: I0916 05:00:34.175249 3290 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 16 05:00:34.175353 kubelet[3290]: I0916 05:00:34.175336 3290 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 16 05:00:34.175483 kubelet[3290]: I0916 05:00:34.175353 3290 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.0.0-n-32926c0571","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 16 05:00:34.175565 kubelet[3290]: I0916 05:00:34.175490 3290 topology_manager.go:138] "Creating topology manager with none policy" Sep 16 05:00:34.175565 kubelet[3290]: I0916 05:00:34.175500 3290 container_manager_linux.go:300] "Creating device plugin manager" Sep 16 05:00:34.175565 kubelet[3290]: I0916 05:00:34.175521 3290 state_mem.go:36] "Initialized new in-memory state store" Sep 16 05:00:34.175643 kubelet[3290]: I0916 05:00:34.175596 3290 kubelet.go:408] "Attempting to sync node with API server" Sep 16 05:00:34.175643 kubelet[3290]: I0916 05:00:34.175609 3290 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 16 05:00:34.175643 kubelet[3290]: I0916 05:00:34.175633 3290 kubelet.go:314] "Adding apiserver pod source" Sep 16 05:00:34.175643 kubelet[3290]: I0916 05:00:34.175642 3290 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 16 05:00:34.175946 kubelet[3290]: I0916 05:00:34.175933 3290 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 16 05:00:34.176409 kubelet[3290]: I0916 05:00:34.176399 3290 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 16 05:00:34.176728 kubelet[3290]: I0916 05:00:34.176720 3290 server.go:1274] "Started kubelet" Sep 16 05:00:34.176784 kubelet[3290]: I0916 05:00:34.176768 3290 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 16 05:00:34.176854 kubelet[3290]: I0916 05:00:34.176819 3290 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 16 05:00:34.176984 kubelet[3290]: I0916 05:00:34.176975 3290 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 16 05:00:34.177826 kubelet[3290]: I0916 05:00:34.177674 3290 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 16 05:00:34.177826 kubelet[3290]: I0916 05:00:34.177796 3290 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 16 05:00:34.177925 kubelet[3290]: I0916 05:00:34.177837 3290 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 16 05:00:34.177925 kubelet[3290]: I0916 05:00:34.177847 3290 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 16 05:00:34.177925 kubelet[3290]: E0916 05:00:34.177836 3290 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459.0.0-n-32926c0571\" not found" Sep 16 05:00:34.178323 kubelet[3290]: I0916 05:00:34.178313 3290 reconciler.go:26] "Reconciler: start to sync state" Sep 16 05:00:34.179046 kubelet[3290]: I0916 05:00:34.179027 3290 server.go:449] "Adding debug handlers to kubelet server" Sep 16 05:00:34.179179 kubelet[3290]: I0916 05:00:34.179158 3290 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 16 05:00:34.179387 kubelet[3290]: E0916 05:00:34.179371 3290 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 16 05:00:34.179752 kubelet[3290]: I0916 05:00:34.179741 3290 factory.go:221] Registration of the containerd container factory successfully Sep 16 05:00:34.179752 kubelet[3290]: I0916 05:00:34.179752 3290 factory.go:221] Registration of the systemd container factory successfully Sep 16 05:00:34.183937 kubelet[3290]: I0916 05:00:34.183909 3290 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 16 05:00:34.184555 kubelet[3290]: I0916 05:00:34.184547 3290 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 16 05:00:34.184594 kubelet[3290]: I0916 05:00:34.184561 3290 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 16 05:00:34.184594 kubelet[3290]: I0916 05:00:34.184573 3290 kubelet.go:2321] "Starting kubelet main sync loop" Sep 16 05:00:34.184631 kubelet[3290]: E0916 05:00:34.184596 3290 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 16 05:00:34.194991 kubelet[3290]: I0916 05:00:34.194946 3290 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 16 05:00:34.194991 kubelet[3290]: I0916 05:00:34.194958 3290 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 16 05:00:34.194991 kubelet[3290]: I0916 05:00:34.194972 3290 state_mem.go:36] "Initialized new in-memory state store" Sep 16 05:00:34.195091 kubelet[3290]: I0916 05:00:34.195074 3290 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 16 05:00:34.195091 kubelet[3290]: I0916 05:00:34.195081 3290 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 16 05:00:34.195129 kubelet[3290]: I0916 05:00:34.195094 3290 policy_none.go:49] "None policy: Start" Sep 16 05:00:34.195348 kubelet[3290]: I0916 05:00:34.195307 3290 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 16 05:00:34.195348 kubelet[3290]: I0916 05:00:34.195316 3290 state_mem.go:35] "Initializing new in-memory state store" Sep 16 05:00:34.195398 kubelet[3290]: I0916 05:00:34.195387 3290 state_mem.go:75] "Updated machine memory state" Sep 16 05:00:34.197527 kubelet[3290]: I0916 05:00:34.197517 3290 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 16 05:00:34.197608 kubelet[3290]: I0916 05:00:34.197602 3290 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 16 05:00:34.197638 kubelet[3290]: I0916 05:00:34.197609 3290 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 16 05:00:34.197702 kubelet[3290]: I0916 05:00:34.197694 3290 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 16 05:00:34.293067 kubelet[3290]: W0916 05:00:34.292944 3290 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 16 05:00:34.293962 kubelet[3290]: W0916 05:00:34.293913 3290 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 16 05:00:34.294152 kubelet[3290]: W0916 05:00:34.293986 3290 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 16 05:00:34.294152 kubelet[3290]: E0916 05:00:34.294065 3290 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-4459.0.0-n-32926c0571\" already exists" pod="kube-system/kube-controller-manager-ci-4459.0.0-n-32926c0571" Sep 16 05:00:34.294346 kubelet[3290]: E0916 05:00:34.294150 3290 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4459.0.0-n-32926c0571\" already exists" pod="kube-system/kube-apiserver-ci-4459.0.0-n-32926c0571" Sep 16 05:00:34.304172 kubelet[3290]: I0916 05:00:34.304119 3290 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459.0.0-n-32926c0571" Sep 16 05:00:34.312662 kubelet[3290]: I0916 05:00:34.312599 3290 kubelet_node_status.go:111] "Node was previously registered" node="ci-4459.0.0-n-32926c0571" Sep 16 05:00:34.312920 kubelet[3290]: I0916 05:00:34.312781 3290 kubelet_node_status.go:75] "Successfully registered node" node="ci-4459.0.0-n-32926c0571" Sep 16 05:00:34.378802 kubelet[3290]: I0916 05:00:34.378705 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cbd58c232a940d5e9c6692593ef50d57-k8s-certs\") pod \"kube-apiserver-ci-4459.0.0-n-32926c0571\" (UID: \"cbd58c232a940d5e9c6692593ef50d57\") " pod="kube-system/kube-apiserver-ci-4459.0.0-n-32926c0571" Sep 16 05:00:34.378802 kubelet[3290]: I0916 05:00:34.378779 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/609dcee35be423997775bce859c1c4fc-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.0.0-n-32926c0571\" (UID: \"609dcee35be423997775bce859c1c4fc\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-32926c0571" Sep 16 05:00:34.379121 kubelet[3290]: I0916 05:00:34.378844 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/609dcee35be423997775bce859c1c4fc-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.0.0-n-32926c0571\" (UID: \"609dcee35be423997775bce859c1c4fc\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-32926c0571" Sep 16 05:00:34.379121 kubelet[3290]: I0916 05:00:34.378891 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cbd58c232a940d5e9c6692593ef50d57-ca-certs\") pod \"kube-apiserver-ci-4459.0.0-n-32926c0571\" (UID: \"cbd58c232a940d5e9c6692593ef50d57\") " pod="kube-system/kube-apiserver-ci-4459.0.0-n-32926c0571" Sep 16 05:00:34.379121 kubelet[3290]: I0916 05:00:34.378949 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cbd58c232a940d5e9c6692593ef50d57-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.0.0-n-32926c0571\" (UID: \"cbd58c232a940d5e9c6692593ef50d57\") " pod="kube-system/kube-apiserver-ci-4459.0.0-n-32926c0571" Sep 16 05:00:34.379121 kubelet[3290]: I0916 05:00:34.378993 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/609dcee35be423997775bce859c1c4fc-ca-certs\") pod \"kube-controller-manager-ci-4459.0.0-n-32926c0571\" (UID: \"609dcee35be423997775bce859c1c4fc\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-32926c0571" Sep 16 05:00:34.379452 kubelet[3290]: I0916 05:00:34.379110 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/609dcee35be423997775bce859c1c4fc-k8s-certs\") pod \"kube-controller-manager-ci-4459.0.0-n-32926c0571\" (UID: \"609dcee35be423997775bce859c1c4fc\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-32926c0571" Sep 16 05:00:34.379452 kubelet[3290]: I0916 05:00:34.379189 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/609dcee35be423997775bce859c1c4fc-kubeconfig\") pod \"kube-controller-manager-ci-4459.0.0-n-32926c0571\" (UID: \"609dcee35be423997775bce859c1c4fc\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-32926c0571" Sep 16 05:00:34.379452 kubelet[3290]: I0916 05:00:34.379250 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d980e5dbb6a3caf8d690734cff50842c-kubeconfig\") pod \"kube-scheduler-ci-4459.0.0-n-32926c0571\" (UID: \"d980e5dbb6a3caf8d690734cff50842c\") " pod="kube-system/kube-scheduler-ci-4459.0.0-n-32926c0571" Sep 16 05:00:35.176807 kubelet[3290]: I0916 05:00:35.176787 3290 apiserver.go:52] "Watching apiserver" Sep 16 05:00:35.177907 kubelet[3290]: I0916 05:00:35.177895 3290 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 16 05:00:35.204624 kubelet[3290]: I0916 05:00:35.204591 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459.0.0-n-32926c0571" podStartSLOduration=2.204580373 podStartE2EDuration="2.204580373s" podCreationTimestamp="2025-09-16 05:00:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 05:00:35.20455562 +0000 UTC m=+1.063084867" watchObservedRunningTime="2025-09-16 05:00:35.204580373 +0000 UTC m=+1.063109617" Sep 16 05:00:35.204730 kubelet[3290]: I0916 05:00:35.204649 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459.0.0-n-32926c0571" podStartSLOduration=1.204645868 podStartE2EDuration="1.204645868s" podCreationTimestamp="2025-09-16 05:00:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 05:00:35.199821846 +0000 UTC m=+1.058351094" watchObservedRunningTime="2025-09-16 05:00:35.204645868 +0000 UTC m=+1.063175112" Sep 16 05:00:35.208097 kubelet[3290]: I0916 05:00:35.208044 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459.0.0-n-32926c0571" podStartSLOduration=3.208031846 podStartE2EDuration="3.208031846s" podCreationTimestamp="2025-09-16 05:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 05:00:35.207980063 +0000 UTC m=+1.066509311" watchObservedRunningTime="2025-09-16 05:00:35.208031846 +0000 UTC m=+1.066561094" Sep 16 05:00:39.021952 kubelet[3290]: I0916 05:00:39.021842 3290 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 16 05:00:39.022764 containerd[1918]: time="2025-09-16T05:00:39.022498486Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 16 05:00:39.023359 kubelet[3290]: I0916 05:00:39.022964 3290 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 16 05:00:39.969888 systemd[1]: Created slice kubepods-besteffort-pod37e28c03_0076_45b9_8040_d5bf35a09f1f.slice - libcontainer container kubepods-besteffort-pod37e28c03_0076_45b9_8040_d5bf35a09f1f.slice. Sep 16 05:00:40.020929 kubelet[3290]: I0916 05:00:40.020819 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w6ck\" (UniqueName: \"kubernetes.io/projected/37e28c03-0076-45b9-8040-d5bf35a09f1f-kube-api-access-4w6ck\") pod \"kube-proxy-588w7\" (UID: \"37e28c03-0076-45b9-8040-d5bf35a09f1f\") " pod="kube-system/kube-proxy-588w7" Sep 16 05:00:40.020929 kubelet[3290]: I0916 05:00:40.020912 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/37e28c03-0076-45b9-8040-d5bf35a09f1f-kube-proxy\") pod \"kube-proxy-588w7\" (UID: \"37e28c03-0076-45b9-8040-d5bf35a09f1f\") " pod="kube-system/kube-proxy-588w7" Sep 16 05:00:40.021259 kubelet[3290]: I0916 05:00:40.020969 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/37e28c03-0076-45b9-8040-d5bf35a09f1f-xtables-lock\") pod \"kube-proxy-588w7\" (UID: \"37e28c03-0076-45b9-8040-d5bf35a09f1f\") " pod="kube-system/kube-proxy-588w7" Sep 16 05:00:40.021259 kubelet[3290]: I0916 05:00:40.021014 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37e28c03-0076-45b9-8040-d5bf35a09f1f-lib-modules\") pod \"kube-proxy-588w7\" (UID: \"37e28c03-0076-45b9-8040-d5bf35a09f1f\") " pod="kube-system/kube-proxy-588w7" Sep 16 05:00:40.089923 systemd[1]: Created slice kubepods-besteffort-podf229c984_b7be_4b56_8751_98d09d84766a.slice - libcontainer container kubepods-besteffort-podf229c984_b7be_4b56_8751_98d09d84766a.slice. Sep 16 05:00:40.121291 kubelet[3290]: I0916 05:00:40.121229 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f229c984-b7be-4b56-8751-98d09d84766a-var-lib-calico\") pod \"tigera-operator-58fc44c59b-bcsj8\" (UID: \"f229c984-b7be-4b56-8751-98d09d84766a\") " pod="tigera-operator/tigera-operator-58fc44c59b-bcsj8" Sep 16 05:00:40.121291 kubelet[3290]: I0916 05:00:40.121279 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sz4j\" (UniqueName: \"kubernetes.io/projected/f229c984-b7be-4b56-8751-98d09d84766a-kube-api-access-9sz4j\") pod \"tigera-operator-58fc44c59b-bcsj8\" (UID: \"f229c984-b7be-4b56-8751-98d09d84766a\") " pod="tigera-operator/tigera-operator-58fc44c59b-bcsj8" Sep 16 05:00:40.291687 containerd[1918]: time="2025-09-16T05:00:40.291501854Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-588w7,Uid:37e28c03-0076-45b9-8040-d5bf35a09f1f,Namespace:kube-system,Attempt:0,}" Sep 16 05:00:40.300916 containerd[1918]: time="2025-09-16T05:00:40.300846943Z" level=info msg="connecting to shim 7d04b7e6b9ef275999821dbebda144b7788e3f759b6a4b13f159254c27f8487e" address="unix:///run/containerd/s/be0bccaceb557f006916bdc6839edeca59bb7d26a5344908d65c05e7a3a0a77c" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:00:40.322311 systemd[1]: Started cri-containerd-7d04b7e6b9ef275999821dbebda144b7788e3f759b6a4b13f159254c27f8487e.scope - libcontainer container 7d04b7e6b9ef275999821dbebda144b7788e3f759b6a4b13f159254c27f8487e. Sep 16 05:00:40.335721 containerd[1918]: time="2025-09-16T05:00:40.335679452Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-588w7,Uid:37e28c03-0076-45b9-8040-d5bf35a09f1f,Namespace:kube-system,Attempt:0,} returns sandbox id \"7d04b7e6b9ef275999821dbebda144b7788e3f759b6a4b13f159254c27f8487e\"" Sep 16 05:00:40.336880 containerd[1918]: time="2025-09-16T05:00:40.336864085Z" level=info msg="CreateContainer within sandbox \"7d04b7e6b9ef275999821dbebda144b7788e3f759b6a4b13f159254c27f8487e\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 16 05:00:40.341041 containerd[1918]: time="2025-09-16T05:00:40.341017982Z" level=info msg="Container 8e02ae9483f6a2c72307e5152c37f3857e6e5621d9aaaef68716ac4ee3893b13: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:00:40.345297 containerd[1918]: time="2025-09-16T05:00:40.345254322Z" level=info msg="CreateContainer within sandbox \"7d04b7e6b9ef275999821dbebda144b7788e3f759b6a4b13f159254c27f8487e\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"8e02ae9483f6a2c72307e5152c37f3857e6e5621d9aaaef68716ac4ee3893b13\"" Sep 16 05:00:40.345798 containerd[1918]: time="2025-09-16T05:00:40.345750454Z" level=info msg="StartContainer for \"8e02ae9483f6a2c72307e5152c37f3857e6e5621d9aaaef68716ac4ee3893b13\"" Sep 16 05:00:40.346574 containerd[1918]: time="2025-09-16T05:00:40.346540824Z" level=info msg="connecting to shim 8e02ae9483f6a2c72307e5152c37f3857e6e5621d9aaaef68716ac4ee3893b13" address="unix:///run/containerd/s/be0bccaceb557f006916bdc6839edeca59bb7d26a5344908d65c05e7a3a0a77c" protocol=ttrpc version=3 Sep 16 05:00:40.377226 systemd[1]: Started cri-containerd-8e02ae9483f6a2c72307e5152c37f3857e6e5621d9aaaef68716ac4ee3893b13.scope - libcontainer container 8e02ae9483f6a2c72307e5152c37f3857e6e5621d9aaaef68716ac4ee3893b13. Sep 16 05:00:40.392892 containerd[1918]: time="2025-09-16T05:00:40.392838773Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-bcsj8,Uid:f229c984-b7be-4b56-8751-98d09d84766a,Namespace:tigera-operator,Attempt:0,}" Sep 16 05:00:40.399693 containerd[1918]: time="2025-09-16T05:00:40.399667463Z" level=info msg="connecting to shim 2d84a938525c0af1a2ba86f64b977e6b07e842c3d1e212e44b03754c1fa7cd8d" address="unix:///run/containerd/s/93ffd8ef5957751d11a0411875c5b090d60b062574e081b389feef34e884a974" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:00:40.401095 containerd[1918]: time="2025-09-16T05:00:40.401075264Z" level=info msg="StartContainer for \"8e02ae9483f6a2c72307e5152c37f3857e6e5621d9aaaef68716ac4ee3893b13\" returns successfully" Sep 16 05:00:40.417152 systemd[1]: Started cri-containerd-2d84a938525c0af1a2ba86f64b977e6b07e842c3d1e212e44b03754c1fa7cd8d.scope - libcontainer container 2d84a938525c0af1a2ba86f64b977e6b07e842c3d1e212e44b03754c1fa7cd8d. Sep 16 05:00:40.442532 containerd[1918]: time="2025-09-16T05:00:40.442507091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-bcsj8,Uid:f229c984-b7be-4b56-8751-98d09d84766a,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"2d84a938525c0af1a2ba86f64b977e6b07e842c3d1e212e44b03754c1fa7cd8d\"" Sep 16 05:00:40.443215 containerd[1918]: time="2025-09-16T05:00:40.443202993Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 16 05:00:41.221496 kubelet[3290]: I0916 05:00:41.221466 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-588w7" podStartSLOduration=2.221452317 podStartE2EDuration="2.221452317s" podCreationTimestamp="2025-09-16 05:00:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 05:00:41.221414106 +0000 UTC m=+7.079943355" watchObservedRunningTime="2025-09-16 05:00:41.221452317 +0000 UTC m=+7.079981570" Sep 16 05:00:42.063592 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3473720897.mount: Deactivated successfully. Sep 16 05:00:42.322559 containerd[1918]: time="2025-09-16T05:00:42.322471462Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:42.322781 containerd[1918]: time="2025-09-16T05:00:42.322601451Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 16 05:00:42.323083 containerd[1918]: time="2025-09-16T05:00:42.323031011Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:42.323969 containerd[1918]: time="2025-09-16T05:00:42.323925906Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:42.324344 containerd[1918]: time="2025-09-16T05:00:42.324301736Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.881082314s" Sep 16 05:00:42.324344 containerd[1918]: time="2025-09-16T05:00:42.324318006Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 16 05:00:42.325296 containerd[1918]: time="2025-09-16T05:00:42.325244624Z" level=info msg="CreateContainer within sandbox \"2d84a938525c0af1a2ba86f64b977e6b07e842c3d1e212e44b03754c1fa7cd8d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 16 05:00:42.328301 containerd[1918]: time="2025-09-16T05:00:42.328258416Z" level=info msg="Container a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:00:42.330878 containerd[1918]: time="2025-09-16T05:00:42.330836541Z" level=info msg="CreateContainer within sandbox \"2d84a938525c0af1a2ba86f64b977e6b07e842c3d1e212e44b03754c1fa7cd8d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47\"" Sep 16 05:00:42.331093 containerd[1918]: time="2025-09-16T05:00:42.331081090Z" level=info msg="StartContainer for \"a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47\"" Sep 16 05:00:42.331495 containerd[1918]: time="2025-09-16T05:00:42.331481538Z" level=info msg="connecting to shim a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47" address="unix:///run/containerd/s/93ffd8ef5957751d11a0411875c5b090d60b062574e081b389feef34e884a974" protocol=ttrpc version=3 Sep 16 05:00:42.350333 systemd[1]: Started cri-containerd-a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47.scope - libcontainer container a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47. Sep 16 05:00:42.364635 containerd[1918]: time="2025-09-16T05:00:42.364581139Z" level=info msg="StartContainer for \"a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47\" returns successfully" Sep 16 05:00:43.240160 kubelet[3290]: I0916 05:00:43.240100 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-bcsj8" podStartSLOduration=1.358389613 podStartE2EDuration="3.240088751s" podCreationTimestamp="2025-09-16 05:00:40 +0000 UTC" firstStartedPulling="2025-09-16 05:00:40.443015976 +0000 UTC m=+6.301545223" lastFinishedPulling="2025-09-16 05:00:42.324715113 +0000 UTC m=+8.183244361" observedRunningTime="2025-09-16 05:00:43.2400853 +0000 UTC m=+9.098614550" watchObservedRunningTime="2025-09-16 05:00:43.240088751 +0000 UTC m=+9.098617997" Sep 16 05:00:43.772442 systemd[1]: cri-containerd-a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47.scope: Deactivated successfully. Sep 16 05:00:43.773643 containerd[1918]: time="2025-09-16T05:00:43.773617591Z" level=info msg="received exit event container_id:\"a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47\" id:\"a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47\" pid:3657 exit_status:1 exited_at:{seconds:1757998843 nanos:773340500}" Sep 16 05:00:43.773899 containerd[1918]: time="2025-09-16T05:00:43.773721174Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47\" id:\"a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47\" pid:3657 exit_status:1 exited_at:{seconds:1757998843 nanos:773340500}" Sep 16 05:00:43.786871 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47-rootfs.mount: Deactivated successfully. Sep 16 05:00:45.221077 kubelet[3290]: I0916 05:00:45.220999 3290 scope.go:117] "RemoveContainer" containerID="a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47" Sep 16 05:00:45.224854 containerd[1918]: time="2025-09-16T05:00:45.224754454Z" level=info msg="CreateContainer within sandbox \"2d84a938525c0af1a2ba86f64b977e6b07e842c3d1e212e44b03754c1fa7cd8d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 16 05:00:45.230517 containerd[1918]: time="2025-09-16T05:00:45.230480202Z" level=info msg="Container 90aaf0ca370057ac548b3cf9fc06ae4103faf02546a38af3c703efaae209a1db: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:00:45.233269 containerd[1918]: time="2025-09-16T05:00:45.233251777Z" level=info msg="CreateContainer within sandbox \"2d84a938525c0af1a2ba86f64b977e6b07e842c3d1e212e44b03754c1fa7cd8d\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"90aaf0ca370057ac548b3cf9fc06ae4103faf02546a38af3c703efaae209a1db\"" Sep 16 05:00:45.233655 containerd[1918]: time="2025-09-16T05:00:45.233593755Z" level=info msg="StartContainer for \"90aaf0ca370057ac548b3cf9fc06ae4103faf02546a38af3c703efaae209a1db\"" Sep 16 05:00:45.234758 containerd[1918]: time="2025-09-16T05:00:45.234626099Z" level=info msg="connecting to shim 90aaf0ca370057ac548b3cf9fc06ae4103faf02546a38af3c703efaae209a1db" address="unix:///run/containerd/s/93ffd8ef5957751d11a0411875c5b090d60b062574e081b389feef34e884a974" protocol=ttrpc version=3 Sep 16 05:00:45.256156 systemd[1]: Started cri-containerd-90aaf0ca370057ac548b3cf9fc06ae4103faf02546a38af3c703efaae209a1db.scope - libcontainer container 90aaf0ca370057ac548b3cf9fc06ae4103faf02546a38af3c703efaae209a1db. Sep 16 05:00:45.271731 containerd[1918]: time="2025-09-16T05:00:45.271679681Z" level=info msg="StartContainer for \"90aaf0ca370057ac548b3cf9fc06ae4103faf02546a38af3c703efaae209a1db\" returns successfully" Sep 16 05:00:46.574342 sudo[2227]: pam_unix(sudo:session): session closed for user root Sep 16 05:00:46.575110 sshd[2226]: Connection closed by 139.178.89.65 port 58598 Sep 16 05:00:46.575267 sshd-session[2223]: pam_unix(sshd:session): session closed for user core Sep 16 05:00:46.577014 systemd[1]: sshd@8-139.178.94.33:22-139.178.89.65:58598.service: Deactivated successfully. Sep 16 05:00:46.578102 systemd[1]: session-11.scope: Deactivated successfully. Sep 16 05:00:46.578205 systemd[1]: session-11.scope: Consumed 3.296s CPU time, 222.9M memory peak. Sep 16 05:00:46.579270 systemd-logind[1906]: Session 11 logged out. Waiting for processes to exit. Sep 16 05:00:46.579912 systemd-logind[1906]: Removed session 11. Sep 16 05:00:49.049206 update_engine[1911]: I20250916 05:00:49.049094 1911 update_attempter.cc:509] Updating boot flags... Sep 16 05:00:50.242600 systemd[1]: Created slice kubepods-besteffort-pod7bc9079b_5776_4a66_9707_d7625c0fcc36.slice - libcontainer container kubepods-besteffort-pod7bc9079b_5776_4a66_9707_d7625c0fcc36.slice. Sep 16 05:00:50.293147 kubelet[3290]: I0916 05:00:50.293082 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5llpz\" (UniqueName: \"kubernetes.io/projected/7bc9079b-5776-4a66-9707-d7625c0fcc36-kube-api-access-5llpz\") pod \"calico-typha-565644bf4b-nbmql\" (UID: \"7bc9079b-5776-4a66-9707-d7625c0fcc36\") " pod="calico-system/calico-typha-565644bf4b-nbmql" Sep 16 05:00:50.293147 kubelet[3290]: I0916 05:00:50.293129 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/7bc9079b-5776-4a66-9707-d7625c0fcc36-typha-certs\") pod \"calico-typha-565644bf4b-nbmql\" (UID: \"7bc9079b-5776-4a66-9707-d7625c0fcc36\") " pod="calico-system/calico-typha-565644bf4b-nbmql" Sep 16 05:00:50.293147 kubelet[3290]: I0916 05:00:50.293154 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bc9079b-5776-4a66-9707-d7625c0fcc36-tigera-ca-bundle\") pod \"calico-typha-565644bf4b-nbmql\" (UID: \"7bc9079b-5776-4a66-9707-d7625c0fcc36\") " pod="calico-system/calico-typha-565644bf4b-nbmql" Sep 16 05:00:50.546940 containerd[1918]: time="2025-09-16T05:00:50.546751881Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-565644bf4b-nbmql,Uid:7bc9079b-5776-4a66-9707-d7625c0fcc36,Namespace:calico-system,Attempt:0,}" Sep 16 05:00:50.555413 containerd[1918]: time="2025-09-16T05:00:50.555387358Z" level=info msg="connecting to shim 493efbb2f4d7b80e0de96d4d83a42456f24e9383ea677c2b9f8b2528e52a8401" address="unix:///run/containerd/s/e0c074240008f9ab15a8060d39e89437f7e1387691f4dfdf5c9f35f595524647" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:00:50.579269 systemd[1]: Started cri-containerd-493efbb2f4d7b80e0de96d4d83a42456f24e9383ea677c2b9f8b2528e52a8401.scope - libcontainer container 493efbb2f4d7b80e0de96d4d83a42456f24e9383ea677c2b9f8b2528e52a8401. Sep 16 05:00:50.582462 systemd[1]: Created slice kubepods-besteffort-podb857d537_af93_4a8b_b14a_9c5dac0dbcdf.slice - libcontainer container kubepods-besteffort-podb857d537_af93_4a8b_b14a_9c5dac0dbcdf.slice. Sep 16 05:00:50.595569 kubelet[3290]: I0916 05:00:50.595522 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b857d537-af93-4a8b-b14a-9c5dac0dbcdf-cni-net-dir\") pod \"calico-node-dxtrl\" (UID: \"b857d537-af93-4a8b-b14a-9c5dac0dbcdf\") " pod="calico-system/calico-node-dxtrl" Sep 16 05:00:50.595569 kubelet[3290]: I0916 05:00:50.595545 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjxzt\" (UniqueName: \"kubernetes.io/projected/b857d537-af93-4a8b-b14a-9c5dac0dbcdf-kube-api-access-mjxzt\") pod \"calico-node-dxtrl\" (UID: \"b857d537-af93-4a8b-b14a-9c5dac0dbcdf\") " pod="calico-system/calico-node-dxtrl" Sep 16 05:00:50.595569 kubelet[3290]: I0916 05:00:50.595558 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b857d537-af93-4a8b-b14a-9c5dac0dbcdf-var-lib-calico\") pod \"calico-node-dxtrl\" (UID: \"b857d537-af93-4a8b-b14a-9c5dac0dbcdf\") " pod="calico-system/calico-node-dxtrl" Sep 16 05:00:50.595569 kubelet[3290]: I0916 05:00:50.595567 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b857d537-af93-4a8b-b14a-9c5dac0dbcdf-xtables-lock\") pod \"calico-node-dxtrl\" (UID: \"b857d537-af93-4a8b-b14a-9c5dac0dbcdf\") " pod="calico-system/calico-node-dxtrl" Sep 16 05:00:50.595687 kubelet[3290]: I0916 05:00:50.595577 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b857d537-af93-4a8b-b14a-9c5dac0dbcdf-var-run-calico\") pod \"calico-node-dxtrl\" (UID: \"b857d537-af93-4a8b-b14a-9c5dac0dbcdf\") " pod="calico-system/calico-node-dxtrl" Sep 16 05:00:50.595687 kubelet[3290]: I0916 05:00:50.595588 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b857d537-af93-4a8b-b14a-9c5dac0dbcdf-cni-log-dir\") pod \"calico-node-dxtrl\" (UID: \"b857d537-af93-4a8b-b14a-9c5dac0dbcdf\") " pod="calico-system/calico-node-dxtrl" Sep 16 05:00:50.595687 kubelet[3290]: I0916 05:00:50.595597 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b857d537-af93-4a8b-b14a-9c5dac0dbcdf-flexvol-driver-host\") pod \"calico-node-dxtrl\" (UID: \"b857d537-af93-4a8b-b14a-9c5dac0dbcdf\") " pod="calico-system/calico-node-dxtrl" Sep 16 05:00:50.595687 kubelet[3290]: I0916 05:00:50.595606 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b857d537-af93-4a8b-b14a-9c5dac0dbcdf-tigera-ca-bundle\") pod \"calico-node-dxtrl\" (UID: \"b857d537-af93-4a8b-b14a-9c5dac0dbcdf\") " pod="calico-system/calico-node-dxtrl" Sep 16 05:00:50.595687 kubelet[3290]: I0916 05:00:50.595615 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b857d537-af93-4a8b-b14a-9c5dac0dbcdf-cni-bin-dir\") pod \"calico-node-dxtrl\" (UID: \"b857d537-af93-4a8b-b14a-9c5dac0dbcdf\") " pod="calico-system/calico-node-dxtrl" Sep 16 05:00:50.595765 kubelet[3290]: I0916 05:00:50.595625 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b857d537-af93-4a8b-b14a-9c5dac0dbcdf-node-certs\") pod \"calico-node-dxtrl\" (UID: \"b857d537-af93-4a8b-b14a-9c5dac0dbcdf\") " pod="calico-system/calico-node-dxtrl" Sep 16 05:00:50.595765 kubelet[3290]: I0916 05:00:50.595634 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b857d537-af93-4a8b-b14a-9c5dac0dbcdf-lib-modules\") pod \"calico-node-dxtrl\" (UID: \"b857d537-af93-4a8b-b14a-9c5dac0dbcdf\") " pod="calico-system/calico-node-dxtrl" Sep 16 05:00:50.595765 kubelet[3290]: I0916 05:00:50.595641 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b857d537-af93-4a8b-b14a-9c5dac0dbcdf-policysync\") pod \"calico-node-dxtrl\" (UID: \"b857d537-af93-4a8b-b14a-9c5dac0dbcdf\") " pod="calico-system/calico-node-dxtrl" Sep 16 05:00:50.606119 containerd[1918]: time="2025-09-16T05:00:50.606097242Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-565644bf4b-nbmql,Uid:7bc9079b-5776-4a66-9707-d7625c0fcc36,Namespace:calico-system,Attempt:0,} returns sandbox id \"493efbb2f4d7b80e0de96d4d83a42456f24e9383ea677c2b9f8b2528e52a8401\"" Sep 16 05:00:50.606755 containerd[1918]: time="2025-09-16T05:00:50.606719719Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 16 05:00:50.698985 kubelet[3290]: E0916 05:00:50.698929 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.698985 kubelet[3290]: W0916 05:00:50.698975 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.699341 kubelet[3290]: E0916 05:00:50.699098 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.703915 kubelet[3290]: E0916 05:00:50.703819 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.703915 kubelet[3290]: W0916 05:00:50.703868 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.703915 kubelet[3290]: E0916 05:00:50.703904 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.713479 kubelet[3290]: E0916 05:00:50.713393 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.713479 kubelet[3290]: W0916 05:00:50.713431 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.713479 kubelet[3290]: E0916 05:00:50.713464 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.801849 kubelet[3290]: E0916 05:00:50.801711 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9dgtk" podUID="50bc1259-bb30-4849-97ed-c6fb8e4bcaf9" Sep 16 05:00:50.883759 kubelet[3290]: E0916 05:00:50.883701 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.883759 kubelet[3290]: W0916 05:00:50.883743 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.884074 kubelet[3290]: E0916 05:00:50.883783 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.884498 kubelet[3290]: E0916 05:00:50.884440 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.884498 kubelet[3290]: W0916 05:00:50.884475 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.884698 kubelet[3290]: E0916 05:00:50.884509 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.885083 kubelet[3290]: E0916 05:00:50.885005 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.885083 kubelet[3290]: W0916 05:00:50.885072 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.885283 kubelet[3290]: E0916 05:00:50.885107 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.885658 kubelet[3290]: E0916 05:00:50.885586 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.885658 kubelet[3290]: W0916 05:00:50.885615 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.885658 kubelet[3290]: E0916 05:00:50.885640 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.886210 kubelet[3290]: E0916 05:00:50.886136 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.886210 kubelet[3290]: W0916 05:00:50.886161 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.886210 kubelet[3290]: E0916 05:00:50.886186 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.886698 kubelet[3290]: E0916 05:00:50.886629 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.886698 kubelet[3290]: W0916 05:00:50.886653 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.886698 kubelet[3290]: E0916 05:00:50.886678 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.887187 kubelet[3290]: E0916 05:00:50.887110 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.887187 kubelet[3290]: W0916 05:00:50.887136 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.887187 kubelet[3290]: E0916 05:00:50.887158 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.887636 kubelet[3290]: E0916 05:00:50.887555 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.887636 kubelet[3290]: W0916 05:00:50.887579 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.887636 kubelet[3290]: E0916 05:00:50.887605 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.888289 kubelet[3290]: E0916 05:00:50.888207 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.888289 kubelet[3290]: W0916 05:00:50.888244 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.888289 kubelet[3290]: E0916 05:00:50.888280 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.888768 kubelet[3290]: E0916 05:00:50.888728 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.888768 kubelet[3290]: W0916 05:00:50.888757 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.889009 containerd[1918]: time="2025-09-16T05:00:50.888702224Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-dxtrl,Uid:b857d537-af93-4a8b-b14a-9c5dac0dbcdf,Namespace:calico-system,Attempt:0,}" Sep 16 05:00:50.889151 kubelet[3290]: E0916 05:00:50.888786 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.889282 kubelet[3290]: E0916 05:00:50.889210 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.889282 kubelet[3290]: W0916 05:00:50.889233 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.889282 kubelet[3290]: E0916 05:00:50.889258 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.889653 kubelet[3290]: E0916 05:00:50.889638 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.889653 kubelet[3290]: W0916 05:00:50.889650 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.889751 kubelet[3290]: E0916 05:00:50.889663 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.889820 kubelet[3290]: E0916 05:00:50.889812 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.889820 kubelet[3290]: W0916 05:00:50.889819 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.889863 kubelet[3290]: E0916 05:00:50.889825 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.889962 kubelet[3290]: E0916 05:00:50.889955 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.889962 kubelet[3290]: W0916 05:00:50.889961 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.890012 kubelet[3290]: E0916 05:00:50.889967 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.890082 kubelet[3290]: E0916 05:00:50.890075 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.890082 kubelet[3290]: W0916 05:00:50.890082 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.890467 kubelet[3290]: E0916 05:00:50.890088 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.890467 kubelet[3290]: E0916 05:00:50.890203 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.890467 kubelet[3290]: W0916 05:00:50.890209 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.890467 kubelet[3290]: E0916 05:00:50.890214 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.890467 kubelet[3290]: E0916 05:00:50.890322 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.890467 kubelet[3290]: W0916 05:00:50.890328 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.890467 kubelet[3290]: E0916 05:00:50.890334 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.890467 kubelet[3290]: E0916 05:00:50.890428 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.890467 kubelet[3290]: W0916 05:00:50.890434 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.890467 kubelet[3290]: E0916 05:00:50.890440 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.890686 kubelet[3290]: E0916 05:00:50.890536 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.890686 kubelet[3290]: W0916 05:00:50.890542 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.890686 kubelet[3290]: E0916 05:00:50.890547 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.890686 kubelet[3290]: E0916 05:00:50.890680 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.890686 kubelet[3290]: W0916 05:00:50.890686 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.890794 kubelet[3290]: E0916 05:00:50.890692 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.896936 kubelet[3290]: E0916 05:00:50.896917 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.896936 kubelet[3290]: W0916 05:00:50.896929 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.896936 kubelet[3290]: E0916 05:00:50.896943 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.897059 kubelet[3290]: I0916 05:00:50.896963 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/50bc1259-bb30-4849-97ed-c6fb8e4bcaf9-kubelet-dir\") pod \"csi-node-driver-9dgtk\" (UID: \"50bc1259-bb30-4849-97ed-c6fb8e4bcaf9\") " pod="calico-system/csi-node-driver-9dgtk" Sep 16 05:00:50.897085 kubelet[3290]: E0916 05:00:50.897062 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.897085 kubelet[3290]: W0916 05:00:50.897069 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.897085 kubelet[3290]: E0916 05:00:50.897076 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.897132 containerd[1918]: time="2025-09-16T05:00:50.897037899Z" level=info msg="connecting to shim eade86b7dcdd496a4f1ec4c4ea8bc7c70df1f835f7998909c85c682656ac5dd9" address="unix:///run/containerd/s/9bbd8fd931a8a53822eaa45a5f8dd06d454dc09db316bb5c21fd3eafb929b26d" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:00:50.897150 kubelet[3290]: I0916 05:00:50.897084 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw8p4\" (UniqueName: \"kubernetes.io/projected/50bc1259-bb30-4849-97ed-c6fb8e4bcaf9-kube-api-access-bw8p4\") pod \"csi-node-driver-9dgtk\" (UID: \"50bc1259-bb30-4849-97ed-c6fb8e4bcaf9\") " pod="calico-system/csi-node-driver-9dgtk" Sep 16 05:00:50.897235 kubelet[3290]: E0916 05:00:50.897203 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.897235 kubelet[3290]: W0916 05:00:50.897209 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.897235 kubelet[3290]: E0916 05:00:50.897215 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.897235 kubelet[3290]: I0916 05:00:50.897224 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/50bc1259-bb30-4849-97ed-c6fb8e4bcaf9-varrun\") pod \"csi-node-driver-9dgtk\" (UID: \"50bc1259-bb30-4849-97ed-c6fb8e4bcaf9\") " pod="calico-system/csi-node-driver-9dgtk" Sep 16 05:00:50.897367 kubelet[3290]: E0916 05:00:50.897333 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.897367 kubelet[3290]: W0916 05:00:50.897338 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.897367 kubelet[3290]: E0916 05:00:50.897344 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.897367 kubelet[3290]: I0916 05:00:50.897353 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/50bc1259-bb30-4849-97ed-c6fb8e4bcaf9-registration-dir\") pod \"csi-node-driver-9dgtk\" (UID: \"50bc1259-bb30-4849-97ed-c6fb8e4bcaf9\") " pod="calico-system/csi-node-driver-9dgtk" Sep 16 05:00:50.897473 kubelet[3290]: E0916 05:00:50.897458 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.897473 kubelet[3290]: W0916 05:00:50.897464 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.897473 kubelet[3290]: E0916 05:00:50.897470 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.897533 kubelet[3290]: I0916 05:00:50.897480 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/50bc1259-bb30-4849-97ed-c6fb8e4bcaf9-socket-dir\") pod \"csi-node-driver-9dgtk\" (UID: \"50bc1259-bb30-4849-97ed-c6fb8e4bcaf9\") " pod="calico-system/csi-node-driver-9dgtk" Sep 16 05:00:50.897621 kubelet[3290]: E0916 05:00:50.897598 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.897621 kubelet[3290]: W0916 05:00:50.897607 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.897621 kubelet[3290]: E0916 05:00:50.897616 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.897738 kubelet[3290]: E0916 05:00:50.897732 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.897738 kubelet[3290]: W0916 05:00:50.897737 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.897781 kubelet[3290]: E0916 05:00:50.897744 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.897842 kubelet[3290]: E0916 05:00:50.897837 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.897865 kubelet[3290]: W0916 05:00:50.897841 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.897865 kubelet[3290]: E0916 05:00:50.897851 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.897942 kubelet[3290]: E0916 05:00:50.897935 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.897963 kubelet[3290]: W0916 05:00:50.897942 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.897963 kubelet[3290]: E0916 05:00:50.897952 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.898069 kubelet[3290]: E0916 05:00:50.898063 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.898069 kubelet[3290]: W0916 05:00:50.898068 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.898107 kubelet[3290]: E0916 05:00:50.898074 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.898158 kubelet[3290]: E0916 05:00:50.898153 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.898183 kubelet[3290]: W0916 05:00:50.898159 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.898183 kubelet[3290]: E0916 05:00:50.898168 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.898284 kubelet[3290]: E0916 05:00:50.898279 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.898284 kubelet[3290]: W0916 05:00:50.898284 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.898322 kubelet[3290]: E0916 05:00:50.898290 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.898378 kubelet[3290]: E0916 05:00:50.898373 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.898398 kubelet[3290]: W0916 05:00:50.898378 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.898398 kubelet[3290]: E0916 05:00:50.898384 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.898473 kubelet[3290]: E0916 05:00:50.898468 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.898491 kubelet[3290]: W0916 05:00:50.898473 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.898491 kubelet[3290]: E0916 05:00:50.898478 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.898590 kubelet[3290]: E0916 05:00:50.898584 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.898590 kubelet[3290]: W0916 05:00:50.898589 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.898625 kubelet[3290]: E0916 05:00:50.898594 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.921200 systemd[1]: Started cri-containerd-eade86b7dcdd496a4f1ec4c4ea8bc7c70df1f835f7998909c85c682656ac5dd9.scope - libcontainer container eade86b7dcdd496a4f1ec4c4ea8bc7c70df1f835f7998909c85c682656ac5dd9. Sep 16 05:00:50.953312 containerd[1918]: time="2025-09-16T05:00:50.953264431Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-dxtrl,Uid:b857d537-af93-4a8b-b14a-9c5dac0dbcdf,Namespace:calico-system,Attempt:0,} returns sandbox id \"eade86b7dcdd496a4f1ec4c4ea8bc7c70df1f835f7998909c85c682656ac5dd9\"" Sep 16 05:00:50.998498 kubelet[3290]: E0916 05:00:50.998439 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.998498 kubelet[3290]: W0916 05:00:50.998462 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.998498 kubelet[3290]: E0916 05:00:50.998483 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.998845 kubelet[3290]: E0916 05:00:50.998794 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.998845 kubelet[3290]: W0916 05:00:50.998810 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.998845 kubelet[3290]: E0916 05:00:50.998830 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.999154 kubelet[3290]: E0916 05:00:50.999129 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.999154 kubelet[3290]: W0916 05:00:50.999150 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.999316 kubelet[3290]: E0916 05:00:50.999177 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.999468 kubelet[3290]: E0916 05:00:50.999446 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.999468 kubelet[3290]: W0916 05:00:50.999462 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.999614 kubelet[3290]: E0916 05:00:50.999486 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:50.999812 kubelet[3290]: E0916 05:00:50.999786 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:50.999812 kubelet[3290]: W0916 05:00:50.999805 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:50.999949 kubelet[3290]: E0916 05:00:50.999832 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.000131 kubelet[3290]: E0916 05:00:51.000084 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.000131 kubelet[3290]: W0916 05:00:51.000101 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.000131 kubelet[3290]: E0916 05:00:51.000120 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.000453 kubelet[3290]: E0916 05:00:51.000424 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.000453 kubelet[3290]: W0916 05:00:51.000444 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.000628 kubelet[3290]: E0916 05:00:51.000473 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.000801 kubelet[3290]: E0916 05:00:51.000780 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.000894 kubelet[3290]: W0916 05:00:51.000800 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.000894 kubelet[3290]: E0916 05:00:51.000829 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.001145 kubelet[3290]: E0916 05:00:51.001123 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.001145 kubelet[3290]: W0916 05:00:51.001143 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.001325 kubelet[3290]: E0916 05:00:51.001185 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.001516 kubelet[3290]: E0916 05:00:51.001494 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.001516 kubelet[3290]: W0916 05:00:51.001512 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.001674 kubelet[3290]: E0916 05:00:51.001548 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.001831 kubelet[3290]: E0916 05:00:51.001810 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.001831 kubelet[3290]: W0916 05:00:51.001827 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.001993 kubelet[3290]: E0916 05:00:51.001890 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.002143 kubelet[3290]: E0916 05:00:51.002122 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.002143 kubelet[3290]: W0916 05:00:51.002140 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.002335 kubelet[3290]: E0916 05:00:51.002204 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.002508 kubelet[3290]: E0916 05:00:51.002487 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.002508 kubelet[3290]: W0916 05:00:51.002504 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.002699 kubelet[3290]: E0916 05:00:51.002531 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.002851 kubelet[3290]: E0916 05:00:51.002830 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.002851 kubelet[3290]: W0916 05:00:51.002847 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.003042 kubelet[3290]: E0916 05:00:51.002873 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.003233 kubelet[3290]: E0916 05:00:51.003213 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.003233 kubelet[3290]: W0916 05:00:51.003230 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.003417 kubelet[3290]: E0916 05:00:51.003257 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.003581 kubelet[3290]: E0916 05:00:51.003560 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.003581 kubelet[3290]: W0916 05:00:51.003577 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.003745 kubelet[3290]: E0916 05:00:51.003653 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.003866 kubelet[3290]: E0916 05:00:51.003846 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.003866 kubelet[3290]: W0916 05:00:51.003863 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.004044 kubelet[3290]: E0916 05:00:51.003931 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.004159 kubelet[3290]: E0916 05:00:51.004136 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.004255 kubelet[3290]: W0916 05:00:51.004160 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.004255 kubelet[3290]: E0916 05:00:51.004235 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.004473 kubelet[3290]: E0916 05:00:51.004452 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.004473 kubelet[3290]: W0916 05:00:51.004468 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.004667 kubelet[3290]: E0916 05:00:51.004495 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.004825 kubelet[3290]: E0916 05:00:51.004804 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.004825 kubelet[3290]: W0916 05:00:51.004821 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.005002 kubelet[3290]: E0916 05:00:51.004849 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.005151 kubelet[3290]: E0916 05:00:51.005130 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.005151 kubelet[3290]: W0916 05:00:51.005148 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.005338 kubelet[3290]: E0916 05:00:51.005175 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.005520 kubelet[3290]: E0916 05:00:51.005493 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.005520 kubelet[3290]: W0916 05:00:51.005515 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.005744 kubelet[3290]: E0916 05:00:51.005548 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.005948 kubelet[3290]: E0916 05:00:51.005921 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.005948 kubelet[3290]: W0916 05:00:51.005942 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.006198 kubelet[3290]: E0916 05:00:51.005991 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.006372 kubelet[3290]: E0916 05:00:51.006345 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.006372 kubelet[3290]: W0916 05:00:51.006368 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.006593 kubelet[3290]: E0916 05:00:51.006396 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.006855 kubelet[3290]: E0916 05:00:51.006828 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.006855 kubelet[3290]: W0916 05:00:51.006849 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.007025 kubelet[3290]: E0916 05:00:51.006878 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:51.022421 kubelet[3290]: E0916 05:00:51.022379 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:51.022421 kubelet[3290]: W0916 05:00:51.022416 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:51.022748 kubelet[3290]: E0916 05:00:51.022451 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:52.185210 kubelet[3290]: E0916 05:00:52.185092 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9dgtk" podUID="50bc1259-bb30-4849-97ed-c6fb8e4bcaf9" Sep 16 05:00:54.186143 kubelet[3290]: E0916 05:00:54.186022 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9dgtk" podUID="50bc1259-bb30-4849-97ed-c6fb8e4bcaf9" Sep 16 05:00:54.816846 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3278608746.mount: Deactivated successfully. Sep 16 05:00:55.044268 containerd[1918]: time="2025-09-16T05:00:55.044217338Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:55.044467 containerd[1918]: time="2025-09-16T05:00:55.044413295Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 16 05:00:55.044936 containerd[1918]: time="2025-09-16T05:00:55.044892093Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:55.045822 containerd[1918]: time="2025-09-16T05:00:55.045779392Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:55.046212 containerd[1918]: time="2025-09-16T05:00:55.046165794Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 4.439430398s" Sep 16 05:00:55.046212 containerd[1918]: time="2025-09-16T05:00:55.046183583Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 16 05:00:55.046603 containerd[1918]: time="2025-09-16T05:00:55.046579670Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 16 05:00:55.049564 containerd[1918]: time="2025-09-16T05:00:55.049542612Z" level=info msg="CreateContainer within sandbox \"493efbb2f4d7b80e0de96d4d83a42456f24e9383ea677c2b9f8b2528e52a8401\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 16 05:00:55.052321 containerd[1918]: time="2025-09-16T05:00:55.052308844Z" level=info msg="Container 0d7666c1f886882f9b6c72323f0ca65cad85558292adab4fb59c5238cdc9bd55: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:00:55.054978 containerd[1918]: time="2025-09-16T05:00:55.054935726Z" level=info msg="CreateContainer within sandbox \"493efbb2f4d7b80e0de96d4d83a42456f24e9383ea677c2b9f8b2528e52a8401\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"0d7666c1f886882f9b6c72323f0ca65cad85558292adab4fb59c5238cdc9bd55\"" Sep 16 05:00:55.055178 containerd[1918]: time="2025-09-16T05:00:55.055141958Z" level=info msg="StartContainer for \"0d7666c1f886882f9b6c72323f0ca65cad85558292adab4fb59c5238cdc9bd55\"" Sep 16 05:00:55.055656 containerd[1918]: time="2025-09-16T05:00:55.055619698Z" level=info msg="connecting to shim 0d7666c1f886882f9b6c72323f0ca65cad85558292adab4fb59c5238cdc9bd55" address="unix:///run/containerd/s/e0c074240008f9ab15a8060d39e89437f7e1387691f4dfdf5c9f35f595524647" protocol=ttrpc version=3 Sep 16 05:00:55.076306 systemd[1]: Started cri-containerd-0d7666c1f886882f9b6c72323f0ca65cad85558292adab4fb59c5238cdc9bd55.scope - libcontainer container 0d7666c1f886882f9b6c72323f0ca65cad85558292adab4fb59c5238cdc9bd55. Sep 16 05:00:55.112489 containerd[1918]: time="2025-09-16T05:00:55.112459526Z" level=info msg="StartContainer for \"0d7666c1f886882f9b6c72323f0ca65cad85558292adab4fb59c5238cdc9bd55\" returns successfully" Sep 16 05:00:55.271741 kubelet[3290]: I0916 05:00:55.271651 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-565644bf4b-nbmql" podStartSLOduration=0.831711439 podStartE2EDuration="5.27162277s" podCreationTimestamp="2025-09-16 05:00:50 +0000 UTC" firstStartedPulling="2025-09-16 05:00:50.606615693 +0000 UTC m=+16.465144940" lastFinishedPulling="2025-09-16 05:00:55.046527023 +0000 UTC m=+20.905056271" observedRunningTime="2025-09-16 05:00:55.27121864 +0000 UTC m=+21.129747919" watchObservedRunningTime="2025-09-16 05:00:55.27162277 +0000 UTC m=+21.130152036" Sep 16 05:00:55.319150 kubelet[3290]: E0916 05:00:55.319086 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.319150 kubelet[3290]: W0916 05:00:55.319139 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.319472 kubelet[3290]: E0916 05:00:55.319187 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.319855 kubelet[3290]: E0916 05:00:55.319820 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.319987 kubelet[3290]: W0916 05:00:55.319856 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.319987 kubelet[3290]: E0916 05:00:55.319889 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.320489 kubelet[3290]: E0916 05:00:55.320453 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.320612 kubelet[3290]: W0916 05:00:55.320490 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.320612 kubelet[3290]: E0916 05:00:55.320523 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.321134 kubelet[3290]: E0916 05:00:55.321101 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.321134 kubelet[3290]: W0916 05:00:55.321131 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.321344 kubelet[3290]: E0916 05:00:55.321163 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.321767 kubelet[3290]: E0916 05:00:55.321710 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.321767 kubelet[3290]: W0916 05:00:55.321745 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.321968 kubelet[3290]: E0916 05:00:55.321778 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.322307 kubelet[3290]: E0916 05:00:55.322276 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.322307 kubelet[3290]: W0916 05:00:55.322305 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.322508 kubelet[3290]: E0916 05:00:55.322332 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.322863 kubelet[3290]: E0916 05:00:55.322833 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.322961 kubelet[3290]: W0916 05:00:55.322863 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.322961 kubelet[3290]: E0916 05:00:55.322889 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.323417 kubelet[3290]: E0916 05:00:55.323385 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.323544 kubelet[3290]: W0916 05:00:55.323416 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.323544 kubelet[3290]: E0916 05:00:55.323444 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.323966 kubelet[3290]: E0916 05:00:55.323938 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.324065 kubelet[3290]: W0916 05:00:55.323966 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.324065 kubelet[3290]: E0916 05:00:55.323993 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.324552 kubelet[3290]: E0916 05:00:55.324522 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.324677 kubelet[3290]: W0916 05:00:55.324552 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.324677 kubelet[3290]: E0916 05:00:55.324579 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.325154 kubelet[3290]: E0916 05:00:55.325031 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.325270 kubelet[3290]: W0916 05:00:55.325155 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.325270 kubelet[3290]: E0916 05:00:55.325180 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.325711 kubelet[3290]: E0916 05:00:55.325681 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.325824 kubelet[3290]: W0916 05:00:55.325713 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.325824 kubelet[3290]: E0916 05:00:55.325738 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.326290 kubelet[3290]: E0916 05:00:55.326259 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.326453 kubelet[3290]: W0916 05:00:55.326290 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.326453 kubelet[3290]: E0916 05:00:55.326316 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.326850 kubelet[3290]: E0916 05:00:55.326822 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.326964 kubelet[3290]: W0916 05:00:55.326851 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.326964 kubelet[3290]: E0916 05:00:55.326878 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.327391 kubelet[3290]: E0916 05:00:55.327360 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.327489 kubelet[3290]: W0916 05:00:55.327391 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.327489 kubelet[3290]: E0916 05:00:55.327415 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.337181 kubelet[3290]: E0916 05:00:55.337099 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.337181 kubelet[3290]: W0916 05:00:55.337139 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.337181 kubelet[3290]: E0916 05:00:55.337174 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.337875 kubelet[3290]: E0916 05:00:55.337826 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.337875 kubelet[3290]: W0916 05:00:55.337863 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.338204 kubelet[3290]: E0916 05:00:55.337907 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.338551 kubelet[3290]: E0916 05:00:55.338492 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.338551 kubelet[3290]: W0916 05:00:55.338535 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.338792 kubelet[3290]: E0916 05:00:55.338580 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.339134 kubelet[3290]: E0916 05:00:55.339101 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.339134 kubelet[3290]: W0916 05:00:55.339130 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.339313 kubelet[3290]: E0916 05:00:55.339167 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.339622 kubelet[3290]: E0916 05:00:55.339596 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.339622 kubelet[3290]: W0916 05:00:55.339621 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.339800 kubelet[3290]: E0916 05:00:55.339742 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.340125 kubelet[3290]: E0916 05:00:55.340029 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.340125 kubelet[3290]: W0916 05:00:55.340093 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.340326 kubelet[3290]: E0916 05:00:55.340159 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.340551 kubelet[3290]: E0916 05:00:55.340497 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.340551 kubelet[3290]: W0916 05:00:55.340522 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.340729 kubelet[3290]: E0916 05:00:55.340582 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.341065 kubelet[3290]: E0916 05:00:55.340993 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.341065 kubelet[3290]: W0916 05:00:55.341022 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.341246 kubelet[3290]: E0916 05:00:55.341093 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.341709 kubelet[3290]: E0916 05:00:55.341656 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.341709 kubelet[3290]: W0916 05:00:55.341687 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.341887 kubelet[3290]: E0916 05:00:55.341729 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.342245 kubelet[3290]: E0916 05:00:55.342196 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.342245 kubelet[3290]: W0916 05:00:55.342230 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.342445 kubelet[3290]: E0916 05:00:55.342272 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.342805 kubelet[3290]: E0916 05:00:55.342724 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.342805 kubelet[3290]: W0916 05:00:55.342765 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.343119 kubelet[3290]: E0916 05:00:55.342826 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.343518 kubelet[3290]: E0916 05:00:55.343474 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.343687 kubelet[3290]: W0916 05:00:55.343517 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.343687 kubelet[3290]: E0916 05:00:55.343621 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.344130 kubelet[3290]: E0916 05:00:55.344084 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.344130 kubelet[3290]: W0916 05:00:55.344111 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.344436 kubelet[3290]: E0916 05:00:55.344170 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.344580 kubelet[3290]: E0916 05:00:55.344561 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.344728 kubelet[3290]: W0916 05:00:55.344588 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.344728 kubelet[3290]: E0916 05:00:55.344641 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.345080 kubelet[3290]: E0916 05:00:55.344969 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.345080 kubelet[3290]: W0916 05:00:55.344990 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.345080 kubelet[3290]: E0916 05:00:55.345021 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.345525 kubelet[3290]: E0916 05:00:55.345502 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.345673 kubelet[3290]: W0916 05:00:55.345529 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.345673 kubelet[3290]: E0916 05:00:55.345559 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.346130 kubelet[3290]: E0916 05:00:55.346083 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.346130 kubelet[3290]: W0916 05:00:55.346111 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.346463 kubelet[3290]: E0916 05:00:55.346137 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:55.346966 kubelet[3290]: E0916 05:00:55.346923 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:55.346966 kubelet[3290]: W0916 05:00:55.346948 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:55.346966 kubelet[3290]: E0916 05:00:55.346973 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.186110 kubelet[3290]: E0916 05:00:56.186004 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9dgtk" podUID="50bc1259-bb30-4849-97ed-c6fb8e4bcaf9" Sep 16 05:00:56.254779 kubelet[3290]: I0916 05:00:56.254688 3290 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 05:00:56.337236 kubelet[3290]: E0916 05:00:56.337155 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.337236 kubelet[3290]: W0916 05:00:56.337193 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.337236 kubelet[3290]: E0916 05:00:56.337227 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.338159 kubelet[3290]: E0916 05:00:56.337746 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.338159 kubelet[3290]: W0916 05:00:56.337779 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.338159 kubelet[3290]: E0916 05:00:56.337812 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.338450 kubelet[3290]: E0916 05:00:56.338363 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.338450 kubelet[3290]: W0916 05:00:56.338396 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.338450 kubelet[3290]: E0916 05:00:56.338427 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.338963 kubelet[3290]: E0916 05:00:56.338891 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.338963 kubelet[3290]: W0916 05:00:56.338920 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.338963 kubelet[3290]: E0916 05:00:56.338947 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.339517 kubelet[3290]: E0916 05:00:56.339432 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.339517 kubelet[3290]: W0916 05:00:56.339469 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.339517 kubelet[3290]: E0916 05:00:56.339501 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.340059 kubelet[3290]: E0916 05:00:56.340013 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.340216 kubelet[3290]: W0916 05:00:56.340064 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.340216 kubelet[3290]: E0916 05:00:56.340096 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.340602 kubelet[3290]: E0916 05:00:56.340531 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.340602 kubelet[3290]: W0916 05:00:56.340558 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.340602 kubelet[3290]: E0916 05:00:56.340585 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.341097 kubelet[3290]: E0916 05:00:56.341060 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.341097 kubelet[3290]: W0916 05:00:56.341085 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.341337 kubelet[3290]: E0916 05:00:56.341108 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.341620 kubelet[3290]: E0916 05:00:56.341555 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.341620 kubelet[3290]: W0916 05:00:56.341580 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.341620 kubelet[3290]: E0916 05:00:56.341602 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.342086 kubelet[3290]: E0916 05:00:56.342025 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.342086 kubelet[3290]: W0916 05:00:56.342067 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.342314 kubelet[3290]: E0916 05:00:56.342091 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.342609 kubelet[3290]: E0916 05:00:56.342579 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.342609 kubelet[3290]: W0916 05:00:56.342606 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.342779 kubelet[3290]: E0916 05:00:56.342632 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.343117 kubelet[3290]: E0916 05:00:56.343070 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.343117 kubelet[3290]: W0916 05:00:56.343094 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.343344 kubelet[3290]: E0916 05:00:56.343117 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.343657 kubelet[3290]: E0916 05:00:56.343588 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.343657 kubelet[3290]: W0916 05:00:56.343617 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.343657 kubelet[3290]: E0916 05:00:56.343642 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.344130 kubelet[3290]: E0916 05:00:56.344074 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.344130 kubelet[3290]: W0916 05:00:56.344097 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.344130 kubelet[3290]: E0916 05:00:56.344119 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.344644 kubelet[3290]: E0916 05:00:56.344579 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.344644 kubelet[3290]: W0916 05:00:56.344605 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.344644 kubelet[3290]: E0916 05:00:56.344631 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.348356 kubelet[3290]: E0916 05:00:56.348262 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.348356 kubelet[3290]: W0916 05:00:56.348300 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.348356 kubelet[3290]: E0916 05:00:56.348332 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.349015 kubelet[3290]: E0916 05:00:56.348922 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.349015 kubelet[3290]: W0916 05:00:56.348958 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.349015 kubelet[3290]: E0916 05:00:56.348999 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.349752 kubelet[3290]: E0916 05:00:56.349665 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.349752 kubelet[3290]: W0916 05:00:56.349700 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.349752 kubelet[3290]: E0916 05:00:56.349741 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.350403 kubelet[3290]: E0916 05:00:56.350318 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.350403 kubelet[3290]: W0916 05:00:56.350354 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.350403 kubelet[3290]: E0916 05:00:56.350394 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.350984 kubelet[3290]: E0916 05:00:56.350893 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.350984 kubelet[3290]: W0916 05:00:56.350928 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.351287 kubelet[3290]: E0916 05:00:56.351058 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.351558 kubelet[3290]: E0916 05:00:56.351485 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.351558 kubelet[3290]: W0916 05:00:56.351515 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.351831 kubelet[3290]: E0916 05:00:56.351638 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.352064 kubelet[3290]: E0916 05:00:56.352007 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.352201 kubelet[3290]: W0916 05:00:56.352065 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.352316 kubelet[3290]: E0916 05:00:56.352190 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.352685 kubelet[3290]: E0916 05:00:56.352615 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.352685 kubelet[3290]: W0916 05:00:56.352644 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.352685 kubelet[3290]: E0916 05:00:56.352679 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.353435 kubelet[3290]: E0916 05:00:56.353379 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.353435 kubelet[3290]: W0916 05:00:56.353416 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.353660 kubelet[3290]: E0916 05:00:56.353456 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.354079 kubelet[3290]: E0916 05:00:56.353994 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.354079 kubelet[3290]: W0916 05:00:56.354022 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.354336 kubelet[3290]: E0916 05:00:56.354123 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.354670 kubelet[3290]: E0916 05:00:56.354594 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.354670 kubelet[3290]: W0916 05:00:56.354630 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.354940 kubelet[3290]: E0916 05:00:56.354707 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.355178 kubelet[3290]: E0916 05:00:56.355108 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.355178 kubelet[3290]: W0916 05:00:56.355134 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.355444 kubelet[3290]: E0916 05:00:56.355239 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.355715 kubelet[3290]: E0916 05:00:56.355623 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.355715 kubelet[3290]: W0916 05:00:56.355652 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.355992 kubelet[3290]: E0916 05:00:56.355777 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.356156 kubelet[3290]: E0916 05:00:56.356128 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.356156 kubelet[3290]: W0916 05:00:56.356153 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.356361 kubelet[3290]: E0916 05:00:56.356185 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.356768 kubelet[3290]: E0916 05:00:56.356681 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.356768 kubelet[3290]: W0916 05:00:56.356708 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.356768 kubelet[3290]: E0916 05:00:56.356741 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.357573 kubelet[3290]: E0916 05:00:56.357474 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.357573 kubelet[3290]: W0916 05:00:56.357517 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.357573 kubelet[3290]: E0916 05:00:56.357562 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.358225 kubelet[3290]: E0916 05:00:56.358137 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.358225 kubelet[3290]: W0916 05:00:56.358166 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.358225 kubelet[3290]: E0916 05:00:56.358200 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:56.358768 kubelet[3290]: E0916 05:00:56.358680 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:00:56.358768 kubelet[3290]: W0916 05:00:56.358708 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:00:56.358768 kubelet[3290]: E0916 05:00:56.358734 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:00:58.185615 kubelet[3290]: E0916 05:00:58.185520 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9dgtk" podUID="50bc1259-bb30-4849-97ed-c6fb8e4bcaf9" Sep 16 05:00:59.367441 containerd[1918]: time="2025-09-16T05:00:59.367385936Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:59.367665 containerd[1918]: time="2025-09-16T05:00:59.367571774Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 16 05:00:59.367960 containerd[1918]: time="2025-09-16T05:00:59.367920568Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:59.368734 containerd[1918]: time="2025-09-16T05:00:59.368693643Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:00:59.369104 containerd[1918]: time="2025-09-16T05:00:59.369072400Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 4.322474986s" Sep 16 05:00:59.369132 containerd[1918]: time="2025-09-16T05:00:59.369104148Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 16 05:00:59.369991 containerd[1918]: time="2025-09-16T05:00:59.369977818Z" level=info msg="CreateContainer within sandbox \"eade86b7dcdd496a4f1ec4c4ea8bc7c70df1f835f7998909c85c682656ac5dd9\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 16 05:00:59.373292 containerd[1918]: time="2025-09-16T05:00:59.373280148Z" level=info msg="Container e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:00:59.377236 containerd[1918]: time="2025-09-16T05:00:59.377222821Z" level=info msg="CreateContainer within sandbox \"eade86b7dcdd496a4f1ec4c4ea8bc7c70df1f835f7998909c85c682656ac5dd9\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564\"" Sep 16 05:00:59.377466 containerd[1918]: time="2025-09-16T05:00:59.377453958Z" level=info msg="StartContainer for \"e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564\"" Sep 16 05:00:59.378177 containerd[1918]: time="2025-09-16T05:00:59.378166317Z" level=info msg="connecting to shim e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564" address="unix:///run/containerd/s/9bbd8fd931a8a53822eaa45a5f8dd06d454dc09db316bb5c21fd3eafb929b26d" protocol=ttrpc version=3 Sep 16 05:00:59.402175 systemd[1]: Started cri-containerd-e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564.scope - libcontainer container e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564. Sep 16 05:00:59.426132 containerd[1918]: time="2025-09-16T05:00:59.426107571Z" level=info msg="StartContainer for \"e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564\" returns successfully" Sep 16 05:00:59.431109 systemd[1]: cri-containerd-e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564.scope: Deactivated successfully. Sep 16 05:00:59.432378 containerd[1918]: time="2025-09-16T05:00:59.432355427Z" level=info msg="received exit event container_id:\"e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564\" id:\"e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564\" pid:4200 exited_at:{seconds:1757998859 nanos:432168549}" Sep 16 05:00:59.432429 containerd[1918]: time="2025-09-16T05:00:59.432408734Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564\" id:\"e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564\" pid:4200 exited_at:{seconds:1757998859 nanos:432168549}" Sep 16 05:00:59.446023 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564-rootfs.mount: Deactivated successfully. Sep 16 05:01:00.186231 kubelet[3290]: E0916 05:01:00.186132 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9dgtk" podUID="50bc1259-bb30-4849-97ed-c6fb8e4bcaf9" Sep 16 05:01:00.269109 containerd[1918]: time="2025-09-16T05:01:00.269001700Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 16 05:01:02.185666 kubelet[3290]: E0916 05:01:02.185584 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9dgtk" podUID="50bc1259-bb30-4849-97ed-c6fb8e4bcaf9" Sep 16 05:01:04.186809 kubelet[3290]: E0916 05:01:04.186687 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9dgtk" podUID="50bc1259-bb30-4849-97ed-c6fb8e4bcaf9" Sep 16 05:01:06.000861 kubelet[3290]: I0916 05:01:06.000758 3290 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 05:01:06.185481 kubelet[3290]: E0916 05:01:06.185394 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9dgtk" podUID="50bc1259-bb30-4849-97ed-c6fb8e4bcaf9" Sep 16 05:01:07.128609 containerd[1918]: time="2025-09-16T05:01:07.128583357Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:07.128866 containerd[1918]: time="2025-09-16T05:01:07.128826346Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 16 05:01:07.129264 containerd[1918]: time="2025-09-16T05:01:07.129250999Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:07.130137 containerd[1918]: time="2025-09-16T05:01:07.130122529Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:07.130497 containerd[1918]: time="2025-09-16T05:01:07.130483167Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 6.861386265s" Sep 16 05:01:07.130538 containerd[1918]: time="2025-09-16T05:01:07.130499508Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 16 05:01:07.131442 containerd[1918]: time="2025-09-16T05:01:07.131427748Z" level=info msg="CreateContainer within sandbox \"eade86b7dcdd496a4f1ec4c4ea8bc7c70df1f835f7998909c85c682656ac5dd9\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 16 05:01:07.134851 containerd[1918]: time="2025-09-16T05:01:07.134810390Z" level=info msg="Container 462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:01:07.138291 containerd[1918]: time="2025-09-16T05:01:07.138278019Z" level=info msg="CreateContainer within sandbox \"eade86b7dcdd496a4f1ec4c4ea8bc7c70df1f835f7998909c85c682656ac5dd9\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e\"" Sep 16 05:01:07.138524 containerd[1918]: time="2025-09-16T05:01:07.138482160Z" level=info msg="StartContainer for \"462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e\"" Sep 16 05:01:07.139495 containerd[1918]: time="2025-09-16T05:01:07.139454042Z" level=info msg="connecting to shim 462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e" address="unix:///run/containerd/s/9bbd8fd931a8a53822eaa45a5f8dd06d454dc09db316bb5c21fd3eafb929b26d" protocol=ttrpc version=3 Sep 16 05:01:07.161241 systemd[1]: Started cri-containerd-462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e.scope - libcontainer container 462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e. Sep 16 05:01:07.181616 containerd[1918]: time="2025-09-16T05:01:07.181593746Z" level=info msg="StartContainer for \"462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e\" returns successfully" Sep 16 05:01:07.770520 containerd[1918]: time="2025-09-16T05:01:07.770487822Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 16 05:01:07.771574 systemd[1]: cri-containerd-462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e.scope: Deactivated successfully. Sep 16 05:01:07.771758 systemd[1]: cri-containerd-462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e.scope: Consumed 372ms CPU time, 193.4M memory peak, 171.3M written to disk. Sep 16 05:01:07.772347 containerd[1918]: time="2025-09-16T05:01:07.772328877Z" level=info msg="received exit event container_id:\"462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e\" id:\"462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e\" pid:4264 exited_at:{seconds:1757998867 nanos:772150177}" Sep 16 05:01:07.772414 containerd[1918]: time="2025-09-16T05:01:07.772335756Z" level=info msg="TaskExit event in podsandbox handler container_id:\"462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e\" id:\"462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e\" pid:4264 exited_at:{seconds:1757998867 nanos:772150177}" Sep 16 05:01:07.787145 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e-rootfs.mount: Deactivated successfully. Sep 16 05:01:07.875256 kubelet[3290]: I0916 05:01:07.875183 3290 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 16 05:01:07.928204 systemd[1]: Created slice kubepods-burstable-pod8b8410d9_6f65_4188_9142_9e6482b4999b.slice - libcontainer container kubepods-burstable-pod8b8410d9_6f65_4188_9142_9e6482b4999b.slice. Sep 16 05:01:07.935013 systemd[1]: Created slice kubepods-besteffort-pod767fa079_891d_454b_9bcf_96567c6c98e4.slice - libcontainer container kubepods-besteffort-pod767fa079_891d_454b_9bcf_96567c6c98e4.slice. Sep 16 05:01:07.940329 systemd[1]: Created slice kubepods-burstable-podb48014a9_760b_4104_b099_afc0fb5f5cd5.slice - libcontainer container kubepods-burstable-podb48014a9_760b_4104_b099_afc0fb5f5cd5.slice. Sep 16 05:01:07.944110 kubelet[3290]: I0916 05:01:07.944067 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ac107799-03db-4410-b2fb-2b6d2c05b1d8-calico-apiserver-certs\") pod \"calico-apiserver-75499dcbcd-phsx9\" (UID: \"ac107799-03db-4410-b2fb-2b6d2c05b1d8\") " pod="calico-apiserver/calico-apiserver-75499dcbcd-phsx9" Sep 16 05:01:07.944110 kubelet[3290]: I0916 05:01:07.944098 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmftl\" (UniqueName: \"kubernetes.io/projected/bece35df-f07e-4c73-80a2-0b5b3228348f-kube-api-access-cmftl\") pod \"calico-kube-controllers-89f5b8df-cq9w2\" (UID: \"bece35df-f07e-4c73-80a2-0b5b3228348f\") " pod="calico-system/calico-kube-controllers-89f5b8df-cq9w2" Sep 16 05:01:07.958103 kubelet[3290]: I0916 05:01:07.944118 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b48014a9-760b-4104-b099-afc0fb5f5cd5-config-volume\") pod \"coredns-7c65d6cfc9-tgnt6\" (UID: \"b48014a9-760b-4104-b099-afc0fb5f5cd5\") " pod="kube-system/coredns-7c65d6cfc9-tgnt6" Sep 16 05:01:07.958103 kubelet[3290]: I0916 05:01:07.944366 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/767fa079-891d-454b-9bcf-96567c6c98e4-config\") pod \"goldmane-7988f88666-7wtzg\" (UID: \"767fa079-891d-454b-9bcf-96567c6c98e4\") " pod="calico-system/goldmane-7988f88666-7wtzg" Sep 16 05:01:07.958103 kubelet[3290]: I0916 05:01:07.944403 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9m9d\" (UniqueName: \"kubernetes.io/projected/2aca797e-97f5-4c40-8082-71bf715e6951-kube-api-access-j9m9d\") pod \"calico-apiserver-75499dcbcd-cz8mk\" (UID: \"2aca797e-97f5-4c40-8082-71bf715e6951\") " pod="calico-apiserver/calico-apiserver-75499dcbcd-cz8mk" Sep 16 05:01:07.958103 kubelet[3290]: I0916 05:01:07.944431 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slxp5\" (UniqueName: \"kubernetes.io/projected/9bee220d-c0ad-4dc5-b18f-54982efd83d3-kube-api-access-slxp5\") pod \"whisker-5fbf7c7878-rvlwq\" (UID: \"9bee220d-c0ad-4dc5-b18f-54982efd83d3\") " pod="calico-system/whisker-5fbf7c7878-rvlwq" Sep 16 05:01:07.958103 kubelet[3290]: I0916 05:01:07.944539 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl44n\" (UniqueName: \"kubernetes.io/projected/b48014a9-760b-4104-b099-afc0fb5f5cd5-kube-api-access-cl44n\") pod \"coredns-7c65d6cfc9-tgnt6\" (UID: \"b48014a9-760b-4104-b099-afc0fb5f5cd5\") " pod="kube-system/coredns-7c65d6cfc9-tgnt6" Sep 16 05:01:07.944894 systemd[1]: Created slice kubepods-besteffort-podbece35df_f07e_4c73_80a2_0b5b3228348f.slice - libcontainer container kubepods-besteffort-podbece35df_f07e_4c73_80a2_0b5b3228348f.slice. Sep 16 05:01:07.958360 kubelet[3290]: I0916 05:01:07.944600 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qztlb\" (UniqueName: \"kubernetes.io/projected/8b8410d9-6f65-4188-9142-9e6482b4999b-kube-api-access-qztlb\") pod \"coredns-7c65d6cfc9-xdnnv\" (UID: \"8b8410d9-6f65-4188-9142-9e6482b4999b\") " pod="kube-system/coredns-7c65d6cfc9-xdnnv" Sep 16 05:01:07.958360 kubelet[3290]: I0916 05:01:07.944632 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bece35df-f07e-4c73-80a2-0b5b3228348f-tigera-ca-bundle\") pod \"calico-kube-controllers-89f5b8df-cq9w2\" (UID: \"bece35df-f07e-4c73-80a2-0b5b3228348f\") " pod="calico-system/calico-kube-controllers-89f5b8df-cq9w2" Sep 16 05:01:07.958360 kubelet[3290]: I0916 05:01:07.944651 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/767fa079-891d-454b-9bcf-96567c6c98e4-goldmane-ca-bundle\") pod \"goldmane-7988f88666-7wtzg\" (UID: \"767fa079-891d-454b-9bcf-96567c6c98e4\") " pod="calico-system/goldmane-7988f88666-7wtzg" Sep 16 05:01:07.958360 kubelet[3290]: I0916 05:01:07.944671 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/767fa079-891d-454b-9bcf-96567c6c98e4-goldmane-key-pair\") pod \"goldmane-7988f88666-7wtzg\" (UID: \"767fa079-891d-454b-9bcf-96567c6c98e4\") " pod="calico-system/goldmane-7988f88666-7wtzg" Sep 16 05:01:07.958360 kubelet[3290]: I0916 05:01:07.944726 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxpnf\" (UniqueName: \"kubernetes.io/projected/767fa079-891d-454b-9bcf-96567c6c98e4-kube-api-access-gxpnf\") pod \"goldmane-7988f88666-7wtzg\" (UID: \"767fa079-891d-454b-9bcf-96567c6c98e4\") " pod="calico-system/goldmane-7988f88666-7wtzg" Sep 16 05:01:07.948939 systemd[1]: Created slice kubepods-besteffort-pod2aca797e_97f5_4c40_8082_71bf715e6951.slice - libcontainer container kubepods-besteffort-pod2aca797e_97f5_4c40_8082_71bf715e6951.slice. Sep 16 05:01:07.958542 kubelet[3290]: I0916 05:01:07.944800 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9qqq\" (UniqueName: \"kubernetes.io/projected/ac107799-03db-4410-b2fb-2b6d2c05b1d8-kube-api-access-g9qqq\") pod \"calico-apiserver-75499dcbcd-phsx9\" (UID: \"ac107799-03db-4410-b2fb-2b6d2c05b1d8\") " pod="calico-apiserver/calico-apiserver-75499dcbcd-phsx9" Sep 16 05:01:07.958542 kubelet[3290]: I0916 05:01:07.944865 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2aca797e-97f5-4c40-8082-71bf715e6951-calico-apiserver-certs\") pod \"calico-apiserver-75499dcbcd-cz8mk\" (UID: \"2aca797e-97f5-4c40-8082-71bf715e6951\") " pod="calico-apiserver/calico-apiserver-75499dcbcd-cz8mk" Sep 16 05:01:07.958542 kubelet[3290]: I0916 05:01:07.944906 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9bee220d-c0ad-4dc5-b18f-54982efd83d3-whisker-backend-key-pair\") pod \"whisker-5fbf7c7878-rvlwq\" (UID: \"9bee220d-c0ad-4dc5-b18f-54982efd83d3\") " pod="calico-system/whisker-5fbf7c7878-rvlwq" Sep 16 05:01:07.958542 kubelet[3290]: I0916 05:01:07.944943 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bee220d-c0ad-4dc5-b18f-54982efd83d3-whisker-ca-bundle\") pod \"whisker-5fbf7c7878-rvlwq\" (UID: \"9bee220d-c0ad-4dc5-b18f-54982efd83d3\") " pod="calico-system/whisker-5fbf7c7878-rvlwq" Sep 16 05:01:07.958542 kubelet[3290]: I0916 05:01:07.944982 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8b8410d9-6f65-4188-9142-9e6482b4999b-config-volume\") pod \"coredns-7c65d6cfc9-xdnnv\" (UID: \"8b8410d9-6f65-4188-9142-9e6482b4999b\") " pod="kube-system/coredns-7c65d6cfc9-xdnnv" Sep 16 05:01:07.952584 systemd[1]: Created slice kubepods-besteffort-podac107799_03db_4410_b2fb_2b6d2c05b1d8.slice - libcontainer container kubepods-besteffort-podac107799_03db_4410_b2fb_2b6d2c05b1d8.slice. Sep 16 05:01:07.955302 systemd[1]: Created slice kubepods-besteffort-pod9bee220d_c0ad_4dc5_b18f_54982efd83d3.slice - libcontainer container kubepods-besteffort-pod9bee220d_c0ad_4dc5_b18f_54982efd83d3.slice. Sep 16 05:01:08.200422 systemd[1]: Created slice kubepods-besteffort-pod50bc1259_bb30_4849_97ed_c6fb8e4bcaf9.slice - libcontainer container kubepods-besteffort-pod50bc1259_bb30_4849_97ed_c6fb8e4bcaf9.slice. Sep 16 05:01:08.206289 containerd[1918]: time="2025-09-16T05:01:08.206177035Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9dgtk,Uid:50bc1259-bb30-4849-97ed-c6fb8e4bcaf9,Namespace:calico-system,Attempt:0,}" Sep 16 05:01:08.231948 containerd[1918]: time="2025-09-16T05:01:08.231918890Z" level=error msg="Failed to destroy network for sandbox \"484ccaeef5f855a85b224c916736166d3b894e505b73fde3adb9867b9534cc34\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.232492 containerd[1918]: time="2025-09-16T05:01:08.232450162Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9dgtk,Uid:50bc1259-bb30-4849-97ed-c6fb8e4bcaf9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"484ccaeef5f855a85b224c916736166d3b894e505b73fde3adb9867b9534cc34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.232655 kubelet[3290]: E0916 05:01:08.232606 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"484ccaeef5f855a85b224c916736166d3b894e505b73fde3adb9867b9534cc34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.232691 kubelet[3290]: E0916 05:01:08.232652 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"484ccaeef5f855a85b224c916736166d3b894e505b73fde3adb9867b9534cc34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9dgtk" Sep 16 05:01:08.232691 kubelet[3290]: E0916 05:01:08.232670 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"484ccaeef5f855a85b224c916736166d3b894e505b73fde3adb9867b9534cc34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9dgtk" Sep 16 05:01:08.232728 kubelet[3290]: E0916 05:01:08.232700 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9dgtk_calico-system(50bc1259-bb30-4849-97ed-c6fb8e4bcaf9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9dgtk_calico-system(50bc1259-bb30-4849-97ed-c6fb8e4bcaf9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"484ccaeef5f855a85b224c916736166d3b894e505b73fde3adb9867b9534cc34\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9dgtk" podUID="50bc1259-bb30-4849-97ed-c6fb8e4bcaf9" Sep 16 05:01:08.232960 containerd[1918]: time="2025-09-16T05:01:08.232945629Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xdnnv,Uid:8b8410d9-6f65-4188-9142-9e6482b4999b,Namespace:kube-system,Attempt:0,}" Sep 16 05:01:08.233382 systemd[1]: run-netns-cni\x2dd38eb74a\x2d1a89\x2d8d1c\x2dba4d\x2db87eb9efb082.mount: Deactivated successfully. Sep 16 05:01:08.256805 containerd[1918]: time="2025-09-16T05:01:08.256774814Z" level=error msg="Failed to destroy network for sandbox \"481c86bc1ab7fb256e6f3151bcea60e0b21c8eaed3e49051572a40bebeb0ee81\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.257353 containerd[1918]: time="2025-09-16T05:01:08.257304977Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xdnnv,Uid:8b8410d9-6f65-4188-9142-9e6482b4999b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"481c86bc1ab7fb256e6f3151bcea60e0b21c8eaed3e49051572a40bebeb0ee81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.257477 kubelet[3290]: E0916 05:01:08.257453 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"481c86bc1ab7fb256e6f3151bcea60e0b21c8eaed3e49051572a40bebeb0ee81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.257525 kubelet[3290]: E0916 05:01:08.257490 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"481c86bc1ab7fb256e6f3151bcea60e0b21c8eaed3e49051572a40bebeb0ee81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-xdnnv" Sep 16 05:01:08.257525 kubelet[3290]: E0916 05:01:08.257504 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"481c86bc1ab7fb256e6f3151bcea60e0b21c8eaed3e49051572a40bebeb0ee81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-xdnnv" Sep 16 05:01:08.257561 kubelet[3290]: E0916 05:01:08.257529 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-xdnnv_kube-system(8b8410d9-6f65-4188-9142-9e6482b4999b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-xdnnv_kube-system(8b8410d9-6f65-4188-9142-9e6482b4999b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"481c86bc1ab7fb256e6f3151bcea60e0b21c8eaed3e49051572a40bebeb0ee81\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-xdnnv" podUID="8b8410d9-6f65-4188-9142-9e6482b4999b" Sep 16 05:01:08.258382 containerd[1918]: time="2025-09-16T05:01:08.258368550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-tgnt6,Uid:b48014a9-760b-4104-b099-afc0fb5f5cd5,Namespace:kube-system,Attempt:0,}" Sep 16 05:01:08.258443 containerd[1918]: time="2025-09-16T05:01:08.258371691Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-89f5b8df-cq9w2,Uid:bece35df-f07e-4c73-80a2-0b5b3228348f,Namespace:calico-system,Attempt:0,}" Sep 16 05:01:08.258476 containerd[1918]: time="2025-09-16T05:01:08.258451928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5fbf7c7878-rvlwq,Uid:9bee220d-c0ad-4dc5-b18f-54982efd83d3,Namespace:calico-system,Attempt:0,}" Sep 16 05:01:08.258509 containerd[1918]: time="2025-09-16T05:01:08.258495901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-7wtzg,Uid:767fa079-891d-454b-9bcf-96567c6c98e4,Namespace:calico-system,Attempt:0,}" Sep 16 05:01:08.258532 containerd[1918]: time="2025-09-16T05:01:08.258520320Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75499dcbcd-phsx9,Uid:ac107799-03db-4410-b2fb-2b6d2c05b1d8,Namespace:calico-apiserver,Attempt:0,}" Sep 16 05:01:08.258554 containerd[1918]: time="2025-09-16T05:01:08.258502917Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75499dcbcd-cz8mk,Uid:2aca797e-97f5-4c40-8082-71bf715e6951,Namespace:calico-apiserver,Attempt:0,}" Sep 16 05:01:08.287069 containerd[1918]: time="2025-09-16T05:01:08.287023780Z" level=error msg="Failed to destroy network for sandbox \"dd9570994166e46ee2478026aae02e34ee3b77134330484b825e4900db2e0954\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.287452 containerd[1918]: time="2025-09-16T05:01:08.287435662Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-tgnt6,Uid:b48014a9-760b-4104-b099-afc0fb5f5cd5,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd9570994166e46ee2478026aae02e34ee3b77134330484b825e4900db2e0954\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.287552 containerd[1918]: time="2025-09-16T05:01:08.287532678Z" level=error msg="Failed to destroy network for sandbox \"8e298e3a04d7f045658d57974e9aebb841c29742d1c784e9805aa8cb5fb84975\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.287608 kubelet[3290]: E0916 05:01:08.287582 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd9570994166e46ee2478026aae02e34ee3b77134330484b825e4900db2e0954\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.287651 kubelet[3290]: E0916 05:01:08.287627 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd9570994166e46ee2478026aae02e34ee3b77134330484b825e4900db2e0954\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-tgnt6" Sep 16 05:01:08.287651 kubelet[3290]: E0916 05:01:08.287641 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd9570994166e46ee2478026aae02e34ee3b77134330484b825e4900db2e0954\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-tgnt6" Sep 16 05:01:08.287705 kubelet[3290]: E0916 05:01:08.287668 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-tgnt6_kube-system(b48014a9-760b-4104-b099-afc0fb5f5cd5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-tgnt6_kube-system(b48014a9-760b-4104-b099-afc0fb5f5cd5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dd9570994166e46ee2478026aae02e34ee3b77134330484b825e4900db2e0954\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-tgnt6" podUID="b48014a9-760b-4104-b099-afc0fb5f5cd5" Sep 16 05:01:08.287930 containerd[1918]: time="2025-09-16T05:01:08.287912966Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5fbf7c7878-rvlwq,Uid:9bee220d-c0ad-4dc5-b18f-54982efd83d3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e298e3a04d7f045658d57974e9aebb841c29742d1c784e9805aa8cb5fb84975\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.288006 kubelet[3290]: E0916 05:01:08.287993 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e298e3a04d7f045658d57974e9aebb841c29742d1c784e9805aa8cb5fb84975\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.288028 kubelet[3290]: E0916 05:01:08.288013 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e298e3a04d7f045658d57974e9aebb841c29742d1c784e9805aa8cb5fb84975\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5fbf7c7878-rvlwq" Sep 16 05:01:08.288028 kubelet[3290]: E0916 05:01:08.288025 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e298e3a04d7f045658d57974e9aebb841c29742d1c784e9805aa8cb5fb84975\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5fbf7c7878-rvlwq" Sep 16 05:01:08.288075 kubelet[3290]: E0916 05:01:08.288054 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5fbf7c7878-rvlwq_calico-system(9bee220d-c0ad-4dc5-b18f-54982efd83d3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5fbf7c7878-rvlwq_calico-system(9bee220d-c0ad-4dc5-b18f-54982efd83d3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8e298e3a04d7f045658d57974e9aebb841c29742d1c784e9805aa8cb5fb84975\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5fbf7c7878-rvlwq" podUID="9bee220d-c0ad-4dc5-b18f-54982efd83d3" Sep 16 05:01:08.288739 containerd[1918]: time="2025-09-16T05:01:08.288723344Z" level=error msg="Failed to destroy network for sandbox \"583def49f1434935f64d7c8cd06c02de71b65a730e385130a3fda8edacf0fceb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.289096 containerd[1918]: time="2025-09-16T05:01:08.289078850Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-7wtzg,Uid:767fa079-891d-454b-9bcf-96567c6c98e4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"583def49f1434935f64d7c8cd06c02de71b65a730e385130a3fda8edacf0fceb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.289189 kubelet[3290]: E0916 05:01:08.289173 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"583def49f1434935f64d7c8cd06c02de71b65a730e385130a3fda8edacf0fceb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.289231 kubelet[3290]: E0916 05:01:08.289195 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"583def49f1434935f64d7c8cd06c02de71b65a730e385130a3fda8edacf0fceb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-7wtzg" Sep 16 05:01:08.289231 kubelet[3290]: E0916 05:01:08.289208 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"583def49f1434935f64d7c8cd06c02de71b65a730e385130a3fda8edacf0fceb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-7wtzg" Sep 16 05:01:08.289294 kubelet[3290]: E0916 05:01:08.289227 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-7wtzg_calico-system(767fa079-891d-454b-9bcf-96567c6c98e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-7wtzg_calico-system(767fa079-891d-454b-9bcf-96567c6c98e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"583def49f1434935f64d7c8cd06c02de71b65a730e385130a3fda8edacf0fceb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-7wtzg" podUID="767fa079-891d-454b-9bcf-96567c6c98e4" Sep 16 05:01:08.289682 containerd[1918]: time="2025-09-16T05:01:08.289666788Z" level=error msg="Failed to destroy network for sandbox \"c39983d4adf5db0d724716100d3be4e2d194e9cf26ecb32ceb6ca71c86f006a8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.290059 containerd[1918]: time="2025-09-16T05:01:08.290032632Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75499dcbcd-phsx9,Uid:ac107799-03db-4410-b2fb-2b6d2c05b1d8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c39983d4adf5db0d724716100d3be4e2d194e9cf26ecb32ceb6ca71c86f006a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.290123 kubelet[3290]: E0916 05:01:08.290110 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c39983d4adf5db0d724716100d3be4e2d194e9cf26ecb32ceb6ca71c86f006a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.290153 kubelet[3290]: E0916 05:01:08.290134 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c39983d4adf5db0d724716100d3be4e2d194e9cf26ecb32ceb6ca71c86f006a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75499dcbcd-phsx9" Sep 16 05:01:08.290153 kubelet[3290]: E0916 05:01:08.290144 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c39983d4adf5db0d724716100d3be4e2d194e9cf26ecb32ceb6ca71c86f006a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75499dcbcd-phsx9" Sep 16 05:01:08.290192 kubelet[3290]: E0916 05:01:08.290163 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-75499dcbcd-phsx9_calico-apiserver(ac107799-03db-4410-b2fb-2b6d2c05b1d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-75499dcbcd-phsx9_calico-apiserver(ac107799-03db-4410-b2fb-2b6d2c05b1d8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c39983d4adf5db0d724716100d3be4e2d194e9cf26ecb32ceb6ca71c86f006a8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-75499dcbcd-phsx9" podUID="ac107799-03db-4410-b2fb-2b6d2c05b1d8" Sep 16 05:01:08.290612 containerd[1918]: time="2025-09-16T05:01:08.290595085Z" level=error msg="Failed to destroy network for sandbox \"a82a0462f3e31b270b5d8e37267d1e2258bdf02fbb57849eb06d00df84cbe1ea\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.290992 containerd[1918]: time="2025-09-16T05:01:08.290973908Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-89f5b8df-cq9w2,Uid:bece35df-f07e-4c73-80a2-0b5b3228348f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a82a0462f3e31b270b5d8e37267d1e2258bdf02fbb57849eb06d00df84cbe1ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.291072 kubelet[3290]: E0916 05:01:08.291055 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a82a0462f3e31b270b5d8e37267d1e2258bdf02fbb57849eb06d00df84cbe1ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.291109 kubelet[3290]: E0916 05:01:08.291079 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a82a0462f3e31b270b5d8e37267d1e2258bdf02fbb57849eb06d00df84cbe1ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-89f5b8df-cq9w2" Sep 16 05:01:08.291109 kubelet[3290]: E0916 05:01:08.291090 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a82a0462f3e31b270b5d8e37267d1e2258bdf02fbb57849eb06d00df84cbe1ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-89f5b8df-cq9w2" Sep 16 05:01:08.291152 kubelet[3290]: E0916 05:01:08.291114 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-89f5b8df-cq9w2_calico-system(bece35df-f07e-4c73-80a2-0b5b3228348f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-89f5b8df-cq9w2_calico-system(bece35df-f07e-4c73-80a2-0b5b3228348f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a82a0462f3e31b270b5d8e37267d1e2258bdf02fbb57849eb06d00df84cbe1ea\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-89f5b8df-cq9w2" podUID="bece35df-f07e-4c73-80a2-0b5b3228348f" Sep 16 05:01:08.293210 containerd[1918]: time="2025-09-16T05:01:08.293191797Z" level=error msg="Failed to destroy network for sandbox \"0734f62edca9f8661c34e483eef785646d5561067736c19af8d3c876d7aa37c1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.293591 containerd[1918]: time="2025-09-16T05:01:08.293572099Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75499dcbcd-cz8mk,Uid:2aca797e-97f5-4c40-8082-71bf715e6951,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0734f62edca9f8661c34e483eef785646d5561067736c19af8d3c876d7aa37c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.293656 kubelet[3290]: E0916 05:01:08.293643 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0734f62edca9f8661c34e483eef785646d5561067736c19af8d3c876d7aa37c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:01:08.293685 kubelet[3290]: E0916 05:01:08.293665 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0734f62edca9f8661c34e483eef785646d5561067736c19af8d3c876d7aa37c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75499dcbcd-cz8mk" Sep 16 05:01:08.293685 kubelet[3290]: E0916 05:01:08.293677 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0734f62edca9f8661c34e483eef785646d5561067736c19af8d3c876d7aa37c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75499dcbcd-cz8mk" Sep 16 05:01:08.293723 kubelet[3290]: E0916 05:01:08.293703 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-75499dcbcd-cz8mk_calico-apiserver(2aca797e-97f5-4c40-8082-71bf715e6951)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-75499dcbcd-cz8mk_calico-apiserver(2aca797e-97f5-4c40-8082-71bf715e6951)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0734f62edca9f8661c34e483eef785646d5561067736c19af8d3c876d7aa37c1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-75499dcbcd-cz8mk" podUID="2aca797e-97f5-4c40-8082-71bf715e6951" Sep 16 05:01:08.293812 containerd[1918]: time="2025-09-16T05:01:08.293803349Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 16 05:01:09.151209 systemd[1]: run-netns-cni\x2deb236b32\x2d0271\x2df75a\x2d3b7e\x2dc74ad0b61f32.mount: Deactivated successfully. Sep 16 05:01:14.168038 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1058109356.mount: Deactivated successfully. Sep 16 05:01:14.185421 containerd[1918]: time="2025-09-16T05:01:14.185373254Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:14.185587 containerd[1918]: time="2025-09-16T05:01:14.185558954Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 16 05:01:14.185970 containerd[1918]: time="2025-09-16T05:01:14.185928256Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:14.186738 containerd[1918]: time="2025-09-16T05:01:14.186697298Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:14.187118 containerd[1918]: time="2025-09-16T05:01:14.187067982Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 5.893250448s" Sep 16 05:01:14.187118 containerd[1918]: time="2025-09-16T05:01:14.187082273Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 16 05:01:14.190624 containerd[1918]: time="2025-09-16T05:01:14.190578691Z" level=info msg="CreateContainer within sandbox \"eade86b7dcdd496a4f1ec4c4ea8bc7c70df1f835f7998909c85c682656ac5dd9\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 16 05:01:14.194950 containerd[1918]: time="2025-09-16T05:01:14.194905938Z" level=info msg="Container 8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:01:14.199228 containerd[1918]: time="2025-09-16T05:01:14.199187129Z" level=info msg="CreateContainer within sandbox \"eade86b7dcdd496a4f1ec4c4ea8bc7c70df1f835f7998909c85c682656ac5dd9\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\"" Sep 16 05:01:14.199499 containerd[1918]: time="2025-09-16T05:01:14.199480294Z" level=info msg="StartContainer for \"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\"" Sep 16 05:01:14.200255 containerd[1918]: time="2025-09-16T05:01:14.200218698Z" level=info msg="connecting to shim 8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247" address="unix:///run/containerd/s/9bbd8fd931a8a53822eaa45a5f8dd06d454dc09db316bb5c21fd3eafb929b26d" protocol=ttrpc version=3 Sep 16 05:01:14.220308 systemd[1]: Started cri-containerd-8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247.scope - libcontainer container 8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247. Sep 16 05:01:14.245128 containerd[1918]: time="2025-09-16T05:01:14.245076670Z" level=info msg="StartContainer for \"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" returns successfully" Sep 16 05:01:14.307568 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 16 05:01:14.307627 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 16 05:01:14.326060 kubelet[3290]: I0916 05:01:14.325929 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-dxtrl" podStartSLOduration=1.092307596 podStartE2EDuration="24.325915398s" podCreationTimestamp="2025-09-16 05:00:50 +0000 UTC" firstStartedPulling="2025-09-16 05:00:50.953826958 +0000 UTC m=+16.812356205" lastFinishedPulling="2025-09-16 05:01:14.187434761 +0000 UTC m=+40.045964007" observedRunningTime="2025-09-16 05:01:14.325632848 +0000 UTC m=+40.184162098" watchObservedRunningTime="2025-09-16 05:01:14.325915398 +0000 UTC m=+40.184444642" Sep 16 05:01:14.366178 containerd[1918]: time="2025-09-16T05:01:14.366150520Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"f0141ea8bcf059b278d30f44791f9202336894f2bbece4265175a49a57032b88\" pid:4738 exit_status:1 exited_at:{seconds:1757998874 nanos:365942637}" Sep 16 05:01:14.386340 kubelet[3290]: I0916 05:01:14.386314 3290 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9bee220d-c0ad-4dc5-b18f-54982efd83d3-whisker-backend-key-pair\") pod \"9bee220d-c0ad-4dc5-b18f-54982efd83d3\" (UID: \"9bee220d-c0ad-4dc5-b18f-54982efd83d3\") " Sep 16 05:01:14.386340 kubelet[3290]: I0916 05:01:14.386342 3290 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bee220d-c0ad-4dc5-b18f-54982efd83d3-whisker-ca-bundle\") pod \"9bee220d-c0ad-4dc5-b18f-54982efd83d3\" (UID: \"9bee220d-c0ad-4dc5-b18f-54982efd83d3\") " Sep 16 05:01:14.386431 kubelet[3290]: I0916 05:01:14.386355 3290 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slxp5\" (UniqueName: \"kubernetes.io/projected/9bee220d-c0ad-4dc5-b18f-54982efd83d3-kube-api-access-slxp5\") pod \"9bee220d-c0ad-4dc5-b18f-54982efd83d3\" (UID: \"9bee220d-c0ad-4dc5-b18f-54982efd83d3\") " Sep 16 05:01:14.386591 kubelet[3290]: I0916 05:01:14.386571 3290 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bee220d-c0ad-4dc5-b18f-54982efd83d3-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "9bee220d-c0ad-4dc5-b18f-54982efd83d3" (UID: "9bee220d-c0ad-4dc5-b18f-54982efd83d3"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 16 05:01:14.387871 kubelet[3290]: I0916 05:01:14.387856 3290 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bee220d-c0ad-4dc5-b18f-54982efd83d3-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "9bee220d-c0ad-4dc5-b18f-54982efd83d3" (UID: "9bee220d-c0ad-4dc5-b18f-54982efd83d3"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 16 05:01:14.387902 kubelet[3290]: I0916 05:01:14.387869 3290 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bee220d-c0ad-4dc5-b18f-54982efd83d3-kube-api-access-slxp5" (OuterVolumeSpecName: "kube-api-access-slxp5") pod "9bee220d-c0ad-4dc5-b18f-54982efd83d3" (UID: "9bee220d-c0ad-4dc5-b18f-54982efd83d3"). InnerVolumeSpecName "kube-api-access-slxp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 16 05:01:14.487683 kubelet[3290]: I0916 05:01:14.487480 3290 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9bee220d-c0ad-4dc5-b18f-54982efd83d3-whisker-backend-key-pair\") on node \"ci-4459.0.0-n-32926c0571\" DevicePath \"\"" Sep 16 05:01:14.487683 kubelet[3290]: I0916 05:01:14.487559 3290 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bee220d-c0ad-4dc5-b18f-54982efd83d3-whisker-ca-bundle\") on node \"ci-4459.0.0-n-32926c0571\" DevicePath \"\"" Sep 16 05:01:14.487683 kubelet[3290]: I0916 05:01:14.487593 3290 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slxp5\" (UniqueName: \"kubernetes.io/projected/9bee220d-c0ad-4dc5-b18f-54982efd83d3-kube-api-access-slxp5\") on node \"ci-4459.0.0-n-32926c0571\" DevicePath \"\"" Sep 16 05:01:15.173237 systemd[1]: var-lib-kubelet-pods-9bee220d\x2dc0ad\x2d4dc5\x2db18f\x2d54982efd83d3-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dslxp5.mount: Deactivated successfully. Sep 16 05:01:15.173311 systemd[1]: var-lib-kubelet-pods-9bee220d\x2dc0ad\x2d4dc5\x2db18f\x2d54982efd83d3-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 16 05:01:15.326255 systemd[1]: Removed slice kubepods-besteffort-pod9bee220d_c0ad_4dc5_b18f_54982efd83d3.slice - libcontainer container kubepods-besteffort-pod9bee220d_c0ad_4dc5_b18f_54982efd83d3.slice. Sep 16 05:01:15.350659 systemd[1]: Created slice kubepods-besteffort-podb9383846_fcb6_40c1_80aa_bb985253ce8e.slice - libcontainer container kubepods-besteffort-podb9383846_fcb6_40c1_80aa_bb985253ce8e.slice. Sep 16 05:01:15.373141 containerd[1918]: time="2025-09-16T05:01:15.373115406Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"2dc91b48d46634f70126fd43a708ccb89f306fec74ace932235b32e20b2033e2\" pid:4803 exit_status:1 exited_at:{seconds:1757998875 nanos:372951383}" Sep 16 05:01:15.392493 kubelet[3290]: I0916 05:01:15.392476 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5nvr\" (UniqueName: \"kubernetes.io/projected/b9383846-fcb6-40c1-80aa-bb985253ce8e-kube-api-access-v5nvr\") pod \"whisker-7d5456bb9-rkbdg\" (UID: \"b9383846-fcb6-40c1-80aa-bb985253ce8e\") " pod="calico-system/whisker-7d5456bb9-rkbdg" Sep 16 05:01:15.392671 kubelet[3290]: I0916 05:01:15.392500 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9383846-fcb6-40c1-80aa-bb985253ce8e-whisker-ca-bundle\") pod \"whisker-7d5456bb9-rkbdg\" (UID: \"b9383846-fcb6-40c1-80aa-bb985253ce8e\") " pod="calico-system/whisker-7d5456bb9-rkbdg" Sep 16 05:01:15.392671 kubelet[3290]: I0916 05:01:15.392543 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b9383846-fcb6-40c1-80aa-bb985253ce8e-whisker-backend-key-pair\") pod \"whisker-7d5456bb9-rkbdg\" (UID: \"b9383846-fcb6-40c1-80aa-bb985253ce8e\") " pod="calico-system/whisker-7d5456bb9-rkbdg" Sep 16 05:01:15.652227 containerd[1918]: time="2025-09-16T05:01:15.652202934Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7d5456bb9-rkbdg,Uid:b9383846-fcb6-40c1-80aa-bb985253ce8e,Namespace:calico-system,Attempt:0,}" Sep 16 05:01:15.664482 systemd-networkd[1833]: vxlan.calico: Link UP Sep 16 05:01:15.664486 systemd-networkd[1833]: vxlan.calico: Gained carrier Sep 16 05:01:15.706518 systemd-networkd[1833]: cali2015a67e77c: Link UP Sep 16 05:01:15.706703 systemd-networkd[1833]: cali2015a67e77c: Gained carrier Sep 16 05:01:15.713934 containerd[1918]: 2025-09-16 05:01:15.675 [INFO][5017] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--32926c0571-k8s-whisker--7d5456bb9--rkbdg-eth0 whisker-7d5456bb9- calico-system b9383846-fcb6-40c1-80aa-bb985253ce8e 888 0 2025-09-16 05:01:15 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7d5456bb9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459.0.0-n-32926c0571 whisker-7d5456bb9-rkbdg eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali2015a67e77c [] [] }} ContainerID="69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" Namespace="calico-system" Pod="whisker-7d5456bb9-rkbdg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-whisker--7d5456bb9--rkbdg-" Sep 16 05:01:15.713934 containerd[1918]: 2025-09-16 05:01:15.675 [INFO][5017] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" Namespace="calico-system" Pod="whisker-7d5456bb9-rkbdg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-whisker--7d5456bb9--rkbdg-eth0" Sep 16 05:01:15.713934 containerd[1918]: 2025-09-16 05:01:15.688 [INFO][5087] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" HandleID="k8s-pod-network.69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" Workload="ci--4459.0.0--n--32926c0571-k8s-whisker--7d5456bb9--rkbdg-eth0" Sep 16 05:01:15.714102 containerd[1918]: 2025-09-16 05:01:15.688 [INFO][5087] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" HandleID="k8s-pod-network.69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" Workload="ci--4459.0.0--n--32926c0571-k8s-whisker--7d5456bb9--rkbdg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000433490), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.0.0-n-32926c0571", "pod":"whisker-7d5456bb9-rkbdg", "timestamp":"2025-09-16 05:01:15.688121488 +0000 UTC"}, Hostname:"ci-4459.0.0-n-32926c0571", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:01:15.714102 containerd[1918]: 2025-09-16 05:01:15.688 [INFO][5087] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:01:15.714102 containerd[1918]: 2025-09-16 05:01:15.688 [INFO][5087] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:01:15.714102 containerd[1918]: 2025-09-16 05:01:15.688 [INFO][5087] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-32926c0571' Sep 16 05:01:15.714102 containerd[1918]: 2025-09-16 05:01:15.692 [INFO][5087] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:15.714102 containerd[1918]: 2025-09-16 05:01:15.694 [INFO][5087] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:15.714102 containerd[1918]: 2025-09-16 05:01:15.696 [INFO][5087] ipam/ipam.go 511: Trying affinity for 192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:15.714102 containerd[1918]: 2025-09-16 05:01:15.697 [INFO][5087] ipam/ipam.go 158: Attempting to load block cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:15.714102 containerd[1918]: 2025-09-16 05:01:15.698 [INFO][5087] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:15.714241 containerd[1918]: 2025-09-16 05:01:15.698 [INFO][5087] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.23.192/26 handle="k8s-pod-network.69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:15.714241 containerd[1918]: 2025-09-16 05:01:15.699 [INFO][5087] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e Sep 16 05:01:15.714241 containerd[1918]: 2025-09-16 05:01:15.701 [INFO][5087] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.23.192/26 handle="k8s-pod-network.69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:15.714241 containerd[1918]: 2025-09-16 05:01:15.703 [INFO][5087] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.23.193/26] block=192.168.23.192/26 handle="k8s-pod-network.69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:15.714241 containerd[1918]: 2025-09-16 05:01:15.703 [INFO][5087] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.23.193/26] handle="k8s-pod-network.69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:15.714241 containerd[1918]: 2025-09-16 05:01:15.703 [INFO][5087] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:01:15.714241 containerd[1918]: 2025-09-16 05:01:15.703 [INFO][5087] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.23.193/26] IPv6=[] ContainerID="69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" HandleID="k8s-pod-network.69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" Workload="ci--4459.0.0--n--32926c0571-k8s-whisker--7d5456bb9--rkbdg-eth0" Sep 16 05:01:15.714390 containerd[1918]: 2025-09-16 05:01:15.705 [INFO][5017] cni-plugin/k8s.go 418: Populated endpoint ContainerID="69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" Namespace="calico-system" Pod="whisker-7d5456bb9-rkbdg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-whisker--7d5456bb9--rkbdg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-whisker--7d5456bb9--rkbdg-eth0", GenerateName:"whisker-7d5456bb9-", Namespace:"calico-system", SelfLink:"", UID:"b9383846-fcb6-40c1-80aa-bb985253ce8e", ResourceVersion:"888", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 1, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7d5456bb9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"", Pod:"whisker-7d5456bb9-rkbdg", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.23.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2015a67e77c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:15.714390 containerd[1918]: 2025-09-16 05:01:15.705 [INFO][5017] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.23.193/32] ContainerID="69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" Namespace="calico-system" Pod="whisker-7d5456bb9-rkbdg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-whisker--7d5456bb9--rkbdg-eth0" Sep 16 05:01:15.714467 containerd[1918]: 2025-09-16 05:01:15.705 [INFO][5017] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2015a67e77c ContainerID="69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" Namespace="calico-system" Pod="whisker-7d5456bb9-rkbdg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-whisker--7d5456bb9--rkbdg-eth0" Sep 16 05:01:15.714467 containerd[1918]: 2025-09-16 05:01:15.706 [INFO][5017] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" Namespace="calico-system" Pod="whisker-7d5456bb9-rkbdg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-whisker--7d5456bb9--rkbdg-eth0" Sep 16 05:01:15.714519 containerd[1918]: 2025-09-16 05:01:15.707 [INFO][5017] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" Namespace="calico-system" Pod="whisker-7d5456bb9-rkbdg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-whisker--7d5456bb9--rkbdg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-whisker--7d5456bb9--rkbdg-eth0", GenerateName:"whisker-7d5456bb9-", Namespace:"calico-system", SelfLink:"", UID:"b9383846-fcb6-40c1-80aa-bb985253ce8e", ResourceVersion:"888", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 1, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7d5456bb9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e", Pod:"whisker-7d5456bb9-rkbdg", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.23.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2015a67e77c", MAC:"e6:e1:35:c4:7c:67", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:15.714576 containerd[1918]: 2025-09-16 05:01:15.712 [INFO][5017] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" Namespace="calico-system" Pod="whisker-7d5456bb9-rkbdg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-whisker--7d5456bb9--rkbdg-eth0" Sep 16 05:01:15.722552 containerd[1918]: time="2025-09-16T05:01:15.722526738Z" level=info msg="connecting to shim 69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e" address="unix:///run/containerd/s/e8482c9d9d249da8b6d8a51f39b4da7d52a1b5586a4f5fb8a06af7f6dec12a3e" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:01:15.747208 systemd[1]: Started cri-containerd-69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e.scope - libcontainer container 69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e. Sep 16 05:01:15.772391 containerd[1918]: time="2025-09-16T05:01:15.772367082Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7d5456bb9-rkbdg,Uid:b9383846-fcb6-40c1-80aa-bb985253ce8e,Namespace:calico-system,Attempt:0,} returns sandbox id \"69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e\"" Sep 16 05:01:15.773038 containerd[1918]: time="2025-09-16T05:01:15.773025533Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 16 05:01:16.187194 kubelet[3290]: I0916 05:01:16.187144 3290 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bee220d-c0ad-4dc5-b18f-54982efd83d3" path="/var/lib/kubelet/pods/9bee220d-c0ad-4dc5-b18f-54982efd83d3/volumes" Sep 16 05:01:16.855382 systemd-networkd[1833]: vxlan.calico: Gained IPv6LL Sep 16 05:01:17.751336 systemd-networkd[1833]: cali2015a67e77c: Gained IPv6LL Sep 16 05:01:19.187222 containerd[1918]: time="2025-09-16T05:01:19.187098085Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75499dcbcd-cz8mk,Uid:2aca797e-97f5-4c40-8082-71bf715e6951,Namespace:calico-apiserver,Attempt:0,}" Sep 16 05:01:19.246206 systemd-networkd[1833]: calia8c6b69d5ac: Link UP Sep 16 05:01:19.246476 systemd-networkd[1833]: calia8c6b69d5ac: Gained carrier Sep 16 05:01:19.253658 containerd[1918]: 2025-09-16 05:01:19.207 [INFO][5198] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--cz8mk-eth0 calico-apiserver-75499dcbcd- calico-apiserver 2aca797e-97f5-4c40-8082-71bf715e6951 822 0 2025-09-16 05:00:48 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:75499dcbcd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.0.0-n-32926c0571 calico-apiserver-75499dcbcd-cz8mk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia8c6b69d5ac [] [] }} ContainerID="b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-cz8mk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--cz8mk-" Sep 16 05:01:19.253658 containerd[1918]: 2025-09-16 05:01:19.207 [INFO][5198] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-cz8mk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--cz8mk-eth0" Sep 16 05:01:19.253658 containerd[1918]: 2025-09-16 05:01:19.221 [INFO][5218] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" HandleID="k8s-pod-network.b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" Workload="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--cz8mk-eth0" Sep 16 05:01:19.253918 containerd[1918]: 2025-09-16 05:01:19.221 [INFO][5218] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" HandleID="k8s-pod-network.b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" Workload="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--cz8mk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004eab0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459.0.0-n-32926c0571", "pod":"calico-apiserver-75499dcbcd-cz8mk", "timestamp":"2025-09-16 05:01:19.221897767 +0000 UTC"}, Hostname:"ci-4459.0.0-n-32926c0571", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:01:19.253918 containerd[1918]: 2025-09-16 05:01:19.222 [INFO][5218] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:01:19.253918 containerd[1918]: 2025-09-16 05:01:19.222 [INFO][5218] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:01:19.253918 containerd[1918]: 2025-09-16 05:01:19.222 [INFO][5218] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-32926c0571' Sep 16 05:01:19.253918 containerd[1918]: 2025-09-16 05:01:19.226 [INFO][5218] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:19.253918 containerd[1918]: 2025-09-16 05:01:19.229 [INFO][5218] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:19.253918 containerd[1918]: 2025-09-16 05:01:19.233 [INFO][5218] ipam/ipam.go 511: Trying affinity for 192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:19.253918 containerd[1918]: 2025-09-16 05:01:19.234 [INFO][5218] ipam/ipam.go 158: Attempting to load block cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:19.253918 containerd[1918]: 2025-09-16 05:01:19.236 [INFO][5218] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:19.254365 containerd[1918]: 2025-09-16 05:01:19.236 [INFO][5218] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.23.192/26 handle="k8s-pod-network.b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:19.254365 containerd[1918]: 2025-09-16 05:01:19.237 [INFO][5218] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0 Sep 16 05:01:19.254365 containerd[1918]: 2025-09-16 05:01:19.239 [INFO][5218] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.23.192/26 handle="k8s-pod-network.b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:19.254365 containerd[1918]: 2025-09-16 05:01:19.243 [INFO][5218] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.23.194/26] block=192.168.23.192/26 handle="k8s-pod-network.b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:19.254365 containerd[1918]: 2025-09-16 05:01:19.243 [INFO][5218] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.23.194/26] handle="k8s-pod-network.b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:19.254365 containerd[1918]: 2025-09-16 05:01:19.243 [INFO][5218] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:01:19.254365 containerd[1918]: 2025-09-16 05:01:19.243 [INFO][5218] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.23.194/26] IPv6=[] ContainerID="b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" HandleID="k8s-pod-network.b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" Workload="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--cz8mk-eth0" Sep 16 05:01:19.254658 containerd[1918]: 2025-09-16 05:01:19.244 [INFO][5198] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-cz8mk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--cz8mk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--cz8mk-eth0", GenerateName:"calico-apiserver-75499dcbcd-", Namespace:"calico-apiserver", SelfLink:"", UID:"2aca797e-97f5-4c40-8082-71bf715e6951", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 0, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75499dcbcd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"", Pod:"calico-apiserver-75499dcbcd-cz8mk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.23.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia8c6b69d5ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:19.254766 containerd[1918]: 2025-09-16 05:01:19.244 [INFO][5198] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.23.194/32] ContainerID="b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-cz8mk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--cz8mk-eth0" Sep 16 05:01:19.254766 containerd[1918]: 2025-09-16 05:01:19.244 [INFO][5198] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia8c6b69d5ac ContainerID="b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-cz8mk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--cz8mk-eth0" Sep 16 05:01:19.254766 containerd[1918]: 2025-09-16 05:01:19.246 [INFO][5198] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-cz8mk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--cz8mk-eth0" Sep 16 05:01:19.254903 containerd[1918]: 2025-09-16 05:01:19.246 [INFO][5198] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-cz8mk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--cz8mk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--cz8mk-eth0", GenerateName:"calico-apiserver-75499dcbcd-", Namespace:"calico-apiserver", SelfLink:"", UID:"2aca797e-97f5-4c40-8082-71bf715e6951", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 0, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75499dcbcd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0", Pod:"calico-apiserver-75499dcbcd-cz8mk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.23.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia8c6b69d5ac", MAC:"9e:b0:e6:95:76:5b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:19.255011 containerd[1918]: 2025-09-16 05:01:19.252 [INFO][5198] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-cz8mk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--cz8mk-eth0" Sep 16 05:01:19.262978 containerd[1918]: time="2025-09-16T05:01:19.262944118Z" level=info msg="connecting to shim b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0" address="unix:///run/containerd/s/8dee291abe8507205abbee8d98f03790d380d6b841902bfc0df6ed73d51d9dd6" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:01:19.284190 systemd[1]: Started cri-containerd-b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0.scope - libcontainer container b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0. Sep 16 05:01:19.311174 containerd[1918]: time="2025-09-16T05:01:19.311152674Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75499dcbcd-cz8mk,Uid:2aca797e-97f5-4c40-8082-71bf715e6951,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0\"" Sep 16 05:01:20.186705 containerd[1918]: time="2025-09-16T05:01:20.186616094Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-tgnt6,Uid:b48014a9-760b-4104-b099-afc0fb5f5cd5,Namespace:kube-system,Attempt:0,}" Sep 16 05:01:20.187155 containerd[1918]: time="2025-09-16T05:01:20.187013714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75499dcbcd-phsx9,Uid:ac107799-03db-4410-b2fb-2b6d2c05b1d8,Namespace:calico-apiserver,Attempt:0,}" Sep 16 05:01:20.245075 systemd-networkd[1833]: calib8697f33f27: Link UP Sep 16 05:01:20.245289 systemd-networkd[1833]: calib8697f33f27: Gained carrier Sep 16 05:01:20.252060 containerd[1918]: 2025-09-16 05:01:20.207 [INFO][5285] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--tgnt6-eth0 coredns-7c65d6cfc9- kube-system b48014a9-760b-4104-b099-afc0fb5f5cd5 824 0 2025-09-16 05:00:40 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.0.0-n-32926c0571 coredns-7c65d6cfc9-tgnt6 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib8697f33f27 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tgnt6" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--tgnt6-" Sep 16 05:01:20.252060 containerd[1918]: 2025-09-16 05:01:20.207 [INFO][5285] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tgnt6" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--tgnt6-eth0" Sep 16 05:01:20.252060 containerd[1918]: 2025-09-16 05:01:20.220 [INFO][5333] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" HandleID="k8s-pod-network.5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" Workload="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--tgnt6-eth0" Sep 16 05:01:20.252497 containerd[1918]: 2025-09-16 05:01:20.220 [INFO][5333] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" HandleID="k8s-pod-network.5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" Workload="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--tgnt6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000345ab0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.0.0-n-32926c0571", "pod":"coredns-7c65d6cfc9-tgnt6", "timestamp":"2025-09-16 05:01:20.220769518 +0000 UTC"}, Hostname:"ci-4459.0.0-n-32926c0571", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:01:20.252497 containerd[1918]: 2025-09-16 05:01:20.220 [INFO][5333] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:01:20.252497 containerd[1918]: 2025-09-16 05:01:20.220 [INFO][5333] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:01:20.252497 containerd[1918]: 2025-09-16 05:01:20.220 [INFO][5333] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-32926c0571' Sep 16 05:01:20.252497 containerd[1918]: 2025-09-16 05:01:20.225 [INFO][5333] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.252497 containerd[1918]: 2025-09-16 05:01:20.229 [INFO][5333] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.252497 containerd[1918]: 2025-09-16 05:01:20.233 [INFO][5333] ipam/ipam.go 511: Trying affinity for 192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.252497 containerd[1918]: 2025-09-16 05:01:20.235 [INFO][5333] ipam/ipam.go 158: Attempting to load block cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.252497 containerd[1918]: 2025-09-16 05:01:20.236 [INFO][5333] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.252822 containerd[1918]: 2025-09-16 05:01:20.236 [INFO][5333] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.23.192/26 handle="k8s-pod-network.5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.252822 containerd[1918]: 2025-09-16 05:01:20.237 [INFO][5333] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e Sep 16 05:01:20.252822 containerd[1918]: 2025-09-16 05:01:20.240 [INFO][5333] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.23.192/26 handle="k8s-pod-network.5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.252822 containerd[1918]: 2025-09-16 05:01:20.242 [INFO][5333] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.23.195/26] block=192.168.23.192/26 handle="k8s-pod-network.5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.252822 containerd[1918]: 2025-09-16 05:01:20.242 [INFO][5333] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.23.195/26] handle="k8s-pod-network.5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.252822 containerd[1918]: 2025-09-16 05:01:20.242 [INFO][5333] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:01:20.252822 containerd[1918]: 2025-09-16 05:01:20.242 [INFO][5333] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.23.195/26] IPv6=[] ContainerID="5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" HandleID="k8s-pod-network.5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" Workload="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--tgnt6-eth0" Sep 16 05:01:20.253056 containerd[1918]: 2025-09-16 05:01:20.243 [INFO][5285] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tgnt6" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--tgnt6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--tgnt6-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"b48014a9-760b-4104-b099-afc0fb5f5cd5", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 0, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"", Pod:"coredns-7c65d6cfc9-tgnt6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.23.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib8697f33f27", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:20.253056 containerd[1918]: 2025-09-16 05:01:20.244 [INFO][5285] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.23.195/32] ContainerID="5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tgnt6" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--tgnt6-eth0" Sep 16 05:01:20.253056 containerd[1918]: 2025-09-16 05:01:20.244 [INFO][5285] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib8697f33f27 ContainerID="5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tgnt6" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--tgnt6-eth0" Sep 16 05:01:20.253056 containerd[1918]: 2025-09-16 05:01:20.245 [INFO][5285] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tgnt6" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--tgnt6-eth0" Sep 16 05:01:20.253056 containerd[1918]: 2025-09-16 05:01:20.245 [INFO][5285] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tgnt6" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--tgnt6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--tgnt6-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"b48014a9-760b-4104-b099-afc0fb5f5cd5", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 0, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e", Pod:"coredns-7c65d6cfc9-tgnt6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.23.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib8697f33f27", MAC:"da:a4:d7:eb:59:8a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:20.253056 containerd[1918]: 2025-09-16 05:01:20.251 [INFO][5285] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tgnt6" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--tgnt6-eth0" Sep 16 05:01:20.261162 containerd[1918]: time="2025-09-16T05:01:20.261138432Z" level=info msg="connecting to shim 5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e" address="unix:///run/containerd/s/f1043315a2d736433d82b94cf1cace869463e6b0f9ba329f47dcf302e8ca15bf" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:01:20.283382 systemd[1]: Started cri-containerd-5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e.scope - libcontainer container 5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e. Sep 16 05:01:20.356481 systemd-networkd[1833]: cali53e3a499d74: Link UP Sep 16 05:01:20.356733 systemd-networkd[1833]: cali53e3a499d74: Gained carrier Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.208 [INFO][5291] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--phsx9-eth0 calico-apiserver-75499dcbcd- calico-apiserver ac107799-03db-4410-b2fb-2b6d2c05b1d8 825 0 2025-09-16 05:00:48 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:75499dcbcd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.0.0-n-32926c0571 calico-apiserver-75499dcbcd-phsx9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali53e3a499d74 [] [] }} ContainerID="bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-phsx9" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--phsx9-" Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.208 [INFO][5291] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-phsx9" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--phsx9-eth0" Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.220 [INFO][5335] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" HandleID="k8s-pod-network.bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" Workload="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--phsx9-eth0" Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.221 [INFO][5335] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" HandleID="k8s-pod-network.bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" Workload="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--phsx9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001396c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459.0.0-n-32926c0571", "pod":"calico-apiserver-75499dcbcd-phsx9", "timestamp":"2025-09-16 05:01:20.220952148 +0000 UTC"}, Hostname:"ci-4459.0.0-n-32926c0571", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.221 [INFO][5335] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.242 [INFO][5335] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.242 [INFO][5335] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-32926c0571' Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.327 [INFO][5335] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.336 [INFO][5335] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.341 [INFO][5335] ipam/ipam.go 511: Trying affinity for 192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.342 [INFO][5335] ipam/ipam.go 158: Attempting to load block cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.345 [INFO][5335] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.345 [INFO][5335] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.23.192/26 handle="k8s-pod-network.bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.346 [INFO][5335] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7 Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.350 [INFO][5335] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.23.192/26 handle="k8s-pod-network.bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.353 [INFO][5335] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.23.196/26] block=192.168.23.192/26 handle="k8s-pod-network.bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.353 [INFO][5335] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.23.196/26] handle="k8s-pod-network.bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.353 [INFO][5335] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:01:20.364310 containerd[1918]: 2025-09-16 05:01:20.353 [INFO][5335] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.23.196/26] IPv6=[] ContainerID="bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" HandleID="k8s-pod-network.bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" Workload="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--phsx9-eth0" Sep 16 05:01:20.364826 containerd[1918]: 2025-09-16 05:01:20.355 [INFO][5291] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-phsx9" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--phsx9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--phsx9-eth0", GenerateName:"calico-apiserver-75499dcbcd-", Namespace:"calico-apiserver", SelfLink:"", UID:"ac107799-03db-4410-b2fb-2b6d2c05b1d8", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 0, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75499dcbcd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"", Pod:"calico-apiserver-75499dcbcd-phsx9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.23.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali53e3a499d74", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:20.364826 containerd[1918]: 2025-09-16 05:01:20.355 [INFO][5291] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.23.196/32] ContainerID="bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-phsx9" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--phsx9-eth0" Sep 16 05:01:20.364826 containerd[1918]: 2025-09-16 05:01:20.355 [INFO][5291] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali53e3a499d74 ContainerID="bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-phsx9" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--phsx9-eth0" Sep 16 05:01:20.364826 containerd[1918]: 2025-09-16 05:01:20.356 [INFO][5291] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-phsx9" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--phsx9-eth0" Sep 16 05:01:20.364826 containerd[1918]: 2025-09-16 05:01:20.357 [INFO][5291] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-phsx9" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--phsx9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--phsx9-eth0", GenerateName:"calico-apiserver-75499dcbcd-", Namespace:"calico-apiserver", SelfLink:"", UID:"ac107799-03db-4410-b2fb-2b6d2c05b1d8", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 0, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75499dcbcd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7", Pod:"calico-apiserver-75499dcbcd-phsx9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.23.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali53e3a499d74", MAC:"06:89:8f:ee:ab:a5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:20.364826 containerd[1918]: 2025-09-16 05:01:20.362 [INFO][5291] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" Namespace="calico-apiserver" Pod="calico-apiserver-75499dcbcd-phsx9" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--apiserver--75499dcbcd--phsx9-eth0" Sep 16 05:01:20.365208 containerd[1918]: time="2025-09-16T05:01:20.365186326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-tgnt6,Uid:b48014a9-760b-4104-b099-afc0fb5f5cd5,Namespace:kube-system,Attempt:0,} returns sandbox id \"5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e\"" Sep 16 05:01:20.366498 containerd[1918]: time="2025-09-16T05:01:20.366475493Z" level=info msg="CreateContainer within sandbox \"5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 16 05:01:20.370629 containerd[1918]: time="2025-09-16T05:01:20.370602479Z" level=info msg="Container 5189bb1a81c49c1cc36ff79fe89551677214e10600379dd20c79e1c4e65bc21c: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:01:20.373221 containerd[1918]: time="2025-09-16T05:01:20.373193841Z" level=info msg="connecting to shim bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7" address="unix:///run/containerd/s/cdee89c03d788af5a42af953e2e2d81ac775dfcb89b1d883e227a54dba35303b" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:01:20.373292 containerd[1918]: time="2025-09-16T05:01:20.373264817Z" level=info msg="CreateContainer within sandbox \"5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5189bb1a81c49c1cc36ff79fe89551677214e10600379dd20c79e1c4e65bc21c\"" Sep 16 05:01:20.373552 containerd[1918]: time="2025-09-16T05:01:20.373538513Z" level=info msg="StartContainer for \"5189bb1a81c49c1cc36ff79fe89551677214e10600379dd20c79e1c4e65bc21c\"" Sep 16 05:01:20.374071 containerd[1918]: time="2025-09-16T05:01:20.374021348Z" level=info msg="connecting to shim 5189bb1a81c49c1cc36ff79fe89551677214e10600379dd20c79e1c4e65bc21c" address="unix:///run/containerd/s/f1043315a2d736433d82b94cf1cace869463e6b0f9ba329f47dcf302e8ca15bf" protocol=ttrpc version=3 Sep 16 05:01:20.395338 systemd[1]: Started cri-containerd-5189bb1a81c49c1cc36ff79fe89551677214e10600379dd20c79e1c4e65bc21c.scope - libcontainer container 5189bb1a81c49c1cc36ff79fe89551677214e10600379dd20c79e1c4e65bc21c. Sep 16 05:01:20.397149 systemd[1]: Started cri-containerd-bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7.scope - libcontainer container bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7. Sep 16 05:01:20.408508 containerd[1918]: time="2025-09-16T05:01:20.408483712Z" level=info msg="StartContainer for \"5189bb1a81c49c1cc36ff79fe89551677214e10600379dd20c79e1c4e65bc21c\" returns successfully" Sep 16 05:01:20.434885 containerd[1918]: time="2025-09-16T05:01:20.434831682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75499dcbcd-phsx9,Uid:ac107799-03db-4410-b2fb-2b6d2c05b1d8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7\"" Sep 16 05:01:20.889206 containerd[1918]: time="2025-09-16T05:01:20.889158048Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:20.889295 containerd[1918]: time="2025-09-16T05:01:20.889271544Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 16 05:01:20.889732 containerd[1918]: time="2025-09-16T05:01:20.889695404Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:20.890600 containerd[1918]: time="2025-09-16T05:01:20.890588084Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:20.891043 containerd[1918]: time="2025-09-16T05:01:20.891019434Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 5.117973302s" Sep 16 05:01:20.891043 containerd[1918]: time="2025-09-16T05:01:20.891039237Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 16 05:01:20.891603 containerd[1918]: time="2025-09-16T05:01:20.891592111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 16 05:01:20.892119 containerd[1918]: time="2025-09-16T05:01:20.892106958Z" level=info msg="CreateContainer within sandbox \"69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 16 05:01:20.894586 containerd[1918]: time="2025-09-16T05:01:20.894550232Z" level=info msg="Container 6891f4cb56ea45a67fa8e9c14d5ab132c341787d481573135f7f57c5261a14a9: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:01:20.897346 containerd[1918]: time="2025-09-16T05:01:20.897309960Z" level=info msg="CreateContainer within sandbox \"69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"6891f4cb56ea45a67fa8e9c14d5ab132c341787d481573135f7f57c5261a14a9\"" Sep 16 05:01:20.897498 containerd[1918]: time="2025-09-16T05:01:20.897483826Z" level=info msg="StartContainer for \"6891f4cb56ea45a67fa8e9c14d5ab132c341787d481573135f7f57c5261a14a9\"" Sep 16 05:01:20.898105 containerd[1918]: time="2025-09-16T05:01:20.898048454Z" level=info msg="connecting to shim 6891f4cb56ea45a67fa8e9c14d5ab132c341787d481573135f7f57c5261a14a9" address="unix:///run/containerd/s/e8482c9d9d249da8b6d8a51f39b4da7d52a1b5586a4f5fb8a06af7f6dec12a3e" protocol=ttrpc version=3 Sep 16 05:01:20.915342 systemd[1]: Started cri-containerd-6891f4cb56ea45a67fa8e9c14d5ab132c341787d481573135f7f57c5261a14a9.scope - libcontainer container 6891f4cb56ea45a67fa8e9c14d5ab132c341787d481573135f7f57c5261a14a9. Sep 16 05:01:20.948688 containerd[1918]: time="2025-09-16T05:01:20.948662539Z" level=info msg="StartContainer for \"6891f4cb56ea45a67fa8e9c14d5ab132c341787d481573135f7f57c5261a14a9\" returns successfully" Sep 16 05:01:21.015216 systemd-networkd[1833]: calia8c6b69d5ac: Gained IPv6LL Sep 16 05:01:21.186619 containerd[1918]: time="2025-09-16T05:01:21.186411924Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-7wtzg,Uid:767fa079-891d-454b-9bcf-96567c6c98e4,Namespace:calico-system,Attempt:0,}" Sep 16 05:01:21.186842 containerd[1918]: time="2025-09-16T05:01:21.186428593Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-89f5b8df-cq9w2,Uid:bece35df-f07e-4c73-80a2-0b5b3228348f,Namespace:calico-system,Attempt:0,}" Sep 16 05:01:21.262418 systemd-networkd[1833]: cali27a45bee03b: Link UP Sep 16 05:01:21.262576 systemd-networkd[1833]: cali27a45bee03b: Gained carrier Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.223 [INFO][5587] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--32926c0571-k8s-calico--kube--controllers--89f5b8df--cq9w2-eth0 calico-kube-controllers-89f5b8df- calico-system bece35df-f07e-4c73-80a2-0b5b3228348f 819 0 2025-09-16 05:00:50 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:89f5b8df projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459.0.0-n-32926c0571 calico-kube-controllers-89f5b8df-cq9w2 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali27a45bee03b [] [] }} ContainerID="f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" Namespace="calico-system" Pod="calico-kube-controllers-89f5b8df-cq9w2" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--kube--controllers--89f5b8df--cq9w2-" Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.223 [INFO][5587] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" Namespace="calico-system" Pod="calico-kube-controllers-89f5b8df-cq9w2" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--kube--controllers--89f5b8df--cq9w2-eth0" Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.238 [INFO][5628] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" HandleID="k8s-pod-network.f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" Workload="ci--4459.0.0--n--32926c0571-k8s-calico--kube--controllers--89f5b8df--cq9w2-eth0" Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.238 [INFO][5628] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" HandleID="k8s-pod-network.f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" Workload="ci--4459.0.0--n--32926c0571-k8s-calico--kube--controllers--89f5b8df--cq9w2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00043a080), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.0.0-n-32926c0571", "pod":"calico-kube-controllers-89f5b8df-cq9w2", "timestamp":"2025-09-16 05:01:21.238020057 +0000 UTC"}, Hostname:"ci-4459.0.0-n-32926c0571", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.238 [INFO][5628] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.238 [INFO][5628] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.238 [INFO][5628] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-32926c0571' Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.243 [INFO][5628] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.247 [INFO][5628] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.250 [INFO][5628] ipam/ipam.go 511: Trying affinity for 192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.251 [INFO][5628] ipam/ipam.go 158: Attempting to load block cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.253 [INFO][5628] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.253 [INFO][5628] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.23.192/26 handle="k8s-pod-network.f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.254 [INFO][5628] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5 Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.257 [INFO][5628] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.23.192/26 handle="k8s-pod-network.f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.260 [INFO][5628] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.23.197/26] block=192.168.23.192/26 handle="k8s-pod-network.f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.260 [INFO][5628] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.23.197/26] handle="k8s-pod-network.f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.260 [INFO][5628] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:01:21.267635 containerd[1918]: 2025-09-16 05:01:21.260 [INFO][5628] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.23.197/26] IPv6=[] ContainerID="f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" HandleID="k8s-pod-network.f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" Workload="ci--4459.0.0--n--32926c0571-k8s-calico--kube--controllers--89f5b8df--cq9w2-eth0" Sep 16 05:01:21.268175 containerd[1918]: 2025-09-16 05:01:21.261 [INFO][5587] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" Namespace="calico-system" Pod="calico-kube-controllers-89f5b8df-cq9w2" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--kube--controllers--89f5b8df--cq9w2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-calico--kube--controllers--89f5b8df--cq9w2-eth0", GenerateName:"calico-kube-controllers-89f5b8df-", Namespace:"calico-system", SelfLink:"", UID:"bece35df-f07e-4c73-80a2-0b5b3228348f", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 0, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"89f5b8df", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"", Pod:"calico-kube-controllers-89f5b8df-cq9w2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.23.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali27a45bee03b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:21.268175 containerd[1918]: 2025-09-16 05:01:21.261 [INFO][5587] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.23.197/32] ContainerID="f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" Namespace="calico-system" Pod="calico-kube-controllers-89f5b8df-cq9w2" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--kube--controllers--89f5b8df--cq9w2-eth0" Sep 16 05:01:21.268175 containerd[1918]: 2025-09-16 05:01:21.261 [INFO][5587] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali27a45bee03b ContainerID="f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" Namespace="calico-system" Pod="calico-kube-controllers-89f5b8df-cq9w2" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--kube--controllers--89f5b8df--cq9w2-eth0" Sep 16 05:01:21.268175 containerd[1918]: 2025-09-16 05:01:21.262 [INFO][5587] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" Namespace="calico-system" Pod="calico-kube-controllers-89f5b8df-cq9w2" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--kube--controllers--89f5b8df--cq9w2-eth0" Sep 16 05:01:21.268175 containerd[1918]: 2025-09-16 05:01:21.262 [INFO][5587] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" Namespace="calico-system" Pod="calico-kube-controllers-89f5b8df-cq9w2" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--kube--controllers--89f5b8df--cq9w2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-calico--kube--controllers--89f5b8df--cq9w2-eth0", GenerateName:"calico-kube-controllers-89f5b8df-", Namespace:"calico-system", SelfLink:"", UID:"bece35df-f07e-4c73-80a2-0b5b3228348f", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 0, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"89f5b8df", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5", Pod:"calico-kube-controllers-89f5b8df-cq9w2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.23.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali27a45bee03b", MAC:"ea:d9:7e:d1:d6:8a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:21.268175 containerd[1918]: 2025-09-16 05:01:21.266 [INFO][5587] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" Namespace="calico-system" Pod="calico-kube-controllers-89f5b8df-cq9w2" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-calico--kube--controllers--89f5b8df--cq9w2-eth0" Sep 16 05:01:21.275729 containerd[1918]: time="2025-09-16T05:01:21.275700673Z" level=info msg="connecting to shim f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5" address="unix:///run/containerd/s/6fcf1e0c1b2af769fc65c577c1543f708b44ce962620d562b8d621f4c41b914d" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:01:21.305216 systemd[1]: Started cri-containerd-f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5.scope - libcontainer container f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5. Sep 16 05:01:21.338219 containerd[1918]: time="2025-09-16T05:01:21.338197814Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-89f5b8df-cq9w2,Uid:bece35df-f07e-4c73-80a2-0b5b3228348f,Namespace:calico-system,Attempt:0,} returns sandbox id \"f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5\"" Sep 16 05:01:21.345117 kubelet[3290]: I0916 05:01:21.345072 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-tgnt6" podStartSLOduration=41.34505595 podStartE2EDuration="41.34505595s" podCreationTimestamp="2025-09-16 05:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 05:01:21.344762086 +0000 UTC m=+47.203291333" watchObservedRunningTime="2025-09-16 05:01:21.34505595 +0000 UTC m=+47.203585195" Sep 16 05:01:21.358900 systemd-networkd[1833]: cali12ced012fb5: Link UP Sep 16 05:01:21.359053 systemd-networkd[1833]: cali12ced012fb5: Gained carrier Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.223 [INFO][5582] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--32926c0571-k8s-goldmane--7988f88666--7wtzg-eth0 goldmane-7988f88666- calico-system 767fa079-891d-454b-9bcf-96567c6c98e4 821 0 2025-09-16 05:00:50 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459.0.0-n-32926c0571 goldmane-7988f88666-7wtzg eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali12ced012fb5 [] [] }} ContainerID="cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" Namespace="calico-system" Pod="goldmane-7988f88666-7wtzg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-goldmane--7988f88666--7wtzg-" Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.223 [INFO][5582] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" Namespace="calico-system" Pod="goldmane-7988f88666-7wtzg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-goldmane--7988f88666--7wtzg-eth0" Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.238 [INFO][5626] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" HandleID="k8s-pod-network.cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" Workload="ci--4459.0.0--n--32926c0571-k8s-goldmane--7988f88666--7wtzg-eth0" Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.238 [INFO][5626] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" HandleID="k8s-pod-network.cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" Workload="ci--4459.0.0--n--32926c0571-k8s-goldmane--7988f88666--7wtzg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ef680), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.0.0-n-32926c0571", "pod":"goldmane-7988f88666-7wtzg", "timestamp":"2025-09-16 05:01:21.238018193 +0000 UTC"}, Hostname:"ci-4459.0.0-n-32926c0571", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.238 [INFO][5626] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.260 [INFO][5626] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.260 [INFO][5626] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-32926c0571' Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.343 [INFO][5626] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.347 [INFO][5626] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.350 [INFO][5626] ipam/ipam.go 511: Trying affinity for 192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.350 [INFO][5626] ipam/ipam.go 158: Attempting to load block cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.352 [INFO][5626] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.352 [INFO][5626] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.23.192/26 handle="k8s-pod-network.cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.353 [INFO][5626] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7 Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.354 [INFO][5626] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.23.192/26 handle="k8s-pod-network.cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.356 [INFO][5626] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.23.198/26] block=192.168.23.192/26 handle="k8s-pod-network.cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.356 [INFO][5626] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.23.198/26] handle="k8s-pod-network.cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.356 [INFO][5626] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:01:21.364422 containerd[1918]: 2025-09-16 05:01:21.356 [INFO][5626] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.23.198/26] IPv6=[] ContainerID="cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" HandleID="k8s-pod-network.cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" Workload="ci--4459.0.0--n--32926c0571-k8s-goldmane--7988f88666--7wtzg-eth0" Sep 16 05:01:21.364961 containerd[1918]: 2025-09-16 05:01:21.357 [INFO][5582] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" Namespace="calico-system" Pod="goldmane-7988f88666-7wtzg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-goldmane--7988f88666--7wtzg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-goldmane--7988f88666--7wtzg-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"767fa079-891d-454b-9bcf-96567c6c98e4", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 0, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"", Pod:"goldmane-7988f88666-7wtzg", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.23.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali12ced012fb5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:21.364961 containerd[1918]: 2025-09-16 05:01:21.357 [INFO][5582] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.23.198/32] ContainerID="cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" Namespace="calico-system" Pod="goldmane-7988f88666-7wtzg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-goldmane--7988f88666--7wtzg-eth0" Sep 16 05:01:21.364961 containerd[1918]: 2025-09-16 05:01:21.358 [INFO][5582] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali12ced012fb5 ContainerID="cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" Namespace="calico-system" Pod="goldmane-7988f88666-7wtzg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-goldmane--7988f88666--7wtzg-eth0" Sep 16 05:01:21.364961 containerd[1918]: 2025-09-16 05:01:21.359 [INFO][5582] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" Namespace="calico-system" Pod="goldmane-7988f88666-7wtzg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-goldmane--7988f88666--7wtzg-eth0" Sep 16 05:01:21.364961 containerd[1918]: 2025-09-16 05:01:21.359 [INFO][5582] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" Namespace="calico-system" Pod="goldmane-7988f88666-7wtzg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-goldmane--7988f88666--7wtzg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-goldmane--7988f88666--7wtzg-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"767fa079-891d-454b-9bcf-96567c6c98e4", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 0, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7", Pod:"goldmane-7988f88666-7wtzg", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.23.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali12ced012fb5", MAC:"66:59:1c:ee:42:01", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:21.364961 containerd[1918]: 2025-09-16 05:01:21.363 [INFO][5582] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" Namespace="calico-system" Pod="goldmane-7988f88666-7wtzg" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-goldmane--7988f88666--7wtzg-eth0" Sep 16 05:01:21.372275 containerd[1918]: time="2025-09-16T05:01:21.372241024Z" level=info msg="connecting to shim cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7" address="unix:///run/containerd/s/6354cb98fdeb4b61cdb6467e11933d23d4e79512bcb6ba7e4cd4e3b0da300cdf" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:01:21.395211 systemd[1]: Started cri-containerd-cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7.scope - libcontainer container cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7. Sep 16 05:01:21.422297 containerd[1918]: time="2025-09-16T05:01:21.422243610Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-7wtzg,Uid:767fa079-891d-454b-9bcf-96567c6c98e4,Namespace:calico-system,Attempt:0,} returns sandbox id \"cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7\"" Sep 16 05:01:21.655310 systemd-networkd[1833]: calib8697f33f27: Gained IPv6LL Sep 16 05:01:21.911352 systemd-networkd[1833]: cali53e3a499d74: Gained IPv6LL Sep 16 05:01:22.187291 containerd[1918]: time="2025-09-16T05:01:22.187111482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9dgtk,Uid:50bc1259-bb30-4849-97ed-c6fb8e4bcaf9,Namespace:calico-system,Attempt:0,}" Sep 16 05:01:22.274143 systemd-networkd[1833]: calidb9c06c3034: Link UP Sep 16 05:01:22.275028 systemd-networkd[1833]: calidb9c06c3034: Gained carrier Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.206 [INFO][5769] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--32926c0571-k8s-csi--node--driver--9dgtk-eth0 csi-node-driver- calico-system 50bc1259-bb30-4849-97ed-c6fb8e4bcaf9 680 0 2025-09-16 05:00:50 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459.0.0-n-32926c0571 csi-node-driver-9dgtk eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calidb9c06c3034 [] [] }} ContainerID="f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" Namespace="calico-system" Pod="csi-node-driver-9dgtk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-csi--node--driver--9dgtk-" Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.206 [INFO][5769] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" Namespace="calico-system" Pod="csi-node-driver-9dgtk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-csi--node--driver--9dgtk-eth0" Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.218 [INFO][5793] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" HandleID="k8s-pod-network.f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" Workload="ci--4459.0.0--n--32926c0571-k8s-csi--node--driver--9dgtk-eth0" Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.218 [INFO][5793] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" HandleID="k8s-pod-network.f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" Workload="ci--4459.0.0--n--32926c0571-k8s-csi--node--driver--9dgtk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f660), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.0.0-n-32926c0571", "pod":"csi-node-driver-9dgtk", "timestamp":"2025-09-16 05:01:22.21867133 +0000 UTC"}, Hostname:"ci-4459.0.0-n-32926c0571", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.218 [INFO][5793] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.218 [INFO][5793] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.218 [INFO][5793] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-32926c0571' Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.224 [INFO][5793] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.227 [INFO][5793] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.232 [INFO][5793] ipam/ipam.go 511: Trying affinity for 192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.236 [INFO][5793] ipam/ipam.go 158: Attempting to load block cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.241 [INFO][5793] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.241 [INFO][5793] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.23.192/26 handle="k8s-pod-network.f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.244 [INFO][5793] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.254 [INFO][5793] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.23.192/26 handle="k8s-pod-network.f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.265 [INFO][5793] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.23.199/26] block=192.168.23.192/26 handle="k8s-pod-network.f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.265 [INFO][5793] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.23.199/26] handle="k8s-pod-network.f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.266 [INFO][5793] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:01:22.297778 containerd[1918]: 2025-09-16 05:01:22.266 [INFO][5793] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.23.199/26] IPv6=[] ContainerID="f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" HandleID="k8s-pod-network.f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" Workload="ci--4459.0.0--n--32926c0571-k8s-csi--node--driver--9dgtk-eth0" Sep 16 05:01:22.298297 containerd[1918]: 2025-09-16 05:01:22.269 [INFO][5769] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" Namespace="calico-system" Pod="csi-node-driver-9dgtk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-csi--node--driver--9dgtk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-csi--node--driver--9dgtk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"50bc1259-bb30-4849-97ed-c6fb8e4bcaf9", ResourceVersion:"680", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 0, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"", Pod:"csi-node-driver-9dgtk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.23.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidb9c06c3034", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:22.298297 containerd[1918]: 2025-09-16 05:01:22.270 [INFO][5769] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.23.199/32] ContainerID="f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" Namespace="calico-system" Pod="csi-node-driver-9dgtk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-csi--node--driver--9dgtk-eth0" Sep 16 05:01:22.298297 containerd[1918]: 2025-09-16 05:01:22.270 [INFO][5769] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidb9c06c3034 ContainerID="f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" Namespace="calico-system" Pod="csi-node-driver-9dgtk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-csi--node--driver--9dgtk-eth0" Sep 16 05:01:22.298297 containerd[1918]: 2025-09-16 05:01:22.275 [INFO][5769] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" Namespace="calico-system" Pod="csi-node-driver-9dgtk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-csi--node--driver--9dgtk-eth0" Sep 16 05:01:22.298297 containerd[1918]: 2025-09-16 05:01:22.277 [INFO][5769] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" Namespace="calico-system" Pod="csi-node-driver-9dgtk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-csi--node--driver--9dgtk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-csi--node--driver--9dgtk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"50bc1259-bb30-4849-97ed-c6fb8e4bcaf9", ResourceVersion:"680", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 0, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e", Pod:"csi-node-driver-9dgtk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.23.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidb9c06c3034", MAC:"e6:0a:85:03:15:0f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:22.298297 containerd[1918]: 2025-09-16 05:01:22.293 [INFO][5769] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" Namespace="calico-system" Pod="csi-node-driver-9dgtk" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-csi--node--driver--9dgtk-eth0" Sep 16 05:01:22.304867 containerd[1918]: time="2025-09-16T05:01:22.304842104Z" level=info msg="connecting to shim f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e" address="unix:///run/containerd/s/b7f37b36a01852f1da97487ac7824a42302034cfdb99fabba95b6e5b68a726ea" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:01:22.320234 systemd[1]: Started cri-containerd-f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e.scope - libcontainer container f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e. Sep 16 05:01:22.337077 containerd[1918]: time="2025-09-16T05:01:22.337055533Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9dgtk,Uid:50bc1259-bb30-4849-97ed-c6fb8e4bcaf9,Namespace:calico-system,Attempt:0,} returns sandbox id \"f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e\"" Sep 16 05:01:22.551370 systemd-networkd[1833]: cali27a45bee03b: Gained IPv6LL Sep 16 05:01:22.999349 systemd-networkd[1833]: cali12ced012fb5: Gained IPv6LL Sep 16 05:01:23.186315 containerd[1918]: time="2025-09-16T05:01:23.186193404Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xdnnv,Uid:8b8410d9-6f65-4188-9142-9e6482b4999b,Namespace:kube-system,Attempt:0,}" Sep 16 05:01:23.276370 systemd-networkd[1833]: cali4b29a66cac6: Link UP Sep 16 05:01:23.277093 systemd-networkd[1833]: cali4b29a66cac6: Gained carrier Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.206 [INFO][5864] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--xdnnv-eth0 coredns-7c65d6cfc9- kube-system 8b8410d9-6f65-4188-9142-9e6482b4999b 813 0 2025-09-16 05:00:40 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.0.0-n-32926c0571 coredns-7c65d6cfc9-xdnnv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4b29a66cac6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xdnnv" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--xdnnv-" Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.206 [INFO][5864] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xdnnv" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--xdnnv-eth0" Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.222 [INFO][5885] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" HandleID="k8s-pod-network.f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" Workload="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--xdnnv-eth0" Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.222 [INFO][5885] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" HandleID="k8s-pod-network.f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" Workload="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--xdnnv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000345490), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.0.0-n-32926c0571", "pod":"coredns-7c65d6cfc9-xdnnv", "timestamp":"2025-09-16 05:01:23.222851385 +0000 UTC"}, Hostname:"ci-4459.0.0-n-32926c0571", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.222 [INFO][5885] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.223 [INFO][5885] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.223 [INFO][5885] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-32926c0571' Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.227 [INFO][5885] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.231 [INFO][5885] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.234 [INFO][5885] ipam/ipam.go 511: Trying affinity for 192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.235 [INFO][5885] ipam/ipam.go 158: Attempting to load block cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.237 [INFO][5885] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.23.192/26 host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.237 [INFO][5885] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.23.192/26 handle="k8s-pod-network.f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.238 [INFO][5885] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6 Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.256 [INFO][5885] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.23.192/26 handle="k8s-pod-network.f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.267 [INFO][5885] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.23.200/26] block=192.168.23.192/26 handle="k8s-pod-network.f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.268 [INFO][5885] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.23.200/26] handle="k8s-pod-network.f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" host="ci-4459.0.0-n-32926c0571" Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.268 [INFO][5885] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:01:23.297171 containerd[1918]: 2025-09-16 05:01:23.268 [INFO][5885] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.23.200/26] IPv6=[] ContainerID="f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" HandleID="k8s-pod-network.f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" Workload="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--xdnnv-eth0" Sep 16 05:01:23.299585 containerd[1918]: 2025-09-16 05:01:23.272 [INFO][5864] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xdnnv" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--xdnnv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--xdnnv-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"8b8410d9-6f65-4188-9142-9e6482b4999b", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 0, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"", Pod:"coredns-7c65d6cfc9-xdnnv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.23.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4b29a66cac6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:23.299585 containerd[1918]: 2025-09-16 05:01:23.272 [INFO][5864] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.23.200/32] ContainerID="f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xdnnv" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--xdnnv-eth0" Sep 16 05:01:23.299585 containerd[1918]: 2025-09-16 05:01:23.272 [INFO][5864] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4b29a66cac6 ContainerID="f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xdnnv" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--xdnnv-eth0" Sep 16 05:01:23.299585 containerd[1918]: 2025-09-16 05:01:23.277 [INFO][5864] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xdnnv" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--xdnnv-eth0" Sep 16 05:01:23.299585 containerd[1918]: 2025-09-16 05:01:23.278 [INFO][5864] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xdnnv" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--xdnnv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--xdnnv-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"8b8410d9-6f65-4188-9142-9e6482b4999b", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 0, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-32926c0571", ContainerID:"f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6", Pod:"coredns-7c65d6cfc9-xdnnv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.23.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4b29a66cac6", MAC:"86:68:1c:32:e2:e0", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:01:23.299585 containerd[1918]: 2025-09-16 05:01:23.293 [INFO][5864] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xdnnv" WorkloadEndpoint="ci--4459.0.0--n--32926c0571-k8s-coredns--7c65d6cfc9--xdnnv-eth0" Sep 16 05:01:23.310248 containerd[1918]: time="2025-09-16T05:01:23.310221388Z" level=info msg="connecting to shim f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6" address="unix:///run/containerd/s/4c1b99a03b700ffaf47088a52831b36933fa3e5eab6bad4dd2f24d761d871e02" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:01:23.336317 systemd[1]: Started cri-containerd-f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6.scope - libcontainer container f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6. Sep 16 05:01:23.364863 containerd[1918]: time="2025-09-16T05:01:23.364842621Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xdnnv,Uid:8b8410d9-6f65-4188-9142-9e6482b4999b,Namespace:kube-system,Attempt:0,} returns sandbox id \"f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6\"" Sep 16 05:01:23.366068 containerd[1918]: time="2025-09-16T05:01:23.366055480Z" level=info msg="CreateContainer within sandbox \"f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 16 05:01:23.369298 containerd[1918]: time="2025-09-16T05:01:23.369257426Z" level=info msg="Container be234da1f48cdc2bb69d7fbe97eb5a177e402ea29b0bbf3f79cc4e2fa428e1c6: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:01:23.371631 containerd[1918]: time="2025-09-16T05:01:23.371618004Z" level=info msg="CreateContainer within sandbox \"f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"be234da1f48cdc2bb69d7fbe97eb5a177e402ea29b0bbf3f79cc4e2fa428e1c6\"" Sep 16 05:01:23.371863 containerd[1918]: time="2025-09-16T05:01:23.371850263Z" level=info msg="StartContainer for \"be234da1f48cdc2bb69d7fbe97eb5a177e402ea29b0bbf3f79cc4e2fa428e1c6\"" Sep 16 05:01:23.372290 containerd[1918]: time="2025-09-16T05:01:23.372278052Z" level=info msg="connecting to shim be234da1f48cdc2bb69d7fbe97eb5a177e402ea29b0bbf3f79cc4e2fa428e1c6" address="unix:///run/containerd/s/4c1b99a03b700ffaf47088a52831b36933fa3e5eab6bad4dd2f24d761d871e02" protocol=ttrpc version=3 Sep 16 05:01:23.390337 systemd[1]: Started cri-containerd-be234da1f48cdc2bb69d7fbe97eb5a177e402ea29b0bbf3f79cc4e2fa428e1c6.scope - libcontainer container be234da1f48cdc2bb69d7fbe97eb5a177e402ea29b0bbf3f79cc4e2fa428e1c6. Sep 16 05:01:23.403883 containerd[1918]: time="2025-09-16T05:01:23.403860970Z" level=info msg="StartContainer for \"be234da1f48cdc2bb69d7fbe97eb5a177e402ea29b0bbf3f79cc4e2fa428e1c6\" returns successfully" Sep 16 05:01:23.511207 systemd-networkd[1833]: calidb9c06c3034: Gained IPv6LL Sep 16 05:01:24.359257 kubelet[3290]: I0916 05:01:24.359125 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-xdnnv" podStartSLOduration=44.359090185 podStartE2EDuration="44.359090185s" podCreationTimestamp="2025-09-16 05:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 05:01:24.357856118 +0000 UTC m=+50.216385401" watchObservedRunningTime="2025-09-16 05:01:24.359090185 +0000 UTC m=+50.217619453" Sep 16 05:01:25.304477 systemd-networkd[1833]: cali4b29a66cac6: Gained IPv6LL Sep 16 05:01:28.552312 containerd[1918]: time="2025-09-16T05:01:28.552260281Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:28.552553 containerd[1918]: time="2025-09-16T05:01:28.552470487Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 16 05:01:28.552867 containerd[1918]: time="2025-09-16T05:01:28.552823810Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:28.553687 containerd[1918]: time="2025-09-16T05:01:28.553645721Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:28.554330 containerd[1918]: time="2025-09-16T05:01:28.554288638Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 7.662681578s" Sep 16 05:01:28.554330 containerd[1918]: time="2025-09-16T05:01:28.554305284Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 16 05:01:28.554759 containerd[1918]: time="2025-09-16T05:01:28.554721392Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 16 05:01:28.555246 containerd[1918]: time="2025-09-16T05:01:28.555201742Z" level=info msg="CreateContainer within sandbox \"b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 16 05:01:28.557856 containerd[1918]: time="2025-09-16T05:01:28.557813650Z" level=info msg="Container f65e48564498a81cce582691e4caafd937c005f30e55d139ef1753c68db252c2: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:01:28.560539 containerd[1918]: time="2025-09-16T05:01:28.560493782Z" level=info msg="CreateContainer within sandbox \"b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"f65e48564498a81cce582691e4caafd937c005f30e55d139ef1753c68db252c2\"" Sep 16 05:01:28.560744 containerd[1918]: time="2025-09-16T05:01:28.560703806Z" level=info msg="StartContainer for \"f65e48564498a81cce582691e4caafd937c005f30e55d139ef1753c68db252c2\"" Sep 16 05:01:28.561433 containerd[1918]: time="2025-09-16T05:01:28.561416658Z" level=info msg="connecting to shim f65e48564498a81cce582691e4caafd937c005f30e55d139ef1753c68db252c2" address="unix:///run/containerd/s/8dee291abe8507205abbee8d98f03790d380d6b841902bfc0df6ed73d51d9dd6" protocol=ttrpc version=3 Sep 16 05:01:28.576328 systemd[1]: Started cri-containerd-f65e48564498a81cce582691e4caafd937c005f30e55d139ef1753c68db252c2.scope - libcontainer container f65e48564498a81cce582691e4caafd937c005f30e55d139ef1753c68db252c2. Sep 16 05:01:28.603516 containerd[1918]: time="2025-09-16T05:01:28.603493179Z" level=info msg="StartContainer for \"f65e48564498a81cce582691e4caafd937c005f30e55d139ef1753c68db252c2\" returns successfully" Sep 16 05:01:29.060275 containerd[1918]: time="2025-09-16T05:01:29.060229354Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:29.060390 containerd[1918]: time="2025-09-16T05:01:29.060375448Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 16 05:01:29.061543 containerd[1918]: time="2025-09-16T05:01:29.061528590Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 506.794872ms" Sep 16 05:01:29.061575 containerd[1918]: time="2025-09-16T05:01:29.061544793Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 16 05:01:29.062157 containerd[1918]: time="2025-09-16T05:01:29.062146699Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 16 05:01:29.062665 containerd[1918]: time="2025-09-16T05:01:29.062645920Z" level=info msg="CreateContainer within sandbox \"bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 16 05:01:29.066290 containerd[1918]: time="2025-09-16T05:01:29.066272141Z" level=info msg="Container 9c6c763cf6a00616dbb8139de7c597765cedbca9d7e7d7553a59f677c07c55e5: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:01:29.069783 containerd[1918]: time="2025-09-16T05:01:29.069733226Z" level=info msg="CreateContainer within sandbox \"bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"9c6c763cf6a00616dbb8139de7c597765cedbca9d7e7d7553a59f677c07c55e5\"" Sep 16 05:01:29.070097 containerd[1918]: time="2025-09-16T05:01:29.070026658Z" level=info msg="StartContainer for \"9c6c763cf6a00616dbb8139de7c597765cedbca9d7e7d7553a59f677c07c55e5\"" Sep 16 05:01:29.070774 containerd[1918]: time="2025-09-16T05:01:29.070743506Z" level=info msg="connecting to shim 9c6c763cf6a00616dbb8139de7c597765cedbca9d7e7d7553a59f677c07c55e5" address="unix:///run/containerd/s/cdee89c03d788af5a42af953e2e2d81ac775dfcb89b1d883e227a54dba35303b" protocol=ttrpc version=3 Sep 16 05:01:29.087210 systemd[1]: Started cri-containerd-9c6c763cf6a00616dbb8139de7c597765cedbca9d7e7d7553a59f677c07c55e5.scope - libcontainer container 9c6c763cf6a00616dbb8139de7c597765cedbca9d7e7d7553a59f677c07c55e5. Sep 16 05:01:29.117431 containerd[1918]: time="2025-09-16T05:01:29.117410089Z" level=info msg="StartContainer for \"9c6c763cf6a00616dbb8139de7c597765cedbca9d7e7d7553a59f677c07c55e5\" returns successfully" Sep 16 05:01:29.363909 kubelet[3290]: I0916 05:01:29.363811 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-75499dcbcd-cz8mk" podStartSLOduration=32.120827588 podStartE2EDuration="41.36379663s" podCreationTimestamp="2025-09-16 05:00:48 +0000 UTC" firstStartedPulling="2025-09-16 05:01:19.311702814 +0000 UTC m=+45.170232060" lastFinishedPulling="2025-09-16 05:01:28.554671846 +0000 UTC m=+54.413201102" observedRunningTime="2025-09-16 05:01:29.363605165 +0000 UTC m=+55.222134413" watchObservedRunningTime="2025-09-16 05:01:29.36379663 +0000 UTC m=+55.222325874" Sep 16 05:01:29.368197 kubelet[3290]: I0916 05:01:29.368153 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-75499dcbcd-phsx9" podStartSLOduration=32.741553826 podStartE2EDuration="41.368137889s" podCreationTimestamp="2025-09-16 05:00:48 +0000 UTC" firstStartedPulling="2025-09-16 05:01:20.435421744 +0000 UTC m=+46.293950995" lastFinishedPulling="2025-09-16 05:01:29.062005812 +0000 UTC m=+54.920535058" observedRunningTime="2025-09-16 05:01:29.367991934 +0000 UTC m=+55.226521186" watchObservedRunningTime="2025-09-16 05:01:29.368137889 +0000 UTC m=+55.226667134" Sep 16 05:01:30.361688 kubelet[3290]: I0916 05:01:30.361609 3290 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 05:01:30.361688 kubelet[3290]: I0916 05:01:30.361645 3290 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 05:01:33.887917 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount675261322.mount: Deactivated successfully. Sep 16 05:01:33.891907 containerd[1918]: time="2025-09-16T05:01:33.891861510Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:33.892070 containerd[1918]: time="2025-09-16T05:01:33.892041753Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 16 05:01:33.892535 containerd[1918]: time="2025-09-16T05:01:33.892495306Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:33.893497 containerd[1918]: time="2025-09-16T05:01:33.893457360Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:33.893913 containerd[1918]: time="2025-09-16T05:01:33.893873516Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 4.831713446s" Sep 16 05:01:33.893913 containerd[1918]: time="2025-09-16T05:01:33.893888519Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 16 05:01:33.894355 containerd[1918]: time="2025-09-16T05:01:33.894315946Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 16 05:01:33.894816 containerd[1918]: time="2025-09-16T05:01:33.894798091Z" level=info msg="CreateContainer within sandbox \"69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 16 05:01:33.897404 containerd[1918]: time="2025-09-16T05:01:33.897360348Z" level=info msg="Container 8968fc8b10f7695b2a97f39a72262c5c2fccad78b019991f15fd31b7490a4694: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:01:33.900796 containerd[1918]: time="2025-09-16T05:01:33.900752698Z" level=info msg="CreateContainer within sandbox \"69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"8968fc8b10f7695b2a97f39a72262c5c2fccad78b019991f15fd31b7490a4694\"" Sep 16 05:01:33.901039 containerd[1918]: time="2025-09-16T05:01:33.901020841Z" level=info msg="StartContainer for \"8968fc8b10f7695b2a97f39a72262c5c2fccad78b019991f15fd31b7490a4694\"" Sep 16 05:01:33.901565 containerd[1918]: time="2025-09-16T05:01:33.901553333Z" level=info msg="connecting to shim 8968fc8b10f7695b2a97f39a72262c5c2fccad78b019991f15fd31b7490a4694" address="unix:///run/containerd/s/e8482c9d9d249da8b6d8a51f39b4da7d52a1b5586a4f5fb8a06af7f6dec12a3e" protocol=ttrpc version=3 Sep 16 05:01:33.918322 systemd[1]: Started cri-containerd-8968fc8b10f7695b2a97f39a72262c5c2fccad78b019991f15fd31b7490a4694.scope - libcontainer container 8968fc8b10f7695b2a97f39a72262c5c2fccad78b019991f15fd31b7490a4694. Sep 16 05:01:33.946925 containerd[1918]: time="2025-09-16T05:01:33.946902627Z" level=info msg="StartContainer for \"8968fc8b10f7695b2a97f39a72262c5c2fccad78b019991f15fd31b7490a4694\" returns successfully" Sep 16 05:01:34.394793 kubelet[3290]: I0916 05:01:34.394705 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7d5456bb9-rkbdg" podStartSLOduration=1.27333394 podStartE2EDuration="19.394678012s" podCreationTimestamp="2025-09-16 05:01:15 +0000 UTC" firstStartedPulling="2025-09-16 05:01:15.772915067 +0000 UTC m=+41.631444314" lastFinishedPulling="2025-09-16 05:01:33.89425913 +0000 UTC m=+59.752788386" observedRunningTime="2025-09-16 05:01:34.39411172 +0000 UTC m=+60.252640995" watchObservedRunningTime="2025-09-16 05:01:34.394678012 +0000 UTC m=+60.253207274" Sep 16 05:01:36.228557 containerd[1918]: time="2025-09-16T05:01:36.228535116Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"d308437d549159933eb92e36f79e4869792c5df2d97d7f4f80ee369cc6226902\" pid:6203 exited_at:{seconds:1757998896 nanos:228342622}" Sep 16 05:01:37.122432 containerd[1918]: time="2025-09-16T05:01:37.122397974Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:37.122627 containerd[1918]: time="2025-09-16T05:01:37.122612043Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 16 05:01:37.122994 containerd[1918]: time="2025-09-16T05:01:37.122981633Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:37.123796 containerd[1918]: time="2025-09-16T05:01:37.123784999Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:37.124452 containerd[1918]: time="2025-09-16T05:01:37.124439915Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.230110918s" Sep 16 05:01:37.124481 containerd[1918]: time="2025-09-16T05:01:37.124456264Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 16 05:01:37.125014 containerd[1918]: time="2025-09-16T05:01:37.125004658Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 16 05:01:37.128289 containerd[1918]: time="2025-09-16T05:01:37.128273283Z" level=info msg="CreateContainer within sandbox \"f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 16 05:01:37.131153 containerd[1918]: time="2025-09-16T05:01:37.131112828Z" level=info msg="Container 0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:01:37.134233 containerd[1918]: time="2025-09-16T05:01:37.134191768Z" level=info msg="CreateContainer within sandbox \"f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\"" Sep 16 05:01:37.134429 containerd[1918]: time="2025-09-16T05:01:37.134414785Z" level=info msg="StartContainer for \"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\"" Sep 16 05:01:37.134943 containerd[1918]: time="2025-09-16T05:01:37.134928981Z" level=info msg="connecting to shim 0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3" address="unix:///run/containerd/s/6fcf1e0c1b2af769fc65c577c1543f708b44ce962620d562b8d621f4c41b914d" protocol=ttrpc version=3 Sep 16 05:01:37.151206 systemd[1]: Started cri-containerd-0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3.scope - libcontainer container 0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3. Sep 16 05:01:37.178936 containerd[1918]: time="2025-09-16T05:01:37.178914641Z" level=info msg="StartContainer for \"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" returns successfully" Sep 16 05:01:37.412011 kubelet[3290]: I0916 05:01:37.411721 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-89f5b8df-cq9w2" podStartSLOduration=31.625481888 podStartE2EDuration="47.411671376s" podCreationTimestamp="2025-09-16 05:00:50 +0000 UTC" firstStartedPulling="2025-09-16 05:01:21.338624886 +0000 UTC m=+47.197154133" lastFinishedPulling="2025-09-16 05:01:37.124814375 +0000 UTC m=+62.983343621" observedRunningTime="2025-09-16 05:01:37.409583291 +0000 UTC m=+63.268112604" watchObservedRunningTime="2025-09-16 05:01:37.411671376 +0000 UTC m=+63.270200677" Sep 16 05:01:37.490713 containerd[1918]: time="2025-09-16T05:01:37.490687140Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"371b6a044fd6cb2574cd1add3b7ed4eb19801dc9a4e7ef546642566be9118128\" pid:6300 exited_at:{seconds:1757998897 nanos:490339176}" Sep 16 05:01:38.300266 containerd[1918]: time="2025-09-16T05:01:38.300216495Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"b8739012294b32a266b00cbf75ce45b2ebc1c4d89562e67999995d4eae5dc0f0\" pid:6322 exited_at:{seconds:1757998898 nanos:300120456}" Sep 16 05:01:40.472418 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3924748262.mount: Deactivated successfully. Sep 16 05:01:40.677523 containerd[1918]: time="2025-09-16T05:01:40.677499502Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:40.677781 containerd[1918]: time="2025-09-16T05:01:40.677768910Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 16 05:01:40.678129 containerd[1918]: time="2025-09-16T05:01:40.678119248Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:40.679032 containerd[1918]: time="2025-09-16T05:01:40.679020547Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:40.679471 containerd[1918]: time="2025-09-16T05:01:40.679459586Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.554440402s" Sep 16 05:01:40.679499 containerd[1918]: time="2025-09-16T05:01:40.679475378Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 16 05:01:40.679966 containerd[1918]: time="2025-09-16T05:01:40.679933707Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 16 05:01:40.680540 containerd[1918]: time="2025-09-16T05:01:40.680523228Z" level=info msg="CreateContainer within sandbox \"cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 16 05:01:40.683278 containerd[1918]: time="2025-09-16T05:01:40.683241031Z" level=info msg="Container c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:01:40.685831 containerd[1918]: time="2025-09-16T05:01:40.685817012Z" level=info msg="CreateContainer within sandbox \"cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\"" Sep 16 05:01:40.686060 containerd[1918]: time="2025-09-16T05:01:40.686047941Z" level=info msg="StartContainer for \"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\"" Sep 16 05:01:40.686564 containerd[1918]: time="2025-09-16T05:01:40.686552986Z" level=info msg="connecting to shim c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd" address="unix:///run/containerd/s/6354cb98fdeb4b61cdb6467e11933d23d4e79512bcb6ba7e4cd4e3b0da300cdf" protocol=ttrpc version=3 Sep 16 05:01:40.705282 systemd[1]: Started cri-containerd-c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd.scope - libcontainer container c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd. Sep 16 05:01:40.734915 containerd[1918]: time="2025-09-16T05:01:40.734861196Z" level=info msg="StartContainer for \"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" returns successfully" Sep 16 05:01:41.408029 kubelet[3290]: I0916 05:01:41.407985 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-7wtzg" podStartSLOduration=32.150849966 podStartE2EDuration="51.407970395s" podCreationTimestamp="2025-09-16 05:00:50 +0000 UTC" firstStartedPulling="2025-09-16 05:01:21.422768833 +0000 UTC m=+47.281298081" lastFinishedPulling="2025-09-16 05:01:40.679889262 +0000 UTC m=+66.538418510" observedRunningTime="2025-09-16 05:01:41.407594812 +0000 UTC m=+67.266124059" watchObservedRunningTime="2025-09-16 05:01:41.407970395 +0000 UTC m=+67.266499641" Sep 16 05:01:41.451852 containerd[1918]: time="2025-09-16T05:01:41.451824917Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"43b7e85c8111fe74d86305193a1fe75c42e6409fa3e338ccf7b32f9927e5c112\" pid:6402 exit_status:1 exited_at:{seconds:1757998901 nanos:451254032}" Sep 16 05:01:42.463039 containerd[1918]: time="2025-09-16T05:01:42.462985715Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"a79605ca27471d9288ca0fb1b60ef6f469ddcf69bc92d44734554fd071696d3e\" pid:6441 exit_status:1 exited_at:{seconds:1757998902 nanos:462825269}" Sep 16 05:01:43.773778 containerd[1918]: time="2025-09-16T05:01:43.773752108Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:43.774004 containerd[1918]: time="2025-09-16T05:01:43.773973950Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 16 05:01:43.774371 containerd[1918]: time="2025-09-16T05:01:43.774361056Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:43.775250 containerd[1918]: time="2025-09-16T05:01:43.775239201Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:43.775634 containerd[1918]: time="2025-09-16T05:01:43.775620346Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 3.095673055s" Sep 16 05:01:43.775677 containerd[1918]: time="2025-09-16T05:01:43.775637868Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 16 05:01:43.776562 containerd[1918]: time="2025-09-16T05:01:43.776523159Z" level=info msg="CreateContainer within sandbox \"f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 16 05:01:43.780402 containerd[1918]: time="2025-09-16T05:01:43.780359636Z" level=info msg="Container b34304dfb4590783ac02a6ac4e8bb0b746912e89440cf87d3def56dda0f485e6: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:01:43.783585 containerd[1918]: time="2025-09-16T05:01:43.783571615Z" level=info msg="CreateContainer within sandbox \"f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"b34304dfb4590783ac02a6ac4e8bb0b746912e89440cf87d3def56dda0f485e6\"" Sep 16 05:01:43.783756 containerd[1918]: time="2025-09-16T05:01:43.783744400Z" level=info msg="StartContainer for \"b34304dfb4590783ac02a6ac4e8bb0b746912e89440cf87d3def56dda0f485e6\"" Sep 16 05:01:43.784524 containerd[1918]: time="2025-09-16T05:01:43.784510775Z" level=info msg="connecting to shim b34304dfb4590783ac02a6ac4e8bb0b746912e89440cf87d3def56dda0f485e6" address="unix:///run/containerd/s/b7f37b36a01852f1da97487ac7824a42302034cfdb99fabba95b6e5b68a726ea" protocol=ttrpc version=3 Sep 16 05:01:43.806260 systemd[1]: Started cri-containerd-b34304dfb4590783ac02a6ac4e8bb0b746912e89440cf87d3def56dda0f485e6.scope - libcontainer container b34304dfb4590783ac02a6ac4e8bb0b746912e89440cf87d3def56dda0f485e6. Sep 16 05:01:43.825891 containerd[1918]: time="2025-09-16T05:01:43.825868539Z" level=info msg="StartContainer for \"b34304dfb4590783ac02a6ac4e8bb0b746912e89440cf87d3def56dda0f485e6\" returns successfully" Sep 16 05:01:43.826427 containerd[1918]: time="2025-09-16T05:01:43.826417088Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 16 05:01:46.861986 containerd[1918]: time="2025-09-16T05:01:46.861933566Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:46.862245 containerd[1918]: time="2025-09-16T05:01:46.862177697Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 16 05:01:46.862583 containerd[1918]: time="2025-09-16T05:01:46.862541464Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:46.863382 containerd[1918]: time="2025-09-16T05:01:46.863343012Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:01:46.863995 containerd[1918]: time="2025-09-16T05:01:46.863980847Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 3.037549313s" Sep 16 05:01:46.864046 containerd[1918]: time="2025-09-16T05:01:46.863996107Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 16 05:01:46.864976 containerd[1918]: time="2025-09-16T05:01:46.864961968Z" level=info msg="CreateContainer within sandbox \"f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 16 05:01:46.867883 containerd[1918]: time="2025-09-16T05:01:46.867841353Z" level=info msg="Container 60491c43ec13acceab7b8ababe707cbdb9b84653f99eee5c0d15e4281fc5d4a4: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:01:46.871841 containerd[1918]: time="2025-09-16T05:01:46.871800960Z" level=info msg="CreateContainer within sandbox \"f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"60491c43ec13acceab7b8ababe707cbdb9b84653f99eee5c0d15e4281fc5d4a4\"" Sep 16 05:01:46.872039 containerd[1918]: time="2025-09-16T05:01:46.872023812Z" level=info msg="StartContainer for \"60491c43ec13acceab7b8ababe707cbdb9b84653f99eee5c0d15e4281fc5d4a4\"" Sep 16 05:01:46.872764 containerd[1918]: time="2025-09-16T05:01:46.872751281Z" level=info msg="connecting to shim 60491c43ec13acceab7b8ababe707cbdb9b84653f99eee5c0d15e4281fc5d4a4" address="unix:///run/containerd/s/b7f37b36a01852f1da97487ac7824a42302034cfdb99fabba95b6e5b68a726ea" protocol=ttrpc version=3 Sep 16 05:01:46.892182 systemd[1]: Started cri-containerd-60491c43ec13acceab7b8ababe707cbdb9b84653f99eee5c0d15e4281fc5d4a4.scope - libcontainer container 60491c43ec13acceab7b8ababe707cbdb9b84653f99eee5c0d15e4281fc5d4a4. Sep 16 05:01:46.912152 containerd[1918]: time="2025-09-16T05:01:46.912130594Z" level=info msg="StartContainer for \"60491c43ec13acceab7b8ababe707cbdb9b84653f99eee5c0d15e4281fc5d4a4\" returns successfully" Sep 16 05:01:47.243583 kubelet[3290]: I0916 05:01:47.243560 3290 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 16 05:01:47.243583 kubelet[3290]: I0916 05:01:47.243587 3290 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 16 05:01:47.427607 kubelet[3290]: I0916 05:01:47.427568 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-9dgtk" podStartSLOduration=32.900836369 podStartE2EDuration="57.427554812s" podCreationTimestamp="2025-09-16 05:00:50 +0000 UTC" firstStartedPulling="2025-09-16 05:01:22.337581932 +0000 UTC m=+48.196111180" lastFinishedPulling="2025-09-16 05:01:46.864300375 +0000 UTC m=+72.722829623" observedRunningTime="2025-09-16 05:01:47.42741003 +0000 UTC m=+73.285939278" watchObservedRunningTime="2025-09-16 05:01:47.427554812 +0000 UTC m=+73.286084057" Sep 16 05:01:50.734651 kubelet[3290]: I0916 05:01:50.734593 3290 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 05:01:56.771241 kubelet[3290]: I0916 05:01:56.771216 3290 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 05:02:06.176326 containerd[1918]: time="2025-09-16T05:02:06.176257090Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"bf4640d0e62ec5ef3c02e9f2dbd134f8e210a6fe12a5de1a18840c1c2d0de357\" pid:6566 exited_at:{seconds:1757998926 nanos:176007289}" Sep 16 05:02:08.298802 containerd[1918]: time="2025-09-16T05:02:08.298772600Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"b632f24ea05303ba66c177928d87c2d20b85e17d494e3be43d6de42e7d5d02cc\" pid:6613 exited_at:{seconds:1757998928 nanos:298603247}" Sep 16 05:02:08.315114 containerd[1918]: time="2025-09-16T05:02:08.315086698Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"9dc5f37f7073ae7ecac8bcd2fe4d3528a8248bef29a11cef4b9e02172b24bf7d\" pid:6614 exited_at:{seconds:1757998928 nanos:314821962}" Sep 16 05:02:35.573915 containerd[1918]: time="2025-09-16T05:02:35.573883619Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"55f9959c4f9c3a465ab439b353d11716b5ddd55a9e41e541fbf45a2ec281c52a\" pid:6665 exited_at:{seconds:1757998955 nanos:573543140}" Sep 16 05:02:36.215835 containerd[1918]: time="2025-09-16T05:02:36.215805526Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"b69e63adab680e18cfce0b2f354c1117ab8dd2cdf0fefb5fd57eaa80fbf0ffe5\" pid:6706 exited_at:{seconds:1757998956 nanos:215585909}" Sep 16 05:02:38.306604 containerd[1918]: time="2025-09-16T05:02:38.306580688Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"f2148e887725fb31e5e9a02ac5b1f7bc560f71aac3e180d73c7c712211ea424c\" pid:6753 exited_at:{seconds:1757998958 nanos:306481284}" Sep 16 05:02:38.324056 containerd[1918]: time="2025-09-16T05:02:38.324025290Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"fc8a67ec0b6da6296bea1b32175b787f9e80708dc80cfc9ed2aa13b227caa429\" pid:6752 exited_at:{seconds:1757998958 nanos:323862308}" Sep 16 05:02:39.478396 containerd[1918]: time="2025-09-16T05:02:39.478370184Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"ce710a13d005ad1275d315dbceeae18422530f3bb37a03a335c74e713981be4b\" pid:6797 exited_at:{seconds:1757998959 nanos:478277854}" Sep 16 05:03:06.189294 containerd[1918]: time="2025-09-16T05:03:06.189240292Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"9804679480d185a89202be491209fc31e52454204ef1a6e278b90d54cfd8780c\" pid:6847 exited_at:{seconds:1757998986 nanos:189052907}" Sep 16 05:03:08.299364 containerd[1918]: time="2025-09-16T05:03:08.299338348Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"250c30fd91db483ee81fdd9dcd3d7fdcd223cfb81f4f01ba5a131bdd21fa3388\" pid:6896 exited_at:{seconds:1757998988 nanos:299225078}" Sep 16 05:03:08.318605 containerd[1918]: time="2025-09-16T05:03:08.318551929Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"24e0d17eb65484bd870fc6303324e045dc1e825200c4440b9394961ca4d93e58\" pid:6895 exited_at:{seconds:1757998988 nanos:318350266}" Sep 16 05:03:35.560079 containerd[1918]: time="2025-09-16T05:03:35.560054774Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"28893ffb4e89f5d9daf0b3faf0b32dde9a24be4f2e67ce75ec011225ddac9b45\" pid:6940 exited_at:{seconds:1757999015 nanos:559839977}" Sep 16 05:03:36.215786 containerd[1918]: time="2025-09-16T05:03:36.215728482Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"18790bc9370f4206fa58a2e358a6b99ed52b17e1fe87551690e54bb8a1a788c7\" pid:6973 exited_at:{seconds:1757999016 nanos:215539587}" Sep 16 05:03:38.302090 containerd[1918]: time="2025-09-16T05:03:38.302064977Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"2ef27a48c44890fa4ea1ae7ac6d3d6638ba15a68ac8eafa7c2574d33d78d4ed8\" pid:7021 exited_at:{seconds:1757999018 nanos:301949961}" Sep 16 05:03:38.323128 containerd[1918]: time="2025-09-16T05:03:38.323095698Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"e3d73a2af9776cbeaa16433c3faa15dcaf85de365962bcbeb5987e36a377e418\" pid:7022 exited_at:{seconds:1757999018 nanos:322901773}" Sep 16 05:03:39.498138 containerd[1918]: time="2025-09-16T05:03:39.498096498Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"0009578654f7c1863c2ffa9d5207fbdc081200501be256dba985ce7eec8afdea\" pid:7063 exited_at:{seconds:1757999019 nanos:497891914}" Sep 16 05:04:06.183534 containerd[1918]: time="2025-09-16T05:04:06.183502267Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"fdf48ccaec07d6121ef09a8d6194d4eff064533a1f07e12c20faa0e044657532\" pid:7095 exited_at:{seconds:1757999046 nanos:183175816}" Sep 16 05:04:08.351224 containerd[1918]: time="2025-09-16T05:04:08.351191793Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"565baa3e40495d511c12ac06b747b0ed63e2ba45825dc4d0078064d68167f163\" pid:7141 exited_at:{seconds:1757999048 nanos:351065477}" Sep 16 05:04:08.372422 containerd[1918]: time="2025-09-16T05:04:08.372388084Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"a31ca41bf09e848b52c6f1c762af9407dcc2ba9c33f2681a08b124ee3f1ca29c\" pid:7140 exited_at:{seconds:1757999048 nanos:372200697}" Sep 16 05:04:35.570679 containerd[1918]: time="2025-09-16T05:04:35.570616064Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"a49bf2bf7a424caefb5bf5df134cfdf4c98a042904246df0c56d0ca55591ed4e\" pid:7212 exited_at:{seconds:1757999075 nanos:570399684}" Sep 16 05:04:36.220253 containerd[1918]: time="2025-09-16T05:04:36.220228959Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"c13a22a35f183aab7a7c795f01cc789dcdadc10a6a1a64959a8d6764933b426a\" pid:7243 exited_at:{seconds:1757999076 nanos:220027966}" Sep 16 05:04:38.307286 containerd[1918]: time="2025-09-16T05:04:38.307258540Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"3610556e4591d070e590e7abde2ae78f5e811a85ccd04c0e3ecbaae5d5563ac3\" pid:7291 exited_at:{seconds:1757999078 nanos:307032677}" Sep 16 05:04:38.326222 containerd[1918]: time="2025-09-16T05:04:38.326177249Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"56037b7457ffc0488cf69c2cf333c8860ab1e4b8fb810281cf41771950c3ebf0\" pid:7290 exited_at:{seconds:1757999078 nanos:325924312}" Sep 16 05:04:39.526502 containerd[1918]: time="2025-09-16T05:04:39.526457045Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"2d576d10e9d65196ecb2912dcf23cf4a71c8e543e263dc007c63f3ecd0088254\" pid:7332 exited_at:{seconds:1757999079 nanos:526251176}" Sep 16 05:05:06.179005 containerd[1918]: time="2025-09-16T05:05:06.178945443Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"79ee7692cf3ed783170240c30ae6e042810eeefaa3044c93c8eb2e5bb08e815d\" pid:7358 exited_at:{seconds:1757999106 nanos:178742021}" Sep 16 05:05:08.300949 containerd[1918]: time="2025-09-16T05:05:08.300920949Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"4b3c34de781a1d06bda97d6f5ef6211540ef5b8b4e1c2b21ef3e21064d1e09ab\" pid:7407 exited_at:{seconds:1757999108 nanos:300695598}" Sep 16 05:05:08.319151 containerd[1918]: time="2025-09-16T05:05:08.319124867Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"eaec7124e629fd48bc4d0dfd56f6596760e9a71ab5450bd04c6872f99477c33c\" pid:7406 exited_at:{seconds:1757999108 nanos:318953673}" Sep 16 05:05:30.439862 containerd[1918]: time="2025-09-16T05:05:30.439644768Z" level=warning msg="container event discarded" container=86c51a70b4c44123bfd560b4cca792b727300f287975258edfc7d83da7053c8c type=CONTAINER_CREATED_EVENT Sep 16 05:05:30.439862 containerd[1918]: time="2025-09-16T05:05:30.439843553Z" level=warning msg="container event discarded" container=86c51a70b4c44123bfd560b4cca792b727300f287975258edfc7d83da7053c8c type=CONTAINER_STARTED_EVENT Sep 16 05:05:30.457331 containerd[1918]: time="2025-09-16T05:05:30.457220054Z" level=warning msg="container event discarded" container=a4ef41c99f0c23b34e4b55874de9b6b68907c43ba1a78015214b36b9890f7b9d type=CONTAINER_CREATED_EVENT Sep 16 05:05:30.457331 containerd[1918]: time="2025-09-16T05:05:30.457308914Z" level=warning msg="container event discarded" container=6b3658ef6c1cde3ba407227c9bf6ed1380eb4b0867379dc60a4921562ea0a2e4 type=CONTAINER_CREATED_EVENT Sep 16 05:05:30.457662 containerd[1918]: time="2025-09-16T05:05:30.457338292Z" level=warning msg="container event discarded" container=6b3658ef6c1cde3ba407227c9bf6ed1380eb4b0867379dc60a4921562ea0a2e4 type=CONTAINER_STARTED_EVENT Sep 16 05:05:30.469806 containerd[1918]: time="2025-09-16T05:05:30.469733807Z" level=warning msg="container event discarded" container=40fec3a257bbe01a64f87c227cf385ea6d2f50d35344c99fcf238ef095084723 type=CONTAINER_CREATED_EVENT Sep 16 05:05:30.492205 containerd[1918]: time="2025-09-16T05:05:30.492123525Z" level=warning msg="container event discarded" container=52b6dc2f95ea295a05a300b2e452e2af5cb4d6b3cad8ea09c27928ea30d0664a type=CONTAINER_CREATED_EVENT Sep 16 05:05:30.492205 containerd[1918]: time="2025-09-16T05:05:30.492190762Z" level=warning msg="container event discarded" container=52b6dc2f95ea295a05a300b2e452e2af5cb4d6b3cad8ea09c27928ea30d0664a type=CONTAINER_STARTED_EVENT Sep 16 05:05:30.492475 containerd[1918]: time="2025-09-16T05:05:30.492218113Z" level=warning msg="container event discarded" container=f96fc3414739472b1d19800eb161464d3f580bcb4bd643332482d7df6a239e89 type=CONTAINER_CREATED_EVENT Sep 16 05:05:30.503713 containerd[1918]: time="2025-09-16T05:05:30.503596394Z" level=warning msg="container event discarded" container=a4ef41c99f0c23b34e4b55874de9b6b68907c43ba1a78015214b36b9890f7b9d type=CONTAINER_STARTED_EVENT Sep 16 05:05:30.503713 containerd[1918]: time="2025-09-16T05:05:30.503651508Z" level=warning msg="container event discarded" container=40fec3a257bbe01a64f87c227cf385ea6d2f50d35344c99fcf238ef095084723 type=CONTAINER_STARTED_EVENT Sep 16 05:05:30.545120 containerd[1918]: time="2025-09-16T05:05:30.544992455Z" level=warning msg="container event discarded" container=f96fc3414739472b1d19800eb161464d3f580bcb4bd643332482d7df6a239e89 type=CONTAINER_STARTED_EVENT Sep 16 05:05:35.562273 containerd[1918]: time="2025-09-16T05:05:35.562244581Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"297a2faef86ed19fe3f935b7f5f0ed97c0716b8ff97eef8a1e5b61aae7784ab2\" pid:7457 exited_at:{seconds:1757999135 nanos:562028813}" Sep 16 05:05:36.183136 containerd[1918]: time="2025-09-16T05:05:36.183076165Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"96ed4d3a2ffb9966007dc48d0f819b96247dca8415d086379ea9151b4270019d\" pid:7488 exited_at:{seconds:1757999136 nanos:182875269}" Sep 16 05:05:38.351734 containerd[1918]: time="2025-09-16T05:05:38.351703489Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"c3e6a5b198c21a1a8820b24ba443646d19bc95dfa501e4a81dbfe29140929335\" pid:7535 exited_at:{seconds:1757999138 nanos:351587222}" Sep 16 05:05:38.371563 containerd[1918]: time="2025-09-16T05:05:38.371513951Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"705e76075ce064103fcf11c61a759a95782404c00904a6259be2cb9b61bb2a7e\" pid:7536 exited_at:{seconds:1757999138 nanos:371343930}" Sep 16 05:05:39.525893 containerd[1918]: time="2025-09-16T05:05:39.525862947Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"42d5061723a2e4fe45ababecb54e4f652d96f924541e93fcf431ece02b119e9c\" pid:7577 exited_at:{seconds:1757999139 nanos:525685414}" Sep 16 05:05:40.346852 containerd[1918]: time="2025-09-16T05:05:40.346656827Z" level=warning msg="container event discarded" container=7d04b7e6b9ef275999821dbebda144b7788e3f759b6a4b13f159254c27f8487e type=CONTAINER_CREATED_EVENT Sep 16 05:05:40.346852 containerd[1918]: time="2025-09-16T05:05:40.346801973Z" level=warning msg="container event discarded" container=7d04b7e6b9ef275999821dbebda144b7788e3f759b6a4b13f159254c27f8487e type=CONTAINER_STARTED_EVENT Sep 16 05:05:40.346852 containerd[1918]: time="2025-09-16T05:05:40.346834172Z" level=warning msg="container event discarded" container=8e02ae9483f6a2c72307e5152c37f3857e6e5621d9aaaef68716ac4ee3893b13 type=CONTAINER_CREATED_EVENT Sep 16 05:05:40.411209 containerd[1918]: time="2025-09-16T05:05:40.411089102Z" level=warning msg="container event discarded" container=8e02ae9483f6a2c72307e5152c37f3857e6e5621d9aaaef68716ac4ee3893b13 type=CONTAINER_STARTED_EVENT Sep 16 05:05:40.452725 containerd[1918]: time="2025-09-16T05:05:40.452617184Z" level=warning msg="container event discarded" container=2d84a938525c0af1a2ba86f64b977e6b07e842c3d1e212e44b03754c1fa7cd8d type=CONTAINER_CREATED_EVENT Sep 16 05:05:40.452725 containerd[1918]: time="2025-09-16T05:05:40.452677149Z" level=warning msg="container event discarded" container=2d84a938525c0af1a2ba86f64b977e6b07e842c3d1e212e44b03754c1fa7cd8d type=CONTAINER_STARTED_EVENT Sep 16 05:05:42.341380 containerd[1918]: time="2025-09-16T05:05:42.341279221Z" level=warning msg="container event discarded" container=a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47 type=CONTAINER_CREATED_EVENT Sep 16 05:05:42.374535 containerd[1918]: time="2025-09-16T05:05:42.374484185Z" level=warning msg="container event discarded" container=a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47 type=CONTAINER_STARTED_EVENT Sep 16 05:05:44.403846 containerd[1918]: time="2025-09-16T05:05:44.403677138Z" level=warning msg="container event discarded" container=a5c42b9755a5eca76b137c957cb0a959cfa08d3b6cfd86131c41dbfab2dfcc47 type=CONTAINER_STOPPED_EVENT Sep 16 05:05:45.244098 containerd[1918]: time="2025-09-16T05:05:45.243965914Z" level=warning msg="container event discarded" container=90aaf0ca370057ac548b3cf9fc06ae4103faf02546a38af3c703efaae209a1db type=CONTAINER_CREATED_EVENT Sep 16 05:05:45.281404 containerd[1918]: time="2025-09-16T05:05:45.281268233Z" level=warning msg="container event discarded" container=90aaf0ca370057ac548b3cf9fc06ae4103faf02546a38af3c703efaae209a1db type=CONTAINER_STARTED_EVENT Sep 16 05:05:50.617186 containerd[1918]: time="2025-09-16T05:05:50.617078501Z" level=warning msg="container event discarded" container=493efbb2f4d7b80e0de96d4d83a42456f24e9383ea677c2b9f8b2528e52a8401 type=CONTAINER_CREATED_EVENT Sep 16 05:05:50.617186 containerd[1918]: time="2025-09-16T05:05:50.617162577Z" level=warning msg="container event discarded" container=493efbb2f4d7b80e0de96d4d83a42456f24e9383ea677c2b9f8b2528e52a8401 type=CONTAINER_STARTED_EVENT Sep 16 05:05:50.964458 containerd[1918]: time="2025-09-16T05:05:50.964264523Z" level=warning msg="container event discarded" container=eade86b7dcdd496a4f1ec4c4ea8bc7c70df1f835f7998909c85c682656ac5dd9 type=CONTAINER_CREATED_EVENT Sep 16 05:05:50.964458 containerd[1918]: time="2025-09-16T05:05:50.964351458Z" level=warning msg="container event discarded" container=eade86b7dcdd496a4f1ec4c4ea8bc7c70df1f835f7998909c85c682656ac5dd9 type=CONTAINER_STARTED_EVENT Sep 16 05:05:55.065382 containerd[1918]: time="2025-09-16T05:05:55.065229763Z" level=warning msg="container event discarded" container=0d7666c1f886882f9b6c72323f0ca65cad85558292adab4fb59c5238cdc9bd55 type=CONTAINER_CREATED_EVENT Sep 16 05:05:55.122764 containerd[1918]: time="2025-09-16T05:05:55.122618595Z" level=warning msg="container event discarded" container=0d7666c1f886882f9b6c72323f0ca65cad85558292adab4fb59c5238cdc9bd55 type=CONTAINER_STARTED_EVENT Sep 16 05:05:59.387213 containerd[1918]: time="2025-09-16T05:05:59.387154999Z" level=warning msg="container event discarded" container=e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564 type=CONTAINER_CREATED_EVENT Sep 16 05:05:59.435308 containerd[1918]: time="2025-09-16T05:05:59.435254949Z" level=warning msg="container event discarded" container=e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564 type=CONTAINER_STARTED_EVENT Sep 16 05:05:59.698293 containerd[1918]: time="2025-09-16T05:05:59.698091703Z" level=warning msg="container event discarded" container=e5ba4da09de66d8cbaf2a9837affa8b36c8e1bb105d462a3ade69be3ff1ab564 type=CONTAINER_STOPPED_EVENT Sep 16 05:06:06.221692 containerd[1918]: time="2025-09-16T05:06:06.221592588Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"09b888e52bc87ac1b8e950fc9763126532810927106cca2a03288b3298a8cd34\" pid:7626 exited_at:{seconds:1757999166 nanos:221312563}" Sep 16 05:06:07.148966 containerd[1918]: time="2025-09-16T05:06:07.148763036Z" level=warning msg="container event discarded" container=462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e type=CONTAINER_CREATED_EVENT Sep 16 05:06:07.191982 containerd[1918]: time="2025-09-16T05:06:07.191870029Z" level=warning msg="container event discarded" container=462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e type=CONTAINER_STARTED_EVENT Sep 16 05:06:08.189109 containerd[1918]: time="2025-09-16T05:06:08.188971298Z" level=warning msg="container event discarded" container=462018dea393f78c8e20949bdbd879c1a120cece2c5742f5aaab66cf19e53d1e type=CONTAINER_STOPPED_EVENT Sep 16 05:06:08.304333 containerd[1918]: time="2025-09-16T05:06:08.304308550Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"8102b1b563fd232bd8fc33be55532975d8092f02975829db06da4e748467fe10\" pid:7674 exited_at:{seconds:1757999168 nanos:304189952}" Sep 16 05:06:08.320946 containerd[1918]: time="2025-09-16T05:06:08.320922682Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"fb5674f3bd2a94ba33c883f677670bb81bc261885f365c70e146930b469752eb\" pid:7675 exited_at:{seconds:1757999168 nanos:320709030}" Sep 16 05:06:14.209030 containerd[1918]: time="2025-09-16T05:06:14.208857194Z" level=warning msg="container event discarded" container=8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247 type=CONTAINER_CREATED_EVENT Sep 16 05:06:14.255291 containerd[1918]: time="2025-09-16T05:06:14.255203472Z" level=warning msg="container event discarded" container=8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247 type=CONTAINER_STARTED_EVENT Sep 16 05:06:15.783437 containerd[1918]: time="2025-09-16T05:06:15.783291111Z" level=warning msg="container event discarded" container=69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e type=CONTAINER_CREATED_EVENT Sep 16 05:06:15.783437 containerd[1918]: time="2025-09-16T05:06:15.783380586Z" level=warning msg="container event discarded" container=69b3c2da13afefa26cc1835cf5c8fd0e27fe150eaa5282e0a3c240f4efa80c6e type=CONTAINER_STARTED_EVENT Sep 16 05:06:19.322021 containerd[1918]: time="2025-09-16T05:06:19.321887712Z" level=warning msg="container event discarded" container=b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0 type=CONTAINER_CREATED_EVENT Sep 16 05:06:19.322021 containerd[1918]: time="2025-09-16T05:06:19.321989870Z" level=warning msg="container event discarded" container=b6a3b57dba88875bf5cb5a0d4fe0849fa8c56e048514bed3ce44f4446492a7f0 type=CONTAINER_STARTED_EVENT Sep 16 05:06:20.375431 containerd[1918]: time="2025-09-16T05:06:20.375374557Z" level=warning msg="container event discarded" container=5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e type=CONTAINER_CREATED_EVENT Sep 16 05:06:20.375431 containerd[1918]: time="2025-09-16T05:06:20.375392307Z" level=warning msg="container event discarded" container=5922033db479c6435ee87190cf210c867ec940e3d98ee0b4180f4a7a6fe7312e type=CONTAINER_STARTED_EVENT Sep 16 05:06:20.375431 containerd[1918]: time="2025-09-16T05:06:20.375398477Z" level=warning msg="container event discarded" container=5189bb1a81c49c1cc36ff79fe89551677214e10600379dd20c79e1c4e65bc21c type=CONTAINER_CREATED_EVENT Sep 16 05:06:20.418875 containerd[1918]: time="2025-09-16T05:06:20.418776463Z" level=warning msg="container event discarded" container=5189bb1a81c49c1cc36ff79fe89551677214e10600379dd20c79e1c4e65bc21c type=CONTAINER_STARTED_EVENT Sep 16 05:06:20.445283 containerd[1918]: time="2025-09-16T05:06:20.445161056Z" level=warning msg="container event discarded" container=bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7 type=CONTAINER_CREATED_EVENT Sep 16 05:06:20.445283 containerd[1918]: time="2025-09-16T05:06:20.445234265Z" level=warning msg="container event discarded" container=bc65a19d9b47e41b9b7262d00ec377ba6ac249f86dc4bc8a4d0bd817611e2da7 type=CONTAINER_STARTED_EVENT Sep 16 05:06:20.907210 containerd[1918]: time="2025-09-16T05:06:20.907104013Z" level=warning msg="container event discarded" container=6891f4cb56ea45a67fa8e9c14d5ab132c341787d481573135f7f57c5261a14a9 type=CONTAINER_CREATED_EVENT Sep 16 05:06:20.958931 containerd[1918]: time="2025-09-16T05:06:20.958814840Z" level=warning msg="container event discarded" container=6891f4cb56ea45a67fa8e9c14d5ab132c341787d481573135f7f57c5261a14a9 type=CONTAINER_STARTED_EVENT Sep 16 05:06:21.349134 containerd[1918]: time="2025-09-16T05:06:21.348906631Z" level=warning msg="container event discarded" container=f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5 type=CONTAINER_CREATED_EVENT Sep 16 05:06:21.349134 containerd[1918]: time="2025-09-16T05:06:21.348969615Z" level=warning msg="container event discarded" container=f00610727cf82f84e025bf3179011a75dacd0f6109ccf2dc85d32642409843b5 type=CONTAINER_STARTED_EVENT Sep 16 05:06:21.432624 containerd[1918]: time="2025-09-16T05:06:21.432498786Z" level=warning msg="container event discarded" container=cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7 type=CONTAINER_CREATED_EVENT Sep 16 05:06:21.432624 containerd[1918]: time="2025-09-16T05:06:21.432569925Z" level=warning msg="container event discarded" container=cfd8e4b5bf0180b6431e7bb2403a3dbb5d79d943675d4b368cc8fc86f97555c7 type=CONTAINER_STARTED_EVENT Sep 16 05:06:22.347232 containerd[1918]: time="2025-09-16T05:06:22.347121309Z" level=warning msg="container event discarded" container=f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e type=CONTAINER_CREATED_EVENT Sep 16 05:06:22.347232 containerd[1918]: time="2025-09-16T05:06:22.347200721Z" level=warning msg="container event discarded" container=f0eae6254e869013fba664f22ed0fb2f8160e6e222fbbe61ebd54600ca0faf5e type=CONTAINER_STARTED_EVENT Sep 16 05:06:23.374922 containerd[1918]: time="2025-09-16T05:06:23.374891720Z" level=warning msg="container event discarded" container=f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6 type=CONTAINER_CREATED_EVENT Sep 16 05:06:23.374922 containerd[1918]: time="2025-09-16T05:06:23.374917445Z" level=warning msg="container event discarded" container=f1836f79837419c6f94aa92815ae64f385fd87e5bad1d5b0b7e0964ded1f2dc6 type=CONTAINER_STARTED_EVENT Sep 16 05:06:23.374922 containerd[1918]: time="2025-09-16T05:06:23.374923585Z" level=warning msg="container event discarded" container=be234da1f48cdc2bb69d7fbe97eb5a177e402ea29b0bbf3f79cc4e2fa428e1c6 type=CONTAINER_CREATED_EVENT Sep 16 05:06:23.414558 containerd[1918]: time="2025-09-16T05:06:23.414424908Z" level=warning msg="container event discarded" container=be234da1f48cdc2bb69d7fbe97eb5a177e402ea29b0bbf3f79cc4e2fa428e1c6 type=CONTAINER_STARTED_EVENT Sep 16 05:06:28.570805 containerd[1918]: time="2025-09-16T05:06:28.570672559Z" level=warning msg="container event discarded" container=f65e48564498a81cce582691e4caafd937c005f30e55d139ef1753c68db252c2 type=CONTAINER_CREATED_EVENT Sep 16 05:06:28.613289 containerd[1918]: time="2025-09-16T05:06:28.613161889Z" level=warning msg="container event discarded" container=f65e48564498a81cce582691e4caafd937c005f30e55d139ef1753c68db252c2 type=CONTAINER_STARTED_EVENT Sep 16 05:06:29.080372 containerd[1918]: time="2025-09-16T05:06:29.080244594Z" level=warning msg="container event discarded" container=9c6c763cf6a00616dbb8139de7c597765cedbca9d7e7d7553a59f677c07c55e5 type=CONTAINER_CREATED_EVENT Sep 16 05:06:29.127830 containerd[1918]: time="2025-09-16T05:06:29.127696733Z" level=warning msg="container event discarded" container=9c6c763cf6a00616dbb8139de7c597765cedbca9d7e7d7553a59f677c07c55e5 type=CONTAINER_STARTED_EVENT Sep 16 05:06:33.911151 containerd[1918]: time="2025-09-16T05:06:33.911055182Z" level=warning msg="container event discarded" container=8968fc8b10f7695b2a97f39a72262c5c2fccad78b019991f15fd31b7490a4694 type=CONTAINER_CREATED_EVENT Sep 16 05:06:33.956685 containerd[1918]: time="2025-09-16T05:06:33.956589476Z" level=warning msg="container event discarded" container=8968fc8b10f7695b2a97f39a72262c5c2fccad78b019991f15fd31b7490a4694 type=CONTAINER_STARTED_EVENT Sep 16 05:06:35.577979 containerd[1918]: time="2025-09-16T05:06:35.577953027Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"3923344b499894ba978dff5bdf5275b91938a295f2e0297ebce6df45051be1b5\" pid:7721 exited_at:{seconds:1757999195 nanos:577773701}" Sep 16 05:06:36.217326 containerd[1918]: time="2025-09-16T05:06:36.217281848Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"cba627bd8a24dee1df096e390ba581db32ba4b71a2ebfe732b4bbc6921d7a16a\" pid:7759 exited_at:{seconds:1757999196 nanos:217075579}" Sep 16 05:06:37.144344 containerd[1918]: time="2025-09-16T05:06:37.144176856Z" level=warning msg="container event discarded" container=0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3 type=CONTAINER_CREATED_EVENT Sep 16 05:06:37.188866 containerd[1918]: time="2025-09-16T05:06:37.188738857Z" level=warning msg="container event discarded" container=0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3 type=CONTAINER_STARTED_EVENT Sep 16 05:06:38.307626 containerd[1918]: time="2025-09-16T05:06:38.307569568Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"06dd145a84c9defdf9d739633d25192520da28afb4e84c6206a0eae21df18ba9\" pid:7805 exited_at:{seconds:1757999198 nanos:307427849}" Sep 16 05:06:38.325401 containerd[1918]: time="2025-09-16T05:06:38.325377159Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"a2588bb5b528e63d3c76108262f87c82ff698d227d3412ccdbcd14a56858ab99\" pid:7806 exited_at:{seconds:1757999198 nanos:325201219}" Sep 16 05:06:39.479872 containerd[1918]: time="2025-09-16T05:06:39.479848428Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"ccf4f64d58e401b14be471fdce1130b4abd4d6399dc50589e5df6579800086a6\" pid:7847 exited_at:{seconds:1757999199 nanos:479743730}" Sep 16 05:06:40.696359 containerd[1918]: time="2025-09-16T05:06:40.696192204Z" level=warning msg="container event discarded" container=c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd type=CONTAINER_CREATED_EVENT Sep 16 05:06:40.744852 containerd[1918]: time="2025-09-16T05:06:40.744723007Z" level=warning msg="container event discarded" container=c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd type=CONTAINER_STARTED_EVENT Sep 16 05:06:43.794211 containerd[1918]: time="2025-09-16T05:06:43.794064288Z" level=warning msg="container event discarded" container=b34304dfb4590783ac02a6ac4e8bb0b746912e89440cf87d3def56dda0f485e6 type=CONTAINER_CREATED_EVENT Sep 16 05:06:43.835522 containerd[1918]: time="2025-09-16T05:06:43.835395513Z" level=warning msg="container event discarded" container=b34304dfb4590783ac02a6ac4e8bb0b746912e89440cf87d3def56dda0f485e6 type=CONTAINER_STARTED_EVENT Sep 16 05:06:46.882434 containerd[1918]: time="2025-09-16T05:06:46.882262569Z" level=warning msg="container event discarded" container=60491c43ec13acceab7b8ababe707cbdb9b84653f99eee5c0d15e4281fc5d4a4 type=CONTAINER_CREATED_EVENT Sep 16 05:06:46.921710 containerd[1918]: time="2025-09-16T05:06:46.921570032Z" level=warning msg="container event discarded" container=60491c43ec13acceab7b8ababe707cbdb9b84653f99eee5c0d15e4281fc5d4a4 type=CONTAINER_STARTED_EVENT Sep 16 05:06:57.974975 systemd[1]: Started sshd@9-139.178.94.33:22-139.178.89.65:51556.service - OpenSSH per-connection server daemon (139.178.89.65:51556). Sep 16 05:06:58.005104 sshd[7865]: Accepted publickey for core from 139.178.89.65 port 51556 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:06:58.005940 sshd-session[7865]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:06:58.008882 systemd-logind[1906]: New session 12 of user core. Sep 16 05:06:58.019233 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 16 05:06:58.103053 sshd[7868]: Connection closed by 139.178.89.65 port 51556 Sep 16 05:06:58.103236 sshd-session[7865]: pam_unix(sshd:session): session closed for user core Sep 16 05:06:58.105049 systemd[1]: sshd@9-139.178.94.33:22-139.178.89.65:51556.service: Deactivated successfully. Sep 16 05:06:58.106055 systemd[1]: session-12.scope: Deactivated successfully. Sep 16 05:06:58.106790 systemd-logind[1906]: Session 12 logged out. Waiting for processes to exit. Sep 16 05:06:58.107416 systemd-logind[1906]: Removed session 12. Sep 16 05:07:03.128754 systemd[1]: Started sshd@10-139.178.94.33:22-139.178.89.65:55534.service - OpenSSH per-connection server daemon (139.178.89.65:55534). Sep 16 05:07:03.212206 sshd[7895]: Accepted publickey for core from 139.178.89.65 port 55534 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:03.213500 sshd-session[7895]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:03.218813 systemd-logind[1906]: New session 13 of user core. Sep 16 05:07:03.232200 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 16 05:07:03.315639 sshd[7898]: Connection closed by 139.178.89.65 port 55534 Sep 16 05:07:03.315864 sshd-session[7895]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:03.317757 systemd[1]: sshd@10-139.178.94.33:22-139.178.89.65:55534.service: Deactivated successfully. Sep 16 05:07:03.318896 systemd[1]: session-13.scope: Deactivated successfully. Sep 16 05:07:03.319800 systemd-logind[1906]: Session 13 logged out. Waiting for processes to exit. Sep 16 05:07:03.320535 systemd-logind[1906]: Removed session 13. Sep 16 05:07:06.175878 containerd[1918]: time="2025-09-16T05:07:06.175846043Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"019a8076059f0091f8a8c0f3e7156cb9bfc3b1b9fbdd45f808411065f6b02e17\" pid:7937 exited_at:{seconds:1757999226 nanos:175543957}" Sep 16 05:07:08.324231 containerd[1918]: time="2025-09-16T05:07:08.324195548Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"51a0fcd322cb483c8f56b05fb3ee8ff4527de09fd4c3af5a412243f726615eef\" pid:7984 exited_at:{seconds:1757999228 nanos:324084410}" Sep 16 05:07:08.326745 systemd[1]: Started sshd@11-139.178.94.33:22-139.178.89.65:55550.service - OpenSSH per-connection server daemon (139.178.89.65:55550). Sep 16 05:07:08.343503 containerd[1918]: time="2025-09-16T05:07:08.343477236Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"4e33051ef149f6295e42302e0123bfda7c2b37aed66ed5485887238c642d96e6\" pid:7983 exited_at:{seconds:1757999228 nanos:343303931}" Sep 16 05:07:08.358502 sshd[8005]: Accepted publickey for core from 139.178.89.65 port 55550 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:08.359292 sshd-session[8005]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:08.362393 systemd-logind[1906]: New session 14 of user core. Sep 16 05:07:08.380305 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 16 05:07:08.466909 sshd[8021]: Connection closed by 139.178.89.65 port 55550 Sep 16 05:07:08.467147 sshd-session[8005]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:08.496767 systemd[1]: sshd@11-139.178.94.33:22-139.178.89.65:55550.service: Deactivated successfully. Sep 16 05:07:08.502117 systemd[1]: session-14.scope: Deactivated successfully. Sep 16 05:07:08.504899 systemd-logind[1906]: Session 14 logged out. Waiting for processes to exit. Sep 16 05:07:08.512745 systemd[1]: Started sshd@12-139.178.94.33:22-139.178.89.65:55556.service - OpenSSH per-connection server daemon (139.178.89.65:55556). Sep 16 05:07:08.514816 systemd-logind[1906]: Removed session 14. Sep 16 05:07:08.597216 sshd[8047]: Accepted publickey for core from 139.178.89.65 port 55556 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:08.598309 sshd-session[8047]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:08.603503 systemd-logind[1906]: New session 15 of user core. Sep 16 05:07:08.614289 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 16 05:07:08.774271 sshd[8050]: Connection closed by 139.178.89.65 port 55556 Sep 16 05:07:08.774515 sshd-session[8047]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:08.786390 systemd[1]: sshd@12-139.178.94.33:22-139.178.89.65:55556.service: Deactivated successfully. Sep 16 05:07:08.787858 systemd[1]: session-15.scope: Deactivated successfully. Sep 16 05:07:08.788402 systemd-logind[1906]: Session 15 logged out. Waiting for processes to exit. Sep 16 05:07:08.790156 systemd[1]: Started sshd@13-139.178.94.33:22-139.178.89.65:55560.service - OpenSSH per-connection server daemon (139.178.89.65:55560). Sep 16 05:07:08.790687 systemd-logind[1906]: Removed session 15. Sep 16 05:07:08.841968 sshd[8074]: Accepted publickey for core from 139.178.89.65 port 55560 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:08.843014 sshd-session[8074]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:08.847762 systemd-logind[1906]: New session 16 of user core. Sep 16 05:07:08.858304 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 16 05:07:08.999635 sshd[8077]: Connection closed by 139.178.89.65 port 55560 Sep 16 05:07:08.999953 sshd-session[8074]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:09.002882 systemd[1]: sshd@13-139.178.94.33:22-139.178.89.65:55560.service: Deactivated successfully. Sep 16 05:07:09.004499 systemd[1]: session-16.scope: Deactivated successfully. Sep 16 05:07:09.006169 systemd-logind[1906]: Session 16 logged out. Waiting for processes to exit. Sep 16 05:07:09.007253 systemd-logind[1906]: Removed session 16. Sep 16 05:07:10.048338 update_engine[1911]: I20250916 05:07:10.048234 1911 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 16 05:07:10.048338 update_engine[1911]: I20250916 05:07:10.048339 1911 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 16 05:07:10.049680 update_engine[1911]: I20250916 05:07:10.048846 1911 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 16 05:07:10.050291 update_engine[1911]: I20250916 05:07:10.050226 1911 omaha_request_params.cc:62] Current group set to developer Sep 16 05:07:10.050581 update_engine[1911]: I20250916 05:07:10.050518 1911 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 16 05:07:10.050581 update_engine[1911]: I20250916 05:07:10.050555 1911 update_attempter.cc:643] Scheduling an action processor start. Sep 16 05:07:10.050972 update_engine[1911]: I20250916 05:07:10.050608 1911 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 16 05:07:10.050972 update_engine[1911]: I20250916 05:07:10.050752 1911 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 16 05:07:10.051341 update_engine[1911]: I20250916 05:07:10.050998 1911 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 16 05:07:10.051341 update_engine[1911]: I20250916 05:07:10.051066 1911 omaha_request_action.cc:272] Request: Sep 16 05:07:10.051341 update_engine[1911]: Sep 16 05:07:10.051341 update_engine[1911]: Sep 16 05:07:10.051341 update_engine[1911]: Sep 16 05:07:10.051341 update_engine[1911]: Sep 16 05:07:10.051341 update_engine[1911]: Sep 16 05:07:10.051341 update_engine[1911]: Sep 16 05:07:10.051341 update_engine[1911]: Sep 16 05:07:10.051341 update_engine[1911]: Sep 16 05:07:10.051341 update_engine[1911]: I20250916 05:07:10.051105 1911 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 05:07:10.052869 locksmithd[1974]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 16 05:07:10.054020 update_engine[1911]: I20250916 05:07:10.054006 1911 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 05:07:10.054393 update_engine[1911]: I20250916 05:07:10.054378 1911 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 05:07:10.054743 update_engine[1911]: E20250916 05:07:10.054726 1911 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 05:07:10.054779 update_engine[1911]: I20250916 05:07:10.054770 1911 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 16 05:07:14.019989 systemd[1]: Started sshd@14-139.178.94.33:22-139.178.89.65:43302.service - OpenSSH per-connection server daemon (139.178.89.65:43302). Sep 16 05:07:14.049946 sshd[8109]: Accepted publickey for core from 139.178.89.65 port 43302 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:14.050704 sshd-session[8109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:14.053901 systemd-logind[1906]: New session 17 of user core. Sep 16 05:07:14.068240 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 16 05:07:14.156032 sshd[8112]: Connection closed by 139.178.89.65 port 43302 Sep 16 05:07:14.156264 sshd-session[8109]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:14.158193 systemd[1]: sshd@14-139.178.94.33:22-139.178.89.65:43302.service: Deactivated successfully. Sep 16 05:07:14.159272 systemd[1]: session-17.scope: Deactivated successfully. Sep 16 05:07:14.159992 systemd-logind[1906]: Session 17 logged out. Waiting for processes to exit. Sep 16 05:07:14.160663 systemd-logind[1906]: Removed session 17. Sep 16 05:07:19.169589 systemd[1]: Started sshd@15-139.178.94.33:22-139.178.89.65:43312.service - OpenSSH per-connection server daemon (139.178.89.65:43312). Sep 16 05:07:19.202481 sshd[8137]: Accepted publickey for core from 139.178.89.65 port 43312 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:19.203172 sshd-session[8137]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:19.206397 systemd-logind[1906]: New session 18 of user core. Sep 16 05:07:19.217209 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 16 05:07:19.306756 sshd[8140]: Connection closed by 139.178.89.65 port 43312 Sep 16 05:07:19.306954 sshd-session[8137]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:19.308842 systemd[1]: sshd@15-139.178.94.33:22-139.178.89.65:43312.service: Deactivated successfully. Sep 16 05:07:19.309865 systemd[1]: session-18.scope: Deactivated successfully. Sep 16 05:07:19.310659 systemd-logind[1906]: Session 18 logged out. Waiting for processes to exit. Sep 16 05:07:19.311509 systemd-logind[1906]: Removed session 18. Sep 16 05:07:20.047290 update_engine[1911]: I20250916 05:07:20.047147 1911 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 05:07:20.048170 update_engine[1911]: I20250916 05:07:20.047302 1911 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 05:07:20.048291 update_engine[1911]: I20250916 05:07:20.048190 1911 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 05:07:20.048585 update_engine[1911]: E20250916 05:07:20.048487 1911 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 05:07:20.048751 update_engine[1911]: I20250916 05:07:20.048668 1911 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 16 05:07:24.341340 systemd[1]: Started sshd@16-139.178.94.33:22-139.178.89.65:46838.service - OpenSSH per-connection server daemon (139.178.89.65:46838). Sep 16 05:07:24.383875 sshd[8165]: Accepted publickey for core from 139.178.89.65 port 46838 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:24.384597 sshd-session[8165]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:24.387865 systemd-logind[1906]: New session 19 of user core. Sep 16 05:07:24.408487 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 16 05:07:24.506350 sshd[8168]: Connection closed by 139.178.89.65 port 46838 Sep 16 05:07:24.506544 sshd-session[8165]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:24.508524 systemd[1]: sshd@16-139.178.94.33:22-139.178.89.65:46838.service: Deactivated successfully. Sep 16 05:07:24.509546 systemd[1]: session-19.scope: Deactivated successfully. Sep 16 05:07:24.510318 systemd-logind[1906]: Session 19 logged out. Waiting for processes to exit. Sep 16 05:07:24.510964 systemd-logind[1906]: Removed session 19. Sep 16 05:07:29.539535 systemd[1]: Started sshd@17-139.178.94.33:22-139.178.89.65:46844.service - OpenSSH per-connection server daemon (139.178.89.65:46844). Sep 16 05:07:29.627534 sshd[8209]: Accepted publickey for core from 139.178.89.65 port 46844 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:29.628764 sshd-session[8209]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:29.633696 systemd-logind[1906]: New session 20 of user core. Sep 16 05:07:29.651602 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 16 05:07:29.749024 sshd[8212]: Connection closed by 139.178.89.65 port 46844 Sep 16 05:07:29.749218 sshd-session[8209]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:29.762091 systemd[1]: sshd@17-139.178.94.33:22-139.178.89.65:46844.service: Deactivated successfully. Sep 16 05:07:29.763311 systemd[1]: session-20.scope: Deactivated successfully. Sep 16 05:07:29.763849 systemd-logind[1906]: Session 20 logged out. Waiting for processes to exit. Sep 16 05:07:29.765507 systemd[1]: Started sshd@18-139.178.94.33:22-139.178.89.65:46856.service - OpenSSH per-connection server daemon (139.178.89.65:46856). Sep 16 05:07:29.765877 systemd-logind[1906]: Removed session 20. Sep 16 05:07:29.795597 sshd[8237]: Accepted publickey for core from 139.178.89.65 port 46856 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:29.796331 sshd-session[8237]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:29.799464 systemd-logind[1906]: New session 21 of user core. Sep 16 05:07:29.812667 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 16 05:07:29.950374 sshd[8241]: Connection closed by 139.178.89.65 port 46856 Sep 16 05:07:29.950555 sshd-session[8237]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:29.978510 systemd[1]: sshd@18-139.178.94.33:22-139.178.89.65:46856.service: Deactivated successfully. Sep 16 05:07:29.979586 systemd[1]: session-21.scope: Deactivated successfully. Sep 16 05:07:29.980122 systemd-logind[1906]: Session 21 logged out. Waiting for processes to exit. Sep 16 05:07:29.981478 systemd[1]: Started sshd@19-139.178.94.33:22-139.178.89.65:42992.service - OpenSSH per-connection server daemon (139.178.89.65:42992). Sep 16 05:07:29.981957 systemd-logind[1906]: Removed session 21. Sep 16 05:07:30.013049 sshd[8264]: Accepted publickey for core from 139.178.89.65 port 42992 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:30.013920 sshd-session[8264]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:30.021032 systemd-logind[1906]: New session 22 of user core. Sep 16 05:07:30.037250 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 16 05:07:30.047138 update_engine[1911]: I20250916 05:07:30.047056 1911 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 05:07:30.047138 update_engine[1911]: I20250916 05:07:30.047098 1911 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 05:07:30.047415 update_engine[1911]: I20250916 05:07:30.047305 1911 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 05:07:30.047602 update_engine[1911]: E20250916 05:07:30.047586 1911 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 05:07:30.047635 update_engine[1911]: I20250916 05:07:30.047622 1911 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 16 05:07:31.096543 sshd[8267]: Connection closed by 139.178.89.65 port 42992 Sep 16 05:07:31.096712 sshd-session[8264]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:31.108316 systemd[1]: sshd@19-139.178.94.33:22-139.178.89.65:42992.service: Deactivated successfully. Sep 16 05:07:31.109451 systemd[1]: session-22.scope: Deactivated successfully. Sep 16 05:07:31.109619 systemd[1]: session-22.scope: Consumed 468ms CPU time, 74M memory peak. Sep 16 05:07:31.109996 systemd-logind[1906]: Session 22 logged out. Waiting for processes to exit. Sep 16 05:07:31.111431 systemd[1]: Started sshd@20-139.178.94.33:22-139.178.89.65:43004.service - OpenSSH per-connection server daemon (139.178.89.65:43004). Sep 16 05:07:31.111838 systemd-logind[1906]: Removed session 22. Sep 16 05:07:31.143380 sshd[8296]: Accepted publickey for core from 139.178.89.65 port 43004 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:31.146679 sshd-session[8296]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:31.159247 systemd-logind[1906]: New session 23 of user core. Sep 16 05:07:31.169450 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 16 05:07:31.334876 sshd[8300]: Connection closed by 139.178.89.65 port 43004 Sep 16 05:07:31.335112 sshd-session[8296]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:31.346562 systemd[1]: sshd@20-139.178.94.33:22-139.178.89.65:43004.service: Deactivated successfully. Sep 16 05:07:31.347613 systemd[1]: session-23.scope: Deactivated successfully. Sep 16 05:07:31.348196 systemd-logind[1906]: Session 23 logged out. Waiting for processes to exit. Sep 16 05:07:31.349403 systemd[1]: Started sshd@21-139.178.94.33:22-139.178.89.65:43006.service - OpenSSH per-connection server daemon (139.178.89.65:43006). Sep 16 05:07:31.350056 systemd-logind[1906]: Removed session 23. Sep 16 05:07:31.382789 sshd[8323]: Accepted publickey for core from 139.178.89.65 port 43006 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:31.386180 sshd-session[8323]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:31.399009 systemd-logind[1906]: New session 24 of user core. Sep 16 05:07:31.418468 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 16 05:07:31.522429 sshd[8326]: Connection closed by 139.178.89.65 port 43006 Sep 16 05:07:31.522626 sshd-session[8323]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:31.524633 systemd[1]: sshd@21-139.178.94.33:22-139.178.89.65:43006.service: Deactivated successfully. Sep 16 05:07:31.525658 systemd[1]: session-24.scope: Deactivated successfully. Sep 16 05:07:31.526413 systemd-logind[1906]: Session 24 logged out. Waiting for processes to exit. Sep 16 05:07:31.527005 systemd-logind[1906]: Removed session 24. Sep 16 05:07:35.608752 containerd[1918]: time="2025-09-16T05:07:35.608700023Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"523d0748393463b93c4d8a5963e7265b03834b8b374dc4d715775880ce6551ed\" pid:8367 exited_at:{seconds:1757999255 nanos:608521877}" Sep 16 05:07:36.232901 containerd[1918]: time="2025-09-16T05:07:36.232873097Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8289cdc8bba241f9fb2450023accf4b43a4d224f81df089fbad5a3bf0be28247\" id:\"3477ba92609021e7bd467477b1dbfa74705f26f13c07d7afe97be8cf1088be16\" pid:8399 exited_at:{seconds:1757999256 nanos:232652901}" Sep 16 05:07:36.544660 systemd[1]: Started sshd@22-139.178.94.33:22-139.178.89.65:43022.service - OpenSSH per-connection server daemon (139.178.89.65:43022). Sep 16 05:07:36.591527 sshd[8425]: Accepted publickey for core from 139.178.89.65 port 43022 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:36.594791 sshd-session[8425]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:36.608496 systemd-logind[1906]: New session 25 of user core. Sep 16 05:07:36.620477 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 16 05:07:36.769713 sshd[8428]: Connection closed by 139.178.89.65 port 43022 Sep 16 05:07:36.769921 sshd-session[8425]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:36.771948 systemd[1]: sshd@22-139.178.94.33:22-139.178.89.65:43022.service: Deactivated successfully. Sep 16 05:07:36.772957 systemd[1]: session-25.scope: Deactivated successfully. Sep 16 05:07:36.773734 systemd-logind[1906]: Session 25 logged out. Waiting for processes to exit. Sep 16 05:07:36.774486 systemd-logind[1906]: Removed session 25. Sep 16 05:07:38.299142 containerd[1918]: time="2025-09-16T05:07:38.299120650Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"a4cb546586ff66b98e9ccc65943187d3f51a3fb0749e9ba7fc59ab87c967d178\" pid:8477 exited_at:{seconds:1757999258 nanos:299018170}" Sep 16 05:07:38.317018 containerd[1918]: time="2025-09-16T05:07:38.316993661Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c756fbf6918e9b8b81897c4c9a8b11bfbe415c1c62b7891cdd1c9637f409bbfd\" id:\"31ccdd6bdf2939c85da966d04050a5b892351a8348e98f68a4308cef571a11a2\" pid:8476 exited_at:{seconds:1757999258 nanos:316835653}" Sep 16 05:07:39.463720 containerd[1918]: time="2025-09-16T05:07:39.463696595Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ed448729dc16bb8ecba14c7a59f88a127108d26d645b4c601461b83231d78a3\" id:\"1d17c78ac4551b9d537abaf2e0adfc9138ad913221f6cafa53d42e5aee19edbd\" pid:8526 exited_at:{seconds:1757999259 nanos:463604749}" Sep 16 05:07:40.050312 update_engine[1911]: I20250916 05:07:40.050171 1911 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 05:07:40.050312 update_engine[1911]: I20250916 05:07:40.050309 1911 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 05:07:40.051344 update_engine[1911]: I20250916 05:07:40.051133 1911 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 05:07:40.051497 update_engine[1911]: E20250916 05:07:40.051437 1911 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 05:07:40.051630 update_engine[1911]: I20250916 05:07:40.051590 1911 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 16 05:07:40.051746 update_engine[1911]: I20250916 05:07:40.051623 1911 omaha_request_action.cc:617] Omaha request response: Sep 16 05:07:40.051851 update_engine[1911]: E20250916 05:07:40.051779 1911 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 16 05:07:40.051851 update_engine[1911]: I20250916 05:07:40.051826 1911 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 16 05:07:40.051851 update_engine[1911]: I20250916 05:07:40.051843 1911 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 16 05:07:40.052117 update_engine[1911]: I20250916 05:07:40.051859 1911 update_attempter.cc:306] Processing Done. Sep 16 05:07:40.052117 update_engine[1911]: E20250916 05:07:40.051890 1911 update_attempter.cc:619] Update failed. Sep 16 05:07:40.052117 update_engine[1911]: I20250916 05:07:40.051908 1911 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 16 05:07:40.052117 update_engine[1911]: I20250916 05:07:40.051923 1911 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 16 05:07:40.052117 update_engine[1911]: I20250916 05:07:40.051937 1911 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 16 05:07:40.052474 update_engine[1911]: I20250916 05:07:40.052109 1911 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 16 05:07:40.052474 update_engine[1911]: I20250916 05:07:40.052175 1911 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 16 05:07:40.052474 update_engine[1911]: I20250916 05:07:40.052196 1911 omaha_request_action.cc:272] Request: Sep 16 05:07:40.052474 update_engine[1911]: Sep 16 05:07:40.052474 update_engine[1911]: Sep 16 05:07:40.052474 update_engine[1911]: Sep 16 05:07:40.052474 update_engine[1911]: Sep 16 05:07:40.052474 update_engine[1911]: Sep 16 05:07:40.052474 update_engine[1911]: Sep 16 05:07:40.052474 update_engine[1911]: I20250916 05:07:40.052213 1911 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 05:07:40.052474 update_engine[1911]: I20250916 05:07:40.052258 1911 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 05:07:40.053268 update_engine[1911]: I20250916 05:07:40.052932 1911 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 05:07:40.053358 locksmithd[1974]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 16 05:07:40.053943 update_engine[1911]: E20250916 05:07:40.053378 1911 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 05:07:40.053943 update_engine[1911]: I20250916 05:07:40.053530 1911 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 16 05:07:40.053943 update_engine[1911]: I20250916 05:07:40.053560 1911 omaha_request_action.cc:617] Omaha request response: Sep 16 05:07:40.053943 update_engine[1911]: I20250916 05:07:40.053581 1911 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 16 05:07:40.053943 update_engine[1911]: I20250916 05:07:40.053596 1911 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 16 05:07:40.053943 update_engine[1911]: I20250916 05:07:40.053610 1911 update_attempter.cc:306] Processing Done. Sep 16 05:07:40.053943 update_engine[1911]: I20250916 05:07:40.053627 1911 update_attempter.cc:310] Error event sent. Sep 16 05:07:40.053943 update_engine[1911]: I20250916 05:07:40.053648 1911 update_check_scheduler.cc:74] Next update check in 43m6s Sep 16 05:07:40.054542 locksmithd[1974]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 16 05:07:41.789458 systemd[1]: Started sshd@23-139.178.94.33:22-139.178.89.65:48510.service - OpenSSH per-connection server daemon (139.178.89.65:48510). Sep 16 05:07:41.822417 sshd[8539]: Accepted publickey for core from 139.178.89.65 port 48510 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:41.823119 sshd-session[8539]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:41.826036 systemd-logind[1906]: New session 26 of user core. Sep 16 05:07:41.847257 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 16 05:07:41.977737 sshd[8542]: Connection closed by 139.178.89.65 port 48510 Sep 16 05:07:41.978023 sshd-session[8539]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:41.980659 systemd[1]: sshd@23-139.178.94.33:22-139.178.89.65:48510.service: Deactivated successfully. Sep 16 05:07:41.982103 systemd[1]: session-26.scope: Deactivated successfully. Sep 16 05:07:41.983093 systemd-logind[1906]: Session 26 logged out. Waiting for processes to exit. Sep 16 05:07:41.984046 systemd-logind[1906]: Removed session 26. Sep 16 05:07:46.995116 systemd[1]: Started sshd@24-139.178.94.33:22-139.178.89.65:48516.service - OpenSSH per-connection server daemon (139.178.89.65:48516). Sep 16 05:07:47.025870 sshd[8568]: Accepted publickey for core from 139.178.89.65 port 48516 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:07:47.029333 sshd-session[8568]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:07:47.042421 systemd-logind[1906]: New session 27 of user core. Sep 16 05:07:47.062535 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 16 05:07:47.157906 sshd[8571]: Connection closed by 139.178.89.65 port 48516 Sep 16 05:07:47.158124 sshd-session[8568]: pam_unix(sshd:session): session closed for user core Sep 16 05:07:47.160277 systemd[1]: sshd@24-139.178.94.33:22-139.178.89.65:48516.service: Deactivated successfully. Sep 16 05:07:47.161444 systemd[1]: session-27.scope: Deactivated successfully. Sep 16 05:07:47.162266 systemd-logind[1906]: Session 27 logged out. Waiting for processes to exit. Sep 16 05:07:47.162929 systemd-logind[1906]: Removed session 27.